Premium
Embodied Cognition for Autonomous Interactive Robots
Author(s) -
Hoffman Guy
Publication year - 2012
Publication title -
topics in cognitive science
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.191
H-Index - 56
eISSN - 1756-8765
pISSN - 1756-8757
DOI - 10.1111/j.1756-8765.2012.01218.x
Subject(s) - embodied cognition , robot , embodied agent , cognitive robotics , motor cognition , cognition , cognitive science , computer science , perception , modular design , artificial intelligence , human–computer interaction , human–robot interaction , psychology , social cognition , neuroscience , operating system
In the past, notions of embodiment have been applied to robotics mainly in the realm of very simple robots, and supporting low‐level mechanisms such as dynamics and navigation. In contrast, most human‐like, interactive, and socially adept robotic systems turn away from embodiment and use amodal, symbolic, and modular approaches to cognition and interaction. At the same time, recent research in Embodied Cognition (EC) is spanning an increasing number of complex cognitive processes, including language, nonverbal communication, learning, and social behavior. This article suggests adopting a modern EC approach for autonomous robots interacting with humans. In particular, we present three core principles from EC that may be applicable to such robots: (a) modal perceptual representation, (b) action/perception and action/cognition integration, and (c) a simulation‐based model of top‐down perceptual biasing. We describe a computational framework based on these principles, and its implementation on two physical robots. This could provide a new paradigm for embodied human–robot interaction based on recent psychological and neurological findings.