The basis of shared intentions in human and robot cognition

被引:31
|
作者
Dominey, Peter Ford [1 ]
Warneken, Felix [2 ]
机构
[1] INSERM, U846, Stem Cell & Brain Res Inst, Robot Cognit Lab, F-69675 Bron, France
[2] Harvard Univ, Dept Psychol, Cambridge, MA 02138 USA
关键词
Human-robot cooperation; Development; Social cognition; Spoken language; Vision; Action; Imitation; GRAMMATICAL CONSTRUCTION; IMITATION; CHILDREN; INFANTS; EVENTS;
D O I
10.1016/j.newideapsych.2009.07.006
中图分类号
B84 [心理学];
学科分类号
04 ; 0402 ;
摘要
There is a fundamental difference between robots that are equipped with sensory, motor and cognitive capabilities, vs. simulations or non-embodied cognitive systems. Via their perceptual and motor capabilities, these robotic systems can interact with humans in an increasingly more "natural" way, physically interacting with shared objects in cooperative action settings. Indeed, such cognitive robotic systems provide a unique opportunity to developmental psychologists for implementing their theories and testing their hypotheses on systems that are becoming increasingly "at home" in the sensory-motor and social worlds, where such hypotheses are relevant. The current research is the result of interaction between research in computational neuroscience and robotics on the one hand, and developmental psychology on the other. One of the key findings in the developmental psychology context is that with respect to other primates, humans appear to have a unique ability and motivation to share goals and intentions with others. This ability is expressed in cooperative behavior very early in life, and appears to be the basis for subsequent development of social cognition. Here we attempt to identify a set of core functional elements of cooperative behavior and the corresponding shared intentional representations. We then begin to specify how these capabilities can be implemented in a robotic system, the Cooperator, and tested in human-robot interaction experiments. Based on the results of these experiments we discuss the mutual benefit for both fields of the interaction between robotics and developmental psychology. (C) 2009 Elsevier Ltd. All rights reserved.
引用
收藏
页码:260 / 274
页数:15
相关论文
共 50 条
  • [1] Human-robot negotiation of intentions based on virtual fixtures for shared task execution
    Wei, Dong
    Zhou, Hua
    Yang, Hua Yong
    2020 17TH INTERNATIONAL CONFERENCE ON UBIQUITOUS ROBOTS (UR), 2020, : 297 - 302
  • [2] Toward a construction-based account of shared intentions in social cognition
    Dominey, PF
    BEHAVIORAL AND BRAIN SCIENCES, 2005, 28 (05) : 696 - +
  • [3] Projecting Robot Intentions into Human Environments
    Andersen, Rasmus S.
    Madsen, Ole
    Moeslund, Thomas B.
    Amor, Heni Ben
    2016 25TH IEEE INTERNATIONAL SYMPOSIUM ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION (RO-MAN), 2016, : 294 - 301
  • [4] Understanding human-robot teams in light of all-human teams: Aspects of team interaction and shared cognition
    Demir, Mustafa
    McNeese, Nathan J.
    Cooke, Nancy J.
    INTERNATIONAL JOURNAL OF HUMAN-COMPUTER STUDIES, 2020, 140
  • [5] The neural basis of human moral cognition
    Moll, J
    Zahn, R
    de Oliveira-Souza, R
    Krueger, F
    Grafman, J
    NATURE REVIEWS NEUROSCIENCE, 2005, 6 (10) : 799 - 809
  • [6] Simulating others: the basis of human cognition?
    Cowley, SJ
    LANGUAGE SCIENCES, 2004, 26 (03) : 273 - 299
  • [7] The neural basis of human moral cognition
    Jorge Moll
    Roland Zahn
    Ricardo de Oliveira-Souza
    Frank Krueger
    Jordan Grafman
    Nature Reviews Neuroscience, 2005, 6 : 799 - 809
  • [8] Structure of Cognition and Basis of Human Cognizance
    Sterzinger, O.
    ARCHIV FUR DIE GESAMTE PSYCHOLOGIE, 1930, 74 (3-4): : 583 - 583
  • [9] Shared intentions and shared responsibility
    Sadler, Brook Jenkins
    MIDWEST STUDIES IN PHILOSOPHY VOLUME XXX: SHARED INTENTIONS AND COLLECTIVE RESPONSIBILITY, 2006, 30 : 115 - 144
  • [10] The Role of Intentions in Human-Robot Interaction
    Thill, Serge
    Ziemke, Tom
    COMPANION OF THE 2017 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION (HRI'17), 2017, : 427 - 428