Biologically inspired multimodal integration: Interferences in a human-robot interaction game

被引:7
|
作者
Sauser, Eric L. [1 ]
Billard, Aude G. [1 ]
机构
[1] Ecole Polytech Fed Lausanne, LASA, Learning Algorithms & Syst Lab, CH-1015 Lausanne, Switzerland
来源
2006 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, VOLS 1-12 | 2006年
基金
瑞士国家科学基金会;
关键词
D O I
10.1109/IROS.2006.282283
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper presents a biologically inspired approach to multimodal integration and decision-making in the context of human-robot interactions. More specifically, we address the principle of ideomotor compatibility by which observing the movements of others influences the quality of one's own performance. This fundamental human ability is likely to be linked with human imitation abilities, social interactions, the transfer of manual skills, and probably to mind reading. We present a robotic control model capable of integrating multimodal information, decision making, and replicating a stimulus-response compatibility task, originally designed to measure the effect of ideomotor compatibility on human behavior. The model consists of a neural network based on the dynamic field approach, which is known for its natural ability for stimulus enhancement as well as cooperative and competitive interactions within and across sensorimotor representations. Finally, we discuss how the capacity for ideomotor facilitation can provide the robot with human-like behavior, but at the expense of several disadvantages, such as hesitation and even mistakes.
引用
收藏
页码:5619 / +
页数:2
相关论文
共 50 条
  • [21] A unified multimodal control framework for human-robot interaction
    Cherubini, Andrea
    Passama, Robin
    Fraisse, Philippe
    Crosnier, Andre
    ROBOTICS AND AUTONOMOUS SYSTEMS, 2015, 70 : 106 - 115
  • [22] Multimodal QOL Estimation During Human-Robot Interaction
    Nakagawa, Satoshi
    Kuniyoshi, Yasuo
    2024 IEEE INTERNATIONAL CONFERENCE ON DIGITAL HEALTH, ICDH 2024, 2024, : 23 - 32
  • [23] A Multimodal Human-Robot Interaction Manager for Assistive Robots
    Abbasi, Bahareh
    Monaikul, Natawut
    Rysbek, Zhanibek
    Di Eugenio, Barbara
    Zefran, Milos
    2019 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2019, : 6756 - 6762
  • [24] DiGeTac Unit for Multimodal Communication in Human-Robot Interaction
    Al, Gorkem Anil
    Martinez-Hernandez, Uriel
    IEEE SENSORS LETTERS, 2024, 8 (05)
  • [25] Probabilistic Multimodal Modeling for Human-Robot Interaction Tasks
    Campbell, Joseph
    Stepputtis, Simon
    Amor, Heni Ben
    ROBOTICS: SCIENCE AND SYSTEMS XV, 2019,
  • [26] Multimodal Target Prediction for Rapid Human-Robot Interaction
    Mitra, Mukund
    Patil, Ameya
    Mothish, G. V. S.
    Kumar, Gyanig
    Mukhopadhyay, Abhishek
    Murthy, L. R. D.
    Chakrabarti, Partha Pratim
    Biswas, Pradipta
    COMPANION PROCEEDINGS OF 2024 29TH ANNUAL CONFERENCE ON INTELLIGENT USER INTERFACES, IUI 2024 COMPANION, 2024, : 18 - 23
  • [27] A dialogue manager for multimodal human-robot interaction and learning of a humanoid robot
    Holzapfel, Hartwig
    INDUSTRIAL ROBOT-THE INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH AND APPLICATION, 2008, 35 (06): : 528 - 535
  • [28] Hierarchical Psychologically Inspired Planning for Human-Robot Interaction Tasks
    Kiselev, Gleb
    Panov, Aleksandr
    INTERACTIVE COLLABORATIVE ROBOTICS (ICR 2019), 2019, 11659 : 150 - 160
  • [29] Vocal Human-Robot Interaction Inspired by Battle Management Language
    Ciesielski, Agata
    Yeh, Bryanna
    Gordge, Kelles
    Basescu, Max
    Tunstel, Edward
    2017 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2017, : 3379 - 3384
  • [30] Employing User-Generated Content to Enhance Human-Robot Interaction in a Human-Robot Trust Game
    Liang, Yuhua
    Lee, Seungcheol Austin
    ELEVENTH ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN ROBOT INTERACTION (HRI'16), 2016, : 469 - 470