Embodied Affect for Real-World Human-Robot Interaction

被引:2
|
作者
Canamero, Lola [1 ]
机构
[1] Univ Hertfordshire, Dept Comp Sci, Embodied Emot Cognit & Inter Act Lab, Hatfield, Herts, England
关键词
Emotion modeling; Embodied AI; Autonomous social robots; Developmental robotics; HRI;
D O I
10.1145/3319502.3374843
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
The potential that robots offer to support humans in multiple aspects of our daily lives is increasingly acknowledged. Despite the clear progress in social robotics and human-robot interaction, the actual realization of this potential still faces numerous scientific and technical challenges, many of them linked to difficulties in dealing with the complexity of the real world. Achieving real-world human-robot interaction requires, on the one hand, taking into account and addressing real-world (e.g., stakeholder's) needs and application areas and, on the other hand, making our robots operational in the real world. In this talk, I will address some of the contributions that Embodied Artificial Intelligence can make towards this goal, illustrating my arguments with examples of my and my group's research on HRI using embodied autonomous affective robots in areas such as developmental robotics, healthcare, and computational psychiatry. So far little explored in HRI, Embodied AI, which started as an alternative to "symbolic AI" (a "paradigm change") in the way to conceive and model the notion of "intelligence" and the interactions of embodied agents with the real world, is highly relevant towards achieving "real-world HRI", with its emphasis on notions such as autonomy, adaptation, interaction with dynamic environments, sensorimotor loops and coordination, learning from interactions, and more generally, as Rodney Brooks put it, using and exploiting the real world as "its own best model".
引用
收藏
页码:459 / 459
页数:1
相关论文
共 50 条
  • [31] A Framework for Affect-Based Natural Human-Robot Interaction
    Villani, Valeria
    Sabattini, Lorenzo
    Secchi, Cristian
    Fantuzzi, Cesare
    2018 27TH IEEE INTERNATIONAL SYMPOSIUM ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION (IEEE RO-MAN 2018), 2018, : 1038 - 1044
  • [32] Human-Robot Interaction
    Jia, Yunyi
    Zhang, Biao
    Li, Miao
    King, Brady
    Meghdari, Ali
    JOURNAL OF ROBOTICS, 2018, 2018
  • [33] Human-Robot Interaction
    Sidobre, Daniel
    Broquere, Xavier
    Mainprice, Jim
    Burattini, Ernesto
    Finzi, Alberto
    Rossi, Silvia
    Staffa, Mariacarla
    ADVANCED BIMANUAL MANIPULATION: RESULTS FROM THE DEXMART PROJECT, 2012, 80 : 123 - +
  • [34] Human-Robot Interaction
    Sethumadhavan, Arathi
    ERGONOMICS IN DESIGN, 2012, 20 (03) : 27 - +
  • [35] Human-robot interaction
    Murphy R.R.
    Nomura T.
    Billard A.
    Burke J.L.
    IEEE Robotics and Automation Magazine, 2010, 17 (02): : 85 - 89
  • [36] Human-robot interaction
    Kosuge, K
    Hirata, Y
    IEEE ROBIO 2004: PROCEEDINGS OF THE IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND BIOMIMETICS, 2004, : 8 - 11
  • [37] Human-robot interaction
    Sidobre, Daniel
    Broquère, Xavier
    Mainprice, Jim
    Burattini, Ernesto
    Finzi, Alberto
    Rossi, Silvia
    Staffa, Mariacarla
    Springer Tracts in Advanced Robotics, 2012, 80 (STAR): : 123 - 172
  • [38] Using Embodied Multimodal Fusion to Perform Supportive and Instructive Robot Roles in Human-Robot Interaction
    Manuel Giuliani
    Alois Knoll
    International Journal of Social Robotics, 2013, 5 : 345 - 356
  • [39] Using Embodied Multimodal Fusion to Perform Supportive and Instructive Robot Roles in Human-Robot Interaction
    Giuliani, Manuel
    Knoll, Alois
    INTERNATIONAL JOURNAL OF SOCIAL ROBOTICS, 2013, 5 (03) : 345 - 356
  • [40] Human-Robot Interaction
    Ivaldi, Serena
    Pateraki, Maria
    ERCIM NEWS, 2018, (114): : 6 - 7