Body Language in Affective Human-Robot Interaction

被引:12
|
作者
Stoeva, Darja [1 ]
Gelautz, Margrit [1 ]
机构
[1] TU Wien, Inst Visual Comp & Human Ctr Technol, Vienna, Austria
来源
HRI'20: COMPANION OF THE 2020 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION | 2020年
关键词
human-robot interaction; body language; robotics; affective computing; social signals; EMOTION; RECOGNITION; PERCEPTION;
D O I
10.1145/3371382.3377432
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Social human-robot interaction is concerned with exploring the ways in which social interaction can be achieved between a human and a sociable robot. Affect has an important role in interaction as it helps interactants coordinate and indicate the success of the communication. Designing socially intelligent robots requires competence in communication, which includes exchanges of both verbal and non-verbal cues. This project will focus on non-verbal communication, more specifically body movements, postures and gestures as means of conveying socially affective information. Using the affective grounding perspective, which conceptualizes emotion as a coordination mechanism, together with honest signals as a measurement of the dynamics of the interaction, and the robot Pepper, we aim to develop a system that would be able to communicate affect, with the goal to enhance affective human-robot interaction.
引用
收藏
页码:606 / 608
页数:3
相关论文
共 50 条
  • [41] Generalizing and Formalizing Precisiation Language to Facilitate Human-Robot Interaction
    Nakama, Takehiko
    Munoz, Enrique
    LeBlanc, Kevin
    Ruspini, Enrique
    COMPUTATIONAL INTELLIGENCE, IJCCI 2013, 2016, 613 : 381 - 397
  • [42] Computer simulation of human-robot interaction through natural language
    Khummongkol, Rojanee
    Yokota, Masao
    ARTIFICIAL LIFE AND ROBOTICS, 2016, 21 (04) : 510 - 519
  • [43] Second Workshop on Natural Language Generation for Human-Robot Interaction
    Buschmeier, Hendrik
    Foster, Mary Ellen
    Gkatzia, Dimitra
    HRI'20: COMPANION OF THE 2020 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, 2020, : 646 - 647
  • [44] Structured learning for spoken language understanding in human-robot interaction
    Bastianelli, Emanuele
    Castellucci, Giuseppe
    Croce, Danilo
    Basili, Roberto
    Nardi, Daniele
    INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2017, 36 (5-7): : 660 - 683
  • [45] Toward Ethical Natural Language Generation for Human-Robot Interaction
    Williams, Tom
    COMPANION OF THE 2018 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION (HRI'18), 2018, : 281 - 282
  • [46] Multi-modal Language Models for Human-Robot Interaction
    Janssens, Ruben
    COMPANION OF THE 2024 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, HRI 2024 COMPANION, 2024, : 109 - 111
  • [47] Human-robot interaction through joint robot planning with large language models
    Asuzu, Kosi
    Singh, Harjinder
    Idrissi, Moad
    INTELLIGENT SERVICE ROBOTICS, 2025, : 261 - 277
  • [48] Human Body Pose Interpretation and Classification for Social Human-Robot Interaction
    McColl, Derek
    Zhang, Zhe
    Nejat, Goldie
    INTERNATIONAL JOURNAL OF SOCIAL ROBOTICS, 2011, 3 (03) : 313 - 332
  • [49] Human Body Pose Interpretation and Classification for Social Human-Robot Interaction
    Derek McColl
    Zhe Zhang
    Goldie Nejat
    International Journal of Social Robotics, 2011, 3
  • [50] Mood contagion of robot body language in human robot interaction
    Junchao Xu
    Joost Broekens
    Koen Hindriks
    Mark A. Neerincx
    Autonomous Agents and Multi-Agent Systems, 2015, 29 : 1216 - 1248