Multimodal Adapted Robot Behavior Synthesis within a Narrative Human-Robot Interaction

被引:0
|
作者
Aly, Amir [1 ]
Tapus, Adriana [1 ]
机构
[1] ENSTA ParisTech, Robot & Comp Vis Lab, 828 Blvd Marechaux, F-91120 Palaiseau, France
关键词
SPEECH; UTTERANCES; EXPRESSION; MODELS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In human-human interaction, three modalities of communication (i.e., verbal, nonverbal, and paraverbal) are naturally coordinated so as to enhance the meaning of the conveyed message. In this paper, we try to create a similar coordination between these modalities of communication in order to make the robot behave as naturally as possible. The proposed system uses a group of videos in order to elicit specific target emotions in a human user, upon which interactive narratives will start (i.e., interactive discussions between the participant and the robot around each video's content). During each interaction experiment, the humanoid expressive ALICE robot engages and generates an adapted multimodal behavior to the emotional content of the projected video using speech, head-arm metaphoric gestures, and/or facial expressions. The interactive speech of the robot is synthesized using Mary-TTS (text to speech toolkit), which is used - in parallel - to generate adapted head-arm gestures [1]. This synthesized multimodal robot behavior is evaluated by the interacting human at the end of each emotion-eliciting experiment. The obtained results validate the positive effect of the generated robot behavior multimodality on interaction.
引用
收藏
页码:2986 / 2993
页数:8
相关论文
共 50 条
  • [31] Human-Robot Interaction
    Jia, Yunyi
    Zhang, Biao
    Li, Miao
    King, Brady
    Meghdari, Ali
    [J]. JOURNAL OF ROBOTICS, 2018, 2018
  • [32] Tracking Anthropomorphizing Behavior in Human-Robot Interaction
    Fischer, Kerstin
    [J]. ACM TRANSACTIONS ON HUMAN-ROBOT INTERACTION, 2022, 11 (01)
  • [33] Human-robot interaction
    Kosuge, K
    Hirata, Y
    [J]. IEEE ROBIO 2004: PROCEEDINGS OF THE IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND BIOMIMETICS, 2004, : 8 - 11
  • [34] Human-Robot Interaction
    Sidobre, Daniel
    Broquere, Xavier
    Mainprice, Jim
    Burattini, Ernesto
    Finzi, Alberto
    Rossi, Silvia
    Staffa, Mariacarla
    [J]. ADVANCED BIMANUAL MANIPULATION: RESULTS FROM THE DEXMART PROJECT, 2012, 80 : 123 - +
  • [35] Human-Robot Interaction
    Sethumadhavan, Arathi
    [J]. ERGONOMICS IN DESIGN, 2012, 20 (03) : 27 - +
  • [36] Human-robot interaction
    Murphy, Robin R.
    Nomura, Tatsuya
    Billard, Aude
    Burke, Jennifer L.
    [J]. IEEE Robotics and Automation Magazine, 2010, 17 (02): : 85 - 89
  • [37] Operator engagement detection and robot behavior adaptation in human-robot interaction
    Rani, P
    Sarkar, N
    [J]. 2005 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), VOLS 1-4, 2005, : 2051 - 2056
  • [38] Paralinguistic Cues in Speech to Adapt Robot Behavior in Human-Robot Interaction
    Ashok, Ashita
    Pawlak, Jakub
    Paplu, Sarwar
    Zafar, Zuhair
    Berns, Karsten
    [J]. 2022 9TH IEEE RAS/EMBS INTERNATIONAL CONFERENCE ON BIOMEDICAL ROBOTICS AND BIOMECHATRONICS (BIOROB 2022), 2022,
  • [39] Design of an Entertainment Robot with Multimodal Human-Robot Interactions
    Jean, Jong-Hann
    Chen, Kuan-Ting
    Shih, Kuang-Yao
    Lin, Hsiu-Li
    [J]. 2008 PROCEEDINGS OF SICE ANNUAL CONFERENCE, VOLS 1-7, 2008, : 1378 - 1382
  • [40] Human-Robot Interaction
    Ivaldi, Serena
    Pateraki, Maria
    [J]. ERCIM NEWS, 2018, (114): : 6 - 7