Multimodal Approach to Affective Human-Robot Interaction Design with Children

被引:16
|
作者
Okita, Sandra Y. [1 ]
Ng-Thow-Hing, Victor [2 ]
Sarvadevabhatla, Ravi K. [2 ]
机构
[1] Columbia Univ, Teachers Coll, Dept Math Sci & Technol, New York, NY 10027 USA
[2] Honda Res Inst Inc, Mountain View, CA USA
关键词
Human-robot interaction; human-robot communication; affective interaction; young children; social robots; collaboration;
D O I
10.1145/2030365.2030370
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Two studies examined the different features of humanoid robots and the influence on children's affective behavior. The first study looked at interaction styles and general features of robots. The second study looked at how the robot's attention influences children's behavior and engagement. Through activities familiar to young children (e.g., table setting, story telling), the first study found that cooperative interaction style elicited more oculesic behavior and social engagement. The second study found that quality of attention, type of attention, and length of interaction influences affective behavior and engagement. In the quality of attention, Wizard-of-Oz (woz) elicited the most affective behavior, but automatic attention worked as well as woz when the interaction was short. The type of attention going from nonverbal to verbal attention increased children's oculesic behavior, utterance, and physiological response. Affective interactions did not seem to depend on a single mechanism, but a well-chosen confluence of technical features.
引用
收藏
页数:29
相关论文
共 50 条
  • [1] Affective Human-Robot Interaction with Multimodal Explanations
    Zhu, Hongbo
    Yu, Chuang
    Cangelosi, Angelo
    [J]. SOCIAL ROBOTICS, ICSR 2022, PT I, 2022, 13817 : 241 - 252
  • [2] Affective state recognition and adaptation in human-robot interaction: A design approach
    Liu, Changchun
    Rani, Pramila
    Sarkar, Nilanjan
    [J]. 2006 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, VOLS 1-12, 2006, : 3099 - +
  • [3] Affective Grounding in Human-Robot Interaction
    Jung, Malte F.
    [J]. PROCEEDINGS OF THE 2017 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION (HRI'17), 2017, : 263 - 273
  • [4] Development and Testing of a Multimodal Acquisition Platform for Human-Robot Interaction Affective Studies
    Lazzeri, Nicole
    Mazzei, Daniele
    De Rossi, Danilo
    [J]. JOURNAL OF HUMAN-ROBOT INTERACTION, 2014, 3 (02): : 1 - 24
  • [5] Active Inference Through Energy Minimization in Multimodal Affective Human-Robot Interaction
    Horii, Takato
    Nagai, Yukie
    [J]. FRONTIERS IN ROBOTICS AND AI, 2021, 8
  • [6] Multimodal Interaction for Human-Robot Teams
    Burke, Dustin
    Schurr, Nathan
    Ayers, Jeanine
    Rousseau, Jeff
    Fertitta, John
    Carlin, Alan
    Dumond, Danielle
    [J]. UNMANNED SYSTEMS TECHNOLOGY XV, 2013, 8741
  • [7] Affective state estimation for human-robot interaction
    Kulic, Dana
    Croft, Elizabeth A.
    [J]. IEEE TRANSACTIONS ON ROBOTICS, 2007, 23 (05) : 991 - 1000
  • [8] Body Language in Affective Human-Robot Interaction
    Stoeva, Darja
    Gelautz, Margrit
    [J]. HRI'20: COMPANION OF THE 2020 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, 2020, : 606 - 608
  • [9] Recent advancements in multimodal human-robot interaction
    Su, Hang
    Qi, Wen
    Chen, Jiahao
    Yang, Chenguang
    Sandoval, Juan
    Laribi, Med Amine
    [J]. FRONTIERS IN NEUROROBOTICS, 2023, 17
  • [10] A Dialogue System for Multimodal Human-Robot Interaction
    Lucignano, Lorenzo
    Cutugno, Francesco
    Rossi, Silvia
    Finzi, Alberto
    [J]. ICMI'13: PROCEEDINGS OF THE 2013 ACM INTERNATIONAL CONFERENCE ON MULTIMODAL INTERACTION, 2013, : 197 - 204