Closing the Feedback Loop: The Relationship Between Input and Output Modalities in Human-Robot Interactions

被引:0
|
作者
Markovich, Tamara [1 ]
Honig, Shanee [1 ]
Oron-Gilad, Tal [1 ]
机构
[1] Ben Gurion Univ Negev, Beer Sheva, Israel
来源
关键词
Human-robot interaction; Feedback loop; Navigation task; Feedback by motion cues; Stimulus-response compatibility;
D O I
10.1007/978-3-030-42026-0_3
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Previous studies suggested that communication modalities used for human control and robot feedback influence human-robot interactions. However, they generally tended to focus on one part of the communication, ignoring the relationship between control and feedback modalities. We aim to understand whether the relationship between a user's control modality and a robot's feedback modality influences the quality of the interaction and if so, find the most compatible pairings. In a laboratory Wizard-of-Oz experiment, participants were asked to guide a robot through a maze by using either hand gestures or vocal commands. The robot provided vocal or motion feedback to the users across the experimental conditions forming different combinations of control-feedback modalities. We found that the combinations of control-feedback modalities affected the quality of human-robot interaction (subjective experience and efficiency) in different ways. Participants showed less worry and were slower when they communicated with the robot by voice and received vocal feedback, compared to gestural control and receiving vocal feedback. In addition, they felt more distress and were faster when they communicated with the robot by gestures and received motion feedback compared to vocal control and motion feedback. We also found that providing feedback improves the quality of human-robot interaction. In this paper we detail the procedure and results of this experiment.
引用
收藏
页码:29 / 42
页数:14
相关论文
共 50 条
  • [31] Interactions and motions in human-robot coordination
    Clemson Univ, Clemson, United States
    Proc IEEE Int Conf Rob Autom, (3171-3176):
  • [32] Interactions and motions in human-robot coordination
    Luh, JYS
    Hu, SY
    ICRA '99: IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS 1-4, PROCEEDINGS, 1999, : 3171 - 3176
  • [33] Cognitive Telepresence in Human-Robot Interactions
    Harutyunyan, Vahagn
    Manohar, Vimitha
    Gezehei, Issak
    Crandall, Jacob W.
    JOURNAL OF HUMAN-ROBOT INTERACTION, 2012, 1 (02): : 158 - 182
  • [34] Errors in Human-Robot Interactions and Their Effects on Robot Learning
    Kim, Su Kyoung
    Kirchner, Elsa Andrea
    Schlossmueller, Lukas
    Kirchner, Frank
    FRONTIERS IN ROBOTICS AND AI, 2020, 7
  • [35] Design of an Entertainment Robot with Multimodal Human-Robot Interactions
    Jean, Jong-Hann
    Chen, Kuan-Ting
    Shih, Kuang-Yao
    Lin, Hsiu-Li
    2008 PROCEEDINGS OF SICE ANNUAL CONFERENCE, VOLS 1-7, 2008, : 1378 - 1382
  • [36] The Effect of Anthropomorphization and Gender of a Robot on Human-Robot Interactions
    Ye, Hongjun
    Jeong, Haeyoung
    Zhong, Wenting
    Bhatt, Siddharth
    Izzetoglu, Kurtulus
    Ayaz, Hasan
    Suri, Rajneesh
    ADVANCES IN NEUROERGONOMICS AND COGNITIVE ENGINEERING, 2020, 953 : 357 - 362
  • [37] HUMAN-ROBOT: FROM INTERACTION TO RELATIONSHIP
    Grandgeorge, Marine
    Duhaut, Dominique
    FIELD ROBOTICS, 2012, : 339 - 346
  • [38] Update of Human-Robot Relationship Based on Ethologically Inspired Human-Robot Communication History
    Kanai, Honoka
    Niitsuma, Mihoko
    2016 25TH IEEE INTERNATIONAL SYMPOSIUM ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION (RO-MAN), 2016, : 324 - 330
  • [39] Fully Automatic Analysis of Engagement and Its Relationship to Personality in Human-Robot Interactions
    Salam, Hanan
    Celiktutan, Oya
    Hupont, Isabelle
    Gunes, Hatice
    Chetouani, Mohamed
    IEEE ACCESS, 2017, 5 : 705 - 721
  • [40] Comparing alternative modalities in the context of multimodal human-robot interaction
    Saren, Suprakas
    Mukhopadhyay, Abhishek
    Ghose, Debasish
    Biswas, Pradipta
    JOURNAL ON MULTIMODAL USER INTERFACES, 2024, 18 (01) : 69 - 85