A Literature Survey of How to Convey Transparency in Co-Located Human-Robot Interaction

被引:9
|
作者
Schoett, Svenja Y. [1 ]
Amin, Rifat Mehreen [1 ]
Butz, Andreas [1 ]
机构
[1] Ludwig Maximilians Univ Munchen, Frauenlobstr 7A, D-80337 Munich, Germany
关键词
transparency; HRI; modality; understandability; TRUST;
D O I
10.3390/mti7030025
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In human-robot interaction, transparency is essential to ensure that humans understand and trust robots. Understanding is vital from an ethical perspective and benefits interaction, e.g., through appropriate trust. While there is research on explanations and their content, the methods used to convey the explanations are underexplored. It remains unclear which approaches are used to foster understanding. To this end, we contribute a systematic literature review exploring how robot transparency is fostered in papers published in the ACM Digital Library and IEEE Xplore. We found that researchers predominantly rely on monomodal visual or verbal explanations to foster understanding. Commonly, these explanations are external, as opposed to being integrated in the robot design. This paper provides an overview of how transparency is communicated in human-robot interaction research and derives a classification with concrete recommendations for communicating transparency. Our results establish a solid base for consistent, transparent human-robot interaction designs.
引用
收藏
页数:20
相关论文
共 50 条
  • [21] Autonomy in Physical Human-Robot Interaction: A Brief Survey
    Selvaggio, Mario
    Cognetti, Marco
    Nikolaidis, Stefanos
    Ivaldi, Serena
    Siciliano, Bruno
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2021, 6 (04) : 7989 - 7996
  • [22] Human-Robot Interaction and Sexbots: A Systematic Literature Review
    Gonzalez-Gonzalez, Carina Soledad
    Gil-Iranzo, Rosa Maria
    Paderewski-Rodriguez, Patricia
    SENSORS, 2021, 21 (01) : 1 - 18
  • [23] Improving Transparency in Physical Human-Robot Interaction Using an Impedance Compensator
    Lee, Kyeong Ha
    Baek, Seung Guk
    Lee, Hyuk Jin
    Choi, Hyouk Ryeol
    Moon, Hyungpil
    Koo, Ja Choon
    2017 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2017, : 3591 - 3596
  • [24] How to Improve Human-Robot Interaction with Conversational Fillers
    Wigdor, Noel
    de Greeff, Joachim
    Looije, Rosemarijn
    Neerincx, Mark A.
    2016 25TH IEEE INTERNATIONAL SYMPOSIUM ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION (RO-MAN), 2016, : 219 - 224
  • [25] How social distance shapes human-robot interaction
    Kim, Yunkyung
    Mutlu, Bilge
    INTERNATIONAL JOURNAL OF HUMAN-COMPUTER STUDIES, 2014, 72 (12) : 783 - 795
  • [26] Human-Robot Interaction
    Jia, Yunyi
    Zhang, Biao
    Li, Miao
    King, Brady
    Meghdari, Ali
    JOURNAL OF ROBOTICS, 2018, 2018
  • [27] Screen Feedback in Human-Robot Interaction: How to Enhance Robot Expressiveness
    Mirnig, Nicole
    Tan, Yeow Kee
    Chang, Tai Wen
    Chua, Yuan Wei
    Tran Anh Dung
    Li, Haizhou
    Tscheligi, Manfred
    2014 23RD IEEE INTERNATIONAL SYMPOSIUM ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION (IEEE RO-MAN), 2014, : 224 - 230
  • [28] Human-Robot Interaction
    Sidobre, Daniel
    Broquere, Xavier
    Mainprice, Jim
    Burattini, Ernesto
    Finzi, Alberto
    Rossi, Silvia
    Staffa, Mariacarla
    ADVANCED BIMANUAL MANIPULATION: RESULTS FROM THE DEXMART PROJECT, 2012, 80 : 123 - +
  • [29] Human-Robot Interaction
    Sethumadhavan, Arathi
    ERGONOMICS IN DESIGN, 2012, 20 (03) : 27 - +
  • [30] Human-robot interaction
    Murphy R.R.
    Nomura T.
    Billard A.
    Burke J.L.
    IEEE Robotics and Automation Magazine, 2010, 17 (02): : 85 - 89