Human Preferences for Robot Eye Gaze in Human-to-Robot Handovers

被引:0
|
作者
Tair Faibish
Alap Kshirsagar
Guy Hoffman
Yael Edan
机构
[1] Ben-Gurion University of the Negev,Department of Industrial Engineering and Management and the ABC Robotics Initiative
[2] Cornell University,Sibley School of Mechanical and Aerospace Engineering
关键词
Human-robot handovers; Human-robot interaction; Robot eye gaze; Human-human-handovers; Non-verbal communication;
D O I
暂无
中图分类号
学科分类号
摘要
This paper investigates human’s preferences for a robot’s eye gaze behavior during human-to-robot handovers. We studied gaze patterns for all three phases of the handover process: reach, transfer, and retreat, as opposed to previous work which only focused on the reaching phase. Additionally, we investigated whether the object’s size or fragility or the human’s posture affect the human’s preferences for the robot gaze. A public data-set of human-human handovers was analyzed to obtain the most frequent gaze behaviors that human receivers perform. These were then used to program the robot’s receiver gaze behaviors. In two sets of user studies (video and in-person), a collaborative robot exhibited these gaze behaviors while receiving an object from a human. In the video studies, 72 participants watched and compared videos of handovers between a human actor and a robot demonstrating each of the three gaze behaviors. In the in-person studies, a different set of 72 participants physically performed object handovers with the robot and evaluated their perception of the handovers for the robot’s different gaze behaviors. Results showed that, for both observers and participants in a handover, when the robot exhibited Face-Hand-Face gaze (gazing at the giver’s face and then at the giver’s hand during the reach phase and back at the giver’s face during the retreat phase), participants considered the handover to be more likable, anthropomorphic, and communicative of timing (p<0.0001)\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$(p < 0.0001)$$\end{document}. However, we did not find evidence of any effect of the object’s size or fragility or the giver’s posture on the gaze preference.
引用
收藏
页码:995 / 1012
页数:17
相关论文
共 50 条
  • [1] Human Preferences for Robot Eye Gaze in Human-to-Robot Handovers
    Faibish, Tair
    Kshirsagar, Alap
    Hoffman, Guy
    Edan, Yael
    INTERNATIONAL JOURNAL OF SOCIAL ROBOTICS, 2022, 14 (04) : 995 - 1012
  • [2] Robot Gaze Behaviors in Human-to-Robot Handovers
    Kshirsagar, Alap
    Lim, Melanie
    Christian, Shemar
    Hoffman, Guy
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2020, 5 (04): : 6552 - 6558
  • [3] Human Grasp Classification for Reactive Human-to-Robot Handovers
    Yang, Wei
    Paxton, Chris
    Cakmak, Maya
    Fox, Dieter
    2020 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2020, : 11123 - 11130
  • [4] Reactive Human-to-Robot Handovers of Arbitrary Objects
    Yang, Wei
    Paxton, Chris
    Mousavian, Arsalan
    Chao, Yu-Wei
    Cakmak, Maya
    Fox, Dieter
    2021 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2021), 2021, : 3118 - 3124
  • [5] Learning Human-to-Robot Dexterous Handovers for Anthropomorphic Hand
    Duan, Haonan
    Wang, Peng
    Li, Yiming
    Li, Daheng
    Wei, Wei
    IEEE TRANSACTIONS ON COGNITIVE AND DEVELOPMENTAL SYSTEMS, 2023, 15 (03) : 1224 - 1238
  • [6] Towards safe human-to-robot handovers of unknown containers
    Pang, Yik Lung
    Xompero, Alessio
    Oh, Changjae
    Cavallaro, Andrea
    2021 30TH IEEE INTERNATIONAL CONFERENCE ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION (RO-MAN), 2021, : 51 - 58
  • [7] Expressing and Inferring Action Carefulness in Human-to-Robot Handovers
    Lastrico, Linda
    Duarte, Nuno Ferreira
    Carfi, Alessandro
    Rea, Francesco
    Sciutti, Alessandra
    Mastrogiovanni, Fulvio
    Santos-Victor, Jose
    2023 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2023, : 9824 - 9831
  • [8] Learning Human-to-Robot Handovers from Point Clouds
    Christen, Sammy
    Yang, Wei
    Perez-D'Arpino, Claudia
    Hilliges, Otmar
    Fox, Dieter
    Chao, Yu-Wei
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 9654 - 9664
  • [9] Reactive Human-to-Robot Dexterous Handovers for Anthropomorphic Hand
    Duan, Haonan
    Wang, Peng
    Yang, Yifan
    Li, Daheng
    Wei, Wei
    Luo, Yongkang
    Deng, Guoqiang
    IEEE TRANSACTIONS ON ROBOTICS, 2025, 41 : 742 - 761
  • [10] Model Predictive Control for Fluid Human-to-Robot Handovers
    Yang, Wei
    Sundaralingam, Balakumar
    Paxton, Chris
    Akinola, Iretiayo
    Chao, Yu-Wei
    Cakmak, Maya
    Fox, Dieter
    2022 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, ICRA 2022, 2022, : 6956 - 6962