Cooperative gestures for industry: Exploring the efficacy of robot hand configurations in expression of instructional gestures for human-robot interaction

被引:27
|
作者
Sheikholeslami, Sara [1 ]
Moon, AJung [1 ]
Croft, Elizabeth A. [1 ]
机构
[1] Univ British Columbia, Dept Mech Engn, 6250 Appl Sci Lane, Vancouver, BC V6T 1Z4, Canada
来源
关键词
Human-robot communication; gesture; nonverbal; industrial assembly; recognition; COMMUNICATION;
D O I
10.1177/0278364917709941
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Fast and reliable communication between human worker(s) and robotic assistants is essential for successful collaboration between the agents. This is especially true for typically noisy manufacturing environments that render verbal communication less effective. In this work, we investigate the efficacy of nonverbal communication capabilities of robotic manipulators that have poseable, three-fingered end-effectors (hands). We explore the extent to which different poses of a typical robotic gripper can effectively communicate instructional messages during human-robot collaboration. Within the context of a collaborative car door assembly task, we conducted a series of three studies. We first observed the type of hand configurations that humans use to nonverbally instruct another person (Study 1, N = 17); based on the observation, we examined how well human gestures with frequently used hand configurations are understood by recipients of the message (Study 2, N = 140). Finally, we implemented the most human-recognized human hand configurations on a seven-degree-of-freedom robotic manipulator to investigate the efficacy of having human-inspired hand poses on a robotic hand compared to an unposed hand (Study 3, N = 100). Contributions of this work include presentation of a set of hand configurations humans commonly use to instruct another person in a collaborative assembly scenario, as well as recognition rate and recognition confidence measures for the gestures that humans and robots express using different hand configurations. Results indicate that most gestures are better recognized with a higher level of confidence when displayed with a posed robot hand.
引用
收藏
页码:699 / 720
页数:22
相关论文
共 50 条
  • [1] Exploring the Effect of Robot Hand Configurations in Directional Gestures for Human-Robot Interaction
    Sheikholeslami, Sara
    Moon, AJung
    Croft, Elizabeth A.
    [J]. 2015 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2015, : 3594 - 3599
  • [2] Pantomimic Gestures for Human-Robot Interaction
    Burke, Michael
    Lasenby, Joan
    [J]. IEEE TRANSACTIONS ON ROBOTICS, 2015, 31 (05) : 1225 - 1237
  • [3] Conversational Gestures in Human-Robot Interaction
    Bremner, Paul
    Pipe, Anthony
    Melhuish, Chris
    Fraser, Mike
    Subramanian, Sriram
    [J]. 2009 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN AND CYBERNETICS (SMC 2009), VOLS 1-9, 2009, : 1645 - +
  • [4] Simultaneous Segmentation and Recognition of Hand Gestures for Human-Robot Interaction
    Vasquez Chavarria, Harold
    Jair Escalante, Hugo
    Enrique Sucar, L.
    [J]. 2013 16TH INTERNATIONAL CONFERENCE ON ADVANCED ROBOTICS (ICAR), 2013,
  • [5] An Underwater Human-Robot Interaction Using Hand Gestures for Fuzzy Control
    Jiang, Yu
    Peng, Xianglong
    Xue, Mingzhu
    Wang, Chong
    Qi, Hong
    [J]. INTERNATIONAL JOURNAL OF FUZZY SYSTEMS, 2021, 23 (06) : 1879 - 1889
  • [6] Pointing Gestures for Human-Robot Interaction with the Humanoid Robot Digit
    Lorentz, Viktor
    Weiss, Manuel
    Hildebrand, Kristian
    Boblan, Ivo
    [J]. 2023 32ND IEEE INTERNATIONAL CONFERENCE ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION, RO-MAN, 2023, : 1886 - 1892
  • [7] Integration of Gestures and Speech in Human-Robot Interaction
    Meena, Raveesh
    Jokinen, Kristiina
    Wilcock, Graham
    [J]. 3RD IEEE INTERNATIONAL CONFERENCE ON COGNITIVE INFOCOMMUNICATIONS (COGINFOCOM 2012), 2012, : 673 - 678
  • [8] Human-Robot Interaction Using Pointing Gestures
    Tolgyessy, Michal
    Dekan, Martin
    Hubinsky, Peter
    [J]. ISCSIC'18: PROCEEDINGS OF THE 2ND INTERNATIONAL SYMPOSIUM ON COMPUTER SCIENCE AND INTELLIGENT CONTROL, 2018,
  • [9] Incremental learning of gestures for human-robot interaction
    Okada, Shogo
    Kobayashi, Yoichi
    Ishibashi, Satoshi
    Nishida, Toyoaki
    [J]. AI & SOCIETY, 2010, 25 (02) : 155 - 168
  • [10] Recognizing Touch Gestures for Social Human-Robot Interaction
    Altuglu, Tugce Balli
    Altun, Kerem
    [J]. ICMI'15: PROCEEDINGS OF THE 2015 ACM INTERNATIONAL CONFERENCE ON MULTIMODAL INTERACTION, 2015, : 407 - 413