Gesture-based human-robot interaction for human assistance in manufacturing

被引:88
|
作者
Neto, Pedro [1 ]
Simao, Miguel [1 ,2 ]
Mendes, Nuno [1 ]
Safeea, Mohammad [1 ,2 ]
机构
[1] Univ Coimbra, Dept Mech Engn, Coimbra, Portugal
[2] Arts & Metiers, Lille, France
关键词
Human-robot interaction; Collaborative robotics; Gesture recognition; Intuitive interfaces;
D O I
10.1007/s00170-018-2788-x
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The paradigm for robot usage has changed in the last few years, from a scenario in which robots work isolated to a scenario where robots collaborate with human beings, exploiting and combining the best abilities of robots and humans. The development and acceptance of collaborative robots is highly dependent on reliable and intuitive human-robot interaction (HRI) in the factory floor. This paper proposes a gesture-based HRI framework in which a robot assists a human co-worker delivering tools and parts, and holding objects to/for an assembly operation. Wearable sensors, inertial measurement units (IMUs), are used to capture the human upper body gestures. Captured data are segmented in static and dynamic blocks recurring to an unsupervised sliding window approach. Static and dynamic data blocks feed an artificial neural network (ANN) for static, dynamic, and composed gesture classification. For the HRI interface, we propose a parameterization robotic task manager (PRTM), in which according to the system speech and visual feedback, the co-worker selects/validates robot options using gestures. Experiments in an assembly operation demonstrated the efficiency of the proposed solution.
引用
收藏
页码:119 / 135
页数:17
相关论文
共 50 条
  • [21] A Deictic Gesture-Based Human-Robot Interface for In Situ Task Specification in Construction
    Yoon, Sungboo
    Park, Jinsik
    Park, Moonseo
    Ahn, Changbum R.
    [J]. COMPUTING IN CIVIL ENGINEERING 2023-DATA, SENSING, AND ANALYTICS, 2024, : 445 - 452
  • [22] Challenges in Annotating Gesture-Based Cognitive Status in Human-Robot Collaboration Datasets
    Daigler, Logan
    Higger, Mark
    Mott, Terran
    Williams, Tom
    [J]. COMPANION OF THE 2024 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, HRI 2024 COMPANION, 2024, : 364 - 368
  • [23] Research on multimodal human-robot interaction based on speech and gesture
    Deng Yongda
    Li Fang
    Xin Huang
    [J]. COMPUTERS & ELECTRICAL ENGINEERING, 2018, 72 : 443 - 454
  • [24] Human-robot interaction - Facial gesture recognition
    Rudall, BH
    [J]. ROBOTICA, 1996, 14 : 596 - 597
  • [25] Space, Speech, and Gesture in Human-Robot Interaction
    Mead, Ross
    [J]. ICMI '12: PROCEEDINGS OF THE ACM INTERNATIONAL CONFERENCE ON MULTIMODAL INTERACTION, 2012, : 333 - 336
  • [26] Estimation of gesture pointing for human-robot interaction
    Chen R.
    Fei M.
    Yang A.
    [J]. Yi Qi Yi Biao Xue Bao/Chinese Journal of Scientific Instrument, 2023, 44 (03): : 200 - 208
  • [27] Gesture Mimicry in Social Human-Robot Interaction
    Stolzenwald, Janis
    Bremner, Paul
    [J]. 2017 26TH IEEE INTERNATIONAL SYMPOSIUM ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION (RO-MAN), 2017, : 430 - 436
  • [28] Gesture recognition based on context awareness for human-robot interaction
    Hong, Seok-Ju
    Setiawan, Nurul Arif
    Kim, Song-Gook
    Lee, Chil-Woo
    [J]. ADVANCES IN ARTIFICIAL REALITY AND TELE-EXISTENCE, PROCEEDINGS, 2006, 4282 : 1 - +
  • [29] Gesture recognition based on arm tracking for human-robot interaction
    Sigalas, Markos
    Baltzakis, Haris
    Trahanias, Panos
    [J]. IEEE/RSJ 2010 INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS 2010), 2010, : 5424 - 5429
  • [30] Gesture Learning Based on A Topological Approach for Human-Robot Interaction
    Obo, Takenori
    Takizawa, Kazuma
    [J]. 2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,