Gesture-based human-robot interaction for human assistance in manufacturing

被引:0
|
作者
Pedro Neto
Miguel Simão
Nuno Mendes
Mohammad Safeea
机构
[1] University of Coimbra,Department of Mechanical Engineering
[2] Arts et Métiers,undefined
关键词
Human-robot interaction; Collaborative robotics; Gesture recognition; Intuitive interfaces;
D O I
暂无
中图分类号
学科分类号
摘要
The paradigm for robot usage has changed in the last few years, from a scenario in which robots work isolated to a scenario where robots collaborate with human beings, exploiting and combining the best abilities of robots and humans. The development and acceptance of collaborative robots is highly dependent on reliable and intuitive human-robot interaction (HRI) in the factory floor. This paper proposes a gesture-based HRI framework in which a robot assists a human co-worker delivering tools and parts, and holding objects to/for an assembly operation. Wearable sensors, inertial measurement units (IMUs), are used to capture the human upper body gestures. Captured data are segmented in static and dynamic blocks recurring to an unsupervised sliding window approach. Static and dynamic data blocks feed an artificial neural network (ANN) for static, dynamic, and composed gesture classification. For the HRI interface, we propose a parameterization robotic task manager (PRTM), in which according to the system speech and visual feedback, the co-worker selects/validates robot options using gestures. Experiments in an assembly operation demonstrated the efficiency of the proposed solution.
引用
收藏
页码:119 / 135
页数:16
相关论文
共 50 条
  • [1] Gesture-based human-robot interaction for human assistance in manufacturing
    Neto, Pedro
    Simao, Miguel
    Mendes, Nuno
    Safeea, Mohammad
    [J]. INTERNATIONAL JOURNAL OF ADVANCED MANUFACTURING TECHNOLOGY, 2019, 101 (1-4): : 119 - 135
  • [2] Spatially Unconstrained, Gesture-Based Human-Robot Interaction
    Doisy, Guillaume
    Jevtic, Aleksandar
    Bodiroza, Sasa
    [J]. PROCEEDINGS OF THE 8TH ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION (HRI 2013), 2013, : 117 - +
  • [3] A Gesture-based Multimodal Interface for Human-Robot Interaction
    Uimonen, Mikael
    Kemppi, Paul
    Hakanen, Taru
    [J]. 2023 32ND IEEE INTERNATIONAL CONFERENCE ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION, RO-MAN, 2023, : 165 - 170
  • [4] Hand posture recognition in gesture-based human-robot interaction
    Yin, Xiaoming
    Zhu, Xing
    [J]. ICIEA 2006: 1ST IEEE CONFERENCE ON INDUSTRIAL ELECTRONICS AND APPLICATIONS, VOLS 1-3, PROCEEDINGS, 2006, : 397 - 402
  • [5] Hand posture recognition in gesture-based human-robot interaction
    Yin, Xiaoming
    Zhu, Xing
    [J]. 2006 1ST IEEE CONFERENCE ON INDUSTRIAL ELECTRONICS AND APPLICATIONS, VOLS 1-3, 2006, : 835 - +
  • [6] Improved Optical Flow for Gesture-based Human-robot Interaction
    Chang, Jen-Yen
    Tejero-de-Pablos, Antonio
    Harada, Tatsuya
    [J]. 2019 INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2019, : 7983 - 7989
  • [7] A Novel Gesture-Based Language for Underwater Human-Robot Interaction
    Chiarella, Davide
    Bibuli, Marco
    Bruzzone, Gabriele
    Caccia, Massimo
    Ranieri, Andrea
    Zereik, Enrica
    Marconi, Lucia
    Cutugno, Paola
    [J]. JOURNAL OF MARINE SCIENCE AND ENGINEERING, 2018, 6 (03)
  • [8] Gesture-based Human-Robot Jazz Improvisation
    Hoffman, Guy
    Weinberg, Gil
    [J]. 2010 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2010, : 582 - 587
  • [9] An experiment study of gesture-based human-robot interface
    Xu, Yong
    Gulillemot, Matthieu
    Nishida, Toyoaki
    [J]. 2007 IEEE/ICME INTERNATIONAL CONFERENCE ON COMPLEX MEDICAL ENGINEERING, VOLS 1-4, 2007, : 457 - 463
  • [10] A Gesture-Based Natural Human-Robot Interaction Interface With Unrestricted Force Feedback
    Liang, Yinhao
    Du, Guanglong
    Li, Chunquan
    Chen, Chuxin
    Wang, Xueqian
    Liu, Peter X.
    [J]. IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2022, 71