Efficient Hand Gesture Recognition for Human-Robot Interaction

被引:16
|
作者
Peral, Marc [1 ]
Sanfeliu, Alberto [1 ]
Garrell, Anais [1 ]
机构
[1] Inst Robot & Informat Ind CSIC UPC, Barcelona 08028, Spain
基金
欧盟地平线“2020”;
关键词
Deep learning; gesture recognition; human-robot interaction;
D O I
10.1109/LRA.2022.3193251
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
In this paper, we present an efficient and reliable deep-learning approach that allows users to communicate with robots via hand gesture recognition. Contrary to other works which use external devices such as gloves [1] or joysticks [2] to tele-operate robots, the proposed approach uses only visual information to recognize user's instructions that are encoded in a set of pre-defined hand gestures. Particularly, the method consists of two modules which work sequentially to extract 2D landmarks of hands -ie. joints positions- and to predict the hand gesture based on a temporal representation of them. The approach has been validated in a recent state-of-the-art dataset where it outperformed other methods that use multiple pre-processing steps such as optical flow and semantic segmentation. Our method achieves an accuracy of 87.5% and runs at 10 frames per second. Finally, we conducted real-life experiments with our IVO robot to validate the framework during the interaction process.
引用
收藏
页码:10272 / 10279
页数:8
相关论文
共 50 条
  • [41] Rich and Robust Human-Robot Interaction on Gesture Recognition for Assembly Tasks
    Lim, Gi Hyun
    Pedrosa, Eurico
    Amaral, Filipe
    Lau, Nuno
    Pereira, Artur
    Dias, Paulo
    Azevedo, Jose Luis
    Cunha, Bernardo
    Reis, Luis Paulo
    [J]. 2017 IEEE INTERNATIONAL CONFERENCE ON AUTONOMOUS ROBOT SYSTEMS AND COMPETITIONS (ICARSC), 2017, : 159 - 164
  • [42] The Influence of Speed and Position in Dynamic Gesture Recognition for Human-Robot Interaction
    Carlos Castillo, Jose
    Alonso-Martin, Fernando
    Caceres-Dominguez, David
    Malfaz, Maria
    Salichs, Miguel A.
    [J]. JOURNAL OF SENSORS, 2019, 2019
  • [43] Coherence in One-Shot Gesture Recognition for Human-Robot Interaction
    Cabrera, Maria E.
    Voyles, Richard M.
    Wachs, Juan P.
    [J]. COMPANION OF THE 2018 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION (HRI'18), 2018, : 75 - 76
  • [44] Human-Robot Interaction and Cooperation Through People Detection and Gesture Recognition
    Pereira, Flavio Garcia
    Vassallo, Raquel Frizera
    Teatini Salles, Evandro Ottoni
    [J]. JOURNAL OF CONTROL AUTOMATION AND ELECTRICAL SYSTEMS, 2013, 24 (03) : 187 - 198
  • [45] A Vision-based Gesture Recognition System for Human-Robot Interaction
    Zhang, Jianjie
    Zhao, Mingguo
    [J]. 2009 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND BIOMIMETICS (ROBIO 2009), VOLS 1-4, 2009, : 2096 - 2101
  • [46] A Hand Gesture Recognition System based on GMM Method for Human-robot Interface
    Ho, Yihsin
    Nishitani, Takao
    Yamaguchi, Toru
    Sato-Shimokawara, Eri
    Tagawa, Norio
    [J]. 2013 SECOND INTERNATIONAL CONFERENCE ON ROBOT, VISION AND SIGNAL PROCESSING (RVSP), 2013, : 291 - 294
  • [47] Vision-Based Hand Gesture Recognition for Human-Robot Collaboration: A Survey
    Xia, Zanwu
    Lei, Qujiang
    Yang, Yang
    Zhang, Hongda
    He, Yue
    Wang, Weijun
    Huang, Minghui
    [J]. CONFERENCE PROCEEDINGS OF 2019 5TH INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION AND ROBOTICS (ICCAR), 2019, : 198 - 205
  • [48] Finger identification and hand posture recognition for human-robot interaction
    Yin, Xiaoming
    Xie, Ming
    [J]. IMAGE AND VISION COMPUTING, 2007, 25 (08) : 1291 - 1300
  • [49] Simultaneous Segmentation and Recognition of Hand Gestures for Human-Robot Interaction
    Vasquez Chavarria, Harold
    Jair Escalante, Hugo
    Enrique Sucar, L.
    [J]. 2013 16TH INTERNATIONAL CONFERENCE ON ADVANCED ROBOTICS (ICAR), 2013,
  • [50] Context-aware hand gesture interaction for human-robot collaboration in construction
    Wang, Xin
    Veeramani, Dharmaraj
    Dai, Fei
    Zhu, Zhenhua
    [J]. COMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, 2024, 39 (22) : 3489 - 3504