Human Gesture Recognition with a Flow-based Model for Human Robot Interaction

被引:1
|
作者
Liu, Lanmiao [1 ]
Yu, Chuang [2 ]
Song, Siyang [3 ]
Su, Zhidong [4 ]
Tapus, Adriana [5 ]
机构
[1] Tech Univ Darmstadt, Dept Comp Sci, Darmstadt, Hessen, Germany
[2] Univ Manchester, Cognit Robot Lab, Manchester, England
[3] Univ Cambridge, Affect Intelligence & Robot Lab, Cambridge, England
[4] Oklahoma State Univ, Lab Adv Sensing Computat & Control, Stillwater, OK USA
[5] Inst Polytech Paris, ASR Robot Lab, ENSTA Paris, Palaiseau, France
关键词
Social Robot; Gestures recognition; Flow-based model;
D O I
10.1145/3568294.3580145
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Human skeleton-based gesture classification plays a dominant role in social robotics. Learning the variety of human skeleton-based gestures can help the robot to continuously interact in an appropriate manner in a natural human-robot interaction (HRI). In this paper, we proposed a Flow-based model to classify human gesture actions with skeletal data. Instead of inferring new human skeleton actions from noisy data using a retrained model, our end-to-end model can expand the diversity of labels for gesture recognition from noisy data without retraining the model. At first, our model focuses on detecting five human gesture actions (i.e., come on, right up, left up, hug, and noise-random action). The accuracy of our online human gesture recognition system is as well as the offline one. Meanwhile, both attain 100% accuracy among the first four actions. Our proposed method is more efficient for inference of new human gesture action without retraining, which acquires about 90% accuracy for noise-random action. The gesture recognition system has been applied to the robot's reaction toward the human gesture, which is promising to facilitate a natural human-robot interaction.
引用
收藏
页码:548 / 551
页数:4
相关论文
共 50 条
  • [1] Human-robot interaction based on gesture and movement recognition
    Li, Xing
    SIGNAL PROCESSING-IMAGE COMMUNICATION, 2020, 81
  • [2] Dynamic Gesture Recognition for Human Robot Interaction
    Lee-Ferng, Jong
    Ruiz-del-Solar, Javier
    Verschae, Rodrigo
    Correa, Mauricio
    2009 6TH LATIN AMERICAN ROBOTICS SYMPOSIUM, 2009, : 57 - 64
  • [3] Gesture recognition based on arm tracking for human-robot interaction
    Sigalas, Markos
    Baltzakis, Haris
    Trahanias, Panos
    IEEE/RSJ 2010 INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS 2010), 2010, : 5424 - 5429
  • [4] Human robot interaction for manipulation tasks based on stroke gesture recognition
    Li, Jiajun
    Tao, Jianguo
    Ding, Liang
    Gao, Haibo
    Deng, Zongquan
    Luo, Yang
    Li, Zhandong
    INDUSTRIAL ROBOT-THE INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH AND APPLICATION, 2017, 44 (06): : 700 - 710
  • [5] Gesture recognition based on context awareness for human-robot interaction
    Hong, Seok-Ju
    Setiawan, Nurul Arif
    Kim, Song-Gook
    Lee, Chil-Woo
    ADVANCES IN ARTIFICIAL REALITY AND TELE-EXISTENCE, PROCEEDINGS, 2006, 4282 : 1 - +
  • [6] Human-robot interaction - Facial gesture recognition
    Rudall, BH
    ROBOTICA, 1996, 14 : 596 - 597
  • [7] Online Dynamic Gesture Recognition for Human Robot Interaction
    Dan Xu
    Xinyu Wu
    Yen-Lun Chen
    Yangsheng Xu
    Journal of Intelligent & Robotic Systems, 2015, 77 : 583 - 596
  • [8] Static Hand Gesture Recognition for Human Robot Interaction
    Uwineza, Josiane
    Ma, Hongbin
    Li, Baokui
    Jin, Ying
    INTELLIGENT ROBOTICS AND APPLICATIONS, ICIRA 2019, PT II, 2019, 11741 : 417 - 430
  • [9] Online Dynamic Gesture Recognition for Human Robot Interaction
    Xu, Dan
    Wu, Xinyu
    Chen, Yen-Lun
    Xu, Yangsheng
    JOURNAL OF INTELLIGENT & ROBOTIC SYSTEMS, 2015, 77 (3-4) : 583 - 596
  • [10] Gesture spotting and recognition for human-robot interaction
    Yang, Hee-Deok
    Park, A-Yeon
    Lee, Seong-Whan
    IEEE TRANSACTIONS ON ROBOTICS, 2007, 23 (02) : 256 - 270