Pointing and Commanding Gesture Recognition in 3D for Human-Robot Interaction

被引:0
|
作者
Rahman, Abid [1 ]
Al Mahmud, Jubayer [1 ]
Hasanuzzaman, Md. [1 ]
机构
[1] Univ Dhaka, Dept Comp Sci & Engn, 1 Lalbagh, Dhaka 1000, Bangladesh
关键词
Pointing gesture recognition; 3D dynamic commanding gesture recognition; HMM; Kinect skeletal tracking; human-robot interaction; robot navigation;
D O I
暂无
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
This paper proposes and develops a Kinect-based pointing and commanding gesture recognition system in 3D for human-robot interaction. The system can be divided into three subsystems which are pointing gesture recognition, 3D dynamic gesture recognition and robot navigation. Kinect Skeletal Tracking is used to track the hand (palm), shoulder and elbow joints of a human in 3D coordinate space. The 3D coordinates of these joints are then used to detect pointing gestures and estimate the pointing direction. To detect dynamic hand gestures, the right hand (palm) joint is tracked and its 3D coordinate sequence is recorded. These coordinates are then geometrically translated by a reference point and normalized to form the feature vector that is fed into a Hidden Markov Model (HMM) based classifier for training and classification. For training the HMMs, Baum-Welch Learning algorithm is used. The system is trained and tested using 5 fold cross validation method using 500 gesture instances of 5 predefined gestures performed by 10 volunteers. The system achieves an overall accuracy of 94.4% in recognizing dynamic gestures. This project proposes and implements two separate algorithms for robot navigation using the recognized pointing and commanding gestures. A simple simulator along with a graphical user interface is also developed for testing the proposed interaction system and it works successfully.
引用
收藏
页数:10
相关论文
共 50 条
  • [1] 3D Pointing Gesture Recognition for Human-Robot Interaction
    Lai, Yuhui
    Wang, Chen
    Li, Yanan
    Ge, Shuzhi Sam
    Huang, Deqing
    [J]. PROCEEDINGS OF THE 28TH CHINESE CONTROL AND DECISION CONFERENCE (2016 CCDC), 2016, : 4959 - 4964
  • [2] 3D Gesture Recognition and Adaptation for Human-Robot Interaction
    Al Mahmud, Jubayer
    Das, Bandhan Chandra
    Shin, Jungpil
    Hasib, Khan Md
    Sadik, Rifat
    Mridha, M. F.
    [J]. IEEE ACCESS, 2022, 10 : 116485 - 116513
  • [3] 3D-tracking of head and hands for pointing gesture recognition in a human-robot interaction scenario
    Nickel, K
    Seemann, E
    Stiefelhagen, R
    [J]. SIXTH IEEE INTERNATIONAL CONFERENCE ON AUTOMATIC FACE AND GESTURE RECOGNITION, PROCEEDINGS, 2004, : 565 - 570
  • [4] Estimation of gesture pointing for human-robot interaction
    Chen, Renjun
    Fei, Minrui
    Yang, Aolei
    [J]. Yi Qi Yi Biao Xue Bao/Chinese Journal of Scientific Instrument, 2023, 44 (03): : 200 - 208
  • [5] Augmented Pointing Gesture Estimation for Human-Robot Interaction
    Hu, Zhixian
    Xu, Yingtian
    Lin, Waner
    Wang, Ziya
    Sun, Zhenglong
    [J]. 2022 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, ICRA 2022, 2022, : 6416 - 6422
  • [6] Real-time person tracking and pointing gesture recognition for human-robot interaction
    Nickel, K
    Stiefelhagen, R
    [J]. COMPUTER VISION IN HUMAN-COMPUTER INTERACTION, PROCEEDINGS, 2004, 3058 : 28 - 38
  • [7] Human-robot interaction - Facial gesture recognition
    Rudall, BH
    [J]. ROBOTICA, 1996, 14 : 596 - 597
  • [8] Gesture spotting and recognition for human-robot interaction
    Yang, Hee-Deok
    Park, A-Yeon
    Lee, Seong-Whan
    [J]. IEEE TRANSACTIONS ON ROBOTICS, 2007, 23 (02) : 256 - 270
  • [9] Dynamic Hand Gesture Recognition Based on 3D Hand Pose Estimation for Human-Robot Interaction
    Gao, Qing
    Chen, Yongquan
    Ju, Zhaojie
    Liang, Yi
    [J]. IEEE SENSORS JOURNAL, 2022, 22 (18) : 17421 - 17430
  • [10] Visual recognition of pointing gestures for human-robot interaction
    Nickel, Kai
    Stiefelhagen, Rainer
    [J]. IMAGE AND VISION COMPUTING, 2007, 25 (12) : 1875 - 1884