Hand Gesture Control for Human-Computer Interaction with Deep Learning

被引:8
|
作者
Chua, S. N. David [1 ]
Chin, K. Y. Richard [1 ]
Lim, S. F. [1 ]
Jain, Pushpdant [2 ]
机构
[1] Univ Malaysia Sarawak, Fac Engn, Kota Samarahan 94300, Malaysia
[2] VIT Bhopal Univ, Sch Mech Engn, Sehore 466114, India
关键词
Hand gesture; Human computer interaction; Deep learning; Object detection;
D O I
10.1007/s42835-021-00972-6
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
The use of gesture control has numerous advantages compared to the use of physical hardware. However, it has yet to gain popularity as most gesture control systems require extra sensors or depth cameras to detect or capture the movement of gestures before a meaningful signal can be triggered for corresponding course of action. This research proposes a method for a hand gesture control system with the use of an object detection algorithm, YOLOv3, combined with handcrafted rules to achieve dynamic gesture control on the computer. This project utilizes a single RGB camera for hand gesture recognition and localization. The dataset of all gestures used for training and its corresponding commands, are custom designed by the authors due to the lack of standard gestures specifically for human-computer interaction. Algorithms to integrate gesture commands with virtual mouse and keyboard input through the Pynput library in Python, were developed to handle commands such as mouse control, media control, and others. The mAP result of the YOLOv3 model obtained 96.68% accuracy based on testing result. The use of rule-based algorithms for gesture interpretation was successfully implemented to transform static gesture recognition into dynamic gesture.
引用
收藏
页码:1961 / 1970
页数:10
相关论文
共 50 条
  • [1] Hand Gesture Control for Human–Computer Interaction with Deep Learning
    S. N. David Chua
    K. Y. Richard Chin
    S. F. Lim
    Pushpdant Jain
    [J]. Journal of Electrical Engineering & Technology, 2022, 17 : 1961 - 1970
  • [2] Recognition of hand gesture to human-computer interaction
    Lee, LK
    Kim, S
    Choi, YK
    Lee, MH
    [J]. IECON 2000: 26TH ANNUAL CONFERENCE OF THE IEEE INDUSTRIAL ELECTRONICS SOCIETY, VOLS 1-4: 21ST CENTURY TECHNOLOGIES AND INDUSTRIAL OPPORTUNITIES, 2000, : 2117 - 2122
  • [3] A hand gesture recognition technique for human-computer interaction
    Kiliboz, Nurettin Cagri
    Gudukbay, Ugur
    [J]. JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION, 2015, 28 : 97 - 104
  • [4] Face and hand gesture recognition for human-computer interaction
    Hongo, H
    Ohya, M
    Yasumoto, M
    Yamamoto, K
    [J]. 15TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION, VOL 2, PROCEEDINGS: PATTERN RECOGNITION AND NEURAL NETWORKS, 2000, : 921 - 924
  • [5] Multi-model Human-Computer Interaction System with Hand Gesture and Eye Gesture Control
    Jayalakshmi, M.
    Saradhi, T. Pardha
    Azam, Syed Mohammed Rahil
    Fazil, Sk
    Sriram, S. Durga Sai
    [J]. 2024 5TH INTERNATIONAL CONFERENCE ON INNOVATIVE TRENDS IN INFORMATION TECHNOLOGY, ICITIIT 2024, 2024,
  • [6] Design of hand gesture recognition system for human-computer interaction
    Tsai, Tsung-Han
    Huang, Chih-Chi
    Zhang, Kung-Long
    [J]. MULTIMEDIA TOOLS AND APPLICATIONS, 2020, 79 (9-10) : 5989 - 6007
  • [7] A visual system for hand gesture recognition in human-computer interaction
    Okkonen, Matti-Antero
    Kellokumpu, Vili
    Pietikainen, Matti
    Heikkilae, Janne
    [J]. IMAGE ANALYSIS, PROCEEDINGS, 2007, 4522 : 709 - +
  • [8] THE METHOD FOR HUMAN-COMPUTER INTERACTION BASED ON HAND GESTURE RECOGNITION
    Raudonis, Vidas
    Jonaitis, Domas
    [J]. PROCEEDINGS OF THE 8TH INTERNATIONAL CONFERENCE ON ELECTRICAL AND CONTROL TECHNOLOGIES, 2013, : 45 - 49
  • [9] Design of hand gesture recognition system for human-computer interaction
    Tsung-Han Tsai
    Chih-Chi Huang
    Kung-Long Zhang
    [J]. Multimedia Tools and Applications, 2020, 79 : 5989 - 6007
  • [10] Deep learning for human-computer interaction
    Le, Huy Viet
    Mayer, Sven
    Henze, Niels
    [J]. Interactions, 2020, 28 (01): : 78 - 82