Multi-modal user interface combining eye tracking and hand gesture recognition

被引:0
|
作者
Hansol Kim
Kun Ha Suh
Eui Chul Lee
机构
[1] Sangmyung University,Department of Computer Science
来源
关键词
Eye tracking; Hand gesture controller; Leap motion controller; Hand gesture recognition; Multi-modal interaction;
D O I
暂无
中图分类号
学科分类号
摘要
Many studies on eye tracking have been conducted in diverse research areas. Nevertheless, eye tracking continues to be limited by low accuracy and a severe vibration problem due to pupil tremors. Furthermore, because almost all selection interactions, such as click events, use a dwell-time or eye-blinking method, eye tracking presents issues for both time consumption and involuntary blinking. In this paper, we therefore propose a multi-modal interaction method using a combination of eye tracking and hand gesture recognition with the commercial hand gesture controller. This method performs global and intuitive navigation using eye tracking, and local and detailed navigation using hand gesture controller. It supports intuitive hand gestures for mouse-button clicking. Experimental results indicate that the targeting time for small points is significantly improved using the proposed method. Especially, the proposed method has advantages in large display with high spatial resolution environment. Also, the proposed clicking interaction and modality switching concept showed accurate recognition rate and positive training effect, respectively.
引用
收藏
页码:241 / 250
页数:9
相关论文
共 50 条
  • [1] Multi-modal user interface combining eye tracking and hand gesture recognition
    Kim, Hansol
    Suh, Kun Ha
    Lee, Eui Chul
    [J]. JOURNAL ON MULTIMODAL USER INTERFACES, 2017, 11 (03) : 241 - 250
  • [2] Gesture Recognition on a New Multi-Modal Hand Gesture Dataset
    Schak, Monika
    Gepperth, Alexander
    [J]. PROCEEDINGS OF THE 11TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION APPLICATIONS AND METHODS (ICPRAM), 2021, : 122 - 131
  • [3] Multi-modal user interaction method based on gaze tracking and gesture recognition
    Lee, Heekyung
    Lim, Seong Yong
    Lee, Injae
    Cha, Jihun
    Cho, Dong-Chan
    Cho, Sunyoung
    [J]. SIGNAL PROCESSING-IMAGE COMMUNICATION, 2013, 28 (02) : 114 - 126
  • [4] Mudra: A Multi-Modal Smartwatch Interactive System with Hand Gesture Recognition and User Identification
    Guo, Kaiwen
    Zhou, Hao
    Tian, Ye
    Zhou, Wangqiu
    Ji, Yusheng
    Li, Xiang-Yang
    [J]. IEEE CONFERENCE ON COMPUTER COMMUNICATIONS (IEEE INFOCOM 2022), 2022, : 100 - 109
  • [5] Gesture Recognition and Multi-modal Fusion on a New Hand Gesture Dataset
    Schak, Monika
    Gepperth, Alexander
    [J]. PATTERN RECOGNITION APPLICATIONS AND METHODS, ICPRAM 2021, ICPRAM 2022, 2023, 13822 : 76 - 97
  • [6] Multi-modal zero-shot dynamic hand gesture recognition
    Rastgoo, Razieh
    Kiani, Kourosh
    Escalera, Sergio
    Sabokrou, Mohammad
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2024, 247
  • [7] MULTI-MODAL LEARNING FOR GESTURE RECOGNITION
    Cao, Congqi
    Zhang, Yifan
    Lu, Hanqing
    [J]. 2015 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA & EXPO (ICME), 2015,
  • [8] Multi-modal fusion for robust hand gesture recognition based on heterogeneous networks
    ZOU YongXiang
    CHENG Long
    HAN LiJun
    LI ZhengWei
    [J]. Science China Technological Sciences, 2023, (11) : 3219 - 3230
  • [9] Multi-modal fusion for robust hand gesture recognition based on heterogeneous networks
    YongXiang Zou
    Long Cheng
    LiJun Han
    ZhengWei Li
    [J]. Science China Technological Sciences, 2023, 66 : 3219 - 3230
  • [10] Multi-modal fusion for robust hand gesture recognition based on heterogeneous networks
    Zou, Yongxiang
    Cheng, Long
    Han, Lijun
    Li, Zhengwei
    [J]. SCIENCE CHINA-TECHNOLOGICAL SCIENCES, 2023, 66 (11) : 3219 - 3230