A multimodal virtual keyboard using eye-tracking and hand gesture detection

被引:0
|
作者
Cecotti, H. [1 ]
Meena, Y. K. [2 ]
Prasad, G. [2 ]
机构
[1] Fresno State Univ, Coll Sci & Math, Dept Comp Sci, Fresno, CA 93740 USA
[2] Ulster Univ, Intelligent Syst Res Ctr, Magee Campus, Deny, Londonderry, North Ireland
关键词
D O I
暂无
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
A large number of people with disabilities rely on assistive technologies to communicate with their families, to use social media, and have a social life. Despite a significant increase of novel assitive technologies, robust, non-invasive, and inexpensive solutions should be proposed and optimized in relation to the physical abilities of the users. A reliable and robust identification of intentional visual commands is an important issue in the development of eye-movements based user interfaces. The detection of a command with an eyetracking system can be achieved with a dwell time. Yet, a large number of people can use simple hand gestures as a switch to select a command. We propose a new virtual keyboard based on the detection of ten commands. The keyboard includes all the letters of the Latin script (upper and lower case), punctuation marks, digits, and a delete button. To select a command in the keyboard, the user points the desired item with the gaze, and select it with hand gesture. The system has been evaluated across eight healthy subjects with five predefined hand gestures, and a button for the selection. The results support the conclusion that the performance of a subject, in terms of speed and information transfer rate (ITR), depends on the choice of the hand gesture. The best gesture for each subject provides a mean performance of 8.77+/-2:90 letters per minute, which corresponds to an ITR of 57.04+/-14:55 bits per minute. The results highlight that the hand gesture assigned for the selection of an item is inter-subject dependent.
引用
收藏
页码:3330 / 3333
页数:4
相关论文
共 50 条
  • [1] Exploration of the Virtual Reality Teleportation Methods Using Hand-Tracking, Eye-Tracking, and EEG
    Kim, Jinwook
    Jang, Hyunyoung
    Kim, Dooyoung
    Lee, Jeongmi
    [J]. INTERNATIONAL JOURNAL OF HUMAN-COMPUTER INTERACTION, 2023, 39 (20) : 4112 - 4125
  • [2] Gesture objects detection and tracking for virtual text entry keyboard interface
    Yadav, Kuldeep Singh
    Monsley, Anish K.
    Laskar, Rabul Hussain
    [J]. MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 82 (04) : 5317 - 5342
  • [3] Gesture objects detection and tracking for virtual text entry keyboard interface
    Kuldeep Singh Yadav
    Rabul Hussain Anish Monsley K.
    [J]. Multimedia Tools and Applications, 2023, 82 : 5317 - 5342
  • [4] Rendering Optimizations for Virtual Reality Using Eye-Tracking
    Matthews, Shawn
    Uribe-Quevedo, Alvaro
    Theodorou, Alexander
    [J]. 2020 22ND SYMPOSIUM ON VIRTUAL AND AUGMENTED REALITY (SVR 2020), 2020, : 398 - 405
  • [5] Change detection in desktop virtual environments: An eye-tracking study
    Karacan, Hacer Uke
    Cagiltay, Kursat
    Tekman, H. Gurkan
    [J]. COMPUTERS IN HUMAN BEHAVIOR, 2010, 26 (06) : 1305 - 1313
  • [6] Lung nodule detection using eye-tracking
    Antonelli, Michela
    Yang, Guang-Zhong
    [J]. 2007 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, VOLS 1-7, 2007, : 1021 - +
  • [7] Design of User-Friendly Virtual Thai Keyboard Based on Eye-Tracking Controlled System
    Tantisatirapong, Suchada
    Phothisonothai, Montri
    [J]. 2018 18TH INTERNATIONAL SYMPOSIUM ON COMMUNICATIONS AND INFORMATION TECHNOLOGIES (ISCIT), 2018, : 359 - 362
  • [8] Eye-tracking on virtual reality: a survey
    Moreno-Arjonilla, Jesus
    Lopez-Ruiz, Alfonso
    Jimenez-Perez, J. Roberto
    Callejas-Aguilera, Jose E.
    Jurado, Juan M.
    [J]. VIRTUAL REALITY, 2024, 28 (01)
  • [9] Eye-tracking on virtual reality: a survey
    Jesús Moreno-Arjonilla
    Alfonso López-Ruiz
    J. Roberto Jiménez-Pérez
    José E. Callejas-Aguilera
    Juan M. Jurado
    [J]. Virtual Reality, 2024, 28
  • [10] Joint Attention Simulation Using Eye-Tracking and Virtual Humans
    Courgeon, Matthieu
    Rautureau, Gilles
    Martin, Jean-Claude
    Grynszpan, Ouriel
    [J]. IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2014, 5 (03) : 238 - 250