A gaze-based virtual keyboard using a mouth switch for command selection

被引:0
|
作者
Soundarajan, S. [1 ]
Cecotti, H. [1 ]
机构
[1] Fresno State Univ, Dept Comp Sci, Coll Sci & Math, Fresno, CA 93740 USA
关键词
D O I
暂无
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Portable eye-trackers provide an efficient way to access the point of gaze from a user on a computer screen. Thanks to eyetracking, gaze-based virtual keyboard can be developed by taking into account constraints related to the gaze detection accuracy. In this paper, we propose a new gaze-based virtual keyboard where all the letters can be accessed directly through a single command. In addition, we propose a USB mouth switch that is directly connected through a computer mouse, with the mouse switch replacing the left click button. This approach is considered to tackle the Midas touch problem with eye-tracking for people who are severely disabled. The performance is evaluated on 10 participants by comparing the following three conditions: gaze detection with mouth switch, gaze detection with dwell time by considering the distance to the closest command, and the gaze detection within the surface of the command box. Finally, a workload using NASA-TLX test was conducted on the different conditions. The results revealed that the proposed approach with the mouth switch provides a better performance in terms of typing speed (36.6 +/- 8.4 letters/minute) compared to the other conditions, and a high acceptance as an input device.
引用
收藏
页码:3334 / 3337
页数:4
相关论文
共 50 条
  • [21] Gaze-Based Graphical Password Using Webcam
    Tiwari, Abhishek
    Pal, Rajarshi
    INFORMATION SYSTEMS SECURITY, ICISS 2018, 2018, 11281 : 448 - 461
  • [22] Gaze Typing in Virtual Reality: Impact of Keyboard Design, Selection Method, and Motion
    Rajanna, Vijay
    Hansen, John Paulin
    2018 ACM SYMPOSIUM ON EYE TRACKING RESEARCH & APPLICATIONS (ETRA 2018), 2018,
  • [23] The impact of visual and motor space size on gaze-based target selection
    Wang, Qi-Jun
    Ma, Xiao-Xing
    Lu, Yi-Ni
    Wang, Du-Ming
    Sun, Yu-Hao
    AUSTRALIAN JOURNAL OF PSYCHOLOGY, 2024, 76 (01)
  • [24] A passive BCI for monitoring the intentionality of the gaze-based moving object selection
    Zhao, Darisy G.
    Vasilyev, Anatoly N.
    Kozyrskiy, Bogdan L.
    Melnichuk, Eugeny, V
    Isachenko, Andrey, V
    Velichkovsky, Boris M.
    Shishkin, Sergei L.
    JOURNAL OF NEURAL ENGINEERING, 2021, 18 (02)
  • [25] LAIF: A logging and interaction framework for gaze-based interfaces in virtual entertainment environments
    Nacke, Lennart E.
    Stellmach, Sophie
    Sasse, Dennis
    Niesenhaus, Joerg
    Dachselt, Raimund
    ENTERTAINMENT COMPUTING, 2011, 2 (04) : 265 - 273
  • [26] Using Variable Dwell Time to Accelerate Gaze-Based Web Browsing with Two-Step Selection
    Chen, Zhaokang
    Shi, Bertram E.
    INTERNATIONAL JOURNAL OF HUMAN-COMPUTER INTERACTION, 2019, 35 (03) : 240 - 255
  • [27] PedVR: Simulating Gaze-Based Interactions between a Real User and Virtual Crowds
    Narang, Sahil
    Best, Andrew
    Randhavane, Tanmay
    Shapiro, Ari
    Manocha, Dinesh
    22ND ACM CONFERENCE ON VIRTUAL REALITY SOFTWARE AND TECHNOLOGY (VRST 2016), 2016, : 91 - 100
  • [28] An Explanation of Fitts' Law-like Performance in Gaze-Based Selection Tasks Using a Psychophysics Approach
    Schuetz, Immo
    Murdison, T. Scott
    MacKenzie, Kevin J.
    Zannoli, Marina
    CHI 2019: PROCEEDINGS OF THE 2019 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS, 2019,
  • [29] Climbing Keyboard: A Tilt-Based Selection Keyboard Entry for Virtual Reality
    Huang, Junfeng
    Sun, Minghui
    Qin, Jun
    Gao, BoYu
    Qin, Guihe
    INTERNATIONAL JOURNAL OF HUMAN-COMPUTER INTERACTION, 2024, 40 (05) : 1327 - 1338
  • [30] Gaze-based Anxiety Sensitive Virtual Social Communication Platform for Individuals with Autism
    Babu, Pradeep Raj Krishnappa
    Lahiri, Uttama
    EXTENDED ABSTRACTS OF THE 2022 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS, CHI 2022, 2022,