A multiscript gaze-based assistive virtual keyboard

被引:0
|
作者
Cecotti, H. [1 ]
Meena, Y. K. [2 ]
Bhushan, B. [2 ]
Dutta, A. [2 ]
Prasad, G. [3 ]
机构
[1] Fresno State Univ, Coll Sci & Math, Dept Comp Sci, Fresno, CA 93740 USA
[2] Indian Inst Technol IIT, Ctr Mechatron, Dept Humanities & Social Sci, Kanpur, Uttar Pradesh, India
[3] Ulster Univ, Intelligent Syst Res Ctr, Coleraine, Londonderry, North Ireland
关键词
D O I
10.1109/embc.2019.8856446
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
The recent development of inexpensive and accurate eye-trackers allows the creation of gazed based virtual keyboards that can be used by a large population of disabled people in developing countries. Thanks to eye-tracking technology, gaze-based virtual keyboards can be designed in relation to constraints related to the gaze detection accuracy and the considered display device. In this paper, we propose a new multimodal multiscript gaze-based virtual keyboard where it is possible to change the layout of the graphical user interface in relation to the script. Traditionally, virtual keyboards are assessed for a single language (e.g. English). We propose a multiscript gaze based virtual keyboard that can be accessed for people who communicate with the Latin, Bangla, and/or Devanagari scripts. We evaluate the performance of the virtual keyboard with two main groups of participants: 28 people who can communicate with both Bangla and English, and 24 people who can communicate with both Devanagari and English. The performance is assessed in relation to the information transfer rate when participants had to spell a sentence using their gaze for pointing to the command, and a dedicated mouth switch for commands selection. The results support the conclusion that the system is efficient, with no difference in terms of information transfer rate between Bangla and Devanagari. However, the performance is higher with English, despite the fact it was the secondary language of the participants.
引用
收藏
页码:1306 / 1309
页数:4
相关论文
共 50 条
  • [1] A gaze-based virtual keyboard using a mouth switch for command selection
    Soundarajan, S.
    Cecotti, H.
    [J]. 2018 40TH ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY (EMBC), 2018, : 3334 - 3337
  • [2] Gaze-based Interaction for Virtual Environments
    Jimenez, Jorge
    Gutierrez, Diego
    Latorre, Pedro
    [J]. JOURNAL OF UNIVERSAL COMPUTER SCIENCE, 2008, 14 (19) : 3085 - 3098
  • [3] Gaze-Based Assistive Technology - Usefulness in Clinical Assessments
    Wandin, Helena
    [J]. HARNESSING THE POWER OF TECHNOLOGY TO IMPROVE LIVES, 2017, 242 : 1113 - 1118
  • [4] Gaze-Based Assistive Technology for a Toddler with Tetraplegia and Without Speech
    Hemmingsson, Helena
    Borgestig, Maria
    [J]. HARNESSING THE POWER OF TECHNOLOGY TO IMPROVE LIVES, 2017, 242 : 1109 - 1112
  • [5] Gaze-based Kinaesthetic Interaction for Virtual Reality
    Li, Zhenxing
    Akkil, Deepak
    Raisamo, Roope
    [J]. INTERACTING WITH COMPUTERS, 2020, 32 (01) : 17 - 32
  • [6] The Benefits of Gaze-Based Assistive Technology in Daily Activities for Children with Disabilities
    Borgestig, Maria
    Hemmingsson, Helena
    [J]. HARNESSING THE POWER OF TECHNOLOGY TO IMPROVE LIVES, 2017, 242 : 1082 - 1088
  • [7] Gaze-Based Interaction Intention Recognition in Virtual Reality
    Chen, Xiao-Lin
    Hou, Wen-Jun
    [J]. ELECTRONICS, 2022, 11 (10)
  • [8] An error-aware gaze-based keyboard by means of a hybrid BCI system
    Kalaganis, Fotis P.
    Chatzilari, Elisavet
    Nikolopoulos, Spiros
    Kompatsiaris, Ioannis
    Laskaris, Nikos A.
    [J]. SCIENTIFIC REPORTS, 2018, 8
  • [9] An error-aware gaze-based keyboard by means of a hybrid BCI system
    Fotis P. Kalaganis
    Elisavet Chatzilari
    Spiros Nikolopoulos
    Ioannis Kompatsiaris
    Nikos A. Laskaris
    [J]. Scientific Reports, 8
  • [10] Gaze-based prediction of pen-based virtual interaction tasks
    Cig, Cagia
    Sezgin, Tevfik Metin
    [J]. INTERNATIONAL JOURNAL OF HUMAN-COMPUTER STUDIES, 2015, 73 : 91 - 106