A calibrated, real-time eye gaze tracking system as an assistive system for persons with motor disability

被引:0
|
作者
Sesin, A [1 ]
Adjouadi, M [1 ]
Ayala, M [1 ]
机构
[1] Florida Int Univ, Dept Elect & Comp Engn, Miami, FL 33174 USA
关键词
eye gaze tracking; human-computer interaction; web browsing; editing;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This study focuses on the design of an integrated, real-time assistive system as an alternate human computer interface (HCI) that can be used by individuals with severe motor disabilities. The integrated aspect of the design was initially based on the use of the eye-gaze tracking (EGT) system [1] in order to obtain eye coordinates, which are sent to the computer interface where they are normalized into mouse coordinates according to the current monitor resolution. The novelty of this work is that it simplifies the eye-tracking technique to achieve near real-time and user-friendly interaction. Besides moving the cursor, the user can also use the mouse's right click implemented by keeping the eyes closed for a few milliseconds. This time interval is established such as to distinguish it from the action of a regular blink. The ultimate objective was to seek an HCI that will allow individuals with severe motor disabilities to interact with a computer using only eye movement with options for practical web browsing and editing.
引用
收藏
页码:399 / 404
页数:6
相关论文
共 50 条
  • [1] Real-time facial and eye gaze tracking system
    Park, KR
    Kim, J
    [J]. IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2005, E88D (06): : 1231 - 1238
  • [2] Remote eye gaze tracking system as a computer interface for persons with severe motor disability
    Adjouadi, M
    Sesin, A
    Ayala, M
    Cabrerizo, M
    [J]. COMPUTERS HELPING PEOPLE WITH SPECIAL NEEDS: PROCEEDINGS, 2004, 3118 : 761 - 769
  • [3] A distributed real time eye-gaze tracking system
    García, A
    Sánchez, FM
    Pérez, A
    Pedraza, JL
    Méndez, R
    Córdoba, ML
    Muñoz, ML
    [J]. ETFA 2003: IEEE CONFERENCE ON EMERGING TECHNOLOGIES AND FACTORY AUTOMATION, VOL 2, PROCEEDINGS, 2003, : 545 - 551
  • [4] Real time auto-focus algorithm for eye gaze tracking system
    Liu, Ruian
    Jin, Shijiu
    Wu, Xiaorong
    [J]. 2007 INTERNATIONAL SYMPOSIUM ON INTELLIGENT SIGNAL PROCESSING AND COMMUNICATION SYSTEMS, VOLS 1 AND 2, 2007, : 666 - +
  • [5] The evaluation of eye gaze using an eye tracking system in simulation training of real-time ultrasound-guided venipuncture
    Tatsuru, Kaji
    Keisuke, Yano
    Shun, Onishi
    Mayu, Matsui
    Ayaka, Nagano
    Masakazu, Murakami
    Koshiro, Sugita
    Toshio, Harumatsu
    Koji, Yamada
    Waka, Yamada
    Makoto, Matsukubo
    Mitsuru, Muto
    Kazuhiko, Nakame
    Satoshi, Ieiri
    [J]. JOURNAL OF VASCULAR ACCESS, 2022, 23 (03): : 360 - 364
  • [6] A Real-Time Eye Gaze Tracking Based Digital Mouse
    Kwak, SeHyun
    Lee, Daeho
    Kim, Siwon
    Park, Junghoon
    [J]. INNOVATIVE MOBILE AND INTERNET SERVICES IN UBIQUITOUS COMPUTING, IMIS 2024, 2024, 214 : 39 - 46
  • [7] RITnet: Real-time Semantic Segmentation of the Eye for Gaze Tracking
    Chaudhary, Aayush K.
    Kothari, Rakshit
    Acharya, Manoj
    Dangi, Shusil
    Nair, Nitinraj
    Bailey, Reynold
    Kanan, Christopher
    Diaz, Gabriel
    Pelz, Jeff B.
    [J]. 2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS (ICCVW), 2019, : 3698 - 3702
  • [8] Low Cost Real-time Eye Tracking System for Motorsports
    Xia, Yuanjie
    Lunardi, Andrew
    Heidari, Hadi
    Ghannam, Rami
    [J]. 2022 29TH IEEE INTERNATIONAL CONFERENCE ON ELECTRONICS, CIRCUITS AND SYSTEMS (IEEE ICECS 2022), 2022,
  • [9] An eye-gaze tracking system for people with motor disabilities
    Kim, DH
    Kim, JH
    Chung, MJ
    [J]. INTEGRATION OF ASSISTIVE TECHNOLOGY IN THE INFORMATION AGE, 2001, 9 : 249 - 255
  • [10] PRECISE NON -INTRUSIVE REAL-TIME GAZE TRACKING SYSTEM FOR EMBEDDED SETUPS
    Garcia-Dopico, Antonio
    Perez, Antonio
    Luis Pedraza, Jose
    Luisa Cordoba, Maria
    [J]. COMPUTING AND INFORMATICS, 2017, 36 (02) : 257 - 282