A System for Web Browsing by Eye-Gaze Input

被引:6
|
作者
Abe, Kiyohiko
Owada, Kosuke [1 ]
Ohi, Shoichi [1 ]
Ohyama, Minoru [1 ]
机构
[1] Tokyo Denki Univ, Tokyo, Japan
关键词
eye-gaze input; image analysis; natural light; Web browser; welfare device;
D O I
10.1002/ecj.10110
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
We have developed an eye-gaze input system for people with severe physical disabilities, such as amyotrophic lateral sclerosis (ALS) patients. This system utilizes a personal computer and a home video camera to detect eye-gaze under natural light. The system detects both vertical and horizontal eye-gaze by simple image analysis, and does not require special image processing units or sensors. We also developed the platform for eye-gaze input based on our system. In this paper, we propose a new Web browsing system for physically disabled computer users as an application of the platform for eye-gaze input. The proposed Web browsing system uses a method of direct indicator selection. The method categorizes indicators by their function. These indicators are hierarchized relations; users can select the felicitous function by switching indicators group. This system also analyzes the location of selectable objects on Web pages, such as hyperlink, radio button, edit box, etc. This system stores the locations of these objects, in other words, the mouse cursor skips to the object of candidate input. Therefore, it enables Web browsing at a faster pace. (c) 2008 Wiley Periodicals, Inc. Electron Comm Jpn, 91(5): 11-18, 2008; Published online in Wiley InterScience (www. interscience.wiley.com). DOI 10.1002/ecj.10110
引用
收藏
页码:11 / 18
页数:8
相关论文
共 50 条
  • [41] Eye-Gaze and Mouse-Movements on Web Search as Indicators of Cognitive Impairment
    Gwizdka, Jacek
    Tessmer, Rachel
    Chan, Yao-Cheng
    Radhakrishnan, Kavita
    Henry, Maya L.
    [J]. INFORMATION SYSTEMS AND NEUROSCIENCE, NEUROIS RETREAT 2022, 2022, 58 : 187 - 200
  • [42] Speed and accuracy of eye-gaze pointing
    Chi, CF
    Lin, CL
    [J]. PERCEPTUAL AND MOTOR SKILLS, 1997, 85 (02) : 705 - 718
  • [43] EYE-GAZE WORD-PROCESSING
    FREY, LA
    WHITE, KP
    HUTCHINSON, TE
    [J]. IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS, 1990, 20 (04): : 944 - 950
  • [44] An eigenspace approach to eye-gaze estimation
    Bebis, G
    Fujimura, K
    [J]. PARALLEL AND DISTRIBUTED COMPUTING SYSTEMS, 2000, : 604 - 609
  • [45] Eye-gaze information input based on pupillary response to visual stimulus with luminance modulation
    Muto, Yumiko
    Miyoshi, Hideka
    Kaneko, Hirohiko
    [J]. PLOS ONE, 2020, 15 (01):
  • [46] Development of a Screen Keyboard System with Radially-Arranged Keys and its Effectiveness for Typing Including Eye-Gaze Input
    Ogata, Kohichi
    Nozute, Shigeyoshi
    [J]. IEEJ TRANSACTIONS ON ELECTRICAL AND ELECTRONIC ENGINEERING, 2024, 19 (08) : 1377 - 1386
  • [47] NAVIGATING THROUGH GOOGLE MAPS USING AN EYE-GAZE INTERFACE SYSTEM
    Putra, Hanif Fermanda
    Ogata, Kohichi
    [J]. INTERNATIONAL JOURNAL OF INNOVATIVE COMPUTING INFORMATION AND CONTROL, 2022, 18 (02): : 417 - 432
  • [48] Eye-gaze orienting to auditory and tactile targets
    Soto-Faraco, S
    Kingstone, A
    [J]. JOURNAL OF PSYCHOPHYSIOLOGY, 2005, 19 (01) : 61 - 61
  • [49] Eye-Gaze Tracking based Interaction in India
    Biswas, Pradipta
    Langdon, Pat
    [J]. 6TH INTERNATIONAL CONFERENCE ON INTELLIGENT HUMAN COMPUTER INTERACTION, IHCI 2014, 2014, 39 : 59 - 66
  • [50] Feasibility of Longitudinal Eye-Gaze Tracking in the Workplace
    Hutt S.
    Stewart A.E.B.
    Gregg J.
    Mattingly S.
    D'mello S.K.
    [J]. Proceedings of the ACM on Human-Computer Interaction, 2022, 6 (ETRA)