Proposal of Eye-gaze Recognize Method for Input Interface without Infra-red Ray Equipment

被引:0
|
作者
Fukushima, Kazuki [1 ]
Shirahama, Naruki [2 ]
机构
[1] Kitakyushu Natl Coll Technol, Dept Control Engn, Adv Engn, Kitakyushu, Fukuoka, Japan
[2] Kitakyushu Natl Coll Technol, Dept Elect & Control, Kitakyushu, Fukuoka, Japan
关键词
assistive technology; eye-gaze; web-camera; image processing; welfare support-equipment; OpenCV;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The main purpose of this study is to develop the inexpensive eye-gaze input interface for disabled people. The eye-gaze inputting can suit many situations, and it has little load for user because it is non-contact input interface. Many projects about the eye-gaze have been studied recently. Most of the productions of eye-gaze inputting use infra-red rays (IR) to detect the iris in the eyes. However the harm of IR for human eyes has been pointed out. In addition, if the interface requires the camera which has IR, the users must purchase the specific devices. Therefore we adopted a PC and the camera doesn't have the IR function. We propose the system by using the motion-template that is one of the functions of OpenCV library for tracking motion. However this function can't recognize the point where the user watches on monitor. It can recognize only the motion. That's why we made calibrate function to relate the eye-gaze with the monitor. We propose two methods to recognize eye-gaze. Both methods require the iris binary image. This image shows the iris shape and we expect that user's visual point can be calculated from this iris image. One method uses gravity point to calculate the point. Other method uses the rectangular approximation to calculate it. We did experiments for some subjects by both methods and compared the results to validate which method is proper and how much the calibrate function is accurate. In the experiment, the function randomly spotted a blue target on the monitor. The target position changes on a regular basis. In this experiment, the user stares at the target and we check the accuracy. If this function or methods are proper, the function correctly recognizes the user stare at target or near. For more accuracy, we will consider about how to detect the iris correctly in the future.
引用
收藏
页码:429 / 434
页数:6
相关论文
共 13 条
  • [1] Proposal of eye-gaze character input interface with guide area
    Ban, H
    Sugamata, I
    Itakura, N
    Sakamoto, K
    Kitamoto, T
    [J]. ELECTRONICS AND COMMUNICATIONS IN JAPAN PART III-FUNDAMENTAL ELECTRONIC SCIENCE, 2003, 86 (10): : 36 - 42
  • [2] Effectiveness of Eye-Gaze Input Method: Comparison of Speed and Accuracy Among Three Eye-Gaze Input Method
    Murata, Atsuo
    Moriwaka, Makoto
    [J]. ADVANCES IN USABILITY, USER EXPERIENCE AND ASSISTIVE TECHNOLOGY, 2019, 794 : 763 - 772
  • [3] Input Interface Using Eye-Gaze and Blink Information
    Abe, Kiyohiko
    Sato, Hironobu
    Matsuno, Shogo
    Ohi, Shoichi
    Ohyama, Minoru
    [J]. HCI INTERNATIONAL 2015 - POSTERS' EXTENDED ABSTRACTS, PT I, 2015, 528 : 463 - 467
  • [4] Study on Character Input Methods using Eye-gaze Input Interface
    Murata, Atsuo
    Hayashi, Kazuya
    Moriwaka, Makoto
    Hayami, Takehito
    [J]. 2012 PROCEEDINGS OF SICE ANNUAL CONFERENCE (SICE), 2012, : 1402 - 1407
  • [5] Sensing and controlling model for eye-gaze input human-computer interface
    Tu, DW
    Zhao, QJ
    Yin, HR
    [J]. OPTICAL MODELING AND PERFORMANCE PREDICTIONS, 2003, 5178 : 221 - 228
  • [6] A Link Selection Method for Web-Browser Using Eye-Gaze Input
    Onishi, Kazutaka
    Kajiwara, Yusuke
    Nakamura, Munehiro
    Nambo, Hidetaka
    Kimura, Haruhiko
    [J]. IEEJ TRANSACTIONS ON ELECTRICAL AND ELECTRONIC ENGINEERING, 2014, 9 (06) : 650 - 655
  • [7] Interface using eye-gaze and tablet input for an avatar robot control in class participation support system
    Mu, Shenglin
    Shibata, Satoru
    Yamamoto, Tomonori
    Obayashi, Haruki
    [J]. COMPUTERS & ELECTRICAL ENGINEERING, 2023, 111
  • [8] Study on eye-gaze input interface based on deep learning using images obtained by multiple cameras
    Mu, Shenglin
    Shibata, Satoru
    Chiu, Kuo-chun
    Yamamoto, Tomonori
    Liu, Tung-kuan
    [J]. COMPUTERS & ELECTRICAL ENGINEERING, 2022, 101
  • [9] INFRA-RED METHOD FOR ASSESSMENT OF SMALL AND LARGE EYE MOVEMENTS IN CLINICAL EXPERIMENTS
    GAARDER, K
    SILVERMAN, J
    PFEFFERBAUM, D
    PFEFFERBAUM, L
    KING, C
    [J]. PERCEPTUAL AND MOTOR SKILLS, 1967, 25 (02) : 473 - +
  • [10] Automatic Lock of Cursor Movement: Implications for an Efficient Eye-Gaze Input Method for Drag and Menu Selection
    Murata, Atsuo
    Karwowski, Waldemar
    [J]. IEEE TRANSACTIONS ON HUMAN-MACHINE SYSTEMS, 2019, 49 (03) : 259 - 267