NAVIGATING THROUGH GOOGLE MAPS USING AN EYE-GAZE INTERFACE SYSTEM

被引:1
|
作者
Putra, Hanif Fermanda [1 ]
Ogata, Kohichi [2 ]
机构
[1] Kumamoto Univ, Grad Sch Sci & Technol, Chuo Ku, Kurokami 2-39-1, Kumamoto 8608555, Japan
[2] Kumamoto Univ, Fac Adv Sci & Technol, Chuo Ku, Kurokami 2-39-1, Kumamoto 8608555, Japan
基金
日本学术振兴会;
关键词
Eye-gaze; Eye movement; Input-system; Virtual tour; Navigation; ALGORITHM;
D O I
10.24507/ijicic.18.02.417
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Over the course of decades, researchers and companies have developed different types of tools to enable human interaction with computers. One such tool, the eye-tracking device, has the potential to be utilized in all daily life applications. Even though there are numerous eye-tracking units in the market, they are rarely used in day-to-day life. In this study, we present the potential of utilizing an eye-tracking device and its interface as a navigation controller of Google Maps, a popular navigation system. Our goal is to translate eye-gaze movement into joystick-like movement. We compare the performance of the eye-gaze tracking system and the joystick in several short tasks, fi nding little difference. The study also shows that it does not require extensive training for a beginner user to control the eye-gaze system smoothly.
引用
收藏
页码:417 / 432
页数:16
相关论文
共 50 条
  • [41] Drifting and blinking compensation in electro-oculography (EOG) eye-gaze interface
    Yagi, Tohru
    Kuno, Yoshiaki
    Koga, Kazuo
    Mukai, Toshiharu
    [J]. 2006 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS, VOLS 1-6, PROCEEDINGS, 2006, : 3222 - +
  • [42] An affective user interface based on facial expression recognition and eye-gaze tracking
    Choi, SM
    Kim, YG
    [J]. AFFECTIVE COMPUTING AND INTELLIGENT INTERACTION, PROCEEDINGS, 2005, 3784 : 907 - 914
  • [44] Eye-Gaze With Predictive Link Following Improves Accessibility as a Mouse Pointing Interface
    Vazquez-Li, Jason
    Stachecki, Lyle Pierson
    Magee, John
    [J]. ASSETS'16: PROCEEDINGS OF THE 18TH INTERNATIONAL ACM SIGACCESS CONFERENCE ON COMPUTERS AND ACCESSIBILITY, 2016, : 297 - 298
  • [45] Towards Robust Robot Control in Cartesian Space Using an Infrastructureless Head- and Eye-Gaze Interface
    Woehle, Lukas
    Gebhard, Marion
    [J]. SENSORS, 2021, 21 (05) : 1 - 28
  • [46] Eye-gaze estimation by using features irrespective of face direction
    [J]. Miyake, T., 1600, John Wiley and Sons Inc. (36):
  • [47] Comparison of Eye-gaze Detection using CNN and Vision Transformer
    Niikura, Daiki
    Abe, Kiyohiko
    [J]. IEEJ Transactions on Electronics, Information and Systems, 2024, 144 (07) : 683 - 684
  • [48] Smart User Interface for Mobile Consumer Devices Using Model-Based Eye-Gaze Estimation
    Iqbal, Nadeem
    Lee, Hwaran
    Lee, Soo-Young
    [J]. IEEE TRANSACTIONS ON CONSUMER ELECTRONICS, 2013, 59 (01) : 161 - 166
  • [49] Eye-gaze interfaces using electro-oculography (EOG)
    Tokyo Institute of Technology, 2-12-1 O-okayama, Meguro-ku Tokyo 152-8552, Japan
    [J]. Int Conf Intell User Interfaces Proc IUI, 1600, (28-32):
  • [50] Classification approach for understanding implications of emotions using eye-gaze
    Pradeep Raj Krishnappa Babu
    Uttama Lahiri
    [J]. Journal of Ambient Intelligence and Humanized Computing, 2020, 11 : 2701 - 2713