Robust Head Mounted Wearable Eye Tracking System for Dynamical Calibration

被引:1
|
作者
Lanata, Antonio [1 ]
Greco, Alberto [1 ]
Valenza, Gaetano [1 ]
Scilingo, Enzo Pasquale [1 ]
机构
[1] Univ Pisa, Res Ctr E Piaggio, I-56100 Pisa, Italy
来源
JOURNAL OF EYE MOVEMENT RESEARCH | 2015年 / 8卷 / 05期
关键词
Eye tracking; head movement; vestibulo-ocular reflex; gaze point; statistical comparison; dynamical calibration; MOOD RECOGNITION; MOVEMENTS;
D O I
暂无
中图分类号
R77 [眼科学];
学科分类号
100212 ;
摘要
In this work, a new head mounted eye tracking system is presented. Based on computer vision techniques, the system integrates eye images and head movement, in real time, performing a robust gaze point tracking. Nystagmus movements due to vestibulo-ocular reflex are monitored and integrated. The system proposed here is a strongly improved version of a previous platform called HATCAM, which was robust against changes of illumination conditions. The new version, called HAT-Move, is equipped with accurate inertial motion unit to detect the head movement enabling eye gaze even in dynamical conditions. HAT-Move performance is investigated in a group of healthy subjects in both static and dynamic conditions, i.e. when head is kept still or free to move. Evaluation was performed in terms of amplitude of the angular error between the real coordinates of the fixed points and those computed by the system in two experimental setups, specifically, in laboratory settings and in a 3D virtual reality (VR) scenario. The achieved results showed that HAT-Move is able to achieve eye gaze angular error of about 1 degree along both horizontal and vertical directions.
引用
收藏
页数:15
相关论文
共 50 条
  • [31] Head-Mounted Eye Tracking: A New Method to Describe Infant Looking
    Franchak, John M.
    Kretch, Kari S.
    Soska, Kasey C.
    Adolph, Karen E.
    [J]. CHILD DEVELOPMENT, 2011, 82 (06) : 1738 - 1750
  • [32] Using Head-Mounted Eye-Tracking to Study Handwriting Development
    Fears, Nicholas E.
    Lockman, Jeffrey J.
    [J]. JOURNAL OF MOTOR LEARNING AND DEVELOPMENT, 2020, 8 (01) : 215 - 231
  • [33] Individualized foveated rendering with eye-tracking head-mounted display
    Jihwan Kim
    Jejoong Kim
    Myeongul Jung
    Taesoo Kwon
    Kwanguk Kenny Kim
    [J]. Virtual Reality, 2024, 28
  • [34] Head mounted goggle system with liquid crystal display for evaluation of eye tracking functions on neurological disease patients
    Iijima, A
    Haida, M
    Ishikawa, N
    Minamitani, H
    Shinohara, Y
    [J]. PROCEEDINGS OF THE 25TH ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY, VOLS 1-4: A NEW BEGINNING FOR HUMAN HEALTH, 2003, 25 : 3225 - 3228
  • [35] Eye gaze tracking system with head unfixed
    Zhao, Xincan
    Lu, Zhaoyang
    [J]. Nanjing Hangkong Hangtian Daxue Xuebao/Journal of Nanjing University of Aeronautics and Astronautics, 2010, 42 (04): : 435 - 439
  • [36] Portable Head-Mounted System for Mobile Forearm Tracking
    Polsinelli, Matteo
    Di Matteo, Alessandro
    Lozzi, Daniele
    Mattei, Enrico
    Mignosi, Filippo
    Nazzicone, Lorenzo
    Stornelli, Vincenzo
    Placidi, Giuseppe
    [J]. SENSORS, 2024, 24 (07)
  • [37] A robust method for calibration of eye tracking data recorded during nystagmus
    William Rosengren
    Marcus Nyström
    Björn Hammar
    Martin Stridh
    [J]. Behavior Research Methods, 2020, 52 : 36 - 50
  • [38] A new robust multivariate mode estimator for eye-tracking calibration
    Brilhault, Adrien
    Neuenschwander, Sergio
    Rios, Ricardo Araujo
    [J]. BEHAVIOR RESEARCH METHODS, 2023, 55 (02) : 516 - 553
  • [39] A new robust multivariate mode estimator for eye-tracking calibration
    Adrien Brilhault
    Sergio Neuenschwander
    Ricardo Araujo Rios
    [J]. Behavior Research Methods, 2023, 55 : 516 - 553
  • [40] Real-time Gaze Tracking with Head-eye Coordination for Head-mounted Displays
    Chen, Lingling
    Li, Yingxi
    Bai, Xiaowei
    Wang, Xiaodong
    Hu, Yongqiang
    Song, Mingwu
    Xie, Liang
    Yan, Ye
    Yin, Erwei
    [J]. 2022 IEEE INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY (ISMAR 2022), 2022, : 82 - 91