Noise estimation for head-mounted 3D binocular eye tracking using Pupil Core eye-tracking goggles

被引:0
|
作者
Velisar, Anca [1 ]
Shanidze, Natela M. [1 ]
机构
[1] Smith Kettlewell Eye Res Inst, 2318 Fillmore St, San Francisco, CA 94115 USA
关键词
Eye movements; Head-mounted eye tracking; Wearable eye tracking; Mobile eye tracking; Data quality; Head movement; Body movement; Calibration; Accuracy; Precision; GAZE; VISION;
D O I
10.3758/s13428-023-02150-0
中图分类号
B841 [心理学研究方法];
学科分类号
040201 ;
摘要
Head-mounted, video-based eye tracking is becoming increasingly common and has promise in a range of applications. Here, we provide a practical and systematic assessment of the sources of measurement uncertainty for one such device - the Pupil Core - in three eye-tracking domains: (1) the 2D scene camera image; (2) the physical rotation of the eye relative to the scene camera 3D space; and (3) the external projection of the estimated gaze point location onto the target plane or in relation to world coordinates. We also assess eye camera motion during active tasks relative to the eye and the scene camera, an important consideration as the rigid arrangement of eye and scene camera is essential for proper alignment of the detected gaze. We find that eye camera motion, improper gaze point depth estimation, and erroneous eye models can all lead to added noise that must be considered in the experimental design. Further, while calibration accuracy and precision estimates can help assess data quality in the scene camera image, they may not be reflective of errors and variability in gaze point estimation. These findings support the importance of eye model constancy for comparisons across experimental conditions and suggest additional assessments of data reliability may be warranted for experiments that require the gaze point or measure eye movements relative to the external world.
引用
收藏
页码:53 / 79
页数:27
相关论文
共 50 条
  • [1] Noise estimation for head-mounted 3D binocular eye tracking using Pupil Core eye-tracking goggles
    Anca Velisar
    Natela M. Shanidze
    [J]. Behavior Research Methods, 2024, 56 : 53 - 79
  • [2] Assessment of eye fatigue caused by head-mounted displays using eye-tracking
    Wang, Yan
    Zhai, Guangtao
    Chen, Sichao
    Min, Xiongkuo
    Gao, Zhongpai
    Song, Xuefei
    [J]. BIOMEDICAL ENGINEERING ONLINE, 2019, 18 (01)
  • [3] Assessment of eye fatigue caused by head-mounted displays using eye-tracking
    Yan Wang
    Guangtao Zhai
    Sichao Chen
    Xiongkuo Min
    Zhongpai Gao
    Xuefei Song
    [J]. BioMedical Engineering OnLine, 18
  • [4] Using Head-Mounted Eye-Tracking to Study Handwriting Development
    Fears, Nicholas E.
    Lockman, Jeffrey J.
    [J]. JOURNAL OF MOTOR LEARNING AND DEVELOPMENT, 2020, 8 (01) : 215 - 231
  • [5] Development of a head-mounted, eye-tracking system for dogs
    Williams, Fiona J.
    Mills, Daniel S.
    Guo, Kun
    [J]. JOURNAL OF NEUROSCIENCE METHODS, 2011, 194 (02) : 259 - 265
  • [6] Robust and accurate pupil detection for head-mounted eye tracking
    Wan, Zhong-Hua
    Xiong, Cai-Hua
    Chen, Wen-Bin
    Zhang, Han-Yuan
    [J]. COMPUTERS & ELECTRICAL ENGINEERING, 2021, 93
  • [7] Individualized foveated rendering with eye-tracking head-mounted display
    Kim, Jihwan
    Kim, Jejoong
    Jung, Myeongul
    Kwon, Taesoo
    Kim, Kwanguk Kenny
    [J]. VIRTUAL REALITY, 2024, 28 (01)
  • [8] Individualized foveated rendering with eye-tracking head-mounted display
    Jihwan Kim
    Jejoong Kim
    Myeongul Jung
    Taesoo Kwon
    Kwanguk Kenny Kim
    [J]. Virtual Reality, 2024, 28
  • [9] An Affordable Solution for Binocular Eye Tracking and Calibration in Head-mounted Displays
    Stengel, Michael
    Grogorick, Steve
    Eisemann, Martin
    Eisemann, Elmar
    Magnor, Marcus
    [J]. MM'15: PROCEEDINGS OF THE 2015 ACM MULTIMEDIA CONFERENCE, 2015, : 15 - 24
  • [10] Pupil-Contour-Based Gaze Estimation With Real Pupil Axes for Head-Mounted Eye Tracking
    Wan, Zhonghua
    Xiong, Caihua
    Chen, Wenbin
    Zhang, Hanyuan
    Wu, Shiqian
    [J]. IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2022, 18 (06) : 3640 - 3650