Cleaning up systematic error in eye-tracking data by using required fixation locations

被引:114
|
作者
Hornof, AJ [1 ]
Halverson, T [1 ]
机构
[1] Univ Oregon, Dept Comp & Informat Sci, Eugene, OR 97403 USA
来源
关键词
D O I
10.3758/BF03195487
中图分类号
B841 [心理学研究方法];
学科分类号
040201 ;
摘要
In the course of running an eye-tracking experiment, one computer system or subsystem typically presents the stimuli to the participant and records manual responses, and another collects the eye movement data, with little interaction between the two during the course of the experiment. This article demonstrates how the two systems can interact with each other to facilitate a richer set of experimental designs and applications and to produce more accurate eye tracking data. In an eye-tracking study, a participant is periodically instructed to look at specific screen locations, or explicit required fixation locations (RFLs), in order to calibrate the eye tracker to the participant. The design of an experimental procedure will also often produce a number of implicit RFLs-screen locations that the participant must look at within a certain window of time or at a certain moment in order to successfully and correctly accomplish a task, but without explicit instructions to fixate those locations. In these windows of time or at these moments, the disparity between the fixations recorded by the eye tracker and the screen locations corresponding to implicit RFLs can be examined, and the results of the comparison can be used for a variety of purposes. This article shows how the disparity can be used to monitor the deterioration in the accuracy of the eye tracker calibration and to automatically invoke a recalibration procedure when necessary. This article also demonstrates how the disparity will vary across screen regions and participants and how each participant's unique error signature can be used to reduce the systematic error in the eye movement data collected for that participant.
引用
收藏
页码:592 / 604
页数:13
相关论文
共 50 条
  • [21] INTER-EYE: Interactive Error Compensation for Eye-Tracking Devices
    De Cecco, Mariolino
    Zanetti, Matteo
    Fornaser, Alberto
    Leuci, Malvina
    Conci, Nicola
    [J]. PROCEEDINGS OF THE 12TH INTERNATIONAL AIVELA CONFERENCE ON VIBRATION MEASUREMENTS BY LASER AND NONCONTACT TECHNIQUES: ADVANCES AND APPLICATIONS, 2016, 1740
  • [22] Reducing Fixation Error Due to Natural Head Movement in a Webcam-based Eye-tracking Method
    Kunz, Manuela
    Syed, Arsalan
    Fraser, Kathleen C.
    Wallace, Bruce
    Goubran, Rafik
    Knoefel, Frank
    Thomas, Neil
    [J]. 2023 IEEE SENSORS APPLICATIONS SYMPOSIUM, SAS, 2023,
  • [23] Eye-tracking and Instagram: The effect of viewing context on fixation duration
    Merta, Rhiannon L.
    [J]. INTERNATIONAL JOURNAL OF PSYCHOLOGY, 2023, 58 : 651 - 652
  • [24] The Layout of Web Pages: A Study on the Relation between Information Forms and Locations Using Eye-Tracking
    Li, Mi
    Song, Yangyang
    Lu, Shengfu
    Zhong, Ning
    [J]. ACTIVE MEDIA TECHNOLOGY, PROCEEDINGS, 2009, 5820 : 207 - 216
  • [25] Quantification of fixation stability of upward gaze in myasthenia gravis by using an eye-tracking system
    Mihara, Miharu
    Kakeue, Ken
    Fujita, Kazuya
    Tamura, Ryoi
    Hayashi, Atsushi
    [J]. INVESTIGATIVE OPHTHALMOLOGY & VISUAL SCIENCE, 2016, 57 (12)
  • [26] EYE TRACKING AND THE TRANSLATION PROCESS: REFLECTIONS ON THE ANALYSIS AND INTERPRETATION OF EYE-TRACKING DATA
    Hvelplund, Kristian Tangsgaard
    [J]. MONTI, 2014, : 201 - 223
  • [27] Mining Eye-Tracking Data for Text Summarization
    Taieb-Maimon, Meirav
    Romanovski-Chernik, Aleksandr
    Last, Mark
    Litvak, Marina
    Elhadad, Michael
    [J]. INTERNATIONAL JOURNAL OF HUMAN-COMPUTER INTERACTION, 2024, 40 (17) : 4887 - 4905
  • [28] Using Eye-Tracking and Form Completion Data to Optimize Form Instructions
    Alton, Noel T.
    Rinn, Caitlin
    Summers, Kathryn
    Straub, Kath
    [J]. 2014 IEEE INTERNATIONAL PROFESSIONAL COMMUNICATION CONFERENCE (IPCC), 2014,
  • [29] Infancy Guidelines for Publishing Eye-Tracking Data
    Oakes, Lisa M.
    [J]. INFANCY, 2010, 15 (01) : 1 - 5
  • [30] TAUPE: Visualizing and analyzing eye-tracking data
    De Smet, Benoit
    Lempereur, Lorent
    Sharafi, Zohreh
    Gueheneuc, Yann-Gael
    Antoniol, Giuliano
    Habra, Naji
    [J]. SCIENCE OF COMPUTER PROGRAMMING, 2014, 79 : 260 - 278