MYFix: Automated Fixation Annotation of Eye-Tracking Videos

被引:0
|
作者
Alinaghi, Negar [1 ]
Hollendonner, Samuel [1 ]
Giannopoulos, Ioannis [1 ]
机构
[1] Vienna Univ Technol, Res Div Geoinformat, Wiedner Hauptstr 8-E120, A-1040 Vienna, Austria
关键词
automatic fixation annotation; object detection; semantic segmentation; outdoor mobile eye-tracking; GAZE;
D O I
10.3390/s24092666
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
In mobile eye-tracking research, the automatic annotation of fixation points is an important yet difficult task, especially in varied and dynamic environments such as outdoor urban landscapes. This complexity is increased by the constant movement and dynamic nature of both the observer and their environment in urban spaces. This paper presents a novel approach that integrates the capabilities of two foundation models, YOLOv8 and Mask2Former, as a pipeline to automatically annotate fixation points without requiring additional training or fine-tuning. Our pipeline leverages YOLO's extensive training on the MS COCO dataset for object detection and Mask2Former's training on the Cityscapes dataset for semantic segmentation. This integration not only streamlines the annotation process but also improves accuracy and consistency, ensuring reliable annotations, even in complex scenes with multiple objects side by side or at different depths. Validation through two experiments showcases its efficiency, achieving 89.05% accuracy in a controlled data collection and 81.50% accuracy in a real-world outdoor wayfinding scenario. With an average runtime per frame of 1.61 +/- 0.35 s, our approach stands as a robust solution for automatic fixation annotation.
引用
下载
收藏
页数:21
相关论文
共 50 条
  • [1] Immersion Measurement in Watching Videos Using Eye-tracking Data
    Choi, Youjin
    Kim, JooYeong
    Hong, Jin-Hyuk
    IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2022, 13 (04) : 1759 - 1770
  • [2] Loneliness Assuaged: Eye-Tracking an Audience Watching Barrage Videos
    Chen, Guangyao
    Zhou, Shuhua
    JOVE-JOURNAL OF VISUALIZED EXPERIMENTS, 2020, (159): : 1 - 9
  • [3] Optimizing Fixation Filters for Eye-Tracking on Small Screens
    Trabulsi, Julia
    Norouzi, Kian
    Suurmets, Seidi
    Storm, Mike
    Ramsoy, Thomas Zoega
    FRONTIERS IN NEUROSCIENCE, 2021, 15
  • [4] Eye-tracking
    Tanenhaus, MK
    SpiveyKnowlton, MJ
    LANGUAGE AND COGNITIVE PROCESSES, 1996, 11 (06): : 583 - 588
  • [5] Eye-tracking and Instagram: The effect of viewing context on fixation duration
    Merta, Rhiannon L.
    INTERNATIONAL JOURNAL OF PSYCHOLOGY, 2023, 58 : 651 - 652
  • [6] Algorithms for the automated correction of vertical drift in eye-tracking data
    Carr, Jon W.
    Pescuma, Valentina N.
    Furlan, Michele
    Ktori, Maria
    Crepaldi, Davide
    BEHAVIOR RESEARCH METHODS, 2022, 54 (01) : 287 - 310
  • [7] Algorithms for the automated correction of vertical drift in eye-tracking data
    Jon W. Carr
    Valentina N. Pescuma
    Michele Furlan
    Maria Ktori
    Davide Crepaldi
    Behavior Research Methods, 2022, 54 : 287 - 310
  • [8] TOWARDS AUTOMATED COMPARISON OF EYE-TRACKING RECORDINGS IN DYNAMIC SCENES
    Kuebler, Thomas C.
    Bukenberger, Dennis R.
    Ungewiss, Judith
    Woerner, Alexandra
    Rothe, Colleen
    Schiefer, Ulrich
    Rosenstiel, Wolfgang
    Kasneci, Enkelejda
    2014 5TH EUROPEAN WORKSHOP ON VISUAL INFORMATION PROCESSING (EUVIP 2014), 2014,
  • [9] Fixation and Confusion - Investigating Eye-tracking Participants' Exposure to Information in Personas
    Salminen, Joni
    Jansen, Bernard J.
    An, Jisun
    Jung, Soon-Gyo
    Nielsen, Lene
    Kwak, Haewoon
    CHIIR'18: PROCEEDINGS OF THE 2018 CONFERENCE ON HUMAN INFORMATION INTERACTION & RETRIEVAL, 2018, : 110 - 119
  • [10] Eye-tracking evidence for fixation asymmetries in verbal and numerical quantifier processing
    Holford, Dawn Liu
    Juanchich, Marie
    Foulsham, Tom
    Sirota, Miroslav
    Clarke, Alasdair D. F.
    JUDGMENT AND DECISION MAKING, 2021, 16 (04): : 969 - 1009