Detecting Mental Fatigue from Eye-Tracking Data Gathered While Watching Video

被引:10
|
作者
Yamada, Yasunori [1 ]
Kobayashi, Masatomo [1 ]
机构
[1] IBM Res Tokyo, Tokyo, Japan
来源
ARTIFICIAL INTELLIGENCE IN MEDICINE, AIME 2017 | 2017年 / 10259卷
基金
日本科学技术振兴机构;
关键词
Mental fatigue; Cognitive fatigue; Feature selection; Natural viewing; Free viewing; Visual attention model; CLASSIFICATION; ATTENTION; SELECTION;
D O I
10.1007/978-3-319-59758-4_34
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Monitoring mental fatigue is of increasing importance for improving cognitive performance and health outcomes. Previous models using eye-tracking data allow inference of fatigue in cognitive tasks, such as driving, but they require us to engage in a specific cognitive task. A model capable of estimating fatigue from eye-tracking data in natural-viewing situations when an individual is not performing cognitive tasks has many potential applications. Here, we collected eye-tracking data from 18 adults as they watched video clips (simulating the situation of watching TV programs) before and after performing cognitive tasks. Using this data, we built a fatigue-detection model including novel feature sets and an automated feature selection method. With eye-tracking data of individuals watching only 30-seconds worth of video, our model could determine whether that person was fatigued with 91.0% accuracy in 10-fold cross-validation (chance 50%). Through a comparison with a model incorporating the feature sets used in previous studies, we showed that our model improved the detection accuracy by up to 13.9% (from 77.1 to 91.0%).
引用
收藏
页码:295 / 304
页数:10
相关论文
共 50 条
  • [31] Combining Eye-tracking Data with an Analysis of Video Content from Free-viewing a Video of a Walk in an Urban Park Environment
    Amati, Marco
    McCarthy, Chris
    Parmehr, Ebadat Ghanbari
    Sita, Jodi
    JOVE-JOURNAL OF VISUALIZED EXPERIMENTS, 2019, (147):
  • [32] Effects of data preprocessing on detecting autism in adults using web-based eye-tracking data
    Khalaji, Erfan
    Eraslan, Sukru
    Yesilada, Yeliz
    Yaneva, Victoria
    BEHAVIOUR & INFORMATION TECHNOLOGY, 2023, 42 (14) : 2476 - 2484
  • [33] Eye-Tracking in Assessment of the Mental Workload of Harvester Operators
    Naskrent, Bartlomiej
    Grzywinski, Witold
    Polowy, Krzysztof
    Tomczak, Arkadiusz
    Jelonek, Tomasz
    INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH, 2022, 19 (09)
  • [34] Subtitles in VR 360° video. Results from an eye-tracking experiment
    Brescia-Zapata, Marta
    Krejtz, Krzysztof
    Duchowski, Andrew T.
    Hughes, Chris J.
    Orero, Pilar
    PERSPECTIVES-STUDIES IN TRANSLATION THEORY AND PRACTICE, 2025, 33 (02): : 357 - 379
  • [35] EYE TRACKING AND THE TRANSLATION PROCESS: REFLECTIONS ON THE ANALYSIS AND INTERPRETATION OF EYE-TRACKING DATA
    Hvelplund, Kristian Tangsgaard
    MONTI, 2014, : 201 - 223
  • [36] Mining Eye-Tracking Data for Text Summarization
    Taieb-Maimon, Meirav
    Romanovski-Chernik, Aleksandr
    Last, Mark
    Litvak, Marina
    Elhadad, Michael
    INTERNATIONAL JOURNAL OF HUMAN-COMPUTER INTERACTION, 2024, 40 (17) : 4887 - 4905
  • [37] TAUPE: Visualizing and analyzing eye-tracking data
    De Smet, Benoit
    Lempereur, Lorent
    Sharafi, Zohreh
    Gueheneuc, Yann-Gael
    Antoniol, Giuliano
    Habra, Naji
    SCIENCE OF COMPUTER PROGRAMMING, 2014, 79 : 260 - 278
  • [38] Infancy Guidelines for Publishing Eye-Tracking Data
    Oakes, Lisa M.
    INFANCY, 2010, 15 (01) : 1 - 5
  • [39] Static visualization of temporal eye-tracking data
    Räïhä, KJ
    Aula, A
    Majaranta, P
    Rantala, H
    Koivunen, K
    HUMAN-COMPUTER INTERACTION - INTERACT 2005, PROCEEDINGS, 2005, 3585 : 946 - 949
  • [40] Campus environments and mental restoration: eye-tracking evidence from dynamic stimuli
    Wang, Mengrui
    Zhang, Shuting
    Zhou, Xiang
    ENGINEERING CONSTRUCTION AND ARCHITECTURAL MANAGEMENT, 2025,