A review of machine learning in scanpath analysis for passive gaze-based interaction

被引:1
|
作者
Selim, Abdulrahman Mohamed [1 ]
Barz, Michael [1 ,2 ]
Bhatti, Omair Shahzad [1 ]
Alam, Hasan Md Tusfiqur [1 ]
Sonntag, Daniel [1 ,2 ]
机构
[1] German Res Ctr Artificial Intelligence DFKI, Interact Machine Learning Dept, Saarbrucken, Germany
[2] Carl von Ossietzky Univ Oldenburg, Appl Artificial Intelligence, Oldenburg, Germany
来源
关键词
machine learning; eye tracking; scanpath; passive gaze-based interaction; literature review; EYE-MOVEMENT PATTERNS; TRACKING; PREDICTION; CLASSIFICATION; RECOGNITION; ALGORITHMS; VISUALIZATION; ACCURACY; NETWORKS; SEARCH;
D O I
10.3389/frai.2024.1391745
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The scanpath is an important concept in eye tracking. It refers to a person's eye movements over a period of time, commonly represented as a series of alternating fixations and saccades. Machine learning has been increasingly used for the automatic interpretation of scanpaths over the past few years, particularly in research on passive gaze-based interaction, i.e., interfaces that implicitly observe and interpret human eye movements, with the goal of improving the interaction. This literature review investigates research on machine learning applications in scanpath analysis for passive gaze-based interaction between 2012 and 2022, starting from 2,425 publications and focussing on 77 publications. We provide insights on research domains and common learning tasks in passive gaze-based interaction and present common machine learning practices from data collection and preparation to model selection and evaluation. We discuss commonly followed practices and identify gaps and challenges, especially concerning emerging machine learning topics, to guide future research in the field.
引用
收藏
页数:28
相关论文
共 50 条
  • [1] Microscopic Analysis Using Gaze-Based Interaction
    Fruehberger, Peter
    Klaus, Edmund
    Beyerer, Juergen
    INTERNATIONAL MULTIDISCIPLINARY MICROSCOPY CONGRESS, 2014, 154 : 195 - 200
  • [2] Multimodal Gaze-based Interaction
    Pfeiffer, Thies
    Wachsmuth, Ipke
    AT-AUTOMATISIERUNGSTECHNIK, 2013, 61 (11) : 770 - 776
  • [3] Gaze-based Interaction for Virtual Environments
    Jimenez, Jorge
    Gutierrez, Diego
    Latorre, Pedro
    JOURNAL OF UNIVERSAL COMPUTER SCIENCE, 2008, 14 (19) : 3085 - 3098
  • [4] Gaze-Based Interaction for VR Environments
    Piotrowski, Patryk
    Nowosielski, Adam
    IMAGE PROCESSING AND COMMUNICATIONS: TECHNIQUES, ALGORITHMS AND APPLICATIONS, 2020, 1062 : 41 - 48
  • [5] Leveraging Implicit Gaze-Based User Feedback for Interactive Machine Learning
    Bhatti, Omair
    Barz, Michael
    Sonntag, Daniel
    ADVANCES IN ARTIFICIAL INTELLIGENCE, KI 2022, 2022, 13404 : 9 - 16
  • [6] A Scrolling Approach for Gaze-Based Interaction
    Schniederjann, Florian
    Korthing, Lars
    Broesterhaus, Jonas
    Mertens, Robert
    2019 IEEE INTERNATIONAL SYMPOSIUM ON MULTIMEDIA (ISM 2019), 2019, : 233 - 234
  • [7] Improving usability for video analysis using gaze-based interaction
    Hild, Jutta
    Peinsipp-Byma, Elisabeth
    Klaus, Edmund
    FULL MOTION VIDEO (FMV) WORKFLOWS AND TECHNOLOGIES FOR INTELLIGENCE, SURVEILLANCE, AND RECONNAISSANCE (ISR) AND SITUATIONAL AWARENESS, 2012, 8386
  • [8] Gaze-based interaction: A 30 year retrospective
    Duchowski, Andrew T.
    COMPUTERS & GRAPHICS-UK, 2018, 73 : 59 - 69
  • [9] Interactive Assessment Tool for Gaze-based Machine Learning Models in Information Retrieval
    Valdunciel, Pablo
    Bhatti, Omair Shahzad
    Barz, Michael
    Sonntag, Daniel
    CHIIR'22: PROCEEDINGS OF THE 2022 CONFERENCE ON HUMAN INFORMATION INTERACTION AND RETRIEVAL, 2022, : 332 - 336
  • [10] Gaze-Based Interaction for Interactive Storytelling in VR
    Drewes, Heiko
    Mueller, Evelyn
    Rothe, Sylvia
    Hussmann, Heinrich
    AUGMENTED REALITY, VIRTUAL REALITY, AND COMPUTER GRAPHICS, 2021, 12980 : 91 - 108