Human classifier: Observers can deduce task solely from eye movements

被引:0
|
作者
Brett Bahle
Mark Mills
Michael D. Dodd
机构
[1] University of Iowa,Department of Psychological and Brain Sciences
[2] University of Nebraska,undefined
来源
关键词
Categorization; Visual search; Eye movements; Cognitive;
D O I
暂无
中图分类号
学科分类号
摘要
Computer classifiers have been successful at classifying various tasks using eye movement statistics. However, the question of human classification of task from eye movements has rarely been studied. Across two experiments, we examined whether humans could classify task based solely on the eye movements of other individuals. In Experiment 1, human classifiers were shown one of three sets of eye movements: Fixations, which were displayed as blue circles, with larger circles meaning longer fixation durations; Scanpaths, which were displayed as yellow arrows; and Videos, in which a neon green dot moved around the screen. There was an additional Scene manipulation in which eye movement properties were displayed either on the original scene where the task (Search, Memory, or Rating) was performed or on a black background in which no scene information was available. Experiment 2 used similar methods but only displayed Fixations and Videos with the same Scene manipulation. The results of both experiments showed successful classification of Search. Interestingly, Search was best classified in the absence of the original scene, particularly in the Fixation condition. Memory also was classified above chance with the strongest classification occurring with Videos in the presence of the scene. Additional analyses on the pattern of correct responses in these two conditions demonstrated which eye movement properties successful classifiers were using. These findings demonstrate conditions under which humans can extract information from eye movement characteristics in addition to providing insight into the relative success/failure of previous computer classifiers.
引用
收藏
页码:1415 / 1425
页数:10
相关论文
共 50 条
  • [31] What We Can Learn about Reading Development from the Analysis of Eye Movements
    Korneev A.A.
    Matveeva E.Y.
    Akhutina T.V.
    [J]. Human Physiology, 2018, 44 (2) : 183 - 190
  • [32] Task-switching, inhibitory control and the role of trait anxiety: Evidence from eye-movements during an antisaccade task
    Ansari, Tahereh L.
    Derakshan, Nazanin
    [J]. PSYCHOPHYSIOLOGY, 2007, 44 : S23 - S24
  • [33] Exploring Human Cognition From Eye-Movements: Is There Unconscious Visual Information?
    Mathema, Rujeena
    Lind, Pedro G.
    Lencastre, Pedro
    [J]. PROCEEDINGS OF THE 5TH ACM WORKSHOP ON INTELLIGENT CROSS-DATA ANALYSIS AND RETRIEVAL, ICDAR 2024, 2024, : 19 - 26
  • [34] Features extraction from human eye movements via echo state network
    Koprinkova-Hristova, Petia
    Stefanova, Miroslava
    Genova, Bilyana
    Bocheva, Nadejda
    Kraleva, Radoslava
    Kralev, Velin
    [J]. NEURAL COMPUTING & APPLICATIONS, 2020, 32 (09): : 4213 - 4226
  • [35] Features extraction from human eye movements via echo state network
    Petia Koprinkova-Hristova
    Miroslava Stefanova
    Bilyana Genova
    Nadejda Bocheva
    Radoslava Kraleva
    Velin Kralev
    [J]. Neural Computing and Applications, 2020, 32 : 4213 - 4226
  • [36] Can negative emotion of task-irrelevant working memory representation affect its attentional capture? A study of eye movements
    Huang Yuesheng
    Zhang Bao
    Fan Xinhua
    Huang Jie
    [J]. ACTA PSYCHOLOGICA SINICA, 2021, 53 (01) : 26 - 37
  • [37] Task effects reveal cognitive flexibility responding to frequency and predictability: Evidence from eye movements in reading and proofreading
    Schotter, Elizabeth R.
    Bicknell, Klinton
    Howard, Ian
    Levy, Roger
    Rayner, Keith
    [J]. COGNITION, 2014, 131 (01) : 1 - 27
  • [38] Anxiety interferes with effective inhibition of distracting stimuli: Evidence from eye-movements during a reading task
    Shoker, Leor
    Derakshan, Nazanin
    [J]. PSYCHOPHYSIOLOGY, 2007, 44 : S24 - S24
  • [39] Towards Modeling Human Attention from Eye Movements for Neural Source Code Summarization
    Bansal, Aakash
    Sharif, Bonita
    McMillan, Collin
    [J]. Proceedings of the ACM on Human-Computer Interaction, 2023, 7 (ETRA)
  • [40] Readers can identify the meanings of words without looking at them: Evidence from regressive eye movements
    Schotter, Elizabeth R.
    Fennell, Anna Marie
    [J]. PSYCHONOMIC BULLETIN & REVIEW, 2019, 26 (05) : 1697 - 1704