Contextual cueing of visual search reflects the acquisition of an optimal, one-for-all oculomotor scanning strategy

被引:7
|
作者
Werner Seitz
Artyom Zinchenko
Hermann J. Müller
Thomas Geyer
机构
[1] Ludwig-Maximilians-Universität München,Department Psychologie
[2] Ludwig-Maximilians-Universität München,Munich Center for Neurosciences – Brain & Mind
[3] Ludwig-Maximilians-Universität München,NICUM
来源
关键词
D O I
10.1038/s44271-023-00019-8
中图分类号
学科分类号
摘要
Visual search improves when a target is encountered repeatedly at a fixed location within a stable distractor arrangement (spatial context), compared to non-repeated contexts. The standard account attributes this contextual-cueing effect to the acquisition of display-specific long-term memories, which, when activated by the current display, cue attention to the target location. Here we present an alternative, procedural-optimization account, according to which contextual facilitation arises from the acquisition of generic oculomotor scanning strategies, optimized with respect to the entire set of displays, with frequently searched displays accruing greater weight in the optimization process. To decide between these alternatives, we examined measures of the similarity, across time-on-task, of the spatio-temporal sequences of fixations through repeated and non-repeated displays. We found scanpath similarity to increase generally with learning, but more for repeated versus non-repeated displays. This pattern contradicts display-specific guidance, but supports one-for-all scanpath optimization.
引用
收藏
相关论文
共 2 条
  • [1] Contextual cueing in co-active visual search: Joint action allows acquisition of task-irrelevant context
    Zang, Xuelian
    Zinchenko, Artyom
    Wu, Jiao
    Zhu, Xiuna
    Fang, Fang
    Shi, Zhuanghua
    ATTENTION PERCEPTION & PSYCHOPHYSICS, 2022, 84 (04) : 1114 - 1129
  • [2] Contextual cueing in co-active visual search: Joint action allows acquisition of task-irrelevant context
    Xuelian Zang
    Artyom Zinchenko
    Jiao Wu
    Xiuna Zhu
    Fang Fang
    Zhuanghua Shi
    Attention, Perception, & Psychophysics, 2022, 84 : 1114 - 1129