Language-driven anticipatory eye movements in virtual reality

被引:0
|
作者
Nicole Eichert
David Peeters
Peter Hagoort
机构
[1] Max Planck Institute for Psycholinguistics,Donders Institute for Brain, Cognition, and Behavior
[2] University of Oxford,undefined
[3] Radboud University,undefined
来源
Behavior Research Methods | 2018年 / 50卷
关键词
Virtual Reality; Prediction; Language Comprehension; Eyetracking; Visual World;
D O I
暂无
中图分类号
学科分类号
摘要
Predictive language processing is often studied by measuring eye movements as participants look at objects on a computer screen while they listen to spoken sentences. This variant of the visual-world paradigm has revealed that information encountered by a listener at a spoken verb can give rise to anticipatory eye movements to a target object, which is taken to indicate that people predict upcoming words. The ecological validity of such findings remains questionable, however, because these computer experiments used two-dimensional stimuli that were mere abstractions of real-world objects. Here we present a visual-world paradigm study in a three-dimensional (3-D) immersive virtual reality environment. Despite significant changes in the stimulus materials and the different mode of stimulus presentation, language-mediated anticipatory eye movements were still observed. These findings thus indicate that people do predict upcoming words during language comprehension in a more naturalistic setting where natural depth cues are preserved. Moreover, the results confirm the feasibility of using eyetracking in rich and multimodal 3-D virtual environments.
引用
收藏
页码:1102 / 1115
页数:13
相关论文
共 50 条
  • [1] Language-driven anticipatory eye movements in virtual reality
    Eichert, Nicole
    Peeters, David
    Hagoort, Peter
    [J]. BEHAVIOR RESEARCH METHODS, 2018, 50 (03) : 1102 - 1115
  • [2] Effect of repetition proportion on language-driven anticipatory eye movements
    Britt, Allison E.
    Mirman, Daniel
    Kornilov, Sergey A.
    Magnuson, James S.
    [J]. ACTA PSYCHOLOGICA, 2014, 145 : 128 - 138
  • [3] Anticipatory eye movements and Specific Language Impairment
    Jusic, Ines Galic
    Palmovic, Marijan
    [J]. SUVREMENA LINGVISTIKA, 2010, 36 (70): : 195 - 208
  • [4] Modulation of scene consistency and task demand on language-driven eye movements for audio-visual integration
    Yu, Wan-Yun
    Tsai, Jie-Li
    [J]. ACTA PSYCHOLOGICA, 2016, 171 : 1 - 16
  • [5] Vergence eye movements in virtual reality
    McAnally, Ken
    Grove, Philip
    Wallis, Guy
    [J]. DISPLAYS, 2024, 83
  • [6] Vergence eye movements in virtual reality
    McAnally, Ken
    Grove, Philip
    Wallis, Guy
    [J]. Displays, 2024, 83
  • [7] Language-driven system design
    Mauw, S
    Wiersma, WT
    Willemse, TAC
    [J]. INTERNATIONAL JOURNAL OF SOFTWARE ENGINEERING AND KNOWLEDGE ENGINEERING, 2004, 14 (06) : 625 - 663
  • [8] Anticipatory eye movements in Congkak
    Chong, Sheryl
    Mennie, Neil
    [J]. I-PERCEPTION, 2011, 2 (04): : 335 - 335
  • [9] Verbal and nonverbal predictors of language-mediated anticipatory eye movements
    Joost Rommers
    Antje S. Meyer
    Falk Huettig
    [J]. Attention, Perception, & Psychophysics, 2015, 77 : 720 - 730
  • [10] A tutorial: Analyzing eye and head movements in virtual reality
    Bischof, Walter F.
    Anderson, Nicola C.
    Kingstone, Alan
    [J]. BEHAVIOR RESEARCH METHODS, 2024,