Using eye-tracking to study audio-visual perceptual integration

被引:5
|
作者
Xiao, Mei [1 ]
Wong, May [1 ]
Umali, Michelle [2 ]
Pomplun, Marc [1 ]
机构
[1] Univ Massachusetts, Dept Comp Sci, Boston, MA 02125 USA
[2] Columbia Univ, Ctr Neurobiol & Behav, New York, NY 10032 USA
关键词
D O I
10.1068/p5731
中图分类号
R77 [眼科学];
学科分类号
100212 ;
摘要
Perceptual integration of audio-visual stimuli is fundamental to our everyday conscious experience. Eye-movement analysis may be a suitable tool for studying such integration, since eye movements respond to auditory as well as visual input. Previous studies have shown that additional auditory cues in visual-search tasks can guide eye movements more efficiently and reduce their latency. However, these auditory cues were task-relevant since they indicated the target position and onset time. Therefore, the observed effects may have been due to subjects using the cues as additional information to maximize their performance, without perceptually integrating them with the visual displays. Here, we combine a visual-tracking task with a continuous, task-irrelevant sound from a stationary source to demonstrate that audio-visual perceptual integration affects low-level oculomotor mechanisms. Auditory stimuli of constant, increasing, or decreasing pitch were presented. All sound categories induced more smooth-pursuit eye movement than silence, with the greatest effect occurring with stimuli of increasing pitch. A possible explanation is that integration of the visual scene with continuous sound creates the perception of continuous visual motion. Increasing pitch may amplify this effect through its common association with accelerating motion.
引用
收藏
页码:1391 / 1395
页数:5
相关论文
共 50 条
  • [1] AUDIO-VISUAL ATTENTION: EYE-TRACKING DATASET AND ANALYSIS TOOLBOX
    Marighetto, Pierre
    Coutrot, Antoine
    Riche, Nicolas
    Guyader, Nathalie
    Mancas, Matei
    Gosselin, Bernard
    Laganiere, Robert
    [J]. 2017 24TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2017, : 1802 - 1806
  • [2] Emotional Sounds Guide Visual Attention to Emotional Pictures: An Eye-Tracking Study With Audio-Visual Stimuli
    Gerdes, Antje B. M.
    Alpers, Georg W.
    Braun, Hanna
    Koehler, Sabrina
    Nowak, Ulrike
    Treiber, Lisa
    [J]. EMOTION, 2021, 21 (04) : 679 - 692
  • [3] Audio-visual interactive evaluation of the forest landscape based on eye-tracking experiments
    Liu, Yiping
    Hu, Mengjun
    Zhao, Bing
    [J]. URBAN FORESTRY & URBAN GREENING, 2019, 46
  • [4] Perceptual thresholds of audio-visual spatial coherence for a variety of audio-visual objects
    Stenzel, Hanne
    Jackson, Philip J. B.
    [J]. 2018 AES INTERNATIONAL CONFERENCE ON AUDIO FOR VIRTUAL AND AUGMENTED REALITY, 2018,
  • [5] Why is it difficult for children and adults to follow a person's eye gaze in polynomial social relationships with compound audio-visual stimuli: An eye-tracking study
    Oka, Misaki
    Omori, Mikimasa
    [J]. PLOS ONE, 2023, 18 (08):
  • [6] Audio-visual integration in schizophrenia
    de Gelder, B
    Vroomen, J
    Annen, L
    Masthof, E
    Hodiamont, P
    [J]. SCHIZOPHRENIA RESEARCH, 2003, 59 (2-3) : 211 - 218
  • [7] Effects of aging on audio-visual speech integration Effects of aging on audio-visual speech integration
    Huyse, Aurelie
    Leybaert, Jacqueline
    Berthommier, Frederic
    [J]. JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA, 2014, 136 (04): : 1918 - 1931
  • [8] Joint Audio-Visual Tracking Using Particle Filters
    Dmitry N. Zotkin
    Ramani Duraiswami
    Larry S. Davis
    [J]. EURASIP Journal on Advances in Signal Processing, 2002
  • [9] Comparing Online Webcam- and Laboratory-Based Eye-Tracking for the Assessment of Infants' Audio-Visual Synchrony Perception
    Banki, Anna
    de Eccher, Martina
    Falschlehner, Lilith
    Hoehl, Stefanie
    Markova, Gabriela
    [J]. FRONTIERS IN PSYCHOLOGY, 2022, 12
  • [10] Joint audio-visual tracking using particle filters
    Zotkin, DN
    Duraiswami, R
    Davis, LS
    [J]. EURASIP JOURNAL ON APPLIED SIGNAL PROCESSING, 2002, 2002 (11) : 1154 - 1164