Eye movement of perceivers during audiovisual speech perception

被引:166
|
作者
Vatikiotis-Bateson, E
Eigsti, IM
Yano, S
Munhall, KG
机构
[1] ATR Human Informat Proc Res Labs, Kyoto 61902, Japan
[2] Univ Rochester, Rochester, NY USA
[3] NHK Res Labs, Kinuta, Japan
[4] Queens Univ, Kingston, ON, Canada
来源
PERCEPTION & PSYCHOPHYSICS | 1998年 / 60卷 / 06期
关键词
D O I
10.3758/BF03211929
中图分类号
B84 [心理学];
学科分类号
04 ; 0402 ;
摘要
Perceiver eye movements were recorded during audiovisual presentations of extended monologues. Monologues were presented at different image sizes and with different levels of acoustic masking noise. Two clear targets of gaze fixation were identified, the eyes and the mouth. Regardless of image size, perceivers of both Japanese and English gazed more at the mouth as masking noise levels increased. However, even at the highest noise levels and largest image sizes, subjects gazed at the mouth only about half the time. For the eye target, perceivers typically gazed at one eye more than the ether, and the tendency became stronger at higher noise levels. English perceivers displayed more variety of gaze-sequence patterns (e.g., left eye to mouth to left eye to right eye) and persisted in using them at higher noise levels than did Japanese perceivers. No segment-level correlations were found between perceiver eye motions and phoneme identity of the stimuli.
引用
收藏
页码:926 / 940
页数:15
相关论文
共 50 条
  • [31] Audiovisual Binding for Speech Perception in Noise and in Aging
    Ganesh, Attigodu Chandrashekara
    Berthommier, Frederic
    Schwartz, Jean-Luc
    LANGUAGE LEARNING, 2018, 68 : 193 - 220
  • [32] Audiovisual speech perception of multilingual learners of Japanese
    Woodman, Katarina
    Manalo, Emmanuel
    INTERNATIONAL JOURNAL OF MULTILINGUALISM, 2024,
  • [33] Influences of selective adaptation on perception of audiovisual speech
    Dias, James W.
    Cook, Theresa C.
    Rosenblum, Lawrence D.
    JOURNAL OF PHONETICS, 2016, 56 : 75 - 84
  • [34] Neural processing of asynchronous audiovisual speech perception
    Stevenson, Ryan A.
    Altieri, Nicholas A.
    Kim, Sunah
    Pisoni, David B.
    James, Thomas W.
    NEUROIMAGE, 2010, 49 (04) : 3308 - 3318
  • [35] Spatial frequency requirements for audiovisual speech perception
    Munhall, KG
    Kroos, C
    Jozan, G
    Vatikiotis-Bateson, E
    PERCEPTION & PSYCHOPHYSICS, 2004, 66 (04): : 574 - 583
  • [36] Audiovisual perception of interrupted speech by nonnative listeners
    Yang, Jing
    Nagaraj, Naveen K.
    Magimairaj, Beula M.
    ATTENTION PERCEPTION & PSYCHOPHYSICS, 2024, 86 (05) : 1763 - 1776
  • [37] A NETWORK ANALYSIS OF AUDIOVISUAL AFFECTIVE SPEECH PERCEPTION
    Jansma, H.
    Roebroeck, A.
    Munte, T. F.
    NEUROSCIENCE, 2014, 256 : 230 - 241
  • [38] Spatial frequency requirements for audiovisual speech perception
    K. G. Munhall
    C. Kroos
    G. Jozan
    E. Vatikiotis-Bateson
    Perception & Psychophysics, 2004, 66 : 574 - 583
  • [39] Subliminal Smells Modulate Audiovisual Speech Perception
    Chen, Jennifer
    Wang, Jin
    Chen, Denise
    CHEMICAL SENSES, 2015, 40 (07) : 641 - 642
  • [40] The Principle of Inverse Effectiveness in Audiovisual Speech Perception
    van de Rijt, Luuk P. H.
    Roye, Anja
    Mylanus, Emmanuel A. M.
    van Opstal, A. John
    van Wanrooij, Marc M.
    FRONTIERS IN HUMAN NEUROSCIENCE, 2019, 13