High visual resolution matters in audiovisual speech perception, but only for some

被引:17
|
作者
Alsius, Agnes [1 ]
Wayne, Rachel V. [1 ]
Pare, Martin [2 ]
Munhall, Kevin G. [1 ,2 ]
机构
[1] Queens Univ, Dept Psychol, Humphrey Hall,62 Arch St, Kingston, ON K7L 3N6, Canada
[2] Queens Univ, Ctr Neurosci Studies, Kingston, ON, Canada
基金
加拿大自然科学与工程研究理事会;
关键词
Speech perception; Audiovisual integration; Speechreading skill; Spatial frequency; Eye gaze; WORD-RECOGNITION; HEARING; INTEGRATION; NOISE; DISTANCE; INTELLIGIBILITY; DISTINCTIVENESS; COMPREHENSION; PERFORMANCE; RECEPTION;
D O I
10.3758/s13414-016-1109-4
中图分类号
B84 [心理学];
学科分类号
04 ; 0402 ;
摘要
The basis for individual differences in the degree to which visual speech input enhances comprehension of acoustically degraded speech is largely unknown. Previous research indicates that fine facial detail is not critical for visual enhancement when auditory information is available; however, these studies did not examine individual differences in ability to make use of fine facial detail in relation to audiovisual speech perception ability. Here, we compare participants based on their ability to benefit from visual speech information in the presence of an auditory signal degraded with noise, modulating the resolution of the visual signal through low-pass spatial frequency filtering and monitoring gaze behavior. Participants who benefited most from the addition of visual information (high visual gain) were more adversely affected by the removal of high spatial frequency information, compared to participants with low visual gain, for materials with both poor and rich contextual cues (i.e., words and sentences, respectively). Differences as a function of gaze behavior between participants with the highest and lowest visual gains were observed only for words, with participants with the highest visual gain fixating longer on the mouth region. Our results indicate that the individual variance in audiovisual speech in noise performance can be accounted for, in part, by better use of fine facial detail information extracted from the visual signal and increased fixation on mouth regions for short stimuli. Thus, for some, audiovisual speech perception may suffer when the visual input (in addition to the auditory signal) is less than perfect.
引用
收藏
页码:1472 / 1487
页数:16
相关论文
共 50 条
  • [41] Spatial frequency requirements for audiovisual speech perception
    Munhall, KG
    Kroos, C
    Jozan, G
    Vatikiotis-Bateson, E
    PERCEPTION & PSYCHOPHYSICS, 2004, 66 (04): : 574 - 583
  • [42] Neural processing of asynchronous audiovisual speech perception
    Stevenson, Ryan A.
    Altieri, Nicholas A.
    Kim, Sunah
    Pisoni, David B.
    James, Thomas W.
    NEUROIMAGE, 2010, 49 (04) : 3308 - 3318
  • [43] Spatial frequency requirements for audiovisual speech perception
    K. G. Munhall
    C. Kroos
    G. Jozan
    E. Vatikiotis-Bateson
    Perception & Psychophysics, 2004, 66 : 574 - 583
  • [44] A NETWORK ANALYSIS OF AUDIOVISUAL AFFECTIVE SPEECH PERCEPTION
    Jansma, H.
    Roebroeck, A.
    Munte, T. F.
    NEUROSCIENCE, 2014, 256 : 230 - 241
  • [45] Audiovisual perception of interrupted speech by nonnative listeners
    Yang, Jing
    Nagaraj, Naveen K.
    Magimairaj, Beula M.
    ATTENTION PERCEPTION & PSYCHOPHYSICS, 2024, : 1763 - 1776
  • [46] Subliminal Smells Modulate Audiovisual Speech Perception
    Chen, Jennifer
    Wang, Jin
    Chen, Denise
    CHEMICAL SENSES, 2015, 40 (07) : 641 - 642
  • [47] Perception of intersensory synchrony in audiovisual speech: Not that special
    Vroomen, Jean
    Stekelenburg, Jeroen J.
    COGNITION, 2011, 118 (01) : 75 - 83
  • [48] The word superiority effect in audiovisual speech perception
    Fort, Mathilde
    Spinelli, Elsa
    Savariaux, Christophe
    Kandel, Sonia
    SPEECH COMMUNICATION, 2010, 52 (06) : 525 - 532
  • [49] Audiovisual gating and the time course of speech perception
    Munhall, KG
    Tohkura, Y
    JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA, 1998, 104 (01): : 530 - 539
  • [50] The Principle of Inverse Effectiveness in Audiovisual Speech Perception
    van de Rijt, Luuk P. H.
    Roye, Anja
    Mylanus, Emmanuel A. M.
    van Opstal, A. John
    van Wanrooij, Marc M.
    FRONTIERS IN HUMAN NEUROSCIENCE, 2019, 13