The contribution of dynamic visual cues to audiovisual speech perception

被引:11
|
作者
Jaekl, Philip [1 ,2 ]
Pesquita, Ana [3 ]
Alsius, Agnes [4 ]
Munhall, Kevin [4 ]
Soto-Faraco, Salvador [5 ,6 ]
机构
[1] Univ Rochester, Ctr Visual Sci, Rochester, NY 14627 USA
[2] Univ Rochester, Dept Brain & Cognit Sci, Rochester, NY USA
[3] Univ British Columbia, Dept Psychol, UBC Vis Lab, Vancouver, BC, Canada
[4] Queens Univ, Dept Psychol, Kingston, ON K7L 3N6, Canada
[5] Univ Pompeu Fabra, Dept Informat Technol & Commun, Ctr Brain & Cognit, Barcelona, Spain
[6] ICREA, Barcelona, Spain
基金
加拿大自然科学与工程研究理事会; 欧洲研究理事会;
关键词
Speech-in-noise; Visual form; Visual motion; Visual pathways; Biological motion; Configural; Audiovisual enhancement; BIOLOGICAL MOTION PERCEPTION; CRITICAL FEATURES; VISIBLE SPEECH; FROM-MOTION; FORM; FACES; RECOGNITION; INTEGRATION; ADAPTATION; MECHANISMS;
D O I
10.1016/j.neuropsychologia.2015.06.025
中图分类号
B84 [心理学]; C [社会科学总论]; Q98 [人类学];
学科分类号
03 ; 0303 ; 030303 ; 04 ; 0402 ;
摘要
Seeing a speaker's facial gestures can significantly improve speech comprehension, especially in noisy environments. However, the nature of the visual information from the speaker's facial movements that is relevant for this enhancement is still unclear. Like auditory speech signals, visual speech signals unfold over time and contain both dynamic configural information and luminance-defined local motion cues; two information sources that are thought to engage anatomically and functionally separate visual systems. Whereas, some past studies have highlighted the importance of local, luminance-defined motion cues in audiovisual speech perception, the contribution of dynamic configural information signalling changes in form over time has not yet been assessed. We therefore attempted to single out the contribution of dynamic configural information to audiovisual speech processing. To this aim, we measured word identification performance in noise using unimodal auditory stimuli, and with audiovisual stimuli. In the audiovisual condition, speaking faces were presented as point light displays achieved via motion capture of the original talker. Point light displays could be isoluminant, to minimise the contribution of effective luminance-defined local motion information, or with added luminance contrast, allowing the combined effect of dynamic configural cues and local motion cues. Audiovisual enhancement was found in both the isoluminant and contrast-based luminance conditions compared to an auditory-only condition, demonstrating, for the first time the specific contribution of dynamic configural cues to audiovisual speech improvement. These findings imply that globally processed changes in a speaker's facial shape contribute significantly towards the perception of articulatory gestures and the analysis of audiovisual speech. (C) 2015 Elsevier Ltd. All rights reserved.
引用
收藏
页码:402 / 410
页数:9
相关论文
共 50 条
  • [31] Developmental role of static, dynamic, and contextual cues in speech perception
    Hicks, CB
    Ohde, RN
    JOURNAL OF SPEECH LANGUAGE AND HEARING RESEARCH, 2005, 48 (04): : 960 - 974
  • [32] Dynamic visual speech perception in a patient with visual form agnosia
    Munhall, KG
    Servos, P
    Santi, A
    Goodale, MA
    NEUROREPORT, 2002, 13 (14) : 1793 - 1796
  • [33] The effect of visual spatial attention on audiovisual speech perception in adults with Asperger syndrome
    Satu Saalasti
    Kaisa Tiippana
    Jari Kätsyri
    Mikko Sams
    Experimental Brain Research, 2011, 213 : 283 - 290
  • [34] Visual and audiovisual speech perception with color and gray-scale facial images
    Timothy R. Jordan
    Maxine V. Mccotter
    Sharon M. Thomas
    Perception & Psychophysics, 2000, 62 : 1394 - 1404
  • [35] The effect of visual spatial attention on audiovisual speech perception in adults with Asperger syndrome
    Saalasti, Satu
    Tiippana, Kaisa
    Katsyri, Jari
    Sams, Mikko
    EXPERIMENTAL BRAIN RESEARCH, 2011, 213 (2-3) : 283 - 290
  • [36] Visual and audiovisual speech perception with color and gray-scale facial images
    Jordan, TR
    McCotter, MV
    Thomas, SM
    PERCEPTION & PSYCHOPHYSICS, 2000, 62 (07): : 1394 - 1404
  • [37] Speech Cues Contribute to Audiovisual Spatial Integration
    Bishop, Christopher W.
    Miller, Lee M.
    PLOS ONE, 2011, 6 (08):
  • [38] The Relative Contribution of Visual Cues and Acoustic Enhancement Strategies in Improving Speech Perception of Individuals with Auditory Neuropathy Spectrum Disorders
    Balan, Jithin Raj
    Maruthy, Sandeep
    INDIAN JOURNAL OF OTOLOGY, 2018, 24 (03) : 139 - 147
  • [39] Contribution of somesthetic cues to the perception of body orientation and subjective visual vertical
    Trousselard, M
    Cian, C
    Nougier, V
    Pla, S
    Raphel, C
    PERCEPTION & PSYCHOPHYSICS, 2003, 65 (08): : 1179 - 1187
  • [40] Contribution of somesthetic cues to the perception of body orientation and subjective visual vertical
    Marion Trousselard
    Corinne Cian
    Vincent Nougier
    Simon Pla
    Christian Raphel
    Perception & Psychophysics, 2003, 65 : 1179 - 1187