Face viewing behavior predicts multisensory gain during speech perception

被引:0
|
作者
Johannes Rennig
Kira Wegner-Clemens
Michael S. Beauchamp
机构
[1] Baylor College of Medicine,Department of Neurosurgery and Core for Advanced MRI
来源
关键词
Audiovisual; Face; Multisensory; Speech perception; Eye tracking;
D O I
暂无
中图分类号
学科分类号
摘要
Visual information from the face of an interlocutor complements auditory information from their voice, enhancing intelligibility. However, there are large individual differences in the ability to comprehend noisy audiovisual speech. Another axis of individual variability is the extent to which humans fixate the mouth or the eyes of a viewed face. We speculated that across a lifetime of face viewing, individuals who prefer to fixate the mouth of a viewed face might accumulate stronger associations between visual and auditory speech, resulting in improved comprehension of noisy audiovisual speech. To test this idea, we assessed interindividual variability in two tasks. Participants (n = 102) varied greatly in their ability to understand noisy audiovisual sentences (accuracy from 2–58%) and in the time they spent fixating the mouth of a talker enunciating clear audiovisual syllables (3–98% of total time). These two variables were positively correlated: a 10% increase in time spent fixating the mouth equated to a 5.6% increase in multisensory gain. This finding demonstrates an unexpected link, mediated by histories of visual exposure, between two fundamental human abilities: processing faces and understanding speech.
引用
收藏
页码:70 / 77
页数:7
相关论文
共 50 条
  • [1] Face viewing behavior predicts multisensory gain during speech perception
    Rennig, Johannes
    Wegner-Clemens, Kira
    Beauchamp, Michael S.
    [J]. PSYCHONOMIC BULLETIN & REVIEW, 2020, 27 (01) : 70 - 77
  • [2] Infants' Perceptual Insensitivity to the Other-Race-Face in Multisensory Speech Perception
    Ujiie, Yuta
    Kanazawa, So
    Yamaguchi, Masami K.
    [J]. I-PERCEPTION, 2019, 10 : 44 - 45
  • [3] Multisensory and sensorimotor interactions in speech perception
    Tiippana, Kaisa
    Moettoenen, Riikka
    Schwartz, Jean-Luc
    [J]. FRONTIERS IN PSYCHOLOGY, 2015, 6
  • [4] Editorial: Multisensory speech in perception and production
    Sanchez, Kauyumari
    Neergaard, Karl David
    Dias, James W.
    [J]. FRONTIERS IN HUMAN NEUROSCIENCE, 2024, 18
  • [5] Multisensory and lexical information in speech perception
    Dorsi, Josh
    Lacey, Simon
    Sathian, K.
    [J]. FRONTIERS IN HUMAN NEUROSCIENCE, 2024, 17
  • [6] Cascading and Multisensory Influences on Speech Perception Development
    Choi, Dawoon
    Black, Alexis K.
    Werker, Janet F.
    [J]. MIND BRAIN AND EDUCATION, 2018, 12 (04) : 212 - 223
  • [7] Multisensory perception during locomotion
    Ernst, Marc
    Souman, Jan
    Frissen, Ilja
    [J]. INTERNATIONAL JOURNAL OF PSYCHOLOGY, 2008, 43 (3-4) : 182 - 182
  • [8] Multisensory interactions of face and vocal information during perception and memory in ventrolateral prefrontal cortex
    Romanski, Lizabeth M.
    Sharma, Keshov K.
    [J]. PHILOSOPHICAL TRANSACTIONS OF THE ROYAL SOCIETY B-BIOLOGICAL SCIENCES, 2023, 378 (1886)
  • [9] Assessing multisensory interactions during audio-visual speech perception using ERP
    Winneke, Axel H.
    Phillips, Natalie A.
    [J]. PSYCHOPHYSIOLOGY, 2007, 44 : S36 - S36
  • [10] Assessing multisensory interactions during audio-visual speech perception using ERP
    Winneke, Axel H.
    Phillips, Natalie A.
    [J]. PSYCHOPHYSIOLOGY, 2007, 44 : S36 - S36