An ALE Meta-Analysis on the Audiovisual Integration of Speech Signals

被引:30
|
作者
Erickson, Laura C. [1 ,2 ]
Heeg, Elizabeth [1 ]
Rauschecker, Josef P. [2 ]
Turkeltaub, Peter E. [1 ,3 ]
机构
[1] Georgetown Univ, Med Ctr, Dept Neurol, Washington, DC 20007 USA
[2] Georgetown Univ, Med Ctr, Dept Neurosci, Washington, DC 20007 USA
[3] MedStar Natl Rehabil Hosp, Div Res, Washington, DC USA
基金
美国国家科学基金会; 美国国家卫生研究院;
关键词
cross-modal; language; superior temporal sulcus; activation likelihood estimation; multisensory; auditory dorsal stream; inferior frontal gyrus; asynchronous; incongruent; SUPERIOR TEMPORAL SULCUS; CROSS-MODAL INTEGRATION; HUMAN NEURAL SYSTEM; AUDITORY-CORTEX; MULTISENSORY INTEGRATION; SENSORY SUBSTITUTION; BROCAS AREA; INTERINDIVIDUAL DIFFERENCES; VISUAL INFORMATION; COGNITIVE CONTROL;
D O I
10.1002/hbm.22572
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
The brain improves speech processing through the integration of audiovisual (AV) signals. Situations involving AV speech integration may be crudely dichotomized into those where auditory and visual inputs contain (1) equivalent, complementary signals (validating AV speech) or (2) inconsistent, different signals (conflicting AV speech). This simple framework may allow the systematic examination of broad commonalities and differences between AV neural processes engaged by various experimental paradigms frequently used to study AV speech integration. We conducted an activation likelihood estimation metaanalysis of 22 functional imaging studies comprising 33 experiments, 311 subjects, and 347 foci examining conflicting versus validating AV speech. Experimental paradigms included content congruency, timing synchrony, and perceptual measures, such as the McGurk effect or synchrony judgments, across AV speech stimulus types (sublexical to sentence). Colocalization of conflicting AV speech experiments revealed consistency across at least two contrast types (e.g., synchrony and congruency) in a network of dorsal stream regions in the frontal, parietal, and temporal lobes. There was consistency across all contrast types (synchrony, congruency, and percept) in the bilateral posterior superior/middle temporal cortex. Although fewer studies were available, validating AV speech experiments were localized to other regions, such as ventral stream visual areas in the occipital and inferior temporal cortex. These results suggest that while equivalent, complementary AV speech signals may evoke activity in regions related to the corroboration of sensory input, conflicting AV speech signals recruit widespread dorsal stream areas likely involved in the resolution of conflicting sensory signals. Hum Brain Mapp 35:5587-5605, 2014. (c) 2014 Wiley Periodicals, Inc.
引用
收藏
页码:5587 / 5605
页数:19
相关论文
共 50 条
  • [41] Speech and non-speech measures of audiovisual integration are not correlated
    Wilbiks, Jonathan M. P.
    Brown, Violet A.
    Strand, Julia F.
    ATTENTION PERCEPTION & PSYCHOPHYSICS, 2022, 84 (06) : 1809 - 1819
  • [42] Brain activation elicited by acute stress: An ALE meta-analysis
    Qiu, Yidan
    Fan, Zhiling
    Miao Zhong
    Yang, Jinlong
    Kun Wu
    Hu Huiqing
    Zhang, Ruibin
    Yu Guo
    Lee, Tatia M. C.
    Huang, Ruiwang
    NEUROSCIENCE AND BIOBEHAVIORAL REVIEWS, 2022, 132 : 706 - 724
  • [43] Integration and Temporal Processing of Asynchronous Audiovisual Speech
    Simon, David M.
    Wallace, Mark T.
    JOURNAL OF COGNITIVE NEUROSCIENCE, 2018, 30 (03) : 319 - 337
  • [44] A neuromagnetic study of the integration of audiovisual speech in the brain
    Sams, M
    Levanen, S
    BRAIN TOPOGRAPHY TODAY, 1997, 1147 : 47 - 53
  • [45] Attention to touch weakens audiovisual speech integration
    Alsius, Agnes
    Navarra, Jordi
    Soto-Faraco, Salvador
    EXPERIMENTAL BRAIN RESEARCH, 2007, 183 (03) : 399 - 404
  • [46] Speech Cues Contribute to Audiovisual Spatial Integration
    Bishop, Christopher W.
    Miller, Lee M.
    PLOS ONE, 2011, 6 (08):
  • [47] A measure for assessing the effects of audiovisual speech integration
    Nicholas Altieri
    James T. Townsend
    Michael J. Wenger
    Behavior Research Methods, 2014, 46 : 406 - 415
  • [48] Attention to touch weakens audiovisual speech integration
    Agnès Alsius
    Jordi Navarra
    Salvador Soto-Faraco
    Experimental Brain Research, 2007, 183 : 399 - 404
  • [49] Assessing the role of attention in the audiovisual integration of speech
    Navarra, Jordi
    Alsius, Agnes
    Soto-Faraco, Salvador
    Spence, Charles
    INFORMATION FUSION, 2010, 11 (01) : 4 - 11
  • [50] A measure for assessing the effects of audiovisual speech integration
    Altieri, Nicholas
    Townsend, James T.
    Wenger, Michael J.
    BEHAVIOR RESEARCH METHODS, 2014, 46 (02) : 406 - 415