Sounds facilitate visual motion discrimination via the enhancement of late occipital visual representations

被引:33
|
作者
Kayser, Stephanie J. [1 ]
Philiastides, Marios G. [1 ]
Kayser, Christoph [1 ]
机构
[1] Univ Glasgow, Inst Neurosci & Psychol, Glasgow, Lanark, Scotland
基金
欧洲研究理事会; 英国生物技术与生命科学研究理事会;
关键词
Audio-visual; EEG; Single trial decoding; Sensory decision making; Motion discrimination; PERCEPTUAL DECISION-MAKING; ANTERIOR CINGULATE CORTEX; MULTISENSORY INTEGRATION; AUDITORY MOTION; AUDIOVISUAL MOTION; CUE INTEGRATION; EVIDENCE ACCUMULATION; LOW-LEVEL; OSCILLATIONS; SIGNALS;
D O I
10.1016/j.neuroimage.2017.01.010
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
Sensory discriminations, such as judgements about visual motion, often benefit from multisensory evidence. Despite many reports of enhanced brain activity during multisensory conditions, it remains unclear which dynamic processes implement the multisensory benefit for an upcoming decision in the human brain. Specifically, it remains difficult to attribute perceptual benefits to specific processes, such as early sensory encoding, the transformation of sensory representations into a motor response, or to more unspecific processes such as attention. We combined an audio-visual motion discrimination task with the single-trial mapping of dynamic sensory representations in EEG activity to localize when and where multisensory congruency facilitates perceptual accuracy. Our results show that a congruent sound facilitates the encoding of motion direction in occipital sensory - as opposed to parieto-frontal - cortices, and facilitates later - as opposed to early (i.e. below 100 ms) - sensory activations. This multisensory enhancement was visible as an earlier rise of motion-sensitive activity in middle-occipital regions about 350 ms from stimulus onset, which reflected the better discriminability of motion direction from brain activity and correlated with the perceptual benefit provided by congruent multisensory information. This supports a hierarchical model of multisensory integration in which the enhancement of relevant sensory cortical representations is transformed into a more accurate choice.
引用
收藏
页码:31 / 41
页数:11
相关论文
共 50 条
  • [1] Characteristic sounds facilitate visual search
    Lordanescu, Lucica
    Guzman-Marrinez, Emmanuel
    Grabowecky, Marcia
    Suzuki, Satoru
    PSYCHONOMIC BULLETIN & REVIEW, 2008, 15 (03) : 548 - 554
  • [2] Characteristic sounds facilitate visual search
    Lucica Iordanescu
    Emmanuel Guzman-Martinez
    Marcia Grabowecky
    Satoru Suzuki
    Psychonomic Bulletin & Review, 2008, 15 : 548 - 554
  • [3] Sounds Activate Visual Cortex and Improve Visual Discrimination
    Feng, Wenfeng
    Stoermer, Viola S.
    Martinez, Antigona
    McDonald, John J.
    Hillyard, Steven A.
    JOURNAL OF NEUROSCIENCE, 2014, 34 (29): : 9817 - 9824
  • [4] VISUAL DISCRIMINATION OF CERTAIN CONSONANT SOUNDS
    NEELLEY, JN
    VAUGHN, BE
    QUARTERLY JOURNAL OF SPEECH, 1969, 55 (03) : 301 - 307
  • [5] Filling-in visual motion with sounds
    Vaeljamaee, A.
    Soto-Faraco, S.
    ACTA PSYCHOLOGICA, 2008, 129 (02) : 249 - 254
  • [6] Crossmodal enhancement of visual orientation discrimination by looming sounds requires functional activation of primary visual areas: A case study
    Cecere, Roberto
    Romei, Vincenzo
    Bertini, Caterina
    Ladavas, Elisabetta
    NEUROPSYCHOLOGIA, 2014, 56 : 350 - 358
  • [7] Stable Statistical Representations Facilitate Visual Search
    Corbett, Jennifer E.
    Melcher, David
    JOURNAL OF EXPERIMENTAL PSYCHOLOGY-HUMAN PERCEPTION AND PERFORMANCE, 2014, 40 (05) : 1915 - 1925
  • [8] Linguistic representations of motion depend on visual perceptual representations of motion
    Lu, Shena
    Sun, Yanliang
    Huang, Shuyue
    INTERNATIONAL JOURNAL OF PSYCHOLOGY, 2023, 58 : 273 - 273
  • [9] Indiscriminable sounds determine the direction of visual motion
    Maori Kobayashi
    Wataru Teramoto
    Souta Hidaka
    Yoichi Sugita
    Scientific Reports, 2
  • [10] Indiscriminable sounds determine the direction of visual motion
    Kobayashi, Maori
    Teramoto, Wataru
    Hidaka, Souta
    Sugita, Yoichi
    SCIENTIFIC REPORTS, 2012, 2