Robust EEG-Based Decoding of Auditory Attention With High-RMS-Level Speech Segments in Noisy Conditions

被引:7
|
作者
Wang, Lei [1 ,2 ]
Wu, Ed X. [2 ]
Chen, Fei [1 ]
机构
[1] Southern Univ Sci & Technol, Dept Elect & Elect Engn, Shenzhen, Peoples R China
[2] Univ Hong Kong, Dept Elect & Elect Engn, Hong Kong, Peoples R China
来源
基金
中国国家自然科学基金;
关键词
EEG; temporal response function (TRF); auditory attention decoding; speech RMS-level segments; signal-to-noise ratio; CORTICAL ENTRAINMENT; NORMAL-HEARING; INTELLIGIBILITY; TRACKING; COMPREHENSION; OSCILLATIONS; RESPONSES; BRAIN; DELTA; THETA;
D O I
10.3389/fnhum.2020.557534
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
The attended speech stream can be detected robustly, even in adverse auditory scenarios with auditory attentional modulation, and can be decoded using electroencephalographic (EEG) data. Speech segmentation based on the relative root-mean-square (RMS) intensity can be used to estimate segmental contributions to perception in noisy conditions. High-RMS-level segments contain crucial information for speech perception. Hence, this study aimed to investigate the effect of high-RMS-level speech segments on auditory attention decoding performance under various signal-to-noise ratio (SNR) conditions. Scalp EEG signals were recorded when subjects listened to the attended speech stream in the mixed speech narrated concurrently by two Mandarin speakers. The temporal response function was used to identify the attended speech from EEG responses of tracking to the temporal envelopes of intact speech and high-RMS-level speech segments alone, respectively. Auditory decoding performance was then analyzed under various SNR conditions by comparing EEG correlations to the attended and ignored speech streams. The accuracy of auditory attention decoding based on the temporal envelope with high-RMS-level speech segments was not inferior to that based on the temporal envelope of intact speech. Cortical activity correlated more strongly with attended than with ignored speech under different SNR conditions. These results suggest that EEG recordings corresponding to high-RMS-level speech segments carry crucial information for the identification and tracking of attended speech in the presence of background noise. This study also showed that with the modulation of auditory attention, attended speech can be decoded more robustly from neural activity than from behavioral measures under a wide range of SNR.
引用
收藏
页数:13
相关论文
共 27 条
  • [1] Contribution of RMS-level-based Speech Segments to Target Speech Decoding under Noisy Conditions
    Wang, Lei
    Wu, Ed X.
    Chen, Fei
    INTERSPEECH 2020, 2020, : 121 - 124
  • [2] Impact of Different Acoustic Components on EEG-Based Auditory Attention Decoding in Noisy and Reverberant Conditions
    Aroudi, Ali
    Mirkovic, Bojana
    De Vos, Maarten
    Doclo, Simon
    IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING, 2019, 27 (04) : 652 - 663
  • [3] EEG-based Auditory Attention Decoding Using Unprocessed Binaural Signals in Reverberant and Noisy Conditions
    Aroudi, Ali
    Doclo, Simon
    2017 39TH ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY (EMBC), 2017, : 484 - 488
  • [4] EEG-based auditory attention decoding using speech-level-based segmented computational models
    Wang, Lei
    Wu, Ed X.
    Chen, Fei
    JOURNAL OF NEURAL ENGINEERING, 2021, 18 (04)
  • [5] Auditory attention decoding from EEG-based Mandarin speech envelope reconstruction
    Xu, Zihao
    Bai, Yanru
    Zhao, Ran
    Zheng, Qi
    Ni, Guangjian
    Ming, Dong
    HEARING RESEARCH, 2022, 422
  • [6] 'Are you even listening?' - EEG-based decoding of absolute auditory attention to natural speech
    Roebben, Arnout
    Heintz, Nicolas
    Geirnaert, Simon
    Francart, Tom
    Bertrand, Alexander
    JOURNAL OF NEURAL ENGINEERING, 2024, 21 (03)
  • [7] EEG-based auditory attention decoding with audiovisual speech for hearing-impaired listeners
    Wang, Bo
    Xu, Xiran
    Niu, Yadong
    Wu, Chao
    Wu, Xihong
    Chen, Jing
    CEREBRAL CORTEX, 2023, 33 (22) : 10972 - 10983
  • [8] EEG-BASED DECODING OF AUDITORY ATTENTION TO A TARGET INSTRUMENT IN POLYPHONIC MUSIC
    Cantisani, Giorgia
    Essid, Slim
    Richard, Gael
    2019 IEEE WORKSHOP ON APPLICATIONS OF SIGNAL PROCESSING TO AUDIO AND ACOUSTICS (WASPAA), 2019, : 80 - 84
  • [9] Improving EEG-based decoding of the locus of auditory attention through domain adaptation
    Wilroth, Johanna
    Bernhardsson, Bo
    Heskebeck, Frida
    Skoglund, Martin A.
    Bergeling, Carolina
    Alickovic, Emina
    JOURNAL OF NEURAL ENGINEERING, 2023, 20 (06)
  • [10] EEG-based Auditory Attention Decoding: Impact of Reverberation, Noise and Interference Reduction
    Aroudi, Ali
    Doclo, Simon
    2017 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2017, : 3042 - 3047