Transformer-Based Model for Auditory EEG Decoding

被引:0
|
作者
Chen, Jiaxin [1 ]
Liu, Yin-Long [1 ]
Feng, Rui [1 ]
Yuan, Jiahong [1 ,2 ]
Ling, Zhen-Hua [1 ,2 ]
机构
[1] Univ Sci & Technol China, Natl Engn Res Ctr Speech & Language Informat Proc, Hefei, Peoples R China
[2] Univ Sci & Technol China, Interdisciplinary Res Ctr Linguist Sci, Hefei, Peoples R China
关键词
EEG; speech decoding; Transformer-based models; match-mismatch; regression; SPEECH;
D O I
10.1007/978-981-96-1045-7_11
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
During the process of speech perception, the listener's electroencephalographic(EEG) signals is synchronized with acoustic features such as the speech envelope. This neural tracking mechanism can be used to decode the speech information form the EEG signals, much work has been devoted to investigating. In terms of the limited fitting ability of linear models, many deep learning-based models have been proposed in this field. Recently, Transformer-based models have showed significant potential in the EEG tasks. The Auditory EEG Decoding Challenge 2023 released two tasks to associate a person's EEG signals with the speech they are listening to, namely match-mismatch and regression. In this paper, two Transformer-based models are proposed for the two auditory downstream tasks. The convolution layer and self-attention mechanism are utilized simultaneously to extract both local features and global dependencies. For the match-mismatch task, the Transformer-Dilated Convolution Network is proposed to classify the speech segments that match the EEG segment. Meanwhile, we design the Transformer-Conformer Network to reconstruct the speech envelope for the regression task. Results show that our proposed models outperform the baseline on both tasks. In addition, the Transformer-Conformer Network is superior in performance comparing with all the challenge teams on the regression track.
引用
收藏
页码:129 / 143
页数:15
相关论文
共 50 条
  • [1] Decoding selective auditory attention with EEG using a transformer model
    Xua, Zihao
    Bai, Yanru
    Zhao, Ran
    Hu, Hongmei
    Ni, Guangjian
    Minga, Dong
    METHODS, 2022, 204 : 410 - 417
  • [2] Transformer-Based Unified Neural Network for Quality Estimation and Transformer-Based Re-decoding Model for Machine Translation
    Chen, Cong
    Zong, Qinqin
    Luo, Qi
    Qiu, Bailian
    Li, Maoxi
    MACHINE TRANSLATION, CCMT 2020, 2020, 1328 : 66 - 75
  • [3] EEG Classification with Transformer-Based Models
    Sun, Jiayao
    Xie, Jin
    Zhou, Huihui
    2021 IEEE 3RD GLOBAL CONFERENCE ON LIFE SCIENCES AND TECHNOLOGIES (IEEE LIFETECH 2021), 2021, : 92 - 93
  • [4] Transformer-based ensemble deep learning model for EEG-based emotion recognition
    Xiaopeng Si
    Dong Huang
    Yulin Sun
    Shudi Huang
    He Huang
    Dong Ming
    Brain Science Advances, 2023, 9 (03) : 210 - 223
  • [5] Decoding selective auditory attention with EEG using a transformer model (vol 204, pg 410, 2022)
    Xu, Zihao
    Bai, Yanru
    Zhao, Ran
    Hu, Hongmei
    Ni, Guangjian
    Ming, Dong
    METHODS, 2022, 205 : 157 - 157
  • [6] Flexible Patched Brain Transformer model for EEG decoding
    Klein, Timon
    Minakowski, Piotr
    Sager, Sebastian
    SCIENTIFIC REPORTS, 2025, 15 (01):
  • [7] HEAD-SYNCHRONOUS DECODING FOR TRANSFORMER-BASED STREAMING ASR
    Li, Mohan
    Zorila, Catalin
    Doddipatla, Rama
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 5909 - 5913
  • [8] Transformer-based Learned Image Compression for Joint Decoding and Denoising
    Chen, Yi-Hsin
    Ho, Kuan-Wei
    Tsai, Shiau-Rung
    Lin, Guan-Hsun
    Gnutti, Alessandro
    Peng, Wen-Hsiao
    Leonardi, Riccardo
    2024 PICTURE CODING SYMPOSIUM, PCS 2024, 2024,
  • [9] BaseNet: A transformer-based toolkit for nanopore sequencing signal decoding
    Li, Qingwen
    Sun, Chen
    Wang, Daqian
    Lou, Jizhong
    COMPUTATIONAL AND STRUCTURAL BIOTECHNOLOGY JOURNAL, 2024, 23 : 3430 - 3444
  • [10] A Transformer-Based Spatial-Temporal Sleep Staging Model Through Raw EEG
    Shi, Guang
    Chen, Zheng
    Zhang, Renyuan
    2021 INTERNATIONAL CONFERENCE ON HIGH PERFORMANCE BIG DATA AND INTELLIGENT SYSTEMS (HPBD&IS), 2021, : 110 - 115