STAnet: A Spatiotemporal Attention Network for Decoding Auditory Spatial Attention From EEG

被引:56
|
作者
Su, Enze [1 ]
Cai, Siqi [2 ]
Xie, Longhan [1 ]
Li, Haizhou [2 ,3 ,4 ]
Schultz, Tanja [5 ]
机构
[1] South China Univ Technol, Shien Ming Wu Sch Intelligent Engn, Guangzhou 510460, Guangdong, Peoples R China
[2] Natl Univ Singapore, Dept Elect & Comp Engn, Singapore, Singapore
[3] Chinese Univ Hong Kong Shenzhen, Sch Data Sci, Shenzhen, Peoples R China
[4] Univ Bremen, Machine Listening Lab, Bremen, Germany
[5] Univ Bremen, Cognit Syst Lab, Bremen, Germany
基金
中国国家自然科学基金;
关键词
Electroencephalography; Spatiotemporal phenomena; Feature extraction; Brain modeling; Decoding; Speech enhancement; Pipelines; Auditory attention; brain-computer interface; electroencephalography; spatial attention; temporal attention; CORTICAL REPRESENTATION; SELECTIVE ATTENTION; ATTENDED SPEECH; TRACKING; BRAIN; MODULATION; RESPONSES; DYNAMICS; CHAOS; HAND;
D O I
10.1109/TBME.2022.3140246
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Objective: Humans are able to localize the source of a sound. This enables them to direct attention to a particular speaker in a cocktail party. Psycho-acoustic studies show that the sensory cortices of the human brain respond to the location of sound sources differently, and the auditory attention itself is a dynamic and temporally based brain activity. In this work, we seek to build a computational model which uses both spatial and temporal information manifested in EEG signals for auditory spatial attention detection (ASAD). Methods: We propose an end-to-end spatiotemporal attention network, denoted as STAnet, to detect auditory spatial attention from EEG. The STAnet is designed to assign differentiated weights dynamically to EEG channels through a spatial attention mechanism, and to temporal patterns in EEG signals through a temporal attention mechanism. Results: We report the ASAD experiments on two publicly available datasets. The STAnet outperforms other competitive models by a large margin under various experimental conditions. Its attention decision for 1-second decision window outperforms that of the state-of-the-art techniques for 10-second decision window. Experimental results also demonstrate that the STAnet achieves competitive performance on EEG signals ranging from 64 to as few as 16 channels. Conclusion: This study provides evidence suggesting that efficient low-density EEG online decoding is within reach. Significance: This study also marks an important step towards the practical implementation of ASAD in real life applications.
引用
收藏
页码:2233 / 2242
页数:10
相关论文
共 50 条
  • [1] Decoding auditory attention from EEG using a convolutional neural network
    An, Winko W.
    Pei, Alexander
    Noyce, Abigail L.
    Shinn-Cunningham, Barbara
    2021 43RD ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE & BIOLOGY SOCIETY (EMBC), 2021, : 6586 - 6589
  • [2] Auditory Attention Decoding from EEG using Convolutional Recurrent Neural Network
    Fu, Zhen
    Wang, Bo
    Wu, Xihong
    Chen, Jing
    29TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO 2021), 2021, : 970 - 974
  • [3] DECODING AUDITORY ATTENTION FROM EEG DATA USING CEPSTRAL ANALYSIS
    Alickovic, Emina
    Mendoza, Carlos Francisco
    Segar, Andrew
    Sandsten, Maria
    Skoglund, Martin A.
    2023 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING WORKSHOPS, ICASSPW, 2023,
  • [4] Decoding the anatomical network of spatial attention
    Smith, David V.
    Clithero, John A.
    Rorden, Christopher
    Karnath, Hans-Otto
    PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2013, 110 (04) : 1518 - 1523
  • [5] MAtt: A Manifold Attention Network for EEG Decoding
    Pan, Yue-Ting
    Chou, Jing-Lun
    Wei, Chun-Shu
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [6] EEG-based Auditory Attention Detection with Spatiotemporal Graph and Graph Convolutional Network
    Wang, Ruicong
    Cai, Siqi
    Li, Haizhou
    INTERSPEECH 2023, 2023, : 1144 - 1148
  • [7] What are we really decoding? Unveiling biases in EEG-based decoding of the spatial focus of auditory attention
    Rotaru, Iustina
    Geirnaert, Simon
    Heintz, Nicolas
    van de Ryck, Iris
    Bertrand, Alexander
    Francart, Tom
    JOURNAL OF NEURAL ENGINEERING, 2024, 21 (01)
  • [8] Decoding selective auditory attention with EEG using a transformer model
    Xua, Zihao
    Bai, Yanru
    Zhao, Ran
    Hu, Hongmei
    Ni, Guangjian
    Minga, Dong
    METHODS, 2022, 204 : 410 - 417
  • [9] Auditory Attention Decoding in Four-Talker Environment with EEG
    Yan, Yujie
    Xu, Xiran
    Zhu, Haolin
    Tian, Pei
    Ge, Zhongshu
    Wu, Xihong
    Chen, Jing
    INTERSPEECH 2024, 2024, : 432 - 436
  • [10] Congruent audiovisual speech enhances auditory attention decoding with EEG
    Fu, Zhen
    Wu, Xihong
    Chen, Jing
    JOURNAL OF NEURAL ENGINEERING, 2019, 16 (06)