Dual Attention Relation Network With Fine-Tuning for Few-Shot EEG Motor Imagery Classification

被引:8
|
作者
An, Sion [1 ]
Kim, Soopil [1 ]
Chikontwe, Philip [1 ]
Park, Sang Hyun [1 ]
机构
[1] Daegu Gyeongbuk Inst Sci & Technol DGIST, Daegu 42988, South Korea
关键词
Brain-computer interfaces; electroencephalography (EEG); few-shot classification; meta-learning; motor imagery (MI); NEURAL-NETWORKS;
D O I
10.1109/TNNLS.2023.3287181
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recently, motor imagery (MI) electroencephalography (EEG) classification techniques using deep learning have shown improved performance over conventional techniques. However, improving the classification accuracy on unseen subjects is still challenging due to intersubject variability, scarcity of labeled unseen subject data, and low signal-to-noise ratio (SNR). In this context, we propose a novel two-way few-shot network able to efficiently learn how to learn representative features of unseen subject categories and classify them with limited MI EEG data. The pipeline includes an embedding module that learns feature representations from a set of signals, a temporal-attention module to emphasize important temporal features, an aggregation-attention module for key support signal discovery, and a relation module for final classification based on relation scores between a support set and a query signal. In addition to the unified learning of feature similarity and a few-shot classifier, our method can emphasize informative features in support data relevant to the query, which generalizes better on unseen subjects. Furthermore, we propose to fine-tune the model before testing by arbitrarily sampling a query signal from the provided support set to adapt to the distribution of the unseen subject. We evaluate our proposed method with three different embedding modules on cross-subject and cross-dataset classification tasks using brain-computer interface (BCI) competition IV 2a, 2b, and GIST datasets. Extensive experiments show that our model significantly improves over the baselines and outperforms existing few-shot approaches.
引用
收藏
页码:1 / 15
页数:15
相关论文
共 50 条
  • [1] Few-Shot Relation Learning with Attention for EEG-based Motor Imagery Classification
    An, Sion
    Kim, Soopil
    Chikontwe, Philip
    Park, Sang Hyun
    [J]. 2020 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2020, : 10933 - 10938
  • [2] Hybrid Fine-Tuning Strategy for Few-Shot Classification
    Zhao, Lei
    Ou, Zhonghua
    Zhang, Lixun
    Li, Shuxiao
    [J]. COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2022, 2022
  • [3] Fine-Tuning for Few-Shot Image Classification by Multimodal Prototype Regularization
    Wu, Qianhao
    Qi, Jiaxin
    Zhang, Dong
    Zhang, Hanwang
    Tang, Jinhui
    [J]. IEEE TRANSACTIONS ON MULTIMEDIA, 2024, 26 : 8543 - 8556
  • [4] Singular Value Fine-tuning: Few-shot Segmentation requires Few-parameters Fine-tuning
    Sun, Yanpeng
    Chen, Qiang
    He, Xiangyu
    Wang, Jian
    Feng, Haocheng
    Han, Junyu
    Ding, Errui
    Cheng, Jian
    Li, Zechao
    Wang, Jingdong
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [5] Total Relation Network with Attention for Few-Shot Image Classification
    Li X.-X.
    Liu Z.-Y.
    Wu J.-J.
    Cao J.
    Ma Z.-Y.
    [J]. Jisuanji Xuebao/Chinese Journal of Computers, 2023, 46 (02): : 371 - 384
  • [6] Adaptive fine-tuning strategy for few-shot learning
    Zhuang, Xinkai
    Shao, Mingwen
    Gao, Wei
    Yang, Jianxin
    [J]. JOURNAL OF ELECTRONIC IMAGING, 2022, 31 (06)
  • [7] Embedding Hallucination for Few-Shot Language Fine-tuning
    Jian, Yiren
    Gao, Chongyang
    Vosoughi, Soroush
    [J]. NAACL 2022: THE 2022 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES, 2022, : 5522 - 5530
  • [8] Network Pruning and Fine-tuning for Few-shot Industrial Image Anomaly Detection
    Zhang, Jie
    Suganuma, Masanori
    Okatani, Takayuki
    [J]. 2023 IEEE 21ST INTERNATIONAL CONFERENCE ON INDUSTRIAL INFORMATICS, INDIN, 2023,
  • [9] EFTNet: an efficient fine-tuning method for few-shot segmentation
    Li, Jiaguang
    Wang, Yubo
    Gao, Zihan
    Wei, Ying
    [J]. APPLIED INTELLIGENCE, 2024, 54 (19) : 9488 - 9507
  • [10] Cross Attention Network for Few-shot Classification
    Hou, Ruibing
    Chang, Hong
    Ma, Bingpeng
    Shan, Shiguang
    Chen, Xilin
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32