An adaptive cross-class transfer learning framework with two-level alignment

被引:2
|
作者
Xu, Dong-qin [1 ]
Sun, Yan-jun [1 ]
Li, Ming-ai [1 ,2 ,3 ]
机构
[1] Beijing Univ Technol, Fac Informat Technol, Beijing 100124, Peoples R China
[2] Beijing Key Lab Computat Intelligence & Intelligen, Beijing 100124, Peoples R China
[3] Minist Educ, Engn Res Cent Digital Community, Beijing 100124, Peoples R China
基金
中国国家自然科学基金;
关键词
Motor imagery; Cross-class; Two-level alignment; Transfer learning; Knowledge distillation; BRAIN-COMPUTER INTERFACES; DOMAIN ADAPTATION; SELECTION;
D O I
10.1016/j.bspc.2023.105155
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Because of insufficient data, cross-class transfer learning has potential prospects in motor imagery electroencephalogram (MI-EEG) based rehabilitation engineering, and it has not been effectively addressed to reduce cross-class variability between source and target domains and find the best class-to-class transitive correspondence (CCTC). In this paper, we propose an adaptive cross-class transfer learning (ACTL) framework with twolevel alignment (TLA) for MI decoding. By using the fast partitioning around medoids method, the central samples of each class are acquired from the source domain and target domain respectively. Then, they are applied to construct the central alignment matrix to realize the 1st-level domain alignment for any possible CCTC case. The 2nd-level subject alignment is performed by Euclidean alignment for each class of aligned source domain. Furthermore, the central samples of the target domain together with the two-level aligned source domain are mapped into tangent space, the resulting features are fed to a teacher convolutional neural network (TCNN), it and the transferred student CNN (SCNN) are trained successively, and their parameters of distillation loss are optimized automatically by a scaling-based grid search method. Experiments are conducted on a public MI-EEG dataset with multiple subjects and MI-tasks, the proposed framework can find the optimal SCNN associated with the best CCTC, which achieves a statistically significant average classification accuracy of 86.37%. The results suggest that TLA is helpful for increasing the distribution similarity between different domains, and the knowledge distillation embedded in ACTL framework greatly simplifies the SCNN and outperforms the complicated TCNN.
引用
收藏
页数:22
相关论文
共 50 条
  • [1] Active Learning with Cross-Class Knowledge Transfer
    Guo, Yuchen
    Ding, Guiguang
    Wang, Yuqi
    Jin, Xiaoming
    THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2016, : 1624 - 1630
  • [2] Active Learning with Cross-Class Similarity Transfer
    Guo, Yuchen
    Ding, Guiguang
    Gao, Yue
    Han, Jungong
    THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 1338 - 1344
  • [3] Joint Semantic and Latent Attribute Modelling for Cross-Class Transfer Learning
    Peng, Peixi
    Tian, Yonghong
    Xiang, Tao
    Wang, Yaowei
    Pontil, Massimiliano
    Huang, Tiejun
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2018, 40 (07) : 1625 - 1638
  • [4] Cross-Class Feature Augmentation for Class Incremental Learning
    Kim, Taehoon
    Park, Jaeyoo
    Han, Bohyung
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 12, 2024, : 13168 - 13176
  • [5] A Two-Level Transfer Learning Algorithm for Evolutionary Multitasking
    Ma, Xiaoliang
    Chen, Qunjian
    Yu, Yanan
    Sun, Yiwen
    Ma, Lijia
    Zhu, Zexuan
    FRONTIERS IN NEUROSCIENCE, 2020, 13
  • [6] Traces of Class/Cross-Class Structure Pervade Deep Learning Spectra
    Papyan, Vardan
    JOURNAL OF MACHINE LEARNING RESEARCH, 2020, 21
  • [7] Unsupervised domain adaptive segmentation algorithm based on two-level category alignment
    Dong, Wenyong
    Liang, Zhixue
    Wang, Liping
    Tian, Gang
    Long, Qianhui
    NEURAL NETWORKS, 2024, 177
  • [8] Traces of class/cross-class structure pervade deep learning spectra
    Papyan, Vardan
    Journal of Machine Learning Research, 2020, 21
  • [9] CROSS-CLASS ONLINE TALKS: LEARNING BEYOND CLASSROOM WALLS
    Hashemi, S. Sofkova
    EDULEARN14: 6TH INTERNATIONAL CONFERENCE ON EDUCATION AND NEW LEARNING TECHNOLOGIES, 2014, : 1754 - 1754
  • [10] Imbalanced Classification Method Based on Cross-Class Sample Migration Framework
    Yu, Haibo
    Liu, Jing
    Li, Qiangwei
    Gao, Xin
    Tan, Huang
    Chen, Tianyang
    Computer Engineering and Applications, 2024, 60 (16) : 143 - 158