MaeFE: Masked Autoencoders Family of Electrocardiogram for Self-Supervised Pretraining and Transfer Learning

被引:4
|
作者
Zhang, Huaicheng [1 ]
Liu, Wenhan [1 ]
Shi, Jiguang [1 ]
Chang, Sheng [1 ]
Wang, Hao [1 ]
He, Jin [1 ]
Huang, Qijun [1 ]
机构
[1] Wuhan Univ, Sch Phys & Technol, Wuhan 430072, Hubei, Peoples R China
基金
中国国家自然科学基金;
关键词
Electrocardiography (ECG); mask autoencoder (MAE); pretraining; self-supervised learning; transfer learning;
D O I
10.1109/TIM.2022.3228267
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Electrocardiogram (ECG) is a universal diagnostic tool for heart disease, which can provide data for deep learning. The scarcity of labeled data is a major challenge for medical artificial intelligence diagnosis. Acquiring labeled medical data is time-consuming and high-cost because medical specialists are needed. As a kind of generative self-supervised learning method, a masked autoencoder (MAE) is capable to solve these problems. MAE family of ECG (MaeFE) is proposed in this article. Considering the temporal and spatial features of ECG, MaeFE contains three customized masking modes, including masked time autoencoder (MTAE), masked lead autoencoder (MLAE), and masked lead and time autoencoder (MLTAE). MTAE and MLAE pay greater attention to temporal features and spatial features, respectively. MLTAE is a multihead architecture that combines MTAE and MLAE. In the pretraining stage, ECG signals from the pretrain dataset are divided into patches and partially masked. The encoder transfers unmasked patches to tokens and the decoder reconstructs masked ones. In downstream tasks, the pretrained encoder is utilized as a classifier, which is arrhythmia classification performed in the downstream dataset. The process is the so-called transfer learning. MaeFE outperforms the state-of-the-art self-supervised learning methods, SimCLR, MoCo, CLOCS, and MaskUNet in downstream tasks. MTAE has the best comprehensive performance. Compared to contrastive learning models, MTAE achieves at least a 5.18%, 11.80%, and 3.23% increase in accuracy (Acc), Macro-F1, and area under the curve (AUC), respectively, using the linear probe. It also outperforms other models at 8.99% in Acc, 20.18% in Macro-F1, and 7.13% in AUC using fine-tuning. As another downstream task, experiments on the multilabel classification of arrhythmia are also conducted, which reflects the excellent generalization performance of MaeFE. Depending on experimental results, MaeFE turns out to be efficient and robust in downstream tasks. Overcoming the scarcity of labeled data, MaeFE is better than other self-supervised learning methods and achieves satisfying performance. Consequently, the algorithm in this article is on track of playing a major role in practical applications.
引用
下载
收藏
页数:15
相关论文
共 50 条
  • [31] Transfer learning application of self-supervised learning in ARPES
    Ekahana, Sandy Adhitia
    Winata, Genta Indra
    Soh, Y.
    Tamai, Anna
    Milan, Radovic
    Aeppli, Gabriel
    Shi, Ming
    MACHINE LEARNING-SCIENCE AND TECHNOLOGY, 2023, 4 (03):
  • [32] Prediction of freezing of gait based on self-supervised pretraining via contrastive learning
    Xia, Yi
    Sun, Hua
    Zhang, Baifu
    Xu, Yangyang
    Ye, Qiang
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2024, 89
  • [33] Meta-Learning and Self-Supervised Pretraining for Storm Event Imagery Translation
    Rugina, Ileana
    Dangovski, Rumen
    Simek, Olga
    Veillette, Mark
    Khorrami, Pooya
    Soljacic, Marin
    Cheung, Brian
    2023 IEEE HIGH PERFORMANCE EXTREME COMPUTING CONFERENCE, HPEC, 2023,
  • [34] Self-supervised Bernoulli Autoencoders for Semi-supervised Hashing
    Nanculef, Ricardo
    Mena, Francisco
    Macaluso, Antonio
    Lodi, Stefano
    Sartori, Claudio
    PROGRESS IN PATTERN RECOGNITION, IMAGE ANALYSIS, COMPUTER VISION, AND APPLICATIONS, CIARP 2021, 2021, 12702 : 258 - 268
  • [35] Progressive Self-Supervised Pretraining for Hyperspectral Image Classification
    Guan, Peiyan
    Lam, Edmund Y.
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2024, 62 : 1 - 13
  • [36] Self-supervised Pretraining Isolated Forest for Outlier Detection
    Liang, Dong
    Wang, Jun
    Gao, Xiaoyu
    Wang, Jiahui
    Zhao, Xiaoyong
    Wang, Lei
    2022 INTERNATIONAL CONFERENCE ON BIG DATA, INFORMATION AND COMPUTER NETWORK (BDICN 2022), 2022, : 306 - 310
  • [37] Self-Supervised Pretraining with DICOM metadata in Ultrasound Imaging
    Hu, Szu-Yeu
    Wang, Shuhang
    Weng, Wei-Hung
    Wang, JingChao
    Wang, XiaoHong
    Ozturk, Arinc
    Li, Qian
    Kumar, Viksit
    Samir, Anthony E.
    MACHINE LEARNING FOR HEALTHCARE CONFERENCE, VOL 126, 2020, 126 : 732 - 748
  • [38] How Useful is Self-Supervised Pretraining for Visual Tasks?
    Newell, Alejandro
    Deng, Jia
    2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2020, : 7343 - 7352
  • [39] Self-Supervised Learning Malware Traffic Classification Based on Masked Autoencoder
    Xu, Ke
    Zhang, Xixi
    Wang, Yu
    Ohtsuki, Tomoaki
    Adebisi, Bamidele
    Sari, Hikmet
    Gui, Guan
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (10): : 17330 - 17340
  • [40] GMAEEG: A Self-Supervised Graph Masked Autoencoder for EEG Representation Learning
    Fu, Zanhao
    Zhu, Huaiyu
    Zhao, Yisheng
    Huan, Ruohong
    Zhang, Yi
    Chen, Shuohui
    Pan, Yun
    IEEE Journal of Biomedical and Health Informatics, 2024, 28 (11): : 6486 - 6497