Self-Supervised Contrastive Learning for Medical Time Series: A Systematic Review

被引:16
|
作者
Liu, Ziyu [1 ]
Alavi, Azadeh [1 ]
Li, Minyi [2 ]
Zhang, Xiang [3 ]
机构
[1] RMIT, Sch Comp Technol, Melbourne, Vic 3000, Australia
[2] Coles, Melbourne, Vic 3123, Australia
[3] Univ N Carolina, Dept Comp Sci, Charlotte, NC 28223 USA
基金
美国国家科学基金会;
关键词
self-supervised learning; medical time series; deep learning; healthcare; pretext tasks; contrastive learning; systematic review; RESEARCH RESOURCE; SLEEP; EEG; HEALTH; DATABASE; CLASSIFICATION; FRAMEWORK; CHILDREN; DATASET;
D O I
10.3390/s23094221
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Medical time series are sequential data collected over time that measures health-related signals, such as electroencephalography (EEG), electrocardiography (ECG), and intensive care unit (ICU) readings. Analyzing medical time series and identifying the latent patterns and trends that lead to uncovering highly valuable insights for enhancing diagnosis, treatment, risk assessment, and disease progression. However, data mining in medical time series is heavily limited by the sample annotation which is time-consuming and labor-intensive, and expert-depending. To mitigate this challenge, the emerging self-supervised contrastive learning, which has shown great success since 2020, is a promising solution. Contrastive learning aims to learn representative embeddings by contrasting positive and negative samples without the requirement for explicit labels. Here, we conducted a systematic review of how contrastive learning alleviates the label scarcity in medical time series based on PRISMA standards. We searched the studies in five scientific databases (IEEE, ACM, Scopus, Google Scholar, and PubMed) and retrieved 1908 papers based on the inclusion criteria. After applying excluding criteria, and screening at title, abstract, and full text levels, we carefully reviewed 43 papers in this area. Specifically, this paper outlines the pipeline of contrastive learning, including pre-training, fine-tuning, and testing. We provide a comprehensive summary of the various augmentations applied to medical time series data, the architectures of pre-training encoders, the types of fine-tuning classifiers and clusters, and the popular contrastive loss functions. Moreover, we present an overview of the different data types used in medical time series, highlight the medical applications of interest, and provide a comprehensive table of 51 public datasets that have been utilized in this field. In addition, this paper will provide a discussion on the promising future scopes such as providing guidance for effective augmentation design, developing a unified framework for analyzing hierarchical time series, and investigating methods for processing multimodal data. Despite being in its early stages, self-supervised contrastive learning has shown great potential in overcoming the need for expert-created annotations in the research of medical time series.
引用
收藏
页数:34
相关论文
共 50 条
  • [41] Contrasting Contrastive Self-Supervised Representation Learning Pipelines
    Kotar, Klemen
    Ilharco, Gabriel
    Schmidt, Ludwig
    Ehsani, Kiana
    Mottaghi, Roozbeh
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 9929 - 9939
  • [42] Contrastive self-supervised learning: review, progress, challenges and future research directions
    Kumar, Pranjal
    Rawat, Piyush
    Chauhan, Siddhartha
    INTERNATIONAL JOURNAL OF MULTIMEDIA INFORMATION RETRIEVAL, 2022,
  • [43] Malicious Traffic Identification with Self-Supervised Contrastive Learning
    Yang, Jin
    Jiang, Xinyun
    Liang, Gang
    Li, Siyu
    Ma, Zicheng
    SENSORS, 2023, 23 (16)
  • [44] Self-Supervised Learning on Graphs: Contrastive, Generative, or Predictive
    Wu, Lirong
    Lin, Haitao
    Tan, Cheng
    Gao, Zhangyang
    Li, Stan Z.
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (04) : 4216 - 4235
  • [45] Contrastive Self-Supervised Learning: A Survey on Different Architectures
    Khan, Adnan
    AlBarri, Sarah
    Manzoor, Muhammad Arslan
    PROCEEDINGS OF 2ND IEEE INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE (ICAI 2022), 2022, : 1 - 6
  • [46] Contrastive Self-Supervised Learning for Skeleton Action Recognition
    Gao, Xuehao
    Yang, Yang
    Du, Shaoyi
    NEURIPS 2020 WORKSHOP ON PRE-REGISTRATION IN MACHINE LEARNING, VOL 148, 2020, 148 : 51 - 61
  • [47] CONTRASTIVE SELF-SUPERVISED LEARNING FOR WIRELESS POWER CONTROL
    Naderializadeh, Navid
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 4965 - 4969
  • [48] Self-Supervised Contrastive Learning for Unsupervised Phoneme Segmentation
    Kreuk, Felix
    Keshet, Joseph
    Adi, Yossi
    INTERSPEECH 2020, 2020, : 3700 - 3704
  • [49] Self-supervised contrastive learning for implicit collaborative filtering
    Song, Shipeng
    Liu, Bin
    Teng, Fei
    Li, Tianrui
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2025, 139
  • [50] Contrastive self-supervised learning: review, progress, challenges and future research directions
    Kumar, Pranjal
    Rawat, Piyush
    Chauhan, Siddhartha
    INTERNATIONAL JOURNAL OF MULTIMEDIA INFORMATION RETRIEVAL, 2022, 11 (04) : 461 - 488