Self-Supervised Transformer for Sparse and Irregularly Sampled Multivariate Clinical Time-Series

被引:31
|
作者
Tipirneni, Sindhu [1 ]
Reddy, Chandan K. [1 ]
机构
[1] Virginia Tech, 900 N Glebe Rd, Arlington, VA 22203 USA
基金
美国国家科学基金会;
关键词
Time-series; neural networks; deep learning; healthcare; Transformer; self-supervised learning;
D O I
10.1145/3516367
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Multivariate time-series data are frequently observed in critical care settings and are typically characterized by sparsity (missing information) and irregular time intervals. Existing approaches for learning representations in this domain handle these challenges by either aggregation or imputation of values, which in-turn suppresses the fine-grained information and adds undesirable noise/overhead into the machine learning model. To tackle this problem, we propose a Self-supervised Transformer for Time-Series (STraTS) model, which overcomes these pitfalls by treating time-series as a set of observation triplets instead of using the standard dense matrix representation. It employs a novel Continuous Value Embedding technique to encode continuous time and variable values without the need for discretization. It is composed of a Transformer component with multi-head attention layers, which enable it to learn contextual triplet embeddings while avoiding the problems of recurrence and vanishing gradients that occur in recurrent architectures. In addition, to tackle the problem of limited availability of labeled data (which is typically observed in many healthcare applications), STraTS utilizes self-supervision by leveraging unlabeled data to learn better representations by using time-series forecasting as an auxiliary proxy task. Experiments on real-world multivariate clinical time-series benchmark datasets demonstrate that STraTS has better prediction performance than state-of-the-art methods for mortality prediction, especially when labeled data is limited. Finally, we also present an interpretable version of STraTS, which can identify important measurements in the time-series data. Our data preprocessing and model implementation codes are available at https://github.com/sindhura97/STraTS.
引用
收藏
页数:17
相关论文
共 50 条
  • [41] InverseTime: A Self-Supervised Technique for Semi-Supervised Classification of Time Series
    Goyo, Manuel Alejandro
    Nanculef, Ricardo
    Valle, Carlos
    [J]. IEEE Access, 2024, 12 : 165081 - 165093
  • [42] Semi-supervised Time Series Classification Model with Self-supervised Learning
    Xi, Liang
    Yun, Zichao
    Liu, Han
    Wang, Ruidong
    Huang, Xunhua
    Fan, Haoyi
    [J]. ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2022, 116
  • [43] Sparse transformer with local and seasonal adaptation for multivariate time series forecasting
    Zhang, Yifan
    Wu, Rui
    Dascalu, Sergiu M.
    Harris Jr, Frederick C.
    [J]. SCIENTIFIC REPORTS, 2024, 14 (01):
  • [44] Time pattern reconstruction for classification of irregularly sampled time series
    Sun, Chenxi
    Li, Hongyan
    Song, Moxian
    Cai, Derun
    Zhang, Baofeng
    Hong, Shenda
    [J]. Pattern Recognition, 2024, 147
  • [45] Time pattern reconstruction for classification of irregularly sampled time series
    Sun, Chenxi
    Li, Hongyan
    Song, Moxian
    Cai, Derun
    Zhang, Baofeng
    Hong, Shenda
    [J]. PATTERN RECOGNITION, 2024, 147
  • [46] Self-Supervised Autoregressive Domain Adaptation for Time Series Data
    Ragab, Mohamed
    Eldele, Emadeldeen
    Chen, Zhenghua
    Wu, Min
    Kwoh, Chee-Keong
    Li, Xiaoli
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (01) : 1341 - 1351
  • [47] Self-Supervised Pre-training for Time Series Classification
    Shi, Pengxiang
    Ye, Wenwen
    Qin, Zheng
    [J]. 2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [48] Transformation cost spectrum for irregularly sampled time series
    Celik Ozdes
    Deniz Eroglu
    [J]. The European Physical Journal Special Topics, 2023, 232 : 35 - 46
  • [49] Latent ODEs for Irregularly-Sampled Time Series
    Rubanova, Yulia
    Chen, Ricky T. Q.
    Duvenaud, David
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [50] TransDBC: Transformer for Multivariate Time-Series based Driver Behavior Classification
    Vyas, Jayant
    Bhardwaj, Nishit
    Bhumika
    Das, Debasis
    [J]. 2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,