Self-Supervised Transformer for Sparse and Irregularly Sampled Multivariate Clinical Time-Series

被引:31
|
作者
Tipirneni, Sindhu [1 ]
Reddy, Chandan K. [1 ]
机构
[1] Virginia Tech, 900 N Glebe Rd, Arlington, VA 22203 USA
基金
美国国家科学基金会;
关键词
Time-series; neural networks; deep learning; healthcare; Transformer; self-supervised learning;
D O I
10.1145/3516367
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Multivariate time-series data are frequently observed in critical care settings and are typically characterized by sparsity (missing information) and irregular time intervals. Existing approaches for learning representations in this domain handle these challenges by either aggregation or imputation of values, which in-turn suppresses the fine-grained information and adds undesirable noise/overhead into the machine learning model. To tackle this problem, we propose a Self-supervised Transformer for Time-Series (STraTS) model, which overcomes these pitfalls by treating time-series as a set of observation triplets instead of using the standard dense matrix representation. It employs a novel Continuous Value Embedding technique to encode continuous time and variable values without the need for discretization. It is composed of a Transformer component with multi-head attention layers, which enable it to learn contextual triplet embeddings while avoiding the problems of recurrence and vanishing gradients that occur in recurrent architectures. In addition, to tackle the problem of limited availability of labeled data (which is typically observed in many healthcare applications), STraTS utilizes self-supervision by leveraging unlabeled data to learn better representations by using time-series forecasting as an auxiliary proxy task. Experiments on real-world multivariate clinical time-series benchmark datasets demonstrate that STraTS has better prediction performance than state-of-the-art methods for mortality prediction, especially when labeled data is limited. Finally, we also present an interpretable version of STraTS, which can identify important measurements in the time-series data. Our data preprocessing and model implementation codes are available at https://github.com/sindhura97/STraTS.
引用
收藏
页数:17
相关论文
共 50 条
  • [1] Compatible Transformer for Irregularly Sampled Multivariate Time Series
    Wei, Yuxi
    Peng, Juntong
    He, Tong
    Xu, Chenxin
    Zhang, Jian
    Pan, Shirui
    Chen, Siheng
    [J]. arXiv, 2023,
  • [2] TransEHR: Self-Supervised Transformer for Clinical Time Series Data
    Xu, Yanbo
    Xu, Shangqing
    Ramprassad, Manav
    Tumanov, Alexey
    Zhang, Chao
    [J]. MACHINE LEARNING FOR HEALTH, ML4H, VOL 225, 2023, 225 : 623 - 635
  • [3] Self-supervised Classification of Clinical Multivariate Time Series using Time Series Dynamics
    Yehuda, Yakir
    Freedman, Daniel
    Radinsky, Kira
    [J]. PROCEEDINGS OF THE 29TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2023, 2023, : 5416 - 5427
  • [4] Irregularly sampled seismic data interpolation with self-supervised learning
    Fang, Wenqian
    Fu, Lihua
    Wu, Mengyi
    Yue, Jingnan
    Li, Hongwei
    [J]. GEOPHYSICS, 2023, 88 (03) : V175 - V185
  • [5] Time Series as Images: Vision Transformer for Irregularly Sampled Time Series
    Li, Zekun
    Li, Shiyang
    Yan, Xifeng
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [6] Mixture of multivariate Gaussian processes for classification of irregularly sampled satellite image time-series
    Constantin, Alexandre
    Fauvel, Mathieu
    Girard, Stephane
    [J]. STATISTICS AND COMPUTING, 2022, 32 (05)
  • [7] Mixture of multivariate Gaussian processes for classification of irregularly sampled satellite image time-series
    Alexandre Constantin
    Mathieu Fauvel
    Stéphane Girard
    [J]. Statistics and Computing, 2022, 32
  • [8] Contrastive learning based self-supervised time-series analysis
    Poppelbaum, Johannes
    Chadha, Gavneet Singh
    Schwung, Andreas
    [J]. APPLIED SOFT COMPUTING, 2022, 117
  • [9] Self-supervised pre-training on industrial time-series
    Biggio, Luca
    Kastanis, Iason
    [J]. 2021 8TH SWISS CONFERENCE ON DATA SCIENCE, SDS, 2021, : 56 - 57
  • [10] Compatible Transformer for Irregularly Sa: Multivariate Time Series
    Wei, Yuxi
    Peng, Juntong
    He, Tong
    Xu, Chenxin
    Zhang, Jian
    Pan, Shirui
    Chen, Siheng
    [J]. 23RD IEEE INTERNATIONAL CONFERENCE ON DATA MINING, ICDM 2023, 2023, : 1409 - 1414