SAITS: Self-attention-based imputation for time series

被引:49
|
作者
Du, Wenjie [1 ]
Cote, David [2 ]
Liu, Yan [1 ]
机构
[1] Concordia Univ, Montreal, PQ H3G 1M8, Canada
[2] Ciena Corp, Ottawa, ON, Canada
基金
加拿大自然科学与工程研究理事会;
关键词
Time series; Missing values; Imputation model; Self-attention; Neural network; MULTIPLE IMPUTATION;
D O I
10.1016/j.eswa.2023.119619
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Missing data in time series is a pervasive problem that puts obstacles in the way of advanced analysis. A popular solution is imputation, where the fundamental challenge is to determine what values should be filled in. This paper proposes SAITS, a novel method based on the self-attention mechanism for missing value imputation in multivariate time series. Trained by a joint-optimization approach, SAITS learns missing values from a weighted combination of two diagonally-masked self-attention (DMSA) blocks. DMSA explicitly captures both the temporal dependencies and feature correlations between time steps, which improves imputation accuracy and training speed. Meanwhile, the weighted-combination design enables SAITS to dynamically assign weights to the learned representations from two DMSA blocks according to the attention map and the missingness information. Extensive experiments quantitatively and qualitatively demonstrate that SAITS outperforms the state-of-the-art methods on the time-series imputation task efficiently and reveal SAITS' potential to improve the learning performance of pattern recognition models on incomplete time-series data from the real world.
引用
收藏
页数:15
相关论文
共 50 条
  • [31] Temporal Self-Attention-Based Residual Network for An Environmental Sound Classification
    Tripathi, Achyut Mani
    Paul, Konark
    [J]. INTERSPEECH 2022, 2022, : 1516 - 1520
  • [32] PROXIMITY-AWARE SELF-ATTENTION-BASED SEQUENTIAL LOCATION RECOMMENDATION
    Luo, Xuan
    Huang, Mingqing
    Lv, Rui
    Zhao, Hui
    [J]. International Journal of Innovative Computing, Information and Control, 2024, 20 (05): : 1277 - 1299
  • [33] Attention-based generative adversarial networks for aquaponics environment time series data imputation
    Zhong K.
    Sun X.
    Liu G.
    Jiang Y.
    Ouyang Y.
    Wang Y.
    [J]. Information Processing in Agriculture, 2024, 11 (04) : 542 - 551
  • [34] Medical multivariate time series imputation and forecasting based on a recurrent conditional Wasserstein GAN and attention
    Festag, Sven
    Spreckelsen, Cord
    [J]. JOURNAL OF BIOMEDICAL INFORMATICS, 2023, 139
  • [35] Syntax-Enhanced Self-Attention-Based Semantic Role Labeling
    Zhang, Yue
    Wang, Rui
    Si, Luo
    [J]. 2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 616 - 626
  • [36] A dual-head attention model for time series data imputation
    Zhang, Yifan
    Thorburn, Peter J.
    [J]. COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2021, 189
  • [37] Text Simplification with Self-Attention-Based Pointer-Generator Networks
    Li, Tianyu
    Li, Yun
    Qiang, Jipeng
    Yuan, Yun-Hao
    [J]. NEURAL INFORMATION PROCESSING (ICONIP 2018), PT V, 2018, 11305 : 537 - 545
  • [38] A self-attention-based fusion framework for facial expression recognition in wavelet domain
    Indolia, Sakshi
    Nigam, Swati
    Singh, Rajiv
    [J]. VISUAL COMPUTER, 2024, 40 (09): : 6341 - 6357
  • [39] Self-Attention-Based Deep Feature Fusion for Remote Sensing Scene Classification
    Cao, Ran
    Fang, Leyuan
    Lu, Ting
    He, Nanjun
    [J]. IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2021, 18 (01) : 43 - 47
  • [40] A Dynamic Self-Attention-Based Fault Diagnosis Method for Belt Conveyor Idlers
    Liu, Yi
    Miao, Changyun
    Li, Xianguo
    Ji, Jianhua
    Meng, Dejun
    Wang, Yimin
    [J]. MACHINES, 2023, 11 (02)