SAITS: Self-attention-based imputation for time series

被引:49
|
作者
Du, Wenjie [1 ]
Cote, David [2 ]
Liu, Yan [1 ]
机构
[1] Concordia Univ, Montreal, PQ H3G 1M8, Canada
[2] Ciena Corp, Ottawa, ON, Canada
基金
加拿大自然科学与工程研究理事会;
关键词
Time series; Missing values; Imputation model; Self-attention; Neural network; MULTIPLE IMPUTATION;
D O I
10.1016/j.eswa.2023.119619
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Missing data in time series is a pervasive problem that puts obstacles in the way of advanced analysis. A popular solution is imputation, where the fundamental challenge is to determine what values should be filled in. This paper proposes SAITS, a novel method based on the self-attention mechanism for missing value imputation in multivariate time series. Trained by a joint-optimization approach, SAITS learns missing values from a weighted combination of two diagonally-masked self-attention (DMSA) blocks. DMSA explicitly captures both the temporal dependencies and feature correlations between time steps, which improves imputation accuracy and training speed. Meanwhile, the weighted-combination design enables SAITS to dynamically assign weights to the learned representations from two DMSA blocks according to the attention map and the missingness information. Extensive experiments quantitatively and qualitatively demonstrate that SAITS outperforms the state-of-the-art methods on the time-series imputation task efficiently and reveal SAITS' potential to improve the learning performance of pattern recognition models on incomplete time-series data from the real world.
引用
收藏
页数:15
相关论文
共 50 条
  • [1] Dual-branch cross-dimensional self-attention-based imputation model for multivariate time series
    Fang, Le
    Xiang, Wei
    Zhou, Yuan
    Fang, Juan
    Chi, Lianhua
    Ge, Zongyuan
    [J]. KNOWLEDGE-BASED SYSTEMS, 2023, 279
  • [2] Spatiotemporal Self-Attention-Based LSTNet for Multivariate Time Series Prediction
    Wang, Dezheng
    Chen, Congyan
    [J]. INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, 2023, 2023
  • [3] A Self-Attention-Based Imputation Technique for Enhancing Tabular Data Quality
    Lee, Do-Hoon
    Kim, Han-joon
    [J]. DATA, 2023, 8 (06)
  • [4] Missing Value Imputation for Radar-Derived Time-Series Tracks of Aerial Targets Based on Improved Self-Attention-Based Network
    Song, Zihao
    Zhou, Yan
    Cheng, Wei
    Liang, Futai
    Zhang, Chenhao
    [J]. CMC-COMPUTERS MATERIALS & CONTINUA, 2024, 78 (03): : 3349 - 3376
  • [5] Self-Attention-Based Multivariate Anomaly Detection for CPS Time Series Data with Adversarial Autoencoders
    Li, Qiwen
    Yan, Tijin
    Yuan, Huanhuan
    Xia, Yuanqing
    [J]. 2022 41ST CHINESE CONTROL CONFERENCE (CCC), 2022, : 4251 - 4256
  • [6] Temporal self-attention-based Conv-LSTM network for multivariate time series prediction
    Fu, En
    Zhang, Yinong
    Yang, Fan
    Wang, Shuying
    [J]. NEUROCOMPUTING, 2022, 501 : 162 - 173
  • [7] Self-attention-based time-variant neural networks for multi-step time series forecasting
    Gao, Changxia
    Zhang, Ning
    Li, Youru
    Bian, Feng
    Wan, Huaiyu
    [J]. NEURAL COMPUTING & APPLICATIONS, 2022, 34 (11): : 8737 - 8754
  • [8] Self-attention-based time-variant neural networks for multi-step time series forecasting
    Changxia Gao
    Ning Zhang
    Youru Li
    Feng Bian
    Huaiyu Wan
    [J]. Neural Computing and Applications, 2022, 34 : 8737 - 8754
  • [9] Self-attention-based Group Recommendation
    Yang, Xiaoping
    Shi, Yuliang
    [J]. PROCEEDINGS OF 2020 IEEE 4TH INFORMATION TECHNOLOGY, NETWORKING, ELECTRONIC AND AUTOMATION CONTROL CONFERENCE (ITNEC 2020), 2020, : 2540 - 2546
  • [10] STING: Self-attention based Time-series Imputation Networks using GAN
    Oh, Eunkyu
    Kim, Taehun
    Ji, Yunhu
    Khyalia, Sushil
    [J]. 2021 21ST IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM 2021), 2021, : 1264 - 1269