SAITS: Self-attention-based imputation for time series

被引:49
|
作者
Du, Wenjie [1 ]
Cote, David [2 ]
Liu, Yan [1 ]
机构
[1] Concordia Univ, Montreal, PQ H3G 1M8, Canada
[2] Ciena Corp, Ottawa, ON, Canada
基金
加拿大自然科学与工程研究理事会;
关键词
Time series; Missing values; Imputation model; Self-attention; Neural network; MULTIPLE IMPUTATION;
D O I
10.1016/j.eswa.2023.119619
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Missing data in time series is a pervasive problem that puts obstacles in the way of advanced analysis. A popular solution is imputation, where the fundamental challenge is to determine what values should be filled in. This paper proposes SAITS, a novel method based on the self-attention mechanism for missing value imputation in multivariate time series. Trained by a joint-optimization approach, SAITS learns missing values from a weighted combination of two diagonally-masked self-attention (DMSA) blocks. DMSA explicitly captures both the temporal dependencies and feature correlations between time steps, which improves imputation accuracy and training speed. Meanwhile, the weighted-combination design enables SAITS to dynamically assign weights to the learned representations from two DMSA blocks according to the attention map and the missingness information. Extensive experiments quantitatively and qualitatively demonstrate that SAITS outperforms the state-of-the-art methods on the time-series imputation task efficiently and reveal SAITS' potential to improve the learning performance of pattern recognition models on incomplete time-series data from the real world.
引用
下载
收藏
页数:15
相关论文
共 50 条
  • [1] Dual-branch cross-dimensional self-attention-based imputation model for multivariate time series
    Fang, Le
    Xiang, Wei
    Zhou, Yuan
    Fang, Juan
    Chi, Lianhua
    Ge, Zongyuan
    KNOWLEDGE-BASED SYSTEMS, 2023, 279
  • [2] Spatiotemporal Self-Attention-Based LSTNet for Multivariate Time Series Prediction
    Wang, Dezheng
    Chen, Congyan
    INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, 2023, 2023
  • [3] A Self-Attention-Based Imputation Technique for Enhancing Tabular Data Quality
    Lee, Do-Hoon
    Kim, Han-joon
    DATA, 2023, 8 (06)
  • [4] Missing Value Imputation for Radar-Derived Time-Series Tracks of Aerial Targets Based on Improved Self-Attention-Based Network
    Song, Zihao
    Zhou, Yan
    Cheng, Wei
    Liang, Futai
    Zhang, Chenhao
    CMC-COMPUTERS MATERIALS & CONTINUA, 2024, 78 (03): : 3349 - 3376
  • [5] Self-attention-based graph transformation learning for anomaly detection in multivariate time series
    Qiushi Wang
    Yueming Zhu
    Zhicheng Sun
    Dong Li
    Yunbin Ma
    Complex & Intelligent Systems, 2025, 11 (5)
  • [6] Self-Attention-Based Multivariate Anomaly Detection for CPS Time Series Data with Adversarial Autoencoders
    Li, Qiwen
    Yan, Tijin
    Yuan, Huanhuan
    Xia, Yuanqing
    2022 41ST CHINESE CONTROL CONFERENCE (CCC), 2022, : 4251 - 4256
  • [7] Temporal self-attention-based Conv-LSTM network for multivariate time series prediction
    Fu, En
    Zhang, Yinong
    Yang, Fan
    Wang, Shuying
    NEUROCOMPUTING, 2022, 501 : 162 - 173
  • [8] Self-attention-based time-variant neural networks for multi-step time series forecasting
    Gao, Changxia
    Zhang, Ning
    Li, Youru
    Bian, Feng
    Wan, Huaiyu
    NEURAL COMPUTING & APPLICATIONS, 2022, 34 (11): : 8737 - 8754
  • [9] Self-attention-based time-variant neural networks for multi-step time series forecasting
    Changxia Gao
    Ning Zhang
    Youru Li
    Feng Bian
    Huaiyu Wan
    Neural Computing and Applications, 2022, 34 : 8737 - 8754
  • [10] Self-attention-based Group Recommendation
    Yang, Xiaoping
    Shi, Yuliang
    PROCEEDINGS OF 2020 IEEE 4TH INFORMATION TECHNOLOGY, NETWORKING, ELECTRONIC AND AUTOMATION CONTROL CONFERENCE (ITNEC 2020), 2020, : 2540 - 2546