SAITS: Self-attention-based imputation for time series

被引:49
|
作者
Du, Wenjie [1 ]
Cote, David [2 ]
Liu, Yan [1 ]
机构
[1] Concordia Univ, Montreal, PQ H3G 1M8, Canada
[2] Ciena Corp, Ottawa, ON, Canada
基金
加拿大自然科学与工程研究理事会;
关键词
Time series; Missing values; Imputation model; Self-attention; Neural network; MULTIPLE IMPUTATION;
D O I
10.1016/j.eswa.2023.119619
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Missing data in time series is a pervasive problem that puts obstacles in the way of advanced analysis. A popular solution is imputation, where the fundamental challenge is to determine what values should be filled in. This paper proposes SAITS, a novel method based on the self-attention mechanism for missing value imputation in multivariate time series. Trained by a joint-optimization approach, SAITS learns missing values from a weighted combination of two diagonally-masked self-attention (DMSA) blocks. DMSA explicitly captures both the temporal dependencies and feature correlations between time steps, which improves imputation accuracy and training speed. Meanwhile, the weighted-combination design enables SAITS to dynamically assign weights to the learned representations from two DMSA blocks according to the attention map and the missingness information. Extensive experiments quantitatively and qualitatively demonstrate that SAITS outperforms the state-of-the-art methods on the time-series imputation task efficiently and reveal SAITS' potential to improve the learning performance of pattern recognition models on incomplete time-series data from the real world.
引用
收藏
页数:15
相关论文
共 50 条
  • [31] Self-Attention-Based Deep Learning Network for Regional Influenza Forecasting
    Jung, Seungwon
    Moon, Jaeuk
    Park, Sungwoo
    Hwang, Eenjun
    [J]. IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, 2022, 26 (02) : 922 - 933
  • [32] Syntax-Enhanced Self-Attention-Based Semantic Role Labeling
    Zhang, Yue
    Wang, Rui
    Si, Luo
    [J]. 2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 616 - 626
  • [33] Medical multivariate time series imputation and forecasting based on a recurrent conditional Wasserstein GAN and attention
    Festag, Sven
    Spreckelsen, Cord
    [J]. JOURNAL OF BIOMEDICAL INFORMATICS, 2023, 139
  • [34] Attention-based generative adversarial networks for aquaponics environment time series data imputation
    Zhong, Keyang
    Sun, Xueqian
    Liu, Gedi
    Jiang, Yifeng
    Ouyang, Yi
    Wang, Yang
    [J]. Information Processing in Agriculture, 2024, 11 (04) : 542 - 551
  • [35] A dual-head attention model for time series data imputation
    Zhang, Yifan
    Thorburn, Peter J.
    [J]. COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2021, 189
  • [36] Text Simplification with Self-Attention-Based Pointer-Generator Networks
    Li, Tianyu
    Li, Yun
    Qiang, Jipeng
    Yuan, Yun-Hao
    [J]. NEURAL INFORMATION PROCESSING (ICONIP 2018), PT V, 2018, 11305 : 537 - 545
  • [37] Automated vein verification using self-attention-based convolutional neural networks
    Kocakulak, Mustafa
    Avci, Adem
    Acir, Nurettin
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2023, 230
  • [38] Self-attention-Based Dual-Branch Person Re-identification
    Gao, Peng
    Yue, Xiao
    Chen, Wei
    Chen, Dufeng
    Wang, Li
    Zhang, Tingxiu
    [J]. ADVANCED INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS, PT IV, ICIC 2024, 2024, 14865 : 210 - 219
  • [39] A Dynamic Self-Attention-Based Fault Diagnosis Method for Belt Conveyor Idlers
    Liu, Yi
    Miao, Changyun
    Li, Xianguo
    Ji, Jianhua
    Meng, Dejun
    Wang, Yimin
    [J]. MACHINES, 2023, 11 (02)
  • [40] Self-Attention-Based Deep Feature Fusion for Remote Sensing Scene Classification
    Cao, Ran
    Fang, Leyuan
    Lu, Ting
    He, Nanjun
    [J]. IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2021, 18 (01) : 43 - 47