SAITS: Self-attention-based imputation for time series

被引:49
|
作者
Du, Wenjie [1 ]
Cote, David [2 ]
Liu, Yan [1 ]
机构
[1] Concordia Univ, Montreal, PQ H3G 1M8, Canada
[2] Ciena Corp, Ottawa, ON, Canada
基金
加拿大自然科学与工程研究理事会;
关键词
Time series; Missing values; Imputation model; Self-attention; Neural network; MULTIPLE IMPUTATION;
D O I
10.1016/j.eswa.2023.119619
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Missing data in time series is a pervasive problem that puts obstacles in the way of advanced analysis. A popular solution is imputation, where the fundamental challenge is to determine what values should be filled in. This paper proposes SAITS, a novel method based on the self-attention mechanism for missing value imputation in multivariate time series. Trained by a joint-optimization approach, SAITS learns missing values from a weighted combination of two diagonally-masked self-attention (DMSA) blocks. DMSA explicitly captures both the temporal dependencies and feature correlations between time steps, which improves imputation accuracy and training speed. Meanwhile, the weighted-combination design enables SAITS to dynamically assign weights to the learned representations from two DMSA blocks according to the attention map and the missingness information. Extensive experiments quantitatively and qualitatively demonstrate that SAITS outperforms the state-of-the-art methods on the time-series imputation task efficiently and reveal SAITS' potential to improve the learning performance of pattern recognition models on incomplete time-series data from the real world.
引用
收藏
页数:15
相关论文
共 50 条
  • [41] Self-attention-Based Dual-Branch Person Re-identification
    Gao, Peng
    Yue, Xiao
    Chen, Wei
    Chen, Dufeng
    Wang, Li
    Zhang, Tingxiu
    [J]. ADVANCED INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS, PT IV, ICIC 2024, 2024, 14865 : 210 - 219
  • [42] Enhancing heart disease prediction using a self-attention-based transformer model
    Rahman, Atta Ur
    Alsenani, Yousef
    Zafar, Adeel
    Ullah, Kalim
    Rabie, Khaled
    Shongwe, Thokozani
    [J]. SCIENTIFIC REPORTS, 2024, 14 (01)
  • [43] Hierarchical multimodal self-attention-based graph neural network for DTI prediction
    Bian, Jilong
    Lu, Hao
    Dong, Guanghui
    Wang, Guohua
    [J]. BRIEFINGS IN BIOINFORMATICS, 2024, 25 (04)
  • [44] Self-Attention-Based Models for the Extraction of Molecular Interactions from Biological Texts
    Srivastava, Prashant
    Bej, Saptarshi
    Yordanova, Kristina
    Wolkenhauer, Olaf
    [J]. BIOMOLECULES, 2021, 11 (11)
  • [45] A Self-Attention-Based Deep Reinforcement Learning Approach for AGV Dispatching Systems
    Wei, Qinglai
    Yan, Yutian
    Zhang, Jie
    Xiao, Jun
    Wang, Cong
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (06) : 7911 - 7922
  • [46] Mask-Based Neural Beamforming for Moving Speakers With Self-Attention-Based Tracking
    Ochiai, Tsubasa
    Delcroix, Marc
    Nakatani, Tomohiro
    Araki, Shoko
    [J]. IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2023, 31 : 835 - 848
  • [47] An eXplainable Self-Attention-Based Spatial–Temporal Analysis for Human Activity Recognition
    Meena, Tanushree
    Sarawadekar, Kishor
    [J]. IEEE SENSORS JOURNAL, 2024, 24 (01) : 635 - 644
  • [48] A Self-Attention-Based Neural Network for Predicting Immune Checkpoint Inhibitors Response
    Liu, J.
    Islam, M. T.
    Xing, L.
    [J]. INTERNATIONAL JOURNAL OF RADIATION ONCOLOGY BIOLOGY PHYSICS, 2023, 117 (02): : E475 - E476
  • [49] Enhancing heart disease prediction using a self-attention-based transformer model
    Atta Ur Rahman
    Yousef Alsenani
    Adeel Zafar
    Kalim Ullah
    Khaled Rabie
    Thokozani Shongwe
    [J]. Scientific Reports, 14
  • [50] A self-attention-based fusion framework for facial expression recognition in wavelet domain
    Indolia, Sakshi
    Nigam, Swati
    Singh, Rajiv
    [J]. Visual Computer, 40 (09): : 6341 - 6357