Self-Attentive Moving Average for Time Series Prediction

被引:7
|
作者
Su, Yaxi [1 ]
Cui, Chaoran [1 ]
Qu, Hao [2 ]
机构
[1] Shandong Univ Finance & Econ, Sch Comp Sci & Technol, 7366 East Second Ring Rd, Jinan 250014, Peoples R China
[2] Shandong Univ, Sch Software, Shunhua Rd, Jinan 250101, Peoples R China
来源
APPLIED SCIENCES-BASEL | 2022年 / 12卷 / 07期
基金
中国国家自然科学基金;
关键词
time series prediction; self-attention mechanism; moving average; multi-scale indicator bilinear fusion; OPPORTUNITIES; CHALLENGES; NETWORKS;
D O I
10.3390/app12073602
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
Time series prediction has been studied for decades due to its potential in a wide range of applications. As one of the most popular technical indicators, moving average summarizes the overall changing patterns over a past period and is frequently used to predict the future trend of time series. However, traditional moving average indicators are calculated by averaging the time series data with equal or predefined weights, and ignore the subtle difference in the importance of different time steps. Moreover, unchanged data weights will be applied across different time series, regardless of the differences in their inherent characteristics. In addition, the interaction between different dimensions of different indicators is ignored when using the moving averages of different scales to predict future trends. In this paper, we propose a learning-based moving average indicator, called the self-attentive moving average (SAMA). After encoding the input signals of time series based on recurrent neural networks, we introduce the self-attention mechanism to adaptively determine the data weights at different time steps for calculating the moving average. Furthermore, we use multiple self-attention heads to model the SAMA indicators of different scales, and finally combine them through a bilinear fusion network for time series prediction. Extensive experiments on two real-world datasets demonstrate the effectiveness of our approach. The data and codes of our work have been released.
引用
收藏
页数:12
相关论文
共 50 条
  • [1] Explicit Sparse Self-Attentive Network for CTR Prediction
    Luo, Yu
    Peng, Wanwan
    Fan, Youping
    Pang, Hong
    Xu, Xiang
    Wu, Xiaohua
    [J]. PROCEEDINGS OF THE 10TH INTERNATIONAL CONFERENCE OF INFORMATION AND COMMUNICATION TECHNOLOGY, 2021, 183 : 690 - 695
  • [2] Self-Attentive Associative Memory
    Le, Hung
    Tran, Truyen
    Venkatesh, Svetha
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [3] Self-Attentive Sequential Recommendation
    Kang, Wang-Cheng
    McAuley, Julian
    [J]. 2018 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2018, : 197 - 206
  • [4] On the Robustness of Self-Attentive Models
    Hsieh, Yu-Lun
    Cheng, Minhao
    Juan, Da-Cheng
    Wei, Wei
    Hsu, Wen-Lian
    Hsieh, Cho-Jui
    [J]. 57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 1520 - 1529
  • [5] Self-Attentive Models for Real-Time Malware Classification
    Lu, Qikai
    Zhang, Hongwen
    Kinawi, Husam
    Niu, Di
    [J]. IEEE ACCESS, 2022, 10 : 95970 - 95985
  • [6] Adversarial self-attentive time-variant neural networks for multi-step time series forecasting
    Gao, Changxia
    Zhang, Ning
    Li, Youru
    Lin, Yan
    Wan, Huaiyu
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2023, 231
  • [7] Denoising Self-Attentive Sequential Recommendation
    Chen, Huiyuan
    Lin, Yusan
    Pan, Menghai
    Wang, Lan
    Yeh, Chin-Chia Michael
    Li, Xiaoting
    Zheng, Yan
    Wang, Fei
    Yang, Hao
    [J]. PROCEEDINGS OF THE 16TH ACM CONFERENCE ON RECOMMENDER SYSTEMS, RECSYS 2022, 2022, : 92 - 101
  • [8] SAED: self-attentive energy disaggregation
    Virtsionis-Gkalinikis, Nikolaos
    Nalmpantis, Christoforos
    Vrakas, Dimitris
    [J]. MACHINE LEARNING, 2023, 112 (11) : 4081 - 4100
  • [9] SAED: self-attentive energy disaggregation
    Nikolaos Virtsionis-Gkalinikis
    Christoforos Nalmpantis
    Dimitris Vrakas
    [J]. Machine Learning, 2023, 112 : 4081 - 4100
  • [10] Non-local Self-attentive Autoencoder for Genetic Functionality Prediction
    Li, Yun
    Liu, Zhe
    Yao, Lina
    He, Zihuai
    [J]. CIKM '20: PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, 2020, : 2117 - 2120