Time-Series Neural Network: A High-Accuracy Time-Series Forecasting Method Based on Kernel Filter and Time Attention

被引:7
|
作者
Zhang, Lexin [1 ]
Wang, Ruihan [1 ]
Li, Zhuoyuan [1 ]
Li, Jiaxun [1 ]
Ge, Yichen [1 ]
Wa, Shiyun [2 ]
Huang, Sirui [1 ]
Lv, Chunli [1 ]
机构
[1] China Agr Univ, Beijing 100083, Peoples R China
[2] Imperial Coll London, Appl Computat Sci & Engn, South Kensington Campus, London SW7 2AZ, England
基金
中国国家自然科学基金;
关键词
time-series forecasting; deep learning; time-series neural network; time attention; MACHINE; MODELS;
D O I
10.3390/info14090500
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This research introduces a novel high-accuracy time-series forecasting method, namely the Time Neural Network (TNN), which is based on a kernel filter and time attention mechanism. Taking into account the complex characteristics of time-series data, such as non-linearity, high dimensionality, and long-term dependence, the TNN model is designed and implemented. The key innovations of the TNN model lie in the incorporation of the time attention mechanism and kernel filter, allowing the model to allocate different weights to features at each time point, and extract high-level features from the time-series data, thereby improving the model's predictive accuracy. Additionally, an adaptive weight generator is integrated into the model, enabling the model to automatically adjust weights based on input features. Mainstream time-series forecasting models such as Recurrent Neural Networks (RNNs) and Long Short-Term Memory Networks (LSTM) are employed as baseline models and comprehensive comparative experiments are conducted. The results indicate that the TNN model significantly outperforms the baseline models in both long-term and short-term prediction tasks. Specifically, the RMSE, MAE, and R2 reach 0.05, 0.23, and 0.95, respectively. Remarkably, even for complex time-series data that contain a large amount of noise, the TNN model still maintains a high prediction accuracy.
引用
收藏
页数:18
相关论文
共 50 条
  • [31] Time-Series Forecasting Based on Multi-Layer Attention Architecture
    Wang, Na
    Zhao, Xianglian
    [J]. KSII TRANSACTIONS ON INTERNET AND INFORMATION SYSTEMS, 2024, 18 (01): : 1 - 14
  • [32] A non-parametric softmax for improving neural attention in time-series forecasting
    Totaro, Simone
    Hussain, Amir
    Scardapane, Simone
    [J]. NEUROCOMPUTING, 2020, 381 : 177 - 185
  • [33] A Multiscale Interactive Recurrent Network for Time-Series Forecasting
    Chen, Donghui
    Chen, Ling
    Zhang, Youdong
    Wen, Bo
    Yang, Chenghu
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (09) : 8793 - 8803
  • [34] Evaluation of neural network models and quality forecasting based on process time-series data
    Wang, Zhu
    Liu, Laize
    Dong, Xiujuan
    Liu, Jiaxuan
    [J]. CANADIAN JOURNAL OF CHEMICAL ENGINEERING, 2024, 102 (04): : 1522 - 1537
  • [35] Time-Lag Selection for Time-Series Forecasting Using Neural Network and Heuristic Algorithm
    Surakhi, Ola
    Zaidan, Martha A.
    Fung, Pak Lun
    Hossein Motlagh, Naser
    Serhan, Sami
    AlKhanafseh, Mohammad
    Ghoniem, Rania M.
    Hussein, Tareq
    [J]. ELECTRONICS, 2021, 10 (20)
  • [36] GRAPHSENSOR: A Graph Attention Network for Time-Series Sensor
    Ge, Jiaqi
    Xu, Gaochao
    Lu, Jianchao
    Xu, Xu
    Meng, Xiangyu
    [J]. ELECTRONICS, 2024, 13 (12)
  • [37] Time-series forecasting by pattern imitation
    Motnikar, BS
    Pisanski, T
    Cepar, I
    [J]. OR SPEKTRUM, 1996, 18 (01) : 43 - 49
  • [38] Pattern modelling in time-series forecasting
    Singh, S
    [J]. CYBERNETICS AND SYSTEMS, 2000, 31 (01) : 49 - 65
  • [39] REGENT DEVELOPMENTS IN TIME-SERIES FORECASTING
    FILDES, R
    [J]. OR SPEKTRUM, 1988, 10 (04) : 195 - 212
  • [40] FORECASTING ANNUAL GEOPHYSICAL TIME-SERIES
    NOAKES, DJ
    HIPEL, KW
    MCLEOD, AI
    JIMENEZ, C
    YAKOWITZ, S
    [J]. INTERNATIONAL JOURNAL OF FORECASTING, 1988, 4 (01) : 103 - 115