DESCINet: A hierarchical deep convolutional neural network with skip connection for long time series forecasting

被引:7
|
作者
Silva, Andre Quintiliano Bezerra [1 ]
Goncalves, Wesley Nunes [2 ]
Matsubara, Edson Takashi [2 ]
机构
[1] Fed Inst Educ Sci & Technol Mato Grosso Do Sul, BR-79240000 Campo Grande, MS, Brazil
[2] Univ Fed Mato Grosso do Sul, BR-79010210 Campo Grande, MS, Brazil
关键词
Long sequence time-series forecasting; Skip connections; Convolutional neural networks (CNNs); Deep learning (DL); SINGULARITIES; PREDICTION; DYNAMICS; LSTM;
D O I
10.1016/j.eswa.2023.120246
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Time series forecasting is the process of predicting future values of a time series from knowledge of its past data. Although there are several models for making short-term predictions, the problem of long temporal sequences can still receive new contributions. Recent studies have applied Transformers-based solutions to the long-time series forecasting task and achieved good results. When it comes to time series modeling, the goal is to capture the temporal relationships within a sequence of ordered and continuous points. While using positional encoding and embedding sub-series as tokens in Transformers can help to preserve some ordering information, the permutation-invariant self-attention mechanism used in these models can lead to loss of temporal information. In this paper, we used convolutional networks in a binary tree structure with skip connections between the levels of the tree that allowed greater precision and efficiency in the training. The proposed model was evaluated on five real-life datasets. Experimental results show that our model significantly improves forecast accuracy relative to existing solutions.
引用
收藏
页数:12
相关论文
共 50 条
  • [21] Dilated convolutional neural networks for time series forecasting
    Borovykh, Anastasia
    Bohte, Sander
    Oosterlee, Cornelis W.
    JOURNAL OF COMPUTATIONAL FINANCE, 2019, 22 (04) : 73 - 101
  • [22] Convolutional Neural Networks for Energy Time Series Forecasting
    Koprinska, Irena
    Wu, Dengsong
    Wang, Zheng
    2018 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2018,
  • [23] 1D Quantum Convolutional Neural Network for Time Series Forecasting and Classification
    Alejandra Rivera-Ruiz, Mayra
    Leticia Juarez-Osorio, Sandra
    Mendez-Vazquez, Andres
    Mauricio Lopez-Romero, Jose
    Rodriguez-Tello, Eduardo
    ADVANCES IN COMPUTATIONAL INTELLIGENCE, MICAI 2023, PT I, 2024, 14391 : 17 - 35
  • [24] Blind Image Quality Assessment via Deep Recursive Convolutional Network with Skip Connection
    Yan, Qingsen
    Sun, Jinqiu
    Su, Shaolin
    Zhu, Yu
    Li, Haisen
    Zhang, Yanning
    PATTERN RECOGNITION AND COMPUTER VISION, PT II, 2018, 11257 : 51 - 61
  • [25] SRNET: A Shallow Skip Connection Based Convolutional Neural Network Design for Resolving Singularities
    Robail Yasrab
    Journal of Computer Science and Technology, 2019, 34 : 924 - 938
  • [26] SRNET: A Shallow Skip Connection Based Convolutional Neural Network Design for Resolving Singularities
    Yasrab, Robail
    JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY, 2019, 34 (04) : 924 - 938
  • [27] NHITS: Neural Hierarchical Interpolation for Time Series Forecasting
    Challu, Cristian
    Olivares, Kin G.
    Oreshkin, Boris N.
    Ramirez, Federico Garza
    Mergenthaler-Canseco, Max
    Dubrawski, Artur
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 6, 2023, : 6989 - 6997
  • [28] Explainable online ensemble of deep neural network pruning for time series forecasting
    Saadallah, Amal
    Jakobs, Matthias
    Morik, Katharina
    MACHINE LEARNING, 2022, 111 (09) : 3459 - 3487
  • [29] Explainable online ensemble of deep neural network pruning for time series forecasting
    Amal Saadallah
    Matthias Jakobs
    Katharina Morik
    Machine Learning, 2022, 111 : 3459 - 3487
  • [30] A neural network based time series forecasting
    Jana, PK
    PROCEEDINGS OF INTERNATIONAL CONFERENCE ON INTELLIGENT SENSING AND INFORMATION PROCESSING, 2004, : 329 - 331