Does Catastrophic Forgetting Negatively Affect Financial Predictions?

被引:0
|
作者
Zurli, Alberto [1 ,2 ]
Bertugli, Alessia [3 ]
Credi, Jacopo [2 ]
机构
[1] Univ Modena Reggio & Emilia, Modena, Italy
[2] Axyon AI, Modena, Italy
[3] Univ Trento, Trento, Italy
基金
欧盟地平线“2020”;
关键词
D O I
10.1007/978-3-031-25599-1_37
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Nowadays, financial markets produce a large amount of data, in the form of historical time series, which quantitative researchers have recently attempted at predicting with deep learning models. These models are constantly updated with new incoming data in an online fashion. However, artificial neural networks tend to exhibit poor adaptability, fitting the last seen trends, without keeping the information from the previous ones. Continual learning studies this problem, called catastrophic forgetting, to preserve the knowledge acquired in the past and exploiting it for learning new trends. This paper evaluates and highlights continual learning techniques applied to financial historical time series in a context of binary classification (upward or downward trend). The main state-of-the-art algorithms have been evaluated with data derived from a practical scenario, highlighting how the application of continual learning techniques allows for better performance in the financial field against conventional online approaches (Code is available at https://github.com/albertozurli/cl_timeseries.).
引用
收藏
页码:501 / 515
页数:15
相关论文
共 50 条
  • [41] Enhancing network modularity to mitigate catastrophic forgetting
    Lu Chen
    Masayuki Murata
    Applied Network Science, 5
  • [42] Defying catastrophic forgetting via influence function
    Gao, Rui
    Liu, Weiwei
    ARTIFICIAL INTELLIGENCE, 2025, 339
  • [43] Adaptive trajectory prediction without catastrophic forgetting
    ChunYu Zhi
    HuaiJiang Sun
    Tian Xu
    The Journal of Supercomputing, 2023, 79 : 15579 - 15596
  • [44] Overcoming Catastrophic Forgetting in Graph Neural Networks
    Liu, Huihui
    Yang, Yiding
    Wang, Xinchao
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 8653 - 8661
  • [45] Quantum Continual Learning Overcoming Catastrophic Forgetting
    蒋文杰
    鲁智徳
    邓东灵
    Chinese Physics Letters, 2022, 39 (05) : 29 - 41
  • [46] A Study on Catastrophic Forgetting in Deep LSTM Networks
    Schak, Monika
    Gepperth, Alexander
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2019: DEEP LEARNING, PT II, 2019, 11728 : 714 - 728
  • [47] Comparative Analysis of Catastrophic Forgetting in Metric Learning
    Huo, Jiahao
    van Zyl, Terence L.
    2020 7TH INTERNATIONAL CONFERENCE ON SOFT COMPUTING & MACHINE INTELLIGENCE (ISCMI 2020), 2020, : 68 - 72
  • [48] Enhancing network modularity to mitigate catastrophic forgetting
    Chen, Lu
    Murata, Masayuki
    APPLIED NETWORK SCIENCE, 2020, 5 (01)
  • [49] Overcoming Catastrophic Forgetting with Hard Attention to the Task
    Serra, Joan
    Suris, Didac
    Miron, Marius
    Karatzoglou, Alexandros
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [50] Overcoming Catastrophic Forgetting with Gaussian Mixture Replay
    Pfuelb, Benedikt
    Gepperth, Alexander
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,