Do not Forget to Attend to Uncertainty while Mitigating Catastrophic Forgetting

被引:11
|
作者
Kurmi, Vinod K. [1 ]
Patro, Badri N. [1 ,3 ]
Subramanian, Venkatesh K. [1 ]
Namboodiri, Vinay P. [2 ]
机构
[1] IIT Kanpur, Kanpur, Uttar Pradesh, India
[2] Univ Bath, Bath, Avon, England
[3] Google, Mountain View, CA 94043 USA
关键词
D O I
10.1109/WACV48630.2021.00078
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
One of the major limitations of deep learning models is that they face catastrophic forgetting in an incremental learning scenario. There have been several approaches proposed to tackle the problem of incremental learning. Most of these methods are based on knowledge distillation and do not adequately utilize the information provided by older task models, such as uncertainty estimation in predictions. The predictive uncertainty provides the distributional information can be applied to mitigate catastrophic forgetting in a deep learning framework. In the proposed work, we consider a Bayesian formulation to obtain the data and model uncertainties. We also incorporate self-attention framework to address the incremental learning problem. We define distillation losses in terms of aleatoric uncertainty and self-attention. In the proposed work, we investigate different ablation analyses on these losses. Furthermore, we are able to obtain better results in terms of accuracy on standard benchmarks.
引用
收藏
页码:736 / 745
页数:10
相关论文
共 21 条
  • [1] Mitigating Catastrophic Forgetting with Complementary Layered Learning
    Mondesire, Sean
    Wiegand, R. Paul
    [J]. ELECTRONICS, 2023, 12 (03)
  • [2] Explain to Not Forget: Defending Against Catastrophic Forgetting with XAI
    Ede, Sami
    Baghdadlian, Serop
    Weber, Leander
    An Nguyen
    Zanca, Dario
    Samek, Wojciech
    Lapuschkin, Sebastian
    [J]. MACHINE LEARNING AND KNOWLEDGE EXTRACTION, CD-MAKE 2022, 2022, 13480 : 1 - 18
  • [3] Unsupervised Neuron Selection for Mitigating Catastrophic Forgetting in Neural Networks
    Goodrich, Ben
    Arel, Itamar
    [J]. 2014 IEEE 57TH INTERNATIONAL MIDWEST SYMPOSIUM ON CIRCUITS AND SYSTEMS (MWSCAS), 2014, : 997 - 1000
  • [4] Neuron Clustering for Mitigating Catastrophic Forgetting in Feedforward Neural Networks
    Goodrich, Ben
    Arel, Itamar
    [J]. 2014 IEEE SYMPOSIUM ON COMPUTATIONAL INTELLIGENCE IN DYNAMIC AND UNCERTAIN ENVIRONMENTS (CIDUE), 2014, : 62 - 68
  • [5] CONSISTENCY IS THE KEY TO FURTHER MITIGATING CATASTROPHIC FORGETTING IN CONTINUAL LEARNING
    Bhat, Prashant
    Zonooz, Bahram
    Arani, Elahe
    [J]. CONFERENCE ON LIFELONG LEARNING AGENTS, VOL 199, 2022, 199
  • [6] Forget Me Not: Reducing Catastrophic Forgetting for Domain Adaptation in Reading Comprehension
    Xu, Ying
    Zhong, Xu
    Yepes, Antonio Jose Jimeno
    Lau, Jey Han
    [J]. 2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [7] Ensemble Learning in Fixed Expansion Layer Networks for Mitigating Catastrophic Forgetting
    Coop, Robert
    Mishtal, Aaron
    Arel, Itamar
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2013, 24 (10) : 1623 - 1634
  • [8] Mitigating Catastrophic Forgetting in Deep Transfer Learning for Fingerprinting Indoor Positioning
    Pan, Heng
    Wei, Shuang
    He, Di
    Xiao, Zhuoling
    Arai, Shintaro
    [J]. 2023 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS, ISCAS, 2023,
  • [9] Sequential Covariance-Matrix Estimation with Application to Mitigating Catastrophic Forgetting
    Lancewicki, Tomer
    Goodrich, Ben
    Arel, Itamar
    [J]. 2015 IEEE 14TH INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS (ICMLA), 2015, : 628 - 633
  • [10] Mitigating Catastrophic Forgetting in Deep Learning in a Streaming Setting Using Historical Summary
    Dash, Sajal
    Yin, Junqi
    Shankar, Mallikarjun
    Wang, Feiyi
    Feng, Wu-chun
    [J]. PROCEEDINGS OF THE 7TH INTERNATIONAL WORKSHOP ON DATA ANALYSIS AND REDUCTION FOR BIG SCIENTIFIC DATA (DRBSD-7), 2021, : 11 - 18