Beyond Not-Forgetting: Continual Learning with Backward Knowledge Transfer

被引:0
|
作者
Lin, Sen [1 ]
Yang, Li [1 ]
Fan, Deliang [1 ]
Zhang, Junshan [2 ]
机构
[1] Arizona State Univ, Sch ECEE, Tempe, AZ 85287 USA
[2] Univ Calif Davis, Dept ECE, Davis, CA 95616 USA
基金
美国国家科学基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
By learning a sequence of tasks continually, an agent in continual learning (CL) can improve the learning performance of both a new task and 'old' tasks by leveraging the forward knowledge transfer and the backward knowledge transfer, respectively. However, most existing CL methods focus on addressing catastrophic forgetting in neural networks by minimizing the modification of the learnt model for old tasks. This inevitably limits the backward knowledge transfer from the new task to the old tasks, because judicious model updates could possibly improve the learning performance of the old tasks as well. To tackle this problem, we first theoretically analyze the conditions under which updating the learnt model of old tasks could be beneficial for CL and also lead to backward knowledge transfer, based on the gradient projection onto the input subspaces of old tasks. Building on the theoretical analysis, we next develop a ContinUal learning method with Backward knowlEdge tRansfer (CUBER), for a fixed capacity neural network without data replay. In particular, CUBER first characterizes the task correlation to identify the positively correlated old tasks in a layer-wise manner, and then selectively modifies the learnt model of the old tasks when learning the new task. Experimental studies show that CUBER can even achieve positive backward knowledge transfer on several existing CL benchmarks for the first time without data replay, where the related baselines still suffer from catastrophic forgetting (negative backward knowledge transfer). The superior performance of CUBER on the backward knowledge transfer also leads to higher accuracy accordingly.
引用
收藏
页数:13
相关论文
共 50 条
  • [21] Continual Learning for Instance Segmentation to Mitigate Catastrophic Forgetting
    Lee, Jeong Jun
    Lee, Seung Il
    Kim, Hyun
    18TH INTERNATIONAL SOC DESIGN CONFERENCE 2021 (ISOCC 2021), 2021, : 85 - 86
  • [22] Probing Representation Forgetting in Supervised and Unsupervised Continual Learning
    Davari, MohammadReza
    Asadi, Nader
    Mudur, Sudhir
    Aljundi, Rahaf
    Belilovsky, Eugene
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2022), 2022, : 16691 - 16700
  • [23] Biocultural learning - beyond ecological knowledge transfer
    Garavito-Bermudez, Diana
    JOURNAL OF ENVIRONMENTAL PLANNING AND MANAGEMENT, 2020, 63 (10) : 1791 - 1810
  • [24] CONSISTENCY IS THE KEY TO FURTHER MITIGATING CATASTROPHIC FORGETTING IN CONTINUAL LEARNING
    Bhat, Prashant
    Zonooz, Bahram
    Arani, Elahe
    CONFERENCE ON LIFELONG LEARNING AGENTS, VOL 199, 2022, 199
  • [25] A Novel Class-wise Forgetting Detector in Continual Learning
    Pham, Xuan Cuong
    Liew, Alan Wee-chung
    Wang, Can
    2021 INTERNATIONAL CONFERENCE ON DIGITAL IMAGE COMPUTING: TECHNIQUES AND APPLICATIONS (DICTA 2021), 2021, : 518 - 525
  • [26] CONTINUAL LEARNING BEYOND A SINGLE MODEL
    Thang Doan
    Mirzadeh, Seyed Iman
    Farajtabar, Mehrdad
    CONFERENCE ON LIFELONG LEARNING AGENTS, VOL 232, 2023, 232 : 961 - 991
  • [27] Preempting Catastrophic Forgetting in Continual Learning Models by Anticipatory Regularization
    El Khatib, Alaa
    Karray, Fakhri
    2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
  • [28] Understanding Catastrophic Forgetting of Gated Linear Networks in Continual Learning
    Munari, Matteo
    Pasa, Luca
    Zambon, Daniele
    Alippi, Cesare
    Navarin, Nicolo
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [29] Cross-Regional Fraud Detection via Continual Learning With Knowledge Transfer
    Li, Yujie
    Yang, Xin
    Gao, Qiang
    Wang, Hao
    Zhang, Junbo
    Li, Tianrui
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2024, 36 (12) : 7865 - 7877
  • [30] Beyond Prompt Learning: Continual Adapter for Efficient Rehearsal-Free Continual Learning
    Gao, Xinyuan
    Dong, Songlin
    He, Yuhang
    Wang, Qiang
    Gong, Yihong
    COMPUTER VISION - ECCV 2024, PT LXXXV, 2025, 15143 : 89 - 106