Selective Freezing for Efficient Continual Learning

被引:0
|
作者
Sorrenti, Amelia [1 ]
Bellitto, Giovanni [1 ]
Salanitri, Federica Proietto [1 ]
Pennisi, Matteo [1 ]
Spampinato, Concetto [1 ]
Palazzo, Simone [1 ]
机构
[1] Univ Catania, PeRCeiVe Lab, Catania, Italy
关键词
NEURAL-NETWORKS; SYSTEMS;
D O I
10.1109/ICCVW60793.2023.00381
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper aims to tackle the challenges of continual learning, where sequential learning from a stream of tasks can lead to catastrophic forgetting. Simultaneously, it addresses the need to reduce the computational demands of large-scale deep learning models to mitigate their environmental impact. To achieve this twofold objective, we propose a method that combines selective layer freezing with fast adaptation in a continual learning context. We begin by conducting an extensive analysis of layer freezing in continual learning, revealing that certain configurations allow for freezing a substantial portion of the model without significant accuracy degradation. Leveraging this insight, we introduce a novel approach that optimizes plasticity on new tasks while preserving stability on previous tasks by dynamically identifying a subset of layers to freeze during training. Experimental results demonstrate the effectiveness of our approach in achieving competitive performance with manually-tuned freezing strategies. Moreover, we quantitatively estimate the reduction in computation and energy requirements achieved through our freezing strategy by considering the number of parameters and updates required for model training.
引用
收藏
页码:3542 / 3551
页数:10
相关论文
共 50 条
  • [1] Efficient Spiking Neural Networks with Sparse Selective Activation for Continual Learning
    Shen, Jiangrong
    Ni, Wenyao
    Xu, Qi
    Tang, Huajin
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 1, 2024, : 611 - 619
  • [2] Continual learning with selective nets
    Luu, Hai Tung
    Szemenyei, Marton
    APPLIED INTELLIGENCE, 2025, 55 (07)
  • [3] Efficient Architecture Search for Continual Learning
    Gao, Qiang
    Luo, Zhipeng
    Klabjan, Diego
    Zhang, Fengli
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (11) : 8555 - 8565
  • [4] EsaCL: An Efficient Continual Learning Algorithm
    Ren, Weijieying
    Honavar, Vasant G.
    PROCEEDINGS OF THE 2024 SIAM INTERNATIONAL CONFERENCE ON DATA MINING, SDM, 2024, : 163 - 171
  • [5] Memory Efficient Continual Learning with Transformers
    Ermis, Beyza
    Zappella, Giovanni
    Wistuba, Martin
    Rawal, Aditya
    Archambeau, Cedric
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [6] Beyond Prompt Learning: Continual Adapter for Efficient Rehearsal-Free Continual Learning
    Gao, Xinyuan
    Dong, Songlin
    He, Yuhang
    Wang, Qiang
    Gong, Yihong
    COMPUTER VISION - ECCV 2024, PT LXXXV, 2025, 15143 : 89 - 106
  • [7] Computationally Efficient Rehearsal for Online Continual Learning
    Davalas, Charalampos
    Michail, Dimitrios
    Diou, Christos
    Varlamis, Iraklis
    Tserpes, Konstantinos
    IMAGE ANALYSIS AND PROCESSING, ICIAP 2022, PT III, 2022, 13233 : 39 - 49
  • [8] Visually Grounded Continual Language Learning with Selective Specialization
    Ahrens, Kyra
    Bengtson, Lennart
    Lee, Jae Hee
    Wermter, Stefan
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS - EMNLP 2023, 2023, : 7037 - 7054
  • [9] Continual learning and its industrial applications: A selective review
    Lian, J.
    Choi, K.
    Veeramani, B.
    Hu, A.
    Murli, S.
    Freeman, L.
    Bowen, E.
    Deng, X.
    WILEY INTERDISCIPLINARY REVIEWS-DATA MINING AND KNOWLEDGE DISCOVERY, 2024, 14 (06)
  • [10] Efficient continual learning at the edge with progressive segmented training
    Du, Xiaocong
    Venkataramanaiah, Shreyas Kolala
    Li, Zheng
    Suh, Han-Sok
    Yin, Shihui
    Krishnan, Gokul
    Liu, Frank
    Seo, Jae-sun
    Cao, Yu
    NEUROMORPHIC COMPUTING AND ENGINEERING, 2022, 2 (04):