Selective Freezing for Efficient Continual Learning

被引:0
|
作者
Sorrenti, Amelia [1 ]
Bellitto, Giovanni [1 ]
Salanitri, Federica Proietto [1 ]
Pennisi, Matteo [1 ]
Spampinato, Concetto [1 ]
Palazzo, Simone [1 ]
机构
[1] Univ Catania, PeRCeiVe Lab, Catania, Italy
关键词
NEURAL-NETWORKS; SYSTEMS;
D O I
10.1109/ICCVW60793.2023.00381
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper aims to tackle the challenges of continual learning, where sequential learning from a stream of tasks can lead to catastrophic forgetting. Simultaneously, it addresses the need to reduce the computational demands of large-scale deep learning models to mitigate their environmental impact. To achieve this twofold objective, we propose a method that combines selective layer freezing with fast adaptation in a continual learning context. We begin by conducting an extensive analysis of layer freezing in continual learning, revealing that certain configurations allow for freezing a substantial portion of the model without significant accuracy degradation. Leveraging this insight, we introduce a novel approach that optimizes plasticity on new tasks while preserving stability on previous tasks by dynamically identifying a subset of layers to freeze during training. Experimental results demonstrate the effectiveness of our approach in achieving competitive performance with manually-tuned freezing strategies. Moreover, we quantitatively estimate the reduction in computation and energy requirements achieved through our freezing strategy by considering the number of parameters and updates required for model training.
引用
收藏
页码:3542 / 3551
页数:10
相关论文
共 50 条
  • [21] Continual learning: a feature extraction formalization, an efficient algorithm, and barriers
    Peng, Binghui
    Risteski, Andrej
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [22] NEO: Neuron State Dependent Mechanisms for Efficient Continual Learning
    Daramt, Anurag
    Kudithipudi, Dhireesha
    PROCEEDINGS OF THE 2023 ANNUAL NEURO-INSPIRED COMPUTATIONAL ELEMENTS CONFERENCE, NICE 2023, 2023, : 11 - 19
  • [23] Parameter-Efficient Finetuning for Robust Continual Multilingual Learning
    Badola, Kartikeya
    Dave, Shachi
    Talukdar, Partha
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2023), 2023, : 9763 - 9780
  • [24] Continual learning with selective netsContinual learning with selective netsH. T. Luu, M. Szemenyei
    Hai Tung Luu
    Marton Szemenyei
    Applied Intelligence, 2025, 55 (7)
  • [25] Continual learning
    King, Denise
    JOURNAL OF EMERGENCY NURSING, 2008, 34 (04) : 283 - 283
  • [26] CONTINUAL LEARNING
    BROWN, WE
    JOURNAL OF THE AMERICAN DENTAL ASSOCIATION, 1965, 71 (04): : 935 - &
  • [27] Selective Amnesia: A Continual Learning Approach to Forgetting in Deep Generative Models
    Heng, Alvin
    Soh, Harold
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [28] Continual Active Learning for Efficient Adaptation of Machine Learning Models to Changing Image Acquisition
    Perkonigg, Matthias
    Hofmanninger, Johannes
    Langs, Georg
    INFORMATION PROCESSING IN MEDICAL IMAGING, IPMI 2021, 2021, 12729 : 649 - 660
  • [29] A Unified Continual Learning Framework with General Parameter-Efficient Tuning
    Gao, Qiankun
    Zhao, Chen
    Sun, Yifan
    Xi, Teng
    Zhang, Gang
    Ghanem, Bernard
    Zhang, Jian
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 11449 - 11459
  • [30] PUMA: Efficient Continual Graph Learning for Node Classification With Graph Condensation
    Liu, Yilun
    Qiu, Ruihong
    Tang, Yanran
    Yin, Hongzhi
    Huang, Zi
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2025, 37 (01) : 449 - 461