Learning Without Forgetting

被引:437
|
作者
Li, Zhizhong [1 ]
Hoiem, Derek [1 ]
机构
[1] Univ Illinois, Dept Comp Sci, Champaign, IL 61801 USA
来源
关键词
Convolutional neural networks; Transfer learning; Multitask learning; Deep learning; Visual recognition;
D O I
10.1007/978-3-319-46493-0_37
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
When building a unified vision system or gradually adding new capabilities to a system, the usual assumption is that training data for all tasks is always available. However, as the number of tasks grows, storing and retraining on such data becomes infeasible. A new problem arises where we add new capabilities to a Convolutional Neural Network (CNN), but the training data for its existing capabilities are unavailable. We propose our Learning without Forgetting method, which uses only new task data to train the network while preserving the original capabilities. Our method performs favorably compared to commonly used feature extraction and fine-tuning adaption techniques and performs similarly to multitask learning that uses original task data we assume unavailable. A more surprising observation is that Learning without Forgetting may be able to replace fine-tuning as standard practice for improved new task performance.
引用
下载
收藏
页码:614 / 629
页数:16
相关论文
共 50 条
  • [21] Forest fire and smoke detection using deep learning-based learning without forgetting
    Sathishkumar, Veerappampalayam Easwaramoorthy
    Cho, Jaehyuk
    Subramanian, Malliga
    Naren, Obuli Sai
    FIRE ECOLOGY, 2023, 19 (01)
  • [22] Forest fire and smoke detection using deep learning-based learning without forgetting
    Veerappampalayam Easwaramoorthy Sathishkumar
    Jaehyuk Cho
    Malliga Subramanian
    Obuli Sai Naren
    Fire Ecology, 19
  • [23] Adaptive Learning without Forgetting via Low-Complexity Convex Networks
    Javid, Alireza M.
    Liang, Xinyue
    Skoglund, Mikael
    Chatterjee, Saikat
    28TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO 2020), 2021, : 1623 - 1627
  • [24] Pseudo-rehearsal: Achieving deep reinforcement learning without catastrophic forgetting
    Atkinson, Craig
    McCane, Brendan
    Szymanski, Lech
    Robins, Anthony
    NEUROCOMPUTING, 2021, 428 : 291 - 307
  • [25] BEYOND WITHOUT FORGETTING: MULTI-TASK LEARNING FOR CLASSIFICATION WITH DISJOINT DATASETS
    Hong, Yan
    Niu, Li
    Zhang, Jianfu
    Zhang, Liqing
    2020 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO (ICME), 2020,
  • [26] NETWORK ADAPTATION STRATEGIES FOR LEARNING NEW CLASSES WITHOUT FORGETTING THE ORIGINAL ONES
    Taitelbaum, Hagai
    Chechik, Gal
    Goldberger, Jacob
    2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 3637 - 3641
  • [27] Learning Without Forgetting: A New Framework for Network Cyber Security Threat Detection
    Karn, Rupesh Raj
    Kudva, Prabhakar
    Elfadel, Ibrahim M.
    IEEE ACCESS, 2021, 9 : 137042 - 137062
  • [28] Learning-Without-Forgetting via Memory Index in Incremental Object Detection
    Zhou, Haixin
    Ye, Biaohua
    Lai, JianHuang
    PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2023, PT VIII, 2024, 14432 : 448 - 459
  • [29] Without forgetting the Imam.
    Tamney, JB
    JOURNAL FOR THE SCIENTIFIC STUDY OF RELIGION, 1998, 37 (04) : 767 - 768
  • [30] Without forgetting the Drouot district
    不详
    CONNAISSANCE DES ARTS, 2009, (669): : 129 - 129