Learning Without Forgetting

被引:437
|
作者
Li, Zhizhong [1 ]
Hoiem, Derek [1 ]
机构
[1] Univ Illinois, Dept Comp Sci, Champaign, IL 61801 USA
来源
关键词
Convolutional neural networks; Transfer learning; Multitask learning; Deep learning; Visual recognition;
D O I
10.1007/978-3-319-46493-0_37
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
When building a unified vision system or gradually adding new capabilities to a system, the usual assumption is that training data for all tasks is always available. However, as the number of tasks grows, storing and retraining on such data becomes infeasible. A new problem arises where we add new capabilities to a Convolutional Neural Network (CNN), but the training data for its existing capabilities are unavailable. We propose our Learning without Forgetting method, which uses only new task data to train the network while preserving the original capabilities. Our method performs favorably compared to commonly used feature extraction and fine-tuning adaption techniques and performs similarly to multitask learning that uses original task data we assume unavailable. A more surprising observation is that Learning without Forgetting may be able to replace fine-tuning as standard practice for improved new task performance.
引用
下载
收藏
页码:614 / 629
页数:16
相关论文
共 50 条
  • [41] LEARNING REMEMBERING + FORGETTING
    KIMBLE, D
    SCIENCE, 1964, 146 (365) : 1605 - &
  • [42] TIMING, LEARNING, AND FORGETTING
    SHIMP, CP
    ANNALS OF THE NEW YORK ACADEMY OF SCIENCES, 1984, 423 (MAY) : 346 - 360
  • [43] Adaptive determination of the amount of forgetting in the learning algorithm with forgetting
    Watanabe, E
    PROGRESS IN CONNECTIONIST-BASED INFORMATION SYSTEMS, VOLS 1 AND 2, 1998, : 237 - 240
  • [44] An Incremental Learning of YOLOv3 Without Catastrophic Forgetting for Smart City Applications
    ul Haq, Qazi Mazhar
    Ruan, Shanq-Jang
    Haq, Muhammad Amirul
    Karam, Said
    Shieh, Jeng Lun
    Chondro, Peter
    Gao, De-Qin
    IEEE CONSUMER ELECTRONICS MAGAZINE, 2022, 11 (05) : 56 - 63
  • [45] Cuepervision: self-supervised learning for continuous domain adaptation without catastrophic forgetting
    Schutera, Mark
    Hafner, Frank M.
    Abhau, Jochen
    Hagenmeyer, Veit
    Mikut, Ralf
    Reischl, Markus
    IMAGE AND VISION COMPUTING, 2021, 106
  • [46] Memory Replay GANs: learning to generate images from new categories without forgetting
    Wu, Chenshen
    Herranz, Luis
    Liu, Xialei
    Wang, Yaxing
    van de Weijer, Joost
    Raducanu, Bogdan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [47] Episodic memory based continual learning without catastrophic forgetting for environmental sound classification
    Karam S.
    Ruan S.-J.
    Haq Q.M.
    Li L.P.-H.
    Journal of Ambient Intelligence and Humanized Computing, 2023, 14 (04) : 4439 - 4449
  • [48] Learning without forgetting by leveraging transfer learning for detecting COVID-19 infection from CT images
    Subramanian, Malliga
    Sathishkumar, Veerappampalayam Easwaramoorthy
    Cho, Jaehyuk
    Shanmugavadivel, Kogilavani
    SCIENTIFIC REPORTS, 2023, 13 (01)
  • [49] Learning without forgetting by leveraging transfer learning for detecting COVID-19 infection from CT images
    Malliga Subramanian
    Veerappampalayam Easwaramoorthy Sathishkumar
    Jaehyuk Cho
    Kogilavani Shanmugavadivel
    Scientific Reports, 13
  • [50] Doctors without masks: forgetting or reinvesting?
    Giorgina Barbara Piccoli
    Journal of Nephrology, 2023, 36 : 1223 - 1224