Generative Continual Concept Learning

被引:0
|
作者
Rostami, Mohammad [1 ]
Kolouri, Soheil [2 ]
McClelland, James [3 ]
Pilly, Praveen [2 ]
机构
[1] Univ Penn, Philadelphia, PA 19104 USA
[2] HRL Labs LLC, Malibu, CA USA
[3] Stanford Univ, Stanford, CA 94305 USA
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
After learning a concept, humans are also able to continually generalize their learned concepts to new domains by observing only a few labeled instances without any interference with the past learned knowledge. In contrast, learning concepts efficiently in a continual learning setting remains an open challenge for current Artificial Intelligence algorithms as persistent model retraining is necessary. Inspired by the Parallel Distributed Processing learning and the Complementary Learning Systems theories, we develop a computational model that is able to expand its previously learned concepts efficiently to new domains using a few labeled samples. We couple the new form of a concept to its past learned forms in an embedding space for effective continual learning. Doing so, a generative distribution is learned such that it is shared across the tasks in the embedding space and models the abstract concepts. This procedure enables the model to generate pseudo-data points to replay the past experience to tackle catastrophic forgetting.
引用
收藏
页码:5545 / 5552
页数:8
相关论文
共 50 条
  • [1] Evaluating and Explaining Generative Adversarial Networks for Continual Learning under Concept Drift
    Guzy, Filip
    Wozniak, Michal
    Krawczyk, Bartosz
    21ST IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS ICDMW 2021, 2021, : 295 - 303
  • [2] Continual Learning with Deep Generative Replay
    Shin, Hanul
    Lee, Jung Kwon
    Kim, Jaehong
    Kim, Jiwon
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [3] Generative negative replay for continual learning
    Graffieti, Gabriele
    Maltoni, Davide
    Pellegrini, Lorenzo
    Lomonaco, Vincenzo
    NEURAL NETWORKS, 2023, 162 : 369 - 383
  • [4] UNSUPERVISED GENERATIVE VARIATIONAL CONTINUAL LEARNING
    Liu Guimeng
    Yang, Guo
    Yin, Cheryl Wong Sze
    Suganathan, Ponnuthurai Nagartnam
    Savitha, Ramasamy
    2022 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2022, : 4028 - 4032
  • [5] Continual learning with invertible generative models
    Pomponi, Jary
    Scardapane, Simone
    Uncini, Aurelio
    NEURAL NETWORKS, 2023, 164 : 606 - 616
  • [6] Generative Models from the perspective of Continual Learning
    Lesort, Timothee
    Caselles-Dupre, Hugo
    Garcia-Ortiz, Michael
    Stoian, Andrei
    Filliat, David
    2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
  • [7] Continual Learning for Generative Retrieval over Dynamic Corpora
    Chen, Jiangui
    Zhang, Ruqing
    Guo, Jiafeng
    de Rijke, Maarten
    Chen, Wei
    Fan, Yixing
    Cheng, Xueqi
    PROCEEDINGS OF THE 32ND ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2023, 2023, : 306 - 315
  • [8] Efficient Feature Transformations for Discriminative and Generative Continual Learning
    Verma, Vinay Kumar
    Liang, Kevin J.
    Mehta, Nikhil
    Rai, Piyush
    Carin, Lawrence
    2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 13860 - 13870
  • [9] Continual Pedestrian Trajectory Learning With Social Generative Replay
    Wu, Ya
    Bighashdel, Ariyan
    Chen, Guang
    Dubbelman, Gijs
    Jancura, Pavol
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2023, 8 (02) : 848 - 855
  • [10] The hippocampal formation as a hierarchical generative model supporting generative replay and continual learning
    Stoianov, Ivilin
    Maisto, Domenico
    Pezzulo, Giovanni
    PROGRESS IN NEUROBIOLOGY, 2022, 217