Contrastive Supervised Distillation for Continual Representation Learning

被引:4
|
作者
Barletti, Tommaso [1 ]
Biondi, Niccolo [1 ]
Pernici, Federico [1 ]
Bruni, Matteo [1 ]
Del Bimbo, Alberto [1 ]
机构
[1] Univ Firenze, Media Integrat & Commun Ctr MICC, Dipartimento Ingn Informaz, Florence, Italy
基金
欧盟地平线“2020”;
关键词
Representation learning; Continual learning; Image retrieval; Visual search; Contrastive learning; Distillation;
D O I
10.1007/978-3-031-06427-2_50
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we propose a novel training procedure for the continual representation learning problem in which a neural network model is sequentially learned to alleviate catastrophic forgetting in visual search tasks. Our method, called Contrastive Supervised Distillation (CSD), reduces feature forgetting while learning discriminative features. This is achieved by leveraging labels information in a distillation setting in which the student model is contrastively learned from the teacher model. Extensive experiments show that CSD performs favorably in mitigating catastrophic forgetting by outperforming current state-of-the-art methods. Our results also provide further evidence that feature forgetting evaluated in visual retrieval tasks is not as catastrophic as in classification tasks. Code at: https://github.com/NiccoBiondi/ContrastiveSupervisedDistillation.
引用
收藏
页码:597 / 609
页数:13
相关论文
共 50 条
  • [1] Multilingual Representation Distillation with Contrastive Learning
    Tan, Weiting
    Heffernan, Kevin
    Schwenk, Holger
    Koehn, Philipp
    [J]. 17TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EACL 2023, 2023, : 1477 - 1490
  • [2] Continual Learning Based on Knowledge Distillation and Representation Learning
    Chen, Xiu-Yan
    Liu, Jian-Wei
    Li, Wen-Tao
    [J]. ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2022, PT IV, 2022, 13532 : 27 - 38
  • [3] SCL-IKD: intermediate knowledge distillation via supervised contrastive representation learning
    Saurabh Sharma
    Shikhar Singh Lodhi
    Joydeep Chandra
    [J]. Applied Intelligence, 2023, 53 : 28520 - 28541
  • [4] SCL-IKD: intermediate knowledge distillation via supervised contrastive representation learning
    Sharma, Saurabh
    Lodhi, Shikhar Singh
    Chandra, Joydeep
    [J]. APPLIED INTELLIGENCE, 2023, 53 (23) : 28520 - 28541
  • [5] CONTRASTIVE LEARNING FOR ONLINE SEMI-SUPERVISED GENERAL CONTINUAL LEARNING
    Michel, Nicolas
    Negrel, Romain
    Chierchia, Giovanni
    Bercher, Jean-Francois
    [J]. 2022 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2022, : 1896 - 1900
  • [6] Supervised contrastive learning for graph representation enhancement
    Ghayekhloo, Mohadeseh
    Nickabadi, Ahmad
    [J]. NEUROCOMPUTING, 2024, 588
  • [8] Mutual mentor: Online contrastive distillation network for general continual learning
    Wang, Qiang
    Ji, Zhong
    Li, Jin
    Pang, Yanwei
    [J]. NEUROCOMPUTING, 2023, 537 : 37 - 48
  • [9] Continual semi-supervised learning through contrastive interpolation consistency
    Boschini, Matteo
    Buzzega, Pietro
    Bonicelli, Lorenzo
    Porrello, Angelo
    Calderara, Simone
    [J]. PATTERN RECOGNITION LETTERS, 2022, 162 : 9 - 14
  • [10] Probing Representation Forgetting in Supervised and Unsupervised Continual Learning
    Davari, MohammadReza
    Asadi, Nader
    Mudur, Sudhir
    Aljundi, Rahaf
    Belilovsky, Eugene
    [J]. 2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2022), 2022, : 16691 - 16700