Online Class Incremental Contrastive Learning Based on Incremental Mixup-induced Universum

被引:0
|
作者
Liu, Yu-Wei [1 ]
Chen, Song-Can [1 ]
机构
[1] College of Computer Science and Technology, Nanjing University of Aeronautics and Astronautics, Nanjing,211106, China
来源
Ruan Jian Xue Bao/Journal of Software | 2024年 / 35卷 / 12期
关键词
Adversarial machine learning - Cache memory - Federated learning - Unsupervised learning;
D O I
10.13328/j.cnki.jos.007094
中图分类号
学科分类号
摘要
Online class-increment learning aims to learn new classes effectively under data stream scenarios and guarantee that the model meets the small cache and small batch constraints. However, due to the one-pass nature of data streams, it is difficult for the category information in small batches like offline learning to be exploited by multiple explorations. To alleviate this problem, current studies adopt multiple data augmentation combined with contrastive learning for model training. Nevertheless, considering the limitations of small cache and small batches, existing methods of selecting and storing data randomly are not conducive to obtaining diverse negative samples, which restricts the model discriminability. Previous studies have shown that hard negative samples are the key to improving contrastive learning performance, but this is rarely explored in online learning scenarios. The condued data proposed in traditional Universum learning provides a simple yet intuitive strategy using hard negative samples. Specifically, this study has proposed mixup-induced Universum (MIU) with certain coefficients previously, which effectively improves the performance of offline contrastive learning. Inspired by this, it tries to introduce MIU to online scenes, which is different from the previously statically generated Universum, and data stream scenarios face some additional challenges. Firstly, due to the increasing number of classes, the conventional approach of generating Universum based on globally given classes statically becomes inapplicable, necessitating redefinition and dynamic generation. Therefore, this study proposes to recursively generate MIU with the maximum entropy (incremental MIU, IMIU) relative to the seen (local) class and provides it with an additional small cache to meet the memory limit generally. Secondly, the generated IMIU and positive samples in small batches are mixed up together again to produce diverse and high-quality hard negative samples. Finally, by combining the above steps, the IMIU-based contrastive learning (IUCL) algorithm is developed. Meanwhile, comparison experiments on the standard datasets CIFAR-10, CIFAR-100, and Mini-ImageNet verify the validity of the proposed algorithm. © 2024 Chinese Academy of Sciences. All rights reserved.
引用
收藏
页码:5544 / 5557
相关论文
共 50 条
  • [1] MixER: Mixup-Based Experience Replay for Online Class-Incremental Learning
    Lim, Won-Seon
    Zhou, Yu
    Kim, Dae-Won
    Lee, Jaesung
    IEEE ACCESS, 2024, 12 : 41801 - 41814
  • [2] PCR: Proxy-based Contrastive Replay for Online Class-Incremental Continual Learning
    Lin, Huiwei
    Zhang, Baoquan
    Feng, Shanshan
    Li, Xutao
    Ye, Yunming
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 24246 - 24255
  • [3] Mixup-Inspired Video Class-Incremental Learning
    Long, Jinqiang
    Gao, Yizhao
    Lu, Zhiwu
    23RD IEEE INTERNATIONAL CONFERENCE ON DATA MINING, ICDM 2023, 2023, : 1181 - 1186
  • [4] Max-margin Class Incremental Learning with Mixup Augmentation
    Ye, Daihu
    Xu, Ke
    Xu, Chunyan
    Cui, Zhen
    FOURTEENTH INTERNATIONAL CONFERENCE ON GRAPHICS AND IMAGE PROCESSING, ICGIP 2022, 2022, 12705
  • [5] Class Incremental Learning With Deep Contrastive Learning and Attention Distillation
    Zhu, Jitao
    Luo, Guibo
    Duan, Baishan
    Zhu, Yuesheng
    IEEE SIGNAL PROCESSING LETTERS, 2024, 31 : 1224 - 1228
  • [6] ILCOC: An Incremental Learning Framework based on Contrastive One-class Classifiers
    Sun, Wenju
    Zhang, Jing
    Wang, Danyu
    Geng, Yangli-Ao
    Li, Qingyong
    2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS, CVPRW 2021, 2021, : 3575 - 3583
  • [7] CeCR: Cross-entropy contrastive replay for online class-incremental continual learning
    Sun, Guanglu
    Ji, Baolun
    Liang, Lili
    Chen, Minghui
    NEURAL NETWORKS, 2024, 173
  • [8] Supervised Contrastive Replay: Revisiting the Nearest Class Mean Classifier in Online Class-Incremental Continual Learning
    Mai, Zheda
    Li, Ruiwen
    Kim, Hyunwoo
    Sanner, Scott
    2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS, CVPRW 2021, 2021, : 3584 - 3594
  • [9] Online Class-Incremental Learning in Image Classification Based on Attention
    Du, Baoyu
    Wei, Zhonghe
    Cheng, Jinyong
    Lv, Guohua
    Dai, Xiaoyu
    PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2023, PT VII, 2024, 14431 : 487 - 499
  • [10] Knowledge Consolidation based Class Incremental Online Learning with Limited Data
    Karim, Mohammed Asad
    Verma, Vinay Kumar
    Singh, Pravendra
    Namboodiri, Vinay
    Rai, Piyush
    PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, 2021, : 2621 - 2627