Online Class Incremental Contrastive Learning Based on Incremental Mixup-induced Universum

被引:0
|
作者
Liu, Yu-Wei [1 ]
Chen, Song-Can [1 ]
机构
[1] College of Computer Science and Technology, Nanjing University of Aeronautics and Astronautics, Nanjing,211106, China
来源
Ruan Jian Xue Bao/Journal of Software | 2024年 / 35卷 / 12期
关键词
Adversarial machine learning - Cache memory - Federated learning - Unsupervised learning;
D O I
10.13328/j.cnki.jos.007094
中图分类号
学科分类号
摘要
Online class-increment learning aims to learn new classes effectively under data stream scenarios and guarantee that the model meets the small cache and small batch constraints. However, due to the one-pass nature of data streams, it is difficult for the category information in small batches like offline learning to be exploited by multiple explorations. To alleviate this problem, current studies adopt multiple data augmentation combined with contrastive learning for model training. Nevertheless, considering the limitations of small cache and small batches, existing methods of selecting and storing data randomly are not conducive to obtaining diverse negative samples, which restricts the model discriminability. Previous studies have shown that hard negative samples are the key to improving contrastive learning performance, but this is rarely explored in online learning scenarios. The condued data proposed in traditional Universum learning provides a simple yet intuitive strategy using hard negative samples. Specifically, this study has proposed mixup-induced Universum (MIU) with certain coefficients previously, which effectively improves the performance of offline contrastive learning. Inspired by this, it tries to introduce MIU to online scenes, which is different from the previously statically generated Universum, and data stream scenarios face some additional challenges. Firstly, due to the increasing number of classes, the conventional approach of generating Universum based on globally given classes statically becomes inapplicable, necessitating redefinition and dynamic generation. Therefore, this study proposes to recursively generate MIU with the maximum entropy (incremental MIU, IMIU) relative to the seen (local) class and provides it with an additional small cache to meet the memory limit generally. Secondly, the generated IMIU and positive samples in small batches are mixed up together again to produce diverse and high-quality hard negative samples. Finally, by combining the above steps, the IMIU-based contrastive learning (IUCL) algorithm is developed. Meanwhile, comparison experiments on the standard datasets CIFAR-10, CIFAR-100, and Mini-ImageNet verify the validity of the proposed algorithm. © 2024 Chinese Academy of Sciences. All rights reserved.
引用
收藏
页码:5544 / 5557
相关论文
共 50 条
  • [21] Online Class-Incremental Continual Learning with Adversarial Shapley Value
    Shim, Dongsub
    Mai, Zheda
    Jeong, Jihwan
    Sanner, Scott
    Kim, Hyunwoo
    Jang, Jongseong
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 9630 - 9638
  • [22] Instance-level and Class-level Contrastive Incremental Learning for Image Classification
    Han, Jia-yi
    Liu, Jian-wei
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [23] Fisher Discriminant Analysis Random Forest for Online Class Incremental Learning
    Xiong, Wang
    Wang, Yijie
    Cheng, Li
    2018 IEEE INT CONF ON PARALLEL & DISTRIBUTED PROCESSING WITH APPLICATIONS, UBIQUITOUS COMPUTING & COMMUNICATIONS, BIG DATA & CLOUD COMPUTING, SOCIAL COMPUTING & NETWORKING, SUSTAINABLE COMPUTING & COMMUNICATIONS, 2018, : 597 - 604
  • [24] Versatile Incremental Learning: Towards Class and Domain-Agnostic Incremental Learning
    Park, Min-Yeong
    Lee, Jae-Ho
    Park, Gyeong-Moon
    COMPUTER VISION - ECCV 2024, PT XXXI, 2025, 15089 : 271 - 288
  • [25] Leveraging joint incremental learning objective with data ensemble for class incremental learning
    Mazumder, Pratik
    Karim, Mohammed Asad
    Joshi, Indu
    Singh, Pravendra
    NEURAL NETWORKS, 2023, 161 : 202 - 212
  • [26] DiffClass: Diffusion-Based Class Incremental Learning
    Meng, Zichong
    Zhang, Jie
    Yang, Changdi
    Zhan, Zheng
    Zhao, Pu
    Wang, Yanzhi
    COMPUTER VISION - ECCV 2024, PT LXXXVII, 2025, 15145 : 142 - 159
  • [27] Class-Incremental Learning based on Label Generation
    Shao, Yijia
    Guo, Yiduo
    Zhao, Dongyan
    Liu, Bing
    61ST CONFERENCE OF THE THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 2, 2023, : 1263 - 1276
  • [28] Class-Incremental Learning Based on Anomaly Detection
    Zhang, Lijuan
    Yang, Xiaokang
    Zhang, Kai
    Li, Yong
    Li, Fu
    Li, Jun
    Li, Dongming
    IEEE ACCESS, 2023, 11 : 69423 - 69438
  • [29] Dual Contrastive Learning Framework for Incremental Text Classification
    Wang, Yigong
    Wang, Zhuoyi
    Lin, Yu
    Guo, Jinghui
    Halim, Sadaf Md
    Khan, Latifur
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS - EMNLP 2023, 2023, : 194 - 206
  • [30] Attenuating Catastrophic Forgetting by Joint Contrastive and Incremental Learning
    Ferdinand, Quentin
    Clement, Benoit
    Oliveau, Quentin
    Le Chenadec, Gilles
    Papadakis, Panagiotis
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS, CVPRW 2022, 2022, : 3781 - 3788