Class-Incremental Learning Method With Fast Update and High Retainability Based on Broad Learning System

被引:5
|
作者
Du, Jie [1 ]
Liu, Peng [2 ]
Vong, Chi-Man [2 ]
Chen, Chuangquan [3 ]
Wang, Tianfu [1 ]
Chen, C. L. Philip [4 ,5 ]
机构
[1] Shenzhen Univ, Natl Reg Key Technol Engn Lab Med Ultrasound, Guangdong Key Lab Biomed Measurements & Ultrasound, Sch Biomed Engn,Hlth Sci Ctr, Shenzhen 518060, Peoples R China
[2] Univ Macau, Dept Comp & Informat Sci, Macau, Peoples R China
[3] Wuyi Univ, Fac Intelligent Mfg, Jiangmen 529020, Peoples R China
[4] South China Univ Technol, Sch Comp Sci & Engn, Guangzhou, Peoples R China
[5] South China Univ Technol, Pazhou Lab, Guangzhou 510335, Peoples R China
基金
中国国家自然科学基金;
关键词
Training; Task analysis; Learning systems; Data models; Predictive models; Correlation; Support vector machines; Broad learning system (BLS); catastrophic forgetting; class correlations; class-incremental learning (CIL); recursive update rule;
D O I
10.1109/TNNLS.2023.3259016
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Machine learning aims to generate a predictive model from a training dataset of a fixed number of known classes. However, many real-world applications (such as health monitoring and elderly care) are data streams in which new data arrive continually in a short time. Such new data may even belong to previously unknown classes. Hence, class-incremental learning (CIL) is necessary, which incrementally and rapidly updates an existing model with the data of new classes while retaining the existing knowledge of old classes. However, most current CIL methods are designed based on deep models that require a computationally expensive training and update process. In addition, deep learning based CIL (DCIL) methods typically employ stochastic gradient descent (SGD) as an optimizer that forgets the old knowledge to a certain extent. In this article, a broad learning system-based CIL (BLS-CIL) method with fast update and high retainability of old class knowledge is proposed. Traditional BLS is a fast and effective shallow neural network, but it does not work well on CIL tasks. However, our proposed BLS-CIL can overcome these issues and provide the following: 1) high accuracy due to our novel class-correlation loss function that considers the correlations between old and new classes; 2) significantly short training/update time due to the newly derived closed-form solution for our class-correlation loss without iterative optimization; and 3) high retainability of old class knowledge due to our newly derived recursive update rule for CIL (RULL) that does not replay the exemplars of all old classes, as contrasted to the exemplars-replaying methods with the SGD optimizer. The proposed BLS-CIL has been evaluated over 12 real-world datasets, including seven tabular/numerical datasets and six image datasets, and the compared methods include one shallow network and seven classical or state-of-the-art DCIL methods. Experimental results show that our BIL-CIL can significantly improve the classification performance over a shallow network by a large margin (8.80%-48.42%). It also achieves comparable or even higher accuracy than DCIL methods, but greatly reduces the training time from hours to minutes and the update time from minutes to seconds.
引用
收藏
页码:11332 / 11345
页数:14
相关论文
共 50 条
  • [41] A survey on few-shot class-incremental learning
    Tian, Songsong
    Li, Lusi
    Li, Weijun
    Ran, Hang
    Ning, Xin
    Tiwari, Prayag
    [J]. NEURAL NETWORKS, 2024, 169 : 307 - 324
  • [42] RMM: Reinforced Memory Management for Class-Incremental Learning
    Liu, Yaoyao
    Schiele, Bernt
    Sun, Qianru
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [43] On the Stability-Plasticity Dilemma of Class-Incremental Learning
    Kim, Dongwan
    Han, Bohyung
    [J]. 2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 20196 - 20204
  • [44] Dynamic Task Subspace Ensemble for Class-Incremental Learning
    Zhang, Weile
    He, Yuanjian
    Cong, Yulai
    [J]. ARTIFICIAL INTELLIGENCE, CICAI 2023, PT II, 2024, 14474 : 322 - 334
  • [45] Class-incremental learning for multi-organ segmentation
    Chen, Junyu
    Frey, Eric
    Du, Yong
    [J]. JOURNAL OF NUCLEAR MEDICINE, 2022, 63
  • [46] Mixup-Inspired Video Class-Incremental Learning
    Long, Jinqiang
    Gao, Yizhao
    Lu, Zhiwu
    [J]. 23RD IEEE INTERNATIONAL CONFERENCE ON DATA MINING, ICDM 2023, 2023, : 1181 - 1186
  • [47] Distilling Causal Effect of Data in Class-Incremental Learning
    Hu, Xinting
    Tang, Kaihua
    Miao, Chunyan
    Hua, Xian-Sheng
    Zhang, Hanwang
    [J]. 2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 3956 - 3965
  • [48] FOSTER: Feature Boosting and Compression for Class-Incremental Learning
    Wang, Fu-Yun
    Zhou, Da-Wei
    Ye, Han-Jia
    Zhan, De-Chuan
    [J]. COMPUTER VISION, ECCV 2022, PT XXV, 2022, 13685 : 398 - 414
  • [49] Class-Incremental Generalized Zero-Shot Learning
    Sun, Zhenfeng
    Feng, Rui
    Fu, Yanwei
    [J]. MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 82 (24) : 38233 - 38247
  • [50] A survey on few-shot class-incremental learning
    Tian, Songsong
    Li, Lusi
    Li, Weijun
    Ran, Hang
    Ning, Xin
    Tiwari, Prayag
    [J]. Neural Networks, 2024, 169 : 307 - 324