Class-Incremental Learning Method With Fast Update and High Retainability Based on Broad Learning System

被引:5
|
作者
Du, Jie [1 ]
Liu, Peng [2 ]
Vong, Chi-Man [2 ]
Chen, Chuangquan [3 ]
Wang, Tianfu [1 ]
Chen, C. L. Philip [4 ,5 ]
机构
[1] Shenzhen Univ, Natl Reg Key Technol Engn Lab Med Ultrasound, Guangdong Key Lab Biomed Measurements & Ultrasound, Sch Biomed Engn,Hlth Sci Ctr, Shenzhen 518060, Peoples R China
[2] Univ Macau, Dept Comp & Informat Sci, Macau, Peoples R China
[3] Wuyi Univ, Fac Intelligent Mfg, Jiangmen 529020, Peoples R China
[4] South China Univ Technol, Sch Comp Sci & Engn, Guangzhou, Peoples R China
[5] South China Univ Technol, Pazhou Lab, Guangzhou 510335, Peoples R China
基金
中国国家自然科学基金;
关键词
Training; Task analysis; Learning systems; Data models; Predictive models; Correlation; Support vector machines; Broad learning system (BLS); catastrophic forgetting; class correlations; class-incremental learning (CIL); recursive update rule;
D O I
10.1109/TNNLS.2023.3259016
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Machine learning aims to generate a predictive model from a training dataset of a fixed number of known classes. However, many real-world applications (such as health monitoring and elderly care) are data streams in which new data arrive continually in a short time. Such new data may even belong to previously unknown classes. Hence, class-incremental learning (CIL) is necessary, which incrementally and rapidly updates an existing model with the data of new classes while retaining the existing knowledge of old classes. However, most current CIL methods are designed based on deep models that require a computationally expensive training and update process. In addition, deep learning based CIL (DCIL) methods typically employ stochastic gradient descent (SGD) as an optimizer that forgets the old knowledge to a certain extent. In this article, a broad learning system-based CIL (BLS-CIL) method with fast update and high retainability of old class knowledge is proposed. Traditional BLS is a fast and effective shallow neural network, but it does not work well on CIL tasks. However, our proposed BLS-CIL can overcome these issues and provide the following: 1) high accuracy due to our novel class-correlation loss function that considers the correlations between old and new classes; 2) significantly short training/update time due to the newly derived closed-form solution for our class-correlation loss without iterative optimization; and 3) high retainability of old class knowledge due to our newly derived recursive update rule for CIL (RULL) that does not replay the exemplars of all old classes, as contrasted to the exemplars-replaying methods with the SGD optimizer. The proposed BLS-CIL has been evaluated over 12 real-world datasets, including seven tabular/numerical datasets and six image datasets, and the compared methods include one shallow network and seven classical or state-of-the-art DCIL methods. Experimental results show that our BIL-CIL can significantly improve the classification performance over a shallow network by a large margin (8.80%-48.42%). It also achieves comparable or even higher accuracy than DCIL methods, but greatly reduces the training time from hours to minutes and the update time from minutes to seconds.
引用
收藏
页码:11332 / 11345
页数:14
相关论文
共 50 条
  • [1] Class-Incremental Exemplar Compression for Class-Incremental Learning
    Luo, Zilin
    Liu, Yaoyao
    Schiele, Bernt
    Sun, Qianru
    [J]. 2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 11371 - 11380
  • [2] CLASS-INCREMENTAL LEARNING WITH REPETITION
    Hemati, Hamed
    Cossu, Andrea
    Carta, Antonio
    Hurtado, Julio
    Pellegrini, Lorenzo
    Bacciu, Davide
    Lomonaco, Vincenzo
    Borth, Damian
    [J]. CONFERENCE ON LIFELONG LEARNING AGENTS, VOL 232, 2023, 232 : 437 - 455
  • [3] Class-Incremental Learning Based on Anomaly Detection
    Zhang, Lijuan
    Yang, Xiaokang
    Zhang, Kai
    Li, Yong
    Li, Fu
    Li, Jun
    Li, Dongming
    [J]. IEEE ACCESS, 2023, 11 : 69423 - 69438
  • [4] Class-Incremental Learning based on Label Generation
    Shao, Yijia
    Guo, Yiduo
    Zhao, Dongyan
    Liu, Bing
    [J]. 61ST CONFERENCE OF THE THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 2, 2023, : 1263 - 1276
  • [5] Class-Incremental Learning: A Survey
    Zhou, Da-Wei
    Wang, Qi-Wei
    Qi, Zhi-Hong
    Ye, Han-Jia
    Zhan, De-Chuan
    Liu, Ziwei
    [J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2024, 46 (12) : 9851 - 9873
  • [6] Federated Class-Incremental Learning
    Dong, Jiahua
    Wang, Lixu
    Fang, Zhen
    Sun, Gan
    Xu, Shichao
    Wang, Xiao
    Zhu, Qi
    [J]. 2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, : 10154 - 10163
  • [7] A Class-Incremental Learning Method Based on One Class Support Vector Machine
    Yao, Chengfei
    Zou, Jie
    Luo, Yanan
    Li, Tao
    Bai, Gang
    [J]. 2019 3RD INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE, AUTOMATION AND CONTROL TECHNOLOGIES (AIACT 2019), 2019, 1267
  • [8] Deep Learning for Class-Incremental Learning: A Survey
    Zhou D.-W.
    Wang F.-Y.
    Ye H.-J.
    Zhan D.-C.
    [J]. Jisuanji Xuebao/Chinese Journal of Computers, 2023, 46 (08): : 1577 - 1605
  • [9] Is Class-Incremental Enough for Continual Learning?
    Cossu, Andrea
    Graffieti, Gabriele
    Pellegrini, Lorenzo
    Maltoni, Davide
    Bacciu, Davide
    Carta, Antonio
    Lomonaco, Vincenzo
    [J]. FRONTIERS IN ARTIFICIAL INTELLIGENCE, 2022, 5
  • [10] Class-Incremental Learning with Generative Classifiers
    van de Ven, Gido M.
    Li, Zhe
    Tolias, Andreas S.
    [J]. 2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS, CVPRW 2021, 2021, : 3606 - 3615