Design of lightweight incremental ensemble learning algorithm

被引:0
|
作者
Ding J. [1 ]
Tang J. [1 ]
Yu Z. [1 ]
机构
[1] School of Electronic Engineering, Xidian University, Xi'an
关键词
Classification and regression tree (CART); Computational complexity; Emitter classification; Ensemble learning; Open set recognition;
D O I
10.12305/j.issn.1001-506X.2021.04.01
中图分类号
学科分类号
摘要
Conventional classification and regression tree (CART) can only increase the cognition of new categories by retraining the entire model, causing a great increase in training costs when the number of sample categories is large. To solve this problem, a lightweight incremental ensemble learning algorithm is proposed. When new categories enter the training set, we can classify those new categories by only adding CART base classifiers with the ability of open set recognition into the original ensemble learning algorithm. No retraining is required, so the computational complexity is reduced and the learning process is simplified. In the simulation experiments with the background of emitter classification, the results show that this algorithm can maintain the classification accuracy of more than 90% when the signal noise ratio equal to or larger than -4 dB. In the case of a large number of categories to be classified, this algorithm can significantly reduce the training cost compared with conventional CART. © 2021, Editorial Office of Systems Engineering and Electronics. All right reserved.
引用
收藏
页码:861 / 867
页数:6
相关论文
共 26 条
  • [1] BREIMAN L, FRIEDMAN J, OLSHEN R, Et al., Classification and regression trees, pp. 130-173, (1984)
  • [2] ZHANG L, NING Q., Two improvements on CART decision tree and its application, Computer Engineering and Design, 36, 5, pp. 1209-1213, (2015)
  • [3] LIN S Q, LUO W., A new multilevel CART algorithm for multilevel data with binary outcomes, Multivariate Behavioral Research, 54, 4, pp. 578-592, (2019)
  • [4] JAWORSKI M, DUDA P, PIETRUCZUK L., The CART decision tree for mining data streams, Information Sciences: An International Journal, 266, pp. 1-15, (2014)
  • [5] POLAKA I, TOM I, BORISOV A., Decision tree classifiers in bioinfor-matics, Scientific Journal of Riga Technical University Computer Sciences, 42, 1, pp. 118-123, (2010)
  • [6] SOHN S Y, KIM J W., Decision tree-based technology credit scoring for start-up firms: Korean case, Expert Systems with Applications, 39, 4, pp. 4007-4012, (2011)
  • [7] GALINDO J, TAMAYO P., Credit risk assessment using statistical and machine learning: basic methodology and risk modeling applications, Computational Economics, 15, 1, pp. 107-143, (2000)
  • [8] DENG H X, DIAO Y F, WU W, Et al., A high-speed D-CART online fault diagnosis algorithm for rotor systems, Applied Intelligence: the International Journal of Research on Intelligent Systems for Real Life Complex Problems, 50, 2, pp. 29-41, (2020)
  • [9] HO T K., Random decision forest, Proc.of the 3rd International Conference on Document Analysis and Recognition, 1, pp. 278-282, (1995)
  • [10] BREIMAN L., Random forests, Machine Learning, 45, 1, pp. 5-32, (2001)