Confidence-weighted Learning for Feature Evolution

被引:0
|
作者
Liu Y.-F. [1 ,2 ]
Li W.-B. [1 ]
Gao Y. [1 ]
机构
[1] State Key Laboratory for Novel Software Technology (Nanjing University), Nanjing
[2] College of Mathematics and Information Engineering, Longyan University, Longyan
来源
Ruan Jian Xue Bao/Journal of Software | 2022年 / 33卷 / 04期
关键词
Classification; Evolvable features; Machine learning; Online learning; Second-order confidence-weighted;
D O I
10.13328/j.cnki.jos.006480
中图分类号
学科分类号
摘要
Compared with traditional online learning for fixed features, feature evolvable learning usually assumes that features would not vanish or appear in an arbitrary way, while the old features and new features gathered by the hardware devices will disappear and emerge at the same time along with the devices exchanging simultaneously. However, the existing feature evolvable algorithms merely utilize the first-order information of data streams, regardless of the second-order information which explores the correlations between features and significantly improves the classification performance. A confidence-weighted learning for feature evolution (CWFE) algorithm is proposed to solve the aforementioned problem. First, second-order confidence-weighted learning for data streams is introduced to update the prediction model. Next, in order to benefit the learned model, linear mapping during the overlap period is learned to recover the old features. Then, the existing model is updated with the recovered old features, and at the same time, a new predictive model is learned with the new features. Furthermore, two ensemble methods are introduced to utilize these two models. Finally, empirical studies show superior performance over state-of-the-art feature evolvable algorithms. © Copyright 2022, Institute of Software, the Chinese Academy of Sciences. All rights reserved.
引用
收藏
页码:1315 / 1325
页数:10
相关论文
共 25 条
  • [1] Zhai TT, Gao Y, Zhu JW., Survey of online learning algorithms for streaming data classification, Ruan Jian Xue Bao/Journal of Software, 31, 4, pp. 912-931, (2020)
  • [2] Guo HS, Zhang AJ, Wang WJ., Concept drift detection method based on online performance test, Ruan Jian Xue Bao/Journal of Software, 31, 4, pp. 932-947, (2020)
  • [3] Hou B, Zhang L, Zhou Z., Learning with feature evolvable streams, Proc. of the 30th Annual Conf. on Neural Information Processing Systems, pp. 1417-1427, (2017)
  • [4] Zinkevich M., Online convex programming and generalized infinitesimal gradient ascent, Proc. of the 20th Int'l Conf. on Machine Learning, pp. 928-936, (2003)
  • [5] Hou B, Zhang L, Zhou Z., Learning with feature evolvable streams, IEEE Trans. on Knowledge and Data Engineering, 33, 6, pp. 2602-2615, (2021)
  • [6] Freund Y, Schapire RE., A decision-theoretic generalization of on-line learning and an application to boosting, Journal of Computer and System Sciences, 55, 1, pp. 119-139, (1997)
  • [7] Liu YF, Li WB, Gao Y., Passive-aggressive learning with feature evolvable streams, Journal of Computer Research and Development, 58, 8, pp. 1575-1585, (2021)
  • [8] Crammer K, Dekel O, Keshet J, Et al., Online passive-aggressive algorithms, Journal of Machine Learning Research, 7, 3, pp. 551-585, (2006)
  • [9] Li ZJ, Li YX, Wang F, Et al., Online learning algorithms for big data analytics: A survey, Journal of Computer Research and Development, 52, 8, pp. 1707-1721, (2015)
  • [10] Pan ZS, Tang SQ, Qiu JX, Et al., Survey on online learning algorithms, Journal of Data Acquisition and Processing, 31, 6, pp. 1067-1082, (2016)