Manifold Nearest Neighbor Sample Envelope and Hierarchical Multitype Transform Algorithm for Ensemble Learning

被引:0
|
作者
Yan, Fang [1 ]
Ma, Jie [1 ]
Li, Yong-Ming [1 ]
Wang, Pin [1 ]
Qin, Jian [1 ]
Liu, Cheng-Yu [1 ]
机构
[1] School of Microelectronics and Communication Engineering, Chongqing University, Chongqing,400044, China
来源
关键词
Ensemble learning is an important branch and research hotspot in machine learning. The current main paradigm of ensemble learning algorithms is to obtain multiple sample subsets based on the original sample set; then to train the base classifiers separately and integrate the base classifier results. The main problem of this paradigm is that the diversity among subsets is significantly reduced since all subsets are derived from the original sample set. This problem is especially serious when the data size of the original sample set is small; the sampling ratio is large; and the degree of imbalance is high. In addition; the improvement in the divisibility of the sample subsets obtained by resampling is also limited when the divisibility of the original sample set is low. In order to solve this problem; this paper proposes a manifold nearest neighbor sample envelope and hierarchical multitype transformation algorithm for ensemble learning. It aims to improve the diversity and divisibility of the sample subset by transforming the original sample set into a hierarchical enveloped sample set with differentiation through the envelopment mechanism and the multitype sample transformation. First; the manifold nearest neighbor sample envelope mechanism is designed to transform the original samples into sample envelopes. Second; a multitype sample transformation is performed on the sample envelope to reconstruct and generate hierarchical envelope samples. Third; the inter-layer consistency preservation mechanism based on joint structure domain adaptation is designed to preserve the distribution consistency of the samples before and after the transformation. Thus; improving the high representation ability of the envelope samples to the original samples. Four; feature dimensionality reduction and basic classifier training are performed separately for each layer of the envelope sample set. Finally; the final classification results are obtained using the two dimensional decision fusion mechanism. More than ten datasets and several representative algorithms are used in the experimental part for validation. The results show that compared with the original sample set; the proposed algorithm improves the diversity of the sample subsets; which improves the ensemble learning performance with up to 18.56% accuracy improvement. Compared with related ensemble learning algorithms; the accuracy of this paper’s algorithm has been improved by up to 7.56%. This paper provides a new idea for the improvement of existing ensemble learning algorithms; and it is valuable to transform the paradigm ofensemble learning directly based on original samplesinto a new paradigm ofensemble learning based on hierarchical envelope samples. © 2024 Chinese Institute of Electronics. All rights reserved;
D O I
10.12263/DZXB.20231002
中图分类号
学科分类号
摘要
引用
收藏
页码:4125 / 4141
相关论文
共 50 条
  • [41] Base Model Combination Algorithm for Resolving Tied Predictions for K-Nearest Neighbor OVA Ensemble Models
    Lutu, Patricia E. N.
    Engelbrecht, Andries P.
    INFORMS JOURNAL ON COMPUTING, 2013, 25 (03) : 517 - 526
  • [42] Scalable stellar evolution forecasting Deep learning emulation versus hierarchical nearest-neighbor interpolation
    Maltsev, K.
    Schneider, F.R.N.
    Röpke, F.K.
    Jordan, A.I.
    Qadir, G.A.
    Kerzendorf, W.E.
    Riedmiller, K.
    van der Smagt, P.
    Astronomy and Astrophysics, 2024, 681
  • [43] Non-Intrusive Load Classification and Recognition Using Soft-Voting Ensemble Learning Algorithm With Decision Tree, K-Nearest Neighbor Algorithm and Multilayer Perceptron
    Yang, Nien-Che
    Sung, Ke-Lin
    IEEE ACCESS, 2023, 11 : 94506 - 94520
  • [44] Regional Typhoon Track Prediction Using Ensemble k-Nearest Neighbor Machine Learning in the GIS Environment
    Tamamadin, Mamad
    Lee, Changkye
    Kee, Seong-Hoon
    Yee, Jurng-Jae
    REMOTE SENSING, 2022, 14 (21)
  • [45] Design Exploration of ASIP Architectures for the K-Nearest Neighbor Machine-Learning Algorithm
    Jamma, Dunia
    Ahmed, Omar
    Areibi, Shawki
    Grewal, Gary
    Molloy, Nicholas
    2016 28TH INTERNATIONAL CONFERENCE ON MICROELECTRONICS (ICM 2016), 2016, : 57 - 60
  • [46] State prediction of algae reproduction in seawater based on learning algorithm of fuzzy nearest neighbor clustering
    Zhang Y.
    Li P.
    Wu Y.
    Dongnan Daxue Xuebao (Ziran Kexue Ban)/Journal of Southeast University (Natural Science Edition), 2011, 41 (SUPPL. 1): : 32 - 35
  • [47] A novel deep learning-based brain tumor detection using the Bagging ensemble with K-nearest neighbor
    Archana, K. V.
    Komarasamy, G.
    JOURNAL OF INTELLIGENT SYSTEMS, 2023, 32 (01)
  • [48] Implementing gate operations between uncoupled qubits in linear nearest neighbor arrays using a learning algorithm
    Garigipati, Rudrayya Chowdary
    Kumar, Preethika
    QUANTUM INFORMATION PROCESSING, 2013, 12 (07) : 2291 - 2308
  • [49] C-approximate nearest neighbor query algorithm based on learning for high-dimensional data
    Yuan, Pei-Sen
    Sha, Chao-Feng
    Wang, Xiao-Ling
    Zhou, Ao-Ying
    Ruan Jian Xue Bao/Journal of Software, 2012, 23 (08): : 2018 - 2031
  • [50] A k-Nearest Neighbor Based Multi-Instance Multi-Label Learning Algorithm
    Zhang, Min-Ling
    22ND INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE (ICTAI 2010), PROCEEDINGS, VOL 2, 2010, : 207 - 212