Manifold Nearest Neighbor Sample Envelope and Hierarchical Multitype Transform Algorithm for Ensemble Learning

被引:0
|
作者
Yan, Fang [1 ]
Ma, Jie [1 ]
Li, Yong-Ming [1 ]
Wang, Pin [1 ]
Qin, Jian [1 ]
Liu, Cheng-Yu [1 ]
机构
[1] School of Microelectronics and Communication Engineering, Chongqing University, Chongqing,400044, China
来源
关键词
Ensemble learning is an important branch and research hotspot in machine learning. The current main paradigm of ensemble learning algorithms is to obtain multiple sample subsets based on the original sample set; then to train the base classifiers separately and integrate the base classifier results. The main problem of this paradigm is that the diversity among subsets is significantly reduced since all subsets are derived from the original sample set. This problem is especially serious when the data size of the original sample set is small; the sampling ratio is large; and the degree of imbalance is high. In addition; the improvement in the divisibility of the sample subsets obtained by resampling is also limited when the divisibility of the original sample set is low. In order to solve this problem; this paper proposes a manifold nearest neighbor sample envelope and hierarchical multitype transformation algorithm for ensemble learning. It aims to improve the diversity and divisibility of the sample subset by transforming the original sample set into a hierarchical enveloped sample set with differentiation through the envelopment mechanism and the multitype sample transformation. First; the manifold nearest neighbor sample envelope mechanism is designed to transform the original samples into sample envelopes. Second; a multitype sample transformation is performed on the sample envelope to reconstruct and generate hierarchical envelope samples. Third; the inter-layer consistency preservation mechanism based on joint structure domain adaptation is designed to preserve the distribution consistency of the samples before and after the transformation. Thus; improving the high representation ability of the envelope samples to the original samples. Four; feature dimensionality reduction and basic classifier training are performed separately for each layer of the envelope sample set. Finally; the final classification results are obtained using the two dimensional decision fusion mechanism. More than ten datasets and several representative algorithms are used in the experimental part for validation. The results show that compared with the original sample set; the proposed algorithm improves the diversity of the sample subsets; which improves the ensemble learning performance with up to 18.56% accuracy improvement. Compared with related ensemble learning algorithms; the accuracy of this paper’s algorithm has been improved by up to 7.56%. This paper provides a new idea for the improvement of existing ensemble learning algorithms; and it is valuable to transform the paradigm ofensemble learning directly based on original samplesinto a new paradigm ofensemble learning based on hierarchical envelope samples. © 2024 Chinese Institute of Electronics. All rights reserved;
D O I
10.12263/DZXB.20231002
中图分类号
学科分类号
摘要
引用
收藏
页码:4125 / 4141
相关论文
共 50 条
  • [1] Hierarchical manifold sample envelope transformation model for ensemble classification
    Ma, Jie
    Chen, Hong
    Li, Yongming
    Wang, Pin
    Liu, Chengyu
    Shen, Yinghua
    Pedrycz, Witold
    Wang, Wei
    Li, Fan
    COMPUTERS & ELECTRICAL ENGINEERING, 2025, 123
  • [2] Ensemble learning approach in Improved K Nearest Neighbor algorithm for Text Categorization
    Iswarya, P.
    Radha, V.
    2015 INTERNATIONAL CONFERENCE ON INNOVATIONS IN INFORMATION, EMBEDDED AND COMMUNICATION SYSTEMS (ICIIECS), 2015,
  • [3] Ensemble numeric prediction of nearest-neighbor learning
    He L.
    Song Q.
    Shen J.
    Hai Z.
    Information Technology Journal, 2010, 9 (03) : 535 - 544
  • [4] DISCONA: distributed sample compression for nearest neighbor algorithm
    Rybicki, Jedrzej
    Frenklach, Tatiana
    Puzis, Rami
    APPLIED INTELLIGENCE, 2023, 53 (17) : 19976 - 19989
  • [5] DISCONA: distributed sample compression for nearest neighbor algorithm
    Jedrzej Rybicki
    Tatiana Frenklach
    Rami Puzis
    Applied Intelligence, 2023, 53 : 19976 - 19989
  • [6] Hierarchical distance learning by stacking nearest neighbor classifiers
    Ozay, Mete
    Yarman-Vural, Fatos Tunay
    INFORMATION FUSION, 2016, 29 : 14 - 31
  • [7] Imbalanced Ensemble Algorithm Based on Envelope Learning and Hierarchical Structure Consistency Mechanism
    Li F.
    Zhang X.-H.
    Li Y.-M.
    Wang P.
    Tien Tzu Hsueh Pao/Acta Electronica Sinica, 2024, 52 (03): : 751 - 761
  • [8] Manifold learning using Euclidean k-nearest neighbor graphs
    Costa, JA
    Hero, AO
    2004 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, VOL III, PROCEEDINGS: IMAGE AND MULTIDIMENSIONAL SIGNAL PROCESSING SPECIAL SESSIONS, 2004, : 988 - 991
  • [9] Manifold neighboring envelope sample generation mechanism for imbalanced ensemble classification
    Wang, Yiwen
    Li, Yongming
    Shen, Yinghua
    Li, Fan
    Wang, Pin
    INFORMATION SCIENCES, 2024, 679
  • [10] A Brief Review of Nearest Neighbor Algorithm for Learning and Classification
    Taunk, Kashvi
    De, Sanjukta
    Verma, Srishti
    Swetapadma, Aleena
    PROCEEDINGS OF THE 2019 INTERNATIONAL CONFERENCE ON INTELLIGENT COMPUTING AND CONTROL SYSTEMS (ICCS), 2019, : 1255 - 1260