A Double Pruning Algorithm for Classification Ensembles

被引:0
|
作者
Soto, Victor [1 ]
Martinez-Munoz, Gonzalo [1 ]
Hernandez-Lobato, Daniel [1 ]
Suarez, Alberto [1 ]
机构
[1] Univ Autonoma Madrid, EPS, E-28049 Madrid, Spain
来源
关键词
ensemble pruning; instance-based pruning; ensemble learning; decision trees;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This article introduces a double pruning algorithm that can be used to reduce the storage requirements, speed-up the classification process and improve the performance of parallel ensembles. A key element in the design of the algorithm is the estimation of the class label that the ensemble assigns to a given test instance by polling only a fraction of its classifiers. Instead of applying this form of dynamical (instance-based) pruning to the original ensemble, we propose to apply it to a subset of classifiers selected using standard ensemble pruning techniques. The pruned subensemble is built by first modifying the order in which classifiers are aggregated in the ensemble and then selecting the first classifiers in the ordered sequence. Experiments in benchmark problems illustrate the improvements that can be obtained with this technique. Specifically, using a bagging ensemble of 101 CART trees as a starting point, only the 21 trees of the pruned ordered ensemble need to be stored in memory. Depending on the classification task, on average, only 5 to 12 of these 21 classifiers are queried to compute the predictions. The generalization performance achieved by this double pruning algorithm is similar to pruned ordered bagging and significantly better than standard bagging.
引用
收藏
页码:104 / 113
页数:10
相关论文
共 50 条
  • [21] A New Multilayer Perceptron Pruning Algorithm for Classification and Regression Applications
    Philippe Thomas
    Marie-Christine Suhner
    Neural Processing Letters, 2015, 42 : 437 - 458
  • [22] A New Multilayer Perceptron Pruning Algorithm for Classification and Regression Applications
    Thomas, Philippe
    Suhner, Marie-Christine
    NEURAL PROCESSING LETTERS, 2015, 42 (02) : 437 - 458
  • [23] Hierarchical Pruning of Deep Ensembles with Focal Diversity
    Wu, Yanzhao
    Chow, Ka-Ho
    Wei, Wenqi
    Liu, Ling
    ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2024, 15 (01)
  • [24] Effective pruning of neural network classifier ensembles
    Lazarevic, A
    Obradovic, Z
    IJCNN'01: INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, PROCEEDINGS, 2001, : 796 - 801
  • [25] Collective-agreement-based pruning of ensembles
    Rokach, Lior
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2009, 53 (04) : 1015 - 1026
  • [26] Pruning CNN's with Linear Filter Ensembles
    Sandor, Csanad
    Pavel, Szabolcs
    Csato, Lehel
    ECAI 2020: 24TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, 325 : 1435 - 1442
  • [27] Method for pruning Bagging ensembles and its applications
    School of Economics and Finance, Xi'an Jiaotong University, Xi'an 710049, China
    不详
    不详
    Xitong Gongcheng Lilum yu Shijian, 2008, 7 (105-110):
  • [28] Pruning optimum-path forest ensembles using metaheuristic optimization for land-cover classification
    Nachif Fernandes, Silas Evandro
    de Souza, Andre Nunes
    Gastaldello, Danilo Sinkiti
    Pereira, Danillo Roberto
    Papa, Joao Paulo
    INTERNATIONAL JOURNAL OF REMOTE SENSING, 2017, 38 (20) : 5736 - 5762
  • [29] An improved ensemble pruning for mammogram classification using modified Bees algorithm
    Qasem, Ashwaq
    Abdullah, Siti Norul Huda Sheikh
    Sahran, Shahnorbanun
    Albashish, Dheeb
    Goudarzi, Shidrokh
    Arasaratnam, Shantini
    NEURAL COMPUTING & APPLICATIONS, 2022, 34 (12): : 10093 - 10116
  • [30] Feature Pruning for Partial Discharge Classification using IndFeat and ReliefF Algorithm
    Raymond, Wong Jee Keen
    Sing, Lau Theng
    Kin, Lai Weng
    Meng, Goh Kam
    Illias, Hazlee Azil
    Abu Bakar, Ab Halim
    2018 IEEE 2ND INTERNATIONAL CONFERENCE ON DIELECTRICS (ICD), 2018,