Pruning and dynamic scheduling of cost-sensitive ensembles

被引:0
|
作者
Fan, W [1 ]
Chu, F [1 ]
Wang, HX [1 ]
Yu, PS [1 ]
机构
[1] IBM Corp, TJ Watson Res, Hawthorne, NY 10532 USA
来源
EIGHTEENTH NATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE (AAAI-02)/FOURTEENTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE (IAAI-02), PROCEEDINGS | 2002年
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Previous research has shown that averaging ensemble can scale up learning over very large cost-sensitive datasets with linear speedup independent of the learning algorithms. At the same time, it achieves the same or even better accuracy than a single model computed from the entire dataset. However, on. e major drawback is its inefficiency in prediction since every base model in the ensemble has to be consulted in order to produce a final prediction. In this paper, we propose several approaches to reduce the number of base classifiers. Among various methods explored, our empirical studies have shown that the benefit-based greedy approach can safely remove more. than 90% of the base models while maintaining or even exceeding the prediction accuracy of the original ensemble. Assuming that,each, base classifier consumes one unit of prediction time, the removal of 90% of base classifiers translates to a prediction speedup of 10 times. On top, of pruning, we propose a novel dynamic scheduling approach to further reduce the "expected" number of classifiers employed in prediction. It measures the confidence of a prediction by a subset of classifiers in the pruned ensemble. This confidence is used to decide if more classifiers are needed in order to produce a prediction that is the same as the original unpruned ensemble. This approach reduces the "expected" number of classifiers by another 25% to 75% without loss of accuracy.
引用
收藏
页码:146 / 151
页数:6
相关论文
共 50 条
  • [1] COST-SENSITIVE DECISION TREE WITH PROBABILISTIC PRUNING MECHANISM
    Zhao, Hong
    Li, Xiang-Ju
    Xu, Zi-Long
    Zhu, William
    PROCEEDINGS OF 2015 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS (ICMLC), VOL. 1, 2015, : 81 - 87
  • [2] Cost-sensitive decision trees with pre-pruning
    Du, Jun
    Cai, Zhihua
    Ling, Charles X.
    ADVANCES IN ARTIFICIAL INTELLIGENCE, 2007, 4509 : 171 - +
  • [3] Cost-Sensitive Boosting Pruning Trees for Depression Detection on Twitter
    Tong, Lei
    Liu, Zhihua
    Jiang, Zheheng
    Zhou, Feixiang
    Chen, Long
    Lyu, Jialin
    Zhang, Xiangrong
    Zhang, Qianni
    Sadka, Abdul
    Wang, Yinhai
    Li, Ling
    Zhou, Huiyu
    IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2023, 14 (03) : 1898 - 1911
  • [4] Cost-sensitive decision tree ensembles for effective imbalanced classification
    Krawczyk, Bartosz
    Wozniak, Michal
    Schaefer, Gerald
    APPLIED SOFT COMPUTING, 2014, 14 : 554 - 562
  • [5] CC4.5: cost-sensitive decision tree pruning
    Cai, J
    Durkin, J
    Cai, Q
    Data Mining VI: Data Mining, Text Mining and Their Business Applications, 2005, : 239 - 245
  • [6] Evolutionary Optimisation of Classifiers and Classifier Ensembles for Cost-Sensitive Pattern Recognition
    Schaefer, Gerald
    2013 IEEE 8TH INTERNATIONAL SYMPOSIUM ON APPLIED COMPUTATIONAL INTELLIGENCE AND INFORMATICS (SACI 2013), 2013, : 343 - 346
  • [7] Comparative study of classifier ensembles for cost-sensitive credit risk assessment
    Chen, Ning
    Ribeiro, Bernardete
    Chen, An
    INTELLIGENT DATA ANALYSIS, 2015, 19 (01) : 127 - 144
  • [8] Pruning Ensembles with Cost Constraints
    Krawczyk, Bartosz
    Wozniak, Michal
    INTELLIGENT INFORMATION AND DATABASE SYSTEMS, PT I, 2015, 9011 : 503 - 512
  • [9] Cost-sensitive decision trees with post-pruning and competition for numeric data
    Min, F. (minfanphd@163.com), 1600, Binary Information Press, P.O. Box 162, Bethel, CT 06801-0162, United States (09):
  • [10] A GENETIC PROGRAMMING-BASED LEARNING ALGORITHMS FOR PRUNING COST-SENSITIVE CLASSIFIERS
    Nikdel, Zahra
    Beigy, Hamid
    INTERNATIONAL JOURNAL OF COMPUTATIONAL INTELLIGENCE AND APPLICATIONS, 2012, 11 (02)