Classifier subset selection based on classifier representation and clustering ensemble

被引:4
|
作者
Li, Danyang [1 ]
Zhang, Zhuhong [1 ]
Wen, Guihua [2 ]
机构
[1] Guizhou Univ, Big Data & Informat Engn Coll, Guiyang 550002, Guizhou, Peoples R China
[2] South China Univ Technol, Comp Sci & Engn Coll, Guangzhou 511400, Guangdong, Peoples R China
基金
中国国家自然科学基金;
关键词
Ensemble pruning; Classifier representation; Clustering ensemble; Classifier ensemble; DIVERSITY; ACCURACY; MODEL; SIMILARITY; MARGIN;
D O I
10.1007/s10489-023-04572-x
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Ensemble pruning can improve the performance and reduce the storage requirements of an integration system. Most ensemble pruning approaches remove low-quality or redundant classifiers by evaluating the classifiers' competence and relationships via their predictions. However, finding the best way to represent classifiers and create ensemble diversity is still a worthy research problem in the ensemble pruning field. To confront this issue, we discuss whether properties other than predictions can represent classifiers and propose a new classifier selection method, classifier-representation- and clustering-ensemble-based ensemble pruning (CRCEEP). In the proposed method, two new classifier-representation-learning methods, local-space- and relative-transformation-based representation, are proposed to obtain more information about classifiers. CRCEEP incorporates the clustering ensemble method to group classifiers and prune redundant learners. Finally, accurate and diverse classifiers are integrated to improve classification performance. Extensive experiments were carried out on UCI datasets, and the experimental results verify CRCEEP's effectiveness and the necessity of classifier representation.
引用
收藏
页码:20730 / 20752
页数:23
相关论文
共 50 条
  • [1] Classifier subset selection based on classifier representation and clustering ensemble
    Danyang Li
    Zhuhong Zhang
    Guihua Wen
    [J]. Applied Intelligence, 2023, 53 : 20730 - 20752
  • [2] Classifier ensemble selection based on affinity propagation clustering
    Meng, Jun
    Hao, Han
    Luan, Yushi
    [J]. JOURNAL OF BIOMEDICAL INFORMATICS, 2016, 60 : 234 - 242
  • [3] Classifier Ensemble with Relevance-Based Feature Subset Selection
    Zhao, Junyang
    Zhang, Zhili
    Chang, Zhenjun
    Liu, Dianjian
    [J]. 2017 2ND INTERNATIONAL CONFERENCE ON IMAGE, VISION AND COMPUTING (ICIVC 2017), 2017, : 1137 - 1141
  • [4] Classifier Ensemble Framework Based on Clustering
    Parvin, Hamid
    Parvin, Sajad
    Rezaei, Zahra
    Mohamadi, Moslem
    [J]. DISTRIBUTED COMPUTING AND ARTIFICIAL INTELLIGENCE, 2012, 151 : 743 - 750
  • [5] Classifier Ensemble Framework Based on Clustering Method
    Parvin, Hamid
    Parvin, Sajad
    Rezaei, Zahra
    Mohamadi, Moslem
    [J]. EMERGING TRENDS AND APPLICATIONS IN INFORMATION COMMUNICATION TECHNOLOGIES, 2012, 281 : 338 - 348
  • [6] Proposing a classifier ensemble framework based on classifier selection and decision tree
    Parvin, Hamid
    MirnabiBaboli, Miresmaeil
    Alinejad-Rokny, Hamid
    [J]. ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2015, 37 : 34 - 42
  • [7] Adaptive Selection of Classifier Ensemble Based on GMDH
    Xiao, Jin
    He, Changzheng
    [J]. 2008 INTERNATIONAL SEMINAR ON FUTURE INFORMATION TECHNOLOGY AND MANAGEMENT ENGINEERING, PROCEEDINGS, 2008, : 61 - 64
  • [8] Dynamic Classifier Ensemble Selection Based on GMDH
    Xiao, Jin
    He, Changzheng
    [J]. INTERNATIONAL JOINT CONFERENCE ON COMPUTATIONAL SCIENCES AND OPTIMIZATION, VOL 1, PROCEEDINGS, 2009, : 731 - 734
  • [9] Ensemble Selection based on Classifier Prediction Confidence
    Tien Thanh Nguyen
    Anh Vu Luong
    Manh Truong Dang
    Liew, Alan Wee-Chung
    McCall, John
    [J]. PATTERN RECOGNITION, 2020, 100
  • [10] Integrate Classifier Diversity Evaluation to Feature Selection Based Classifier Ensemble Reduction
    Yao, Gang
    Chao, Fei
    Zeng, Hualin
    Shi, Minghui
    Jiang, Min
    Zhou, Changle
    [J]. 2014 14TH UK WORKSHOP ON COMPUTATIONAL INTELLIGENCE (UKCI), 2014, : 37 - 43