Comparison of Bagging and Boosting Algorithms on Sample and Feature Weighting

被引:0
|
作者
Shirai, Satoshi [1 ]
Kudo, Mineichi [1 ]
Nakamura, Atsuyoshi [1 ]
机构
[1] Hokkaido Univ, Div Comp Sci, Grad Sch Informat Sci & Technol, Sapporo, Hokkaido 0600814, Japan
来源
关键词
RANDOM SUBSPACE METHOD;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We compared boosting with bagging in different strengths of learning algorithms for improving the performance of the set of classifiers to be fused. Our experimental results showed that boosting worked much with weak algorithms and bagging, especially feature-based bagging, worked much with strong algorithms. On the basis of these observations we developed a mixed fusion method in which randomly chosen features are used with a standard boosting method. As a result, it was confirmed that the proposed fusion method worked well regardless of learning algorithms.
引用
收藏
页码:22 / 31
页数:10
相关论文
共 50 条
  • [1] An Empirical Comparison of Voting Classification Algorithms: Bagging, Boosting, and Variants
    Eric Bauer
    Ron Kohavi
    Machine Learning, 1999, 36 : 105 - 139
  • [2] An empirical comparison of voting classification algorithms: Bagging, boosting, and variants
    Bauer, E
    Kohavi, R
    MACHINE LEARNING, 1999, 36 (1-2) : 105 - 139
  • [3] Comparison between two coevolutionary feature weighting algorithms in clustering
    Gancarski, P.
    Blansche, A.
    Wania, A.
    PATTERN RECOGNITION, 2008, 41 (03) : 983 - 994
  • [4] Extensions to Online Feature Selection Using Bagging and Boosting
    Ditzler, Gregory
    LaBarck, Joseph
    Ritchie, James
    Rosen, Gail
    Polikar, Robi
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (09) : 4504 - 4509
  • [5] Bagging and Boosting Algorithms for Support Vector Machine Classifiers
    Shigei, Noritaka
    Miyajima, Hiromi
    PROCEEDINGS OF THE 8TH WSEAS INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE, KNOWLEDGE ENGINEERING AND DATA BASES, 2009, : 372 - +
  • [6] Investigating the Effect of Randomly Selected Feature Subsets on Bagging and Boosting
    Wang, Guan-Wei
    Zhang, Chun-Xia
    Guo, Gao
    COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION, 2015, 44 (03) : 636 - 646
  • [7] Estimation and comparison of gabion weir oxygen mass transfer by ensemble learnings of bagging, boosting, and stacking algorithms
    Luxmi K.M.
    Tiwari N.K.
    Ranjan S.
    ISH Journal of Hydraulic Engineering, 2023, 29 (sup1) : 196 - 211
  • [8] A Comparison of Feature and Expert-Based Weighting Algorithms in Landslide Susceptibility Mapping
    Sahin, Emrehan Kutlug
    Ipbuker, Cengizhan
    Kavzoglu, Taskin
    WORLD MULTIDISCIPLINARY EARTH SCIENCES SYMPOSIUM, WMESS 2015, 2015, 15 : 462 - 467
  • [9] Feature Weighting and Selection Using Hypothesis Margin of Boosting
    Alshawabkeh, Malak
    Aslam, Javed A.
    Dy, Jennifer G.
    Kaeli, David
    12TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM 2012), 2012, : 41 - 50
  • [10] Feature Weighting and Boosting for Few-Shot Segmentation
    Khoi Nguyen
    Todorovic, Sinisa
    2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019), 2019, : 622 - 631