Efficient optimal linear boosting of a pair of classifiers

被引:3
|
作者
Boyarshinov, Victor [1 ]
Magdon-Ismail, Malik [1 ]
机构
[1] Rensselaer Polytech Inst, Comp Sci Dept, Troy, NY 12180 USA
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 2007年 / 18卷 / 02期
基金
美国国家科学基金会;
关键词
intersection; leave-one-out error; minimum weight; point set; separability;
D O I
10.1109/TNN.2006.881707
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Boosting is a meta-learning algorithm which takes as input a set of classifiers and combines these classifiers to obtain a better classifier. We consider the combinatorial problem of efficiently and optimally boosting a pair of classifiers by reducing this problem to that of constructing the optimal linear separator for two sets of points in two dimensions.- Specifically, let each point x is an element of R-2 be assigned a weight W(x) > 0, where the weighting function can be an arbitrary positive function. We give efficient (low-order polynomial time) algorithms for constructing an optimal linear "separator" l defined as follows. Let Q be the set of points misclassified by l. Then, the weight of Q, defined as the sum of the weights of the points in Q, is minimized. If W(z) = 1 for all points, then the resulting separator minimizes (exactly) the misclassification error. Without an increase in computational complexity, our algorithm can be extended to output the leave-one-out error, an unbiased estimate of the expected performance of the resulting boosted classifier.
引用
收藏
页码:317 / 328
页数:12
相关论文
共 50 条
  • [1] Bagging, Boosting and the Random Subspace Method for Linear Classifiers
    Marina Skurichina
    Robert P. W. Duin
    Pattern Analysis & Applications, 2002, 5 : 121 - 135
  • [2] Privacy-Preserving Boosting with Random Linear Classifiers
    Sharma, Sagar
    Chen, Keke
    PROCEEDINGS OF THE 2018 ACM SIGSAC CONFERENCE ON COMPUTER AND COMMUNICATIONS SECURITY (CCS'18), 2018, : 2294 - 2296
  • [3] Bagging, boosting and the random subspace method for linear classifiers
    Skurichina, M
    Duin, RPW
    PATTERN ANALYSIS AND APPLICATIONS, 2002, 5 (02) : 121 - 135
  • [4] Optimal linear ensemble of binary classifiers
    Ahsen, Mehmet Eren
    Vogel, Robert
    Stolovitzky, Gustavo
    BIOINFORMATICS ADVANCES, 2024, 4 (01):
  • [5] Boosting adaptive linear weak classifiers for online learning and tracking
    Parag, Toufiq
    Porikli, Fatih
    Elgammal, Ahmed
    2008 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, VOLS 1-12, 2008, : 1665 - +
  • [6] OPTIMAL TRAINING OF THRESHOLDED LINEAR CORRELATION CLASSIFIERS
    HILDEBRANDT, TH
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 1991, 2 (06): : 577 - 588
  • [7] Object detection using ensemble of linear classifiers with fuzzy adaptive boosting
    Kim, Kisang
    Choi, Hyung-Il
    Oh, Kyoungsu
    EURASIP JOURNAL ON IMAGE AND VIDEO PROCESSING, 2017,
  • [8] Object detection using ensemble of linear classifiers with fuzzy adaptive boosting
    Kisang Kim
    Hyung-Il Choi
    Kyoungsu Oh
    EURASIP Journal on Image and Video Processing, 2017
  • [9] Boosting classifiers regionally
    Maclin, R
    FIFTEENTH NATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE (AAAI-98) AND TENTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICAL INTELLIGENCE (IAAI-98) - PROCEEDINGS, 1998, : 700 - 705
  • [10] Boosting textual compression in optimal linear time
    Ferragina, P
    Giancarlo, R
    Manzini, G
    Sciortino, M
    JOURNAL OF THE ACM, 2005, 52 (04) : 688 - 713