The use of boosting to improve LVQ ATR classifiers

被引:0
|
作者
Lim, S [1 ]
Nasrabadi, NM [1 ]
Mersereau, RM [1 ]
机构
[1] USA, Res Lab, Adelphi, MD 20783 USA
关键词
boosting; automatic target recognition; FLIR imagery; algorithm fusion; composite classifiers;
D O I
10.1117/12.461671
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Boosting has emerged as a popular combination technique to refine weak classifiers. Pioneered by Freund I and Schapire,(2) numerous variations of the AdaBoost algorithm have emerged, such as Breiman's arc-fs algorithms.(3) The central theme of these methods is the generation of an ensemble of a weak learning algorithm using modified versions of the original training set, with emphasis placed on the more difficult instances. The validation stage then aggregates results from each element of the ensemble using some predetermined rule. In this paper the wavelet decomposition based codebook classifier proposed by Chan et al.(4) is used as the learning algorithm. Starting with the whole training set, modifications to the training set are made at each iteration by resampling the original training data set with replacement. The weights used in the resampling are determined using different algorithms, including AdaBoost and arc-fs. The accuracies of the ensembles generated are then determined using various combination techniques such as simple voting and weighted sums.
引用
收藏
页码:99 / 107
页数:5
相关论文
共 50 条
  • [31] Efficient optimal linear boosting of a pair of classifiers
    Boyarshinov, Victor
    Magdon-Ismail, Malik
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2007, 18 (02): : 317 - 328
  • [32] Boosting RVM classifiers for large data sets
    Silva, Catarina
    Ribeir, Bernardete
    Sung, Andrew H.
    ADAPTIVE AND NATURAL COMPUTING ALGORITHMS, PT 2, 2007, 4432 : 228 - +
  • [33] From hyperplanes to large margin classifiers: Applications to SAR ATR
    Zhao, Q
    Principe, JC
    Xu, DX
    AUTOMATIC TARGET RECOGNITION IX, 1999, 3718 : 101 - 109
  • [34] Performance measures for ATR systems with multiple classifiers and multiple labels
    Schubert, Christine M.
    Thorsen, Steven
    Oxley, Mark E.
    Bauer, Kenneth W., Jr.
    SIGNAL PROCESSING, SENSOR FUSION, AND TARGET RECOGNITION XV, 2006, 6235
  • [35] Adapting a Pedestrian Detector by Boosting LDA Exemplar Classifiers
    Xu, Jiaolong
    Vazquez, David
    Ramos, Sebastian
    Lopez, Antonio M.
    Ponsa, Daniel
    2013 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS (CVPRW), 2013, : 688 - 693
  • [36] Bagging, Boosting and the Random Subspace Method for Linear Classifiers
    Marina Skurichina
    Robert P. W. Duin
    Pattern Analysis & Applications, 2002, 5 : 121 - 135
  • [37] Privacy-Preserving Boosting with Random Linear Classifiers
    Sharma, Sagar
    Chen, Keke
    PROCEEDINGS OF THE 2018 ACM SIGSAC CONFERENCE ON COMPUTER AND COMMUNICATIONS SECURITY (CCS'18), 2018, : 2294 - 2296
  • [38] Object Detection by Simple Fuzzy Classifiers Generated by Boosting
    Gabryel, Marcin
    Korytkowski, Marcin
    Scherer, Rafal
    Rutkowski, Leszek
    ARTIFICIAL INTELLIGENCE AND SOFT COMPUTING, PT I, 2013, 7894 : 540 - 547
  • [39] Boosting Classifiers Built from Different Subsets of Features
    Janodet, Jean-Christophe
    Sebban, Marc
    Suchier, Henri-Maxime
    FUNDAMENTA INFORMATICAE, 2009, 96 (1-2) : 89 - 109
  • [40] Boosting Shape Classifiers Accuracy by Considering the Inverse Shape
    Pierard, Sebastien
    Lejeune, Antoine
    Van Droogenbroeck, Marc
    JOURNAL OF PATTERN RECOGNITION RESEARCH, 2016, 11 (01): : 41 - 54