Robust Bayesian linear classifier ensembles

被引:0
|
作者
Cerquides, J [1 ]
de Màntaras, RL
机构
[1] Univ Barcelona, Dept Matemat Aplicada & Anal, Barcelona, Spain
[2] CSIC, Spanish Council Sci Res, Artificial Intelligence Res Inst, IIIA, Madrid, Spain
来源
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Ensemble classifiers combine the classification results of several classifiers. Simple ensemble methods such as uniform averaging over a set of models usually provide an improvement over selecting the single best model. Usually probabilistic classifiers restrict the set of possible models that can be learnt in order to lower computational complexity costs. In these restricted spaces, where incorrect modeling assumptions are possibly made, uniform averaging sometimes performs even better than bayesian model averaging. Linear mixtures over sets of models provide an space that includes uniform averaging as a particular case. We develop two algorithms for learning maximum a posteriori weights for linear mixtures, based on expectation maximization and on constrained optimizition. We provide a nontrivial example of the utility of these two algorithms by applying them for one dependence estimators. We develop the conjugate distribution for one dependence estimators and empirically show that uniform averaging is clearly superior to Bayesian model averaging for this family of models, After that we empirically show that the maximum a posteriori linear mixture weights improve accuracy significantly over uniform aggregation.
引用
收藏
页码:72 / 83
页数:12
相关论文
共 50 条
  • [41] Predicting stock returns by classifier ensembles
    Tsai, Chih-Fong
    Lin, Yuah-Chiao
    Yen, David C.
    Chen, Yan-Min
    APPLIED SOFT COMPUTING, 2011, 11 (02) : 2452 - 2459
  • [42] Dynamic Feature Selection for Classifier Ensembles
    Nunes, Romulo de O.
    Dantas, Carine A.
    Canuto, Anne M. P.
    Xavier-Junior, Joao C.
    2018 7TH BRAZILIAN CONFERENCE ON INTELLIGENT SYSTEMS (BRACIS), 2018, : 468 - 473
  • [43] Survey of diversity researches on classifier ensembles
    Zhang, Hong-Da
    Wang, Xiao-Dan
    Han, Jun
    Xu, Hai-Long
    Xi Tong Gong Cheng Yu Dian Zi Ji Shu/Systems Engineering and Electronics, 2009, 31 (12): : 3007 - 3012
  • [44] Incremental construction of classifier and discriminant ensembles
    Ulas, Aydin
    Semerci, Murat
    Yildiz, Olcay Taner
    Alpaydin, Ethem
    INFORMATION SCIENCES, 2009, 179 (09) : 1298 - 1318
  • [45] Bayesian signal classifier
    Chow, Chi Kin
    Yuen, Shiu Yin
    2007 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-6, 2007, : 207 - 212
  • [46] An investigation of the effects of correlation and autocorrelation on classifier fusion and optimal classifier ensembles
    Leap, Nathan J.
    Clemans, Paul P.
    Bauer, Kenneth W., Jr.
    Oxley, Mark E.
    INTERNATIONAL JOURNAL OF GENERAL SYSTEMS, 2008, 37 (04) : 475 - 498
  • [47] Robust keratoconus detection with Bayesian network classifier for Placido- based corneal indices
    Castro-Luna, Gracia M.
    Martinez-Finkelshtein, Andrei
    Ramos-Lopez, Dario
    CONTACT LENS & ANTERIOR EYE, 2020, 43 (04): : 366 - 372
  • [48] Agnostic Bayesian Learning of Ensembles
    Lacoste, Alexandre
    Larochelle, Hugo
    Marchand, Mario
    Laviolette, Francois
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 32 (CYCLE 1), 2014, 32
  • [49] Federated Bayesian Network Ensembles
    van Daalen, Florian
    Ippel, Lianne
    Dekker, Andre
    Bermejo, Inigo
    2023 EIGHTH INTERNATIONAL CONFERENCE ON FOG AND MOBILE EDGE COMPUTING, FMEC, 2023, : 22 - 33
  • [50] Nonparametric Bayesian Clustering Ensembles
    Wang, Pu
    Domeniconi, Carlotta
    Laskey, Kathryn Blackmond
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, PT III, 2010, 6323 : 435 - 450