Boosted Bayesian network classifiers

被引:0
|
作者
Yushi Jing
Vladimir Pavlović
James M. Rehg
机构
[1] Georgia Institute of Technology,College of Computing
[2] Rutgers University,Department of Computer Science
来源
Machine Learning | 2008年 / 73卷
关键词
Bayesian network classifiers; AdaBoost; Ensemble models; Structure learning;
D O I
暂无
中图分类号
学科分类号
摘要
The use of Bayesian networks for classification problems has received a significant amount of recent attention. Although computationally efficient, the standard maximum likelihood learning method tends to be suboptimal due to the mismatch between its optimization criteria (data likelihood) and the actual goal of classification (label prediction accuracy). Recent approaches to optimizing classification performance during parameter or structure learning show promise, but lack the favorable computational properties of maximum likelihood learning. In this paper we present boosted Bayesian network classifiers, a framework to combine discriminative data-weighting with generative training of intermediate models. We show that boosted Bayesian network classifiers encompass the basic generative models in isolation, but improve their classification performance when the model structure is suboptimal. We also demonstrate that structure learning is beneficial in the construction of boosted Bayesian network classifiers. On a large suite of benchmark data-sets, this approach outperforms generative graphical models such as naive Bayes and TAN in classification accuracy. Boosted Bayesian network classifiers have comparable or better performance in comparison to other discriminatively trained graphical models including ELR and BNC. Furthermore, boosted Bayesian networks require significantly less training time than the ELR and BNC algorithms.
引用
收藏
页码:155 / 184
页数:29
相关论文
共 50 条
  • [1] Boosted Bayesian network classifiers
    Jing, Yushi
    Pavlovic, Vladimir
    Rehg, James M.
    [J]. MACHINE LEARNING, 2008, 73 (02) : 155 - 184
  • [2] Bayesian network classifiers
    Friedman, N
    Geiger, D
    Goldszmidt, M
    [J]. MACHINE LEARNING, 1997, 29 (2-3) : 131 - 163
  • [3] Bayesian Network Classifiers
    Nir Friedman
    Dan Geiger
    Moises Goldszmidt
    [J]. Machine Learning, 1997, 29 : 131 - 163
  • [4] Boosted Network Classifiers for Local Feature Selection
    Hancock, Timothy
    Mamitsuka, Hiroshi
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2012, 23 (11) : 1767 - 1778
  • [5] Comparing Bayesian network classifiers
    Cheng, J
    Greiner, R
    [J]. UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, PROCEEDINGS, 1999, : 101 - 108
  • [6] Adaptive Bayesian network classifiers
    Castillo, Gladys
    Gama, Joao
    [J]. INTELLIGENT DATA ANALYSIS, 2009, 13 (01) : 39 - 59
  • [7] Approximate Bayesian network classifiers
    Slezak, D
    Wróblewski, J
    [J]. ROUGH SETS AND CURRENT TRENDS IN COMPUTING, PROCEEDINGS, 2002, 2475 : 365 - 372
  • [8] Learning Bayesian classifiers from dependency network classifiers
    Gamez, Jose A.
    Mateo, Juan L.
    Puerta, Jose M.
    [J]. ADAPTIVE AND NATURAL COMPUTING ALGORITHMS, PT 1, 2007, 4431 : 806 - +
  • [9] Discrete Bayesian Network Classifiers: A Survey
    Bielza, Concha
    Larranaga, Pedro
    [J]. ACM COMPUTING SURVEYS, 2014, 47 (01)
  • [10] Asynchronous dynamic Bayesian network classifiers
    Wang, Shuang-Cheng
    Zhang, Li
    Zheng, Fei
    [J]. Jisuanji Xuebao/Chinese Journal of Computers, 2020, 43 (09): : 1737 - 1754