Stochastic optimization for bayesian network classifiers

被引:0
|
作者
Yi Ren
LiMin Wang
XiongFei Li
Meng Pang
JunYang Wei
机构
[1] Jilin University,College of Software
[2] Jilin University,Key Laboratory of Symbolic Computation and Knowledge Engineering of Ministry of Education
[3] Jilin University,College of Computer Science and Technology
来源
Applied Intelligence | 2022年 / 52卷
关键词
Bayesian network classifiers; Ensemble learning; Stochastic optimization; Random sampling;
D O I
暂无
中图分类号
学科分类号
摘要
How to reduce the complexity of network topology and make the learned joint probability distribution fit data are two important but inconsistent issues for learning Bayesian network classifier (BNC). By transforming one single high-order topology into a set of low-order ones, ensemble learning algorithms can include more hypothesis implicated in training data and help achieve the tradeoff between bias and variance. Resampling from training data can vary the results of member classifiers of the ensemble, whereas the potentially lost information may bias the estimate of conditional probability distribution and then introduce insignificant rather than significant dependency relationships into the network topology of BNC. In this paper, we propose to learn from training data as a whole and apply heuristic search strategy to flexibly identify the significant conditional dependencies, and then the attribute order is determined implicitly. Random sampling is introduced to make each member of the ensemble “unstable” and fully represent the conditional dependencies. The experimental evaluation on 40 UCI datasets reveals that the proposed algorithm, called random Bayesian forest (RBF), achieves remarkable classification performance compared to the extended version of state-of-the-art out-of-core BNCs (e.g., SKDB, WATAN, WAODE, SA2DE, SASA2DE and IWAODE).
引用
收藏
页码:15496 / 15516
页数:20
相关论文
共 50 条
  • [1] Stochastic optimization for bayesian network classifiers
    Ren, Yi
    Wang, LiMin
    Li, XiongFei
    Pang, Meng
    Wei, JunYang
    [J]. APPLIED INTELLIGENCE, 2022, 52 (13) : 15496 - 15516
  • [2] Bayesian network classifiers
    Friedman, N
    Geiger, D
    Goldszmidt, M
    [J]. MACHINE LEARNING, 1997, 29 (2-3) : 131 - 163
  • [3] Bayesian Network Classifiers
    Nir Friedman
    Dan Geiger
    Moises Goldszmidt
    [J]. Machine Learning, 1997, 29 : 131 - 163
  • [4] Stochastic margin-based structure learning of Bayesian network classifiers
    Pernkopf, Franz
    Wohlmayr, Michael
    [J]. PATTERN RECOGNITION, 2013, 46 (02) : 464 - 471
  • [5] Learning Bayesian network classifiers using ant colony optimization
    Salama, Khalid M.
    Freitas, Alex A.
    [J]. SWARM INTELLIGENCE, 2013, 7 (2-3) : 229 - 254
  • [6] Learning Bayesian network classifiers using ant colony optimization
    Khalid M. Salama
    Alex A. Freitas
    [J]. Swarm Intelligence, 2013, 7 : 229 - 254
  • [7] Boosted Bayesian network classifiers
    Jing, Yushi
    Pavlovic, Vladimir
    Rehg, James M.
    [J]. MACHINE LEARNING, 2008, 73 (02) : 155 - 184
  • [8] Approximate Bayesian network classifiers
    Slezak, D
    Wróblewski, J
    [J]. ROUGH SETS AND CURRENT TRENDS IN COMPUTING, PROCEEDINGS, 2002, 2475 : 365 - 372
  • [9] Adaptive Bayesian network classifiers
    Castillo, Gladys
    Gama, Joao
    [J]. INTELLIGENT DATA ANALYSIS, 2009, 13 (01) : 39 - 59
  • [10] Comparing Bayesian network classifiers
    Cheng, J
    Greiner, R
    [J]. UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, PROCEEDINGS, 1999, : 101 - 108