Learning structured bayesian networks: Combining abstraction hierarchies and tree-structured conditional probability tables

被引:9
|
作者
Desjardins, Marie [1 ]
Rathod, Priyang [1 ]
Getoor, Lise [2 ]
机构
[1] Univ Maryland Baltimore Cty, Dept Comp Sci & Elect Engn, Baltimore, MD 21228 USA
[2] Univ Maryland, Dept Comp Sci, College Pk, MD 20742 USA
关键词
machine learning; Bayesian networks; abstraction hierarchies; background knowledge; clustering; MDL;
D O I
10.1111/j.1467-8640.2007.00320.x
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Context-specific independence representations, such as tree-structured conditional probability distributions, capture local independence relationships among the random variables in a Bayesian network (BN). Local independence relationships among the random variables can also be captured by using attribute-value hierarchies to find an appropriate abstraction level for the values used to describe the conditional probability distributions. Capturing this local structure is important because it reduces the number of parameters required to represent the distribution. This can lead to more robust parameter estimation and structure selection, more efficient inference algorithms, and more interpretable models. In this paper, we introduce Tree-Abstraction-Based Search (TABS), an approach for learning a data distribution by inducing the graph structure and parameters of a BN from training data. TABS combines tree structure and attribute-value hierarchies to compactly represent conditional probability tables. To construct the attribute-value hierarchies, we investigate two data-driven techniques: a global clustering method, which uses all of the training data to build the attribute-value hierarchies, and can be performed as a preprocessing step; and a local clustering method, which uses only the local network structure to learn attribute-value hierarchies. We present empirical results for three real-world domains, finding that (1) combining tree structure and attribute-value hierarchies improves the accuracy of generalization, while providing a significant reduction in the number of parameters in the learned networks, and (2) data-derived hierarchies perform as well or better than expert-provided hierarchies.
引用
收藏
页码:1 / 22
页数:22
相关论文
共 50 条
  • [1] Learning Tree-structured Approximations for Conditional Random Fields
    Skurikhin, Alexei N.
    [J]. 2014 IEEE APPLIED IMAGERY PATTERN RECOGNITION WORKSHOP (AIPR), 2014,
  • [2] Tree-Structured Bayesian Networks for Wrapped Cauchy Directional Distributions
    Leguey, Ignacio
    Bielza, Concha
    Larranaga, Pedro
    [J]. ADVANCES IN ARTIFICIAL INTELLIGENCE, CAEPIA 2016, 2016, 9868 : 207 - 216
  • [3] Bayesian Optimization with Tree-structured Dependencies
    Jenatton, Rodolphe
    Archambeau, Cedric
    Gonzalez, Javier
    Seeger, Matthias
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
  • [4] NML Computation Algorithms for Tree-Structured Multinomial Bayesian Networks
    Kontkanen, Petri
    Wettig, Hannes
    Myllymaki, Petri
    [J]. EURASIP JOURNAL ON BIOINFORMATICS AND SYSTEMS BIOLOGY, 2007, (01)
  • [5] Tree-structured Bayesian network learning with application to scene classification
    Wang, Z. F.
    Wang, Z. H.
    Xie, W. J.
    [J]. ELECTRONICS LETTERS, 2011, 47 (09) : 540 - 541
  • [6] Additive Tree-Structured Covariance Function for Conditional Parameter Spaces in Bayesian Optimization
    Ma, Xingchen
    Blaschko, Matthew B.
    [J]. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108 : 1015 - 1024
  • [7] Tree-Structured Binary Neural Networks
    Serbetci, Ayse
    Akgul, Yusuf Sinan
    [J]. 29TH IEEE CONFERENCE ON SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS (SIU 2021), 2021,
  • [8] Locating faults in tree-structured networks
    Leckie, C
    Dale, M
    [J]. IJCAI-97 - PROCEEDINGS OF THE FIFTEENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOLS 1 AND 2, 1997, : 434 - 439
  • [9] Learning Bayesian Networks with Low Rank Conditional Probability Tables
    Barik, Adarsh
    Honorio, Jean
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [10] Improving Uncertainty Quantification of Variance Networks by Tree-Structured Learning
    Ma, Wenxuan
    Yan, Xing
    Zhang, Kun
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, : 1 - 15