A stochastic variance-reduced coordinate descent algorithm for learning sparse Bayesian network from discrete high-dimensional data

被引:1
|
作者
Shajoonnezhad, Nazanin [1 ]
Nikanjam, Amin [2 ]
机构
[1] KN Toosi Univ Technol, Tehran, Iran
[2] Polytech Montreal, Montreal, PQ, Canada
关键词
Bayesian networks; Sparse structure learning; Stochastic gradient descent; Constrained optimization; DIRECTED ACYCLIC GRAPHS; PENALIZED ESTIMATION; REGULARIZATION; CONNECTIVITY;
D O I
10.1007/s13042-022-01674-9
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper addresses the problem of learning a sparse structure Bayesian network from high-dimensional discrete data. Compared to continuous Bayesian networks, learning a discrete Bayesian network is a challenging problem due to the large parameter space. Although many approaches have been developed for learning continuous Bayesian networks, few approaches have been proposed for the discrete ones. In this paper, we address learning Bayesian networks as an optimization problem and propose a score function which guarantees the learnt structure to be a sparse directed acyclic graph. Besides, we implement a block-wised stochastic coordinate descent algorithm to optimize the score function. Specifically, we use a variance reducing method in our optimization algorithm to make the algorithm work efficiently for high-dimensional data. The proposed approach is applied to synthetic data from well-known benchmark networks. The quality, scalability, and robustness of the constructed network are measured. Compared to some competitive approaches, the results reveal that our algorithm outperforms some of the well-known proposed methods.
引用
下载
收藏
页码:947 / 958
页数:12
相关论文
共 50 条
  • [1] A stochastic variance-reduced coordinate descent algorithm for learning sparse Bayesian network from discrete high-dimensional data
    Nazanin Shajoonnezhad
    Amin Nikanjam
    International Journal of Machine Learning and Cybernetics, 2023, 14 : 947 - 958
  • [2] High-Dimensional Variance-Reduced Stochastic Gradient Expectation-Maximization Algorithm
    Zhu, Rongda
    Wang, Lingxiao
    Zhai, Chengxiang
    Gu, Quanquan
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
  • [3] Variance-Reduced Stochastic Gradient Descent on Streaming Data
    Jothimurugesan, Ellango
    Tahmasbi, Ashraf
    Gibbons, Phillip B.
    Tirthapura, Srikanta
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [4] A Sparse Structure Learning Algorithm for Gaussian Bayesian Network Identification from High-Dimensional Data
    Huang, Shuai
    Li, Jing
    Ye, Jieping
    Fleisher, Adam
    Chen, Kewei
    Wu, Teresa
    Reiman, Eric
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2013, 35 (06) : 1328 - 1342
  • [5] Lock-Free Parallelization for Variance-Reduced Stochastic Gradient Descent on Streaming Data
    Peng, Yaqiong
    Hao, Zhiyu
    Yun, Xiaochun
    IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2020, 31 (09) : 2220 - 2231
  • [6] Robust Coordinate Descent Algorithm Robust Solution Path for High-dimensional Sparse Regression Modeling
    Park, H.
    Konishi, S.
    COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION, 2016, 45 (01) : 115 - 129
  • [7] Local Bayesian network structure learning for high-dimensional data
    Wang, Yangyang
    Gao, Xiaoguang
    Ru, Xinxin
    Xi Tong Gong Cheng Yu Dian Zi Ji Shu/Systems Engineering and Electronics, 2024, 46 (08): : 2676 - 2685
  • [8] Local Bayesian Network Structure Learning for High-Dimensional Data
    Wang, Yangyang
    Gao, Xiaoguang
    Sun, Pengzhan
    Ru, Xinxin
    Wang, Jihan
    2024 9TH INTERNATIONAL CONFERENCE ON CONTROL AND ROBOTICS ENGINEERING, ICCRE 2024, 2024, : 259 - 263
  • [9] Pricing high-dimensional Bermudan options using variance-reduced Monte Carlo methods
    Hepperger, Peter
    JOURNAL OF COMPUTATIONAL FINANCE, 2013, 16 (03) : 99 - 126
  • [10] PCA learning for sparse high-dimensional data
    Hoyle, DC
    Rattray, M
    EUROPHYSICS LETTERS, 2003, 62 (01): : 117 - 123