Learning Latent Variable Gaussian Graphical Model for Biomolecular Network with Low Sample Complexity

被引:0
|
作者
Wang, Yanbo [1 ]
Liu, Quan [1 ]
Yuan, Bo [1 ]
机构
[1] Shanghai Jiao Tong Univ, Dept Comp Sci & Engn, Shanghai 200240, Peoples R China
基金
中国国家自然科学基金;
关键词
COVARIANCE ESTIMATION; SELECTION; LASSO; REGULARIZATION; EXPRESSION; SPARSITY;
D O I
10.1155/2016/2078214
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
Learning a Gaussian graphical model with latent variables is ill posed when there is insufficient sample complexity, thus having to be appropriately regularized. A common choice is convex l(1) plus nuclear norm to regularize the searching process. However, the best estimator performance is not always achieved with these additive convex regularizations, especially when the sample complexity is low. In this paper, we consider a concave additive regularization which does not require the strong irrepresentable condition. We use concave regularization to correct the intrinsic estimation biases from Lasso and nuclear penalty as well. We establish the proximity operators for our concave regularizations, respectively, which induces sparsity and low rankness. In addition, we extend our method to also allow the decomposition of fused structure-sparsity plus low rankness, providing a powerful tool for models with temporal information. Specifically, we develop a nontrivial modified alternating direction method of multipliers with at least local convergence. Finally, we use both synthetic and real data to validate the excellence of our method. In the application of reconstructing two-stage cancer networks, "theWarburg effect" can be revealed directly.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] Learning Latent Variable Gaussian Graphical Models
    Meng, Zhaoshi
    Eriksson, Brian
    Hero, Alfred O., III
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 32 (CYCLE 2), 2014, 32 : 1269 - 1277
  • [2] Alternating Direction Methods for Latent Variable Gaussian Graphical Model Selection
    Ma, Shiqian
    Xue, Lingzhou
    Zou, Hui
    [J]. NEURAL COMPUTATION, 2013, 25 (08) : 2172 - 2198
  • [3] Spectral learning of latent-variable PCFGs: Algorithms and sample complexity
    Cohen, Shay B.
    Stratos, Karl
    Collins, Michael
    Foster, Dean P.
    Ungar, Lyle
    [J]. Journal of Machine Learning Research, 2014, 15 : 2399 - 2449
  • [4] Spectral Learning of Latent-Variable PCFGs: Algorithms and Sample Complexity
    Cohen, Shay B.
    Stratos, Karl
    Collins, Michael
    Foster, Dean P.
    Ungar, Lyle
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2014, 15 : 2399 - 2449
  • [5] On the Sample Complexity of Learning Graphical Games
    Honorio, Jean
    [J]. 2017 55TH ANNUAL ALLERTON CONFERENCE ON COMMUNICATION, CONTROL, AND COMPUTING (ALLERTON), 2017, : 830 - 836
  • [6] Learning Gaussian graphical models with latent confounders
    Wang, Ke
    Franks, Alexander
    Oh, Sang-Yun
    [J]. JOURNAL OF MULTIVARIATE ANALYSIS, 2023, 198
  • [7] Graph learning for latent-variable Gaussian graphical models under laplacian constraints
    Li, Ran
    Lin, Jiming
    Qiu, Hongbing
    Zhang, Wenhui
    Wang, Junyi
    [J]. NEUROCOMPUTING, 2023, 532 : 67 - 76
  • [8] Speeding Up Latent Variable Gaussian Graphical Model Estimation via Nonconvex Optimization
    Xu, Pan
    Ma, Jian
    Gu, Quanquan
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [9] Gaussian Process Latent Variable Alignment Learning
    Kazlauskaite, Ieva
    Ek, Carl Henrik
    Campbell, Neill D. F.
    [J]. 22ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 89, 2019, 89 : 748 - 757
  • [10] Sample-Efficient Robot Motion Learning using Gaussian Process Latent Variable Models
    Antonio Delgado-Guerrero, Juan
    Colome, Adria
    Torras, Carme
    [J]. 2020 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2020, : 314 - 320