Ising Models with Latent Conditional Gaussian Variables

被引:0
|
作者
Nussbaum, Frank [1 ]
Giesen, Joachim [1 ]
机构
[1] Friedrich Schiller Univ Jena, Inst Informat, Jena, Germany
来源
关键词
Ising Models; Latent Variables; Sparse and Low-Rank Matrices; Maximum-Entropy Principle; High-Dimensional Consistency; SELECTION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Ising models describe the joint probability distribution of a vector of binary feature variables. Typically, not all the variables interact with each other and one is interested in learning the presumably sparse network structure of the interacting variables. However, in the presence of latent variables, the conventional method of learning a sparse model might fail. This is because the latent variables induce indirect interactions of the observed variables. In the case of only a few latent conditional Gaussian variables these spurious interactions contribute an additional low-rank component to the interaction parameters of the observed Ising model. Therefore, we propose to learn a sparse + low-rank decomposition of the parameters of an Ising model using a convex regularized likelihood problem. We show that the same problem can be obtained as the dual of a maximum-entropy problem with a new type of relaxation, where the sample means collectively need to match the expected values only up to a given tolerance. The solution to the convex optimization problem has consistency properties in the high-dimensional setting, where the number of observed binary variables and the number of latent conditional Gaussian variables are allowed to grow with the number of training samples.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] Learning Ising and Potts Models with Latent Variables
    Goel, Surbhi
    [J]. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108 : 3557 - 3565
  • [2] Graphical Model Selection for Gaussian Conditional Random Fields in the Presence of Latent Variables
    Frot, Benjamin
    Jostins, Luke
    McVean, Gilean
    [J]. JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2019, 114 (526) : 723 - 734
  • [3] Learning Linear Non-Gaussian Causal Models in the Presence of Latent Variables
    Salehkaleybar, Saber
    Ghassami, AmirEmad
    Kiyavash, Negar
    Zhang, Kun
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2020, 21
  • [4] Learning linear non-Gaussian causal models in the presence of latent variables
    Salehkaleybar, Saber
    Ghassami, AmirEmad
    Kiyavash, Negar
    Zhang, Kun
    [J]. Journal of Machine Learning Research, 2020, 21
  • [5] PATH MODELS WITH LATENT VARIABLES
    DUPACOVA, J
    [J]. EKONOMICKO-MATEMATICKY OBZOR, 1976, 12 (01): : 30 - 43
  • [6] Loglinear Models with Latent Variables
    Hershberger, Scott L.
    [J]. STRUCTURAL EQUATION MODELING-A MULTIDISCIPLINARY JOURNAL, 1995, 2 (02) : 172 - 174
  • [7] Smoothness of Gaussian Conditional Independence Models
    Drton, Mathias
    Xiao, Han
    [J]. ALGEBRAIC METHODS IN STATISTICS AND PROBABILITY II, 2010, 516 : 155 - 177
  • [8] Approximate Marginals in Latent Gaussian Models
    Cseke, Botond
    Heskes, Tom
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2011, 12 : 417 - 454
  • [9] Latent Gaussian Models for Topic Modeling
    Hu, Changwei
    Ryu, Eunsu
    Carlson, David
    Wang, Yingjian
    Carin, Lawrence
    [J]. ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 33, 2014, 33 : 393 - 401
  • [10] Latent Gaussian models for topic modeling
    20160501866993
    [J]. (1) Department of Electrical and Computer Engineering, Duke University, Durham; NC; 27708, United States, 1600, Amazon; Digile; Facebook; Google; Springer (Microtome Publishing):