Convergence Rates of Smooth Message Passing with Rounding in Entropy-Regularized MAP Inference

被引:0
|
作者
Lee, Jonathan N. [1 ]
Pacchiano, Aldo [2 ]
Jordan, Michael I. [2 ]
机构
[1] Stanford Univ, Stanford, CA 94305 USA
[2] Univ Calif Berkeley, Berkeley, CA USA
关键词
PROBABILISTIC INFERENCE; MINIMIZATION; RELAXATIONS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Maximum a posteriori (MAP) inference is a fundamental computational paradigm for statistical inference. In the setting of graphical models, MAP inference entails solving a combinatorial optimization problem to find the most likely configuration of the discrete-valued model. Linear programming (LP) relaxations in the Sherali-Adams hierarchy are widely used to attempt to solve this problem, and smooth message passing algorithms have been proposed to solve regularized versions of these LPs with great success. This paper leverages recent work in entropy-regularized LPs to analyze convergence rates of a class of edge-based smooth message passing algorithms to epsilon-optimality in the relaxation. With an appropriately chosen regularization constant, we present a theoretical guarantee on the number of iterations sufficient to recover the true integral MAP solution when the LP is tight and the solution is unique.
引用
收藏
页码:3003 / 3013
页数:11
相关论文
共 9 条
  • [1] Accelerated Message Passing for Entropy-Regularized MAP Inference
    Lee, Jonathan N.
    Pacchiano, Aldo
    Bartlett, Peter
    Jordan, Michael, I
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [2] CONVERGENCE OF ENTROPY-REGULARIZED NATURAL POLICY GRADIENT WITH LINEAR FUNCTION APPROXIMATION
    Cayci, Semih
    He, Niao
    Srikant, R.
    SIAM JOURNAL ON OPTIMIZATION, 2024, 34 (03) : 2729 - 2755
  • [3] Matryoshka Policy Gradient for Entropy-Regularized RL: Convergence and Global Optimality
    Ged, Francois G.
    Veiga, Maria Han
    JOURNAL OF MACHINE LEARNING RESEARCH, 2024, 25
  • [4] Convergence rate of entropy-regularized multi-marginal optimal transport costs
    Nenna, Luca
    Pegon, Paul
    CANADIAN JOURNAL OF MATHEMATICS-JOURNAL CANADIEN DE MATHEMATIQUES, 2024,
  • [5] Smooth and Strong: MAP Inference with Linear Convergence
    Meshi, Ofer
    Mahdavi, Mehrdad
    Schwing, Alexander G.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 28 (NIPS 2015), 2015, 28
  • [6] Variational Message Passing Neural Network for Maximum-A-Posteriori (MAP) Inference
    Cui, Zijun
    Wang, Hanjing
    Gao, Tian
    Talamadupula, Kartik
    Ji, Qiang
    UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, VOL 180, 2022, 180 : 464 - 474
  • [7] Distributed Inference With Variational Message Passing in Gaussian Graphical Models: Tradeoffs in Message Schedules and Convergence Conditions
    Li, Bin
    Wu, Nan
    Wu, Yik-Chung
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2024, 72 : 2021 - 2035
  • [8] Polynomial-Time Constrained Message Passing for Exact MAP Inference on Discrete Models with Global Dependencies
    Bauer, Alexander
    Nakajima, Shinichi
    Mueller, Klaus-Robert
    MATHEMATICS, 2023, 11 (12)
  • [9] Convergence rates comparison of sum-product decoding of RA codes under different message-passing schedules
    Tong, S
    Bai, BM
    Wang, XM
    IEEE COMMUNICATIONS LETTERS, 2005, 9 (06) : 543 - 545