A Kullback-Leibler View of Maximum Entropy and Maximum Log-Probability Methods

被引:21
|
作者
Abbas, Ali E. [1 ]
Cadenbach, Andrea H. [2 ]
Salimi, Ehsan [3 ]
机构
[1] Univ Southern Calif, Ind & Syst Engn & Publ Policy, Los Angeles, CA 90089 USA
[2] Univ Missouri, Supply Chain & Analyt, St Louis, MO 63121 USA
[3] Univ Southern Calif, Ind & Syst Engn, Los Angeles, CA 90007 USA
来源
ENTROPY | 2017年 / 19卷 / 05期
基金
美国国家科学基金会;
关键词
entropy; minimum cross entropy; joint probability distribution; DISTRIBUTIONS; INFORMATION;
D O I
10.3390/e19050232
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
Entropy methods enable a convenient general approach to providing a probability distribution with partial information. The minimum cross-entropy principle selects the distribution that minimizes the Kullback-Leibler divergence subject to the given constraints. This general principle encompasses a wide variety of distributions, and generalizes other methods that have been proposed independently. There remains, however, some confusion about the breadth of entropy methods in the literature. In particular, the asymmetry of the Kullback-Leibler divergence provides two important special cases when the target distribution is uniform: the maximum entropy method and the maximum log-probability method. This paper compares the performance of both methods under a variety of conditions. We also examine a generalized maximum log-probability method as a further demonstration of the generality of the entropy approach.
引用
收藏
页数:14
相关论文
共 50 条
  • [31] Characterizing variation of nonparametric random probability measures using the Kullback-Leibler divergence
    Watson, J.
    Nieto-Barajas, L.
    Holmes, C.
    STATISTICS, 2017, 51 (03) : 558 - 571
  • [32] Fast adaptive estimation of log-additive exponential models in Kullback-Leibler divergence
    Butucea, Cristina
    Delmas, Jean-Francois
    Dutfoy, Anne
    Fischer, Richard
    ELECTRONIC JOURNAL OF STATISTICS, 2018, 12 (01): : 1256 - 1298
  • [33] Entropy and the Kullback-Leibler Divergence for Bayesian Networks: Computational Complexity and Efficient Implementation
    Scutari, Marco
    ALGORITHMS, 2024, 17 (01)
  • [34] Kullback-Leibler entropy and Penrose conjecture in the Lemaitre-Tolman-Bondi model
    Li, Nan
    Li, Xiao-Long
    Song, Shu-Peng
    EUROPEAN PHYSICAL JOURNAL C, 2015, 75 (03):
  • [35] Mixture density modeling, Kullback-Leibler divergence, and. differential log-likelihood
    Van Hulle, MM
    SIGNAL PROCESSING, 2005, 85 (05) : 951 - 963
  • [36] On Sparse Variational Methods and the Kullback-Leibler Divergence between Stochastic Processes
    Matthews, Alexander G. de G.
    Hensman, James
    Turner, Richard E.
    Ghahramani, Zoubin
    ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 51, 2016, 51 : 231 - 239
  • [37] Entropy production and Kullback-Leibler divergence between stationary trajectories of discrete systems
    Roldan, Edgar
    Parrondo, Juan M. R.
    PHYSICAL REVIEW E, 2012, 85 (03):
  • [38] Combining marginal probability distributions via minimization of weighted sum of Kullback-Leibler divergences
    Kracik, Jan
    INTERNATIONAL JOURNAL OF APPROXIMATE REASONING, 2011, 52 (06) : 659 - 671
  • [39] Kullback-Leibler average, consensus on probability densities, and distributed state estimation with guaranteed stability
    Battistelli, Giorgio
    Chisci, Luigi
    AUTOMATICA, 2014, 50 (03) : 707 - 718
  • [40] The extended Kullback-Leibler divergence measure in the unknown probability density function cases and applications
    Le H.
    van Truong H.
    Bao P.T.
    International Journal of Intelligent Information and Database Systems, 2021, 14 (04): : 403 - 420