A Kullback-Leibler View of Maximum Entropy and Maximum Log-Probability Methods

被引:21
|
作者
Abbas, Ali E. [1 ]
Cadenbach, Andrea H. [2 ]
Salimi, Ehsan [3 ]
机构
[1] Univ Southern Calif, Ind & Syst Engn & Publ Policy, Los Angeles, CA 90089 USA
[2] Univ Missouri, Supply Chain & Analyt, St Louis, MO 63121 USA
[3] Univ Southern Calif, Ind & Syst Engn, Los Angeles, CA 90007 USA
来源
ENTROPY | 2017年 / 19卷 / 05期
基金
美国国家科学基金会;
关键词
entropy; minimum cross entropy; joint probability distribution; DISTRIBUTIONS; INFORMATION;
D O I
10.3390/e19050232
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
Entropy methods enable a convenient general approach to providing a probability distribution with partial information. The minimum cross-entropy principle selects the distribution that minimizes the Kullback-Leibler divergence subject to the given constraints. This general principle encompasses a wide variety of distributions, and generalizes other methods that have been proposed independently. There remains, however, some confusion about the breadth of entropy methods in the literature. In particular, the asymmetry of the Kullback-Leibler divergence provides two important special cases when the target distribution is uniform: the maximum entropy method and the maximum log-probability method. This paper compares the performance of both methods under a variety of conditions. We also examine a generalized maximum log-probability method as a further demonstration of the generality of the entropy approach.
引用
收藏
页数:14
相关论文
共 50 条
  • [21] DERIVATION OF THE ONSAGER-MACHLUP FUNCTION BY A MINIMIZATION OF THE KULLBACK-LEIBLER ENTROPY
    ITO, H
    JOURNAL OF PHYSICS A-MATHEMATICAL AND GENERAL, 1981, 14 (10): : L385 - L388
  • [22] Kullback-Leibler cluster entropy to quantify volatility correlation and risk diversity
    Ponta, L.
    Carbone, A.
    PHYSICAL REVIEW E, 2025, 111 (01)
  • [23] ON INFORMATION GAIN, KULLBACK-LEIBLER DIVERGENCE, ENTROPY PRODUCTION AND THE INVOLUTION KERNEL
    Lopes, Artur O.
    Mengue, Jairo K.
    DISCRETE AND CONTINUOUS DYNAMICAL SYSTEMS, 2022, 42 (07) : 3593 - 3627
  • [24] Fractional cumulative residual Kullback-Leibler information based on Tsallis entropy
    Mao, Xuegeng
    Shang, Pengjian
    Wang, Jianing
    Yin, Yi
    CHAOS SOLITONS & FRACTALS, 2020, 139
  • [25] Some Order Preserving Inequalities for Cross Entropy and Kullback-Leibler Divergence
    Sbert, Mateu
    Chen, Min
    Poch, Jordi
    Bardera, Anton
    ENTROPY, 2018, 20 (12):
  • [26] Renyi Relative Entropy from Homogeneous Kullback-Leibler Divergence Lagrangian
    Chirco, Goffredo
    GEOMETRIC SCIENCE OF INFORMATION (GSI 2021), 2021, 12829 : 744 - 751
  • [27] Kullback-Leibler Information in View of an Extended Version of k-Records
    Mosayeb, Ahmadi
    Borzadaran, Mohtashami G. R.
    COMMUNICATIONS FOR STATISTICAL APPLICATIONS AND METHODS, 2013, 20 (01) : 1 - 13
  • [28] Properties of Kullback-Leibler cross-entropy minimization in nonextensive framework
    Dukkipati, A
    Murty, MN
    Bhatnagar, S
    2005 IEEE International Symposium on Information Theory (ISIT), Vols 1 and 2, 2005, : 2374 - 2378
  • [29] Minimization of the Kullback-Leibler Divergence over a Log-Normal Exponential Arc
    Siri, Paola
    Trivellato, Barbara
    GEOMETRIC SCIENCE OF INFORMATION, 2019, 11712 : 453 - 461
  • [30] Kullback-Leibler Divergence Rate Between Probability Distributions on Sets of Different Cardinalities
    Vidyasagar, M.
    49TH IEEE CONFERENCE ON DECISION AND CONTROL (CDC), 2010, : 948 - 953