Kullback-Leibler Divergence and Moment Matching for Hyperspherical Probability Distributions

被引:0
|
作者
Kurz, Gerhard [1 ]
Pfaff, Florian [1 ]
Hanebeck, Uwe D. [1 ]
机构
[1] Karlsruhe Inst Technol, Inst Anthropomat & Robot, ISAS, Intelligent Sensor Actuator Syst Lab, D-76021 Karlsruhe, Germany
关键词
von Mises-Fisher distribution; Watson distribution; parameter estimation;
D O I
暂无
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
When approximating one probability density with another density, it is desirable to minimize the information loss of the approximation as quantified by, e.g., the Kullback-Leibler divergence (KLD). It has been known for some time that in the case of the Gaussian distribution, matching the first two moments of the original density yields the optimal approximation in terms of minimizing the KLD. In this paper, we will show that a similar property can be proven for certain hyperspherical probability distributions, namely the von Mises-Fisher and the Watson distribution. This result has profound implications for moment-based filtering on the unit hypersphere as it shows that moment-based approaches are optimal in the information-theoretic sense.
引用
收藏
页码:2087 / 2094
页数:8
相关论文
共 50 条
  • [32] A generalization of the Kullback-Leibler divergence and its properties
    Yamano, Takuya
    [J]. JOURNAL OF MATHEMATICAL PHYSICS, 2009, 50 (04)
  • [33] Computation of Kullback-Leibler Divergence in Bayesian Networks
    Moral, Serafin
    Cano, Andres
    Gomez-Olmedo, Manuel
    [J]. ENTROPY, 2021, 23 (09)
  • [34] Generalization of the Kullback-Leibler divergence in the Tsallis statistics
    Huang, Juntao
    Yong, Wen-An
    Hong, Liu
    [J]. JOURNAL OF MATHEMATICAL ANALYSIS AND APPLICATIONS, 2016, 436 (01) : 501 - 512
  • [35] Minimization of the Kullback-Leibler Divergence for Nonlinear Estimation
    Darling, Jacob E.
    DeMars, Kyle J.
    [J]. JOURNAL OF GUIDANCE CONTROL AND DYNAMICS, 2017, 40 (07) : 1739 - 1748
  • [36] Acoustic environment identification by Kullback-Leibler divergence
    Delgado-Gutierrez, G.
    Rodriguez-Santos, F.
    Jimenez-Ramirez, O.
    Vazquez-Medina, R.
    [J]. FORENSIC SCIENCE INTERNATIONAL, 2017, 281 : 134 - 140
  • [37] Kullback-Leibler Divergence for Nonnegative Matrix Factorization
    Yang, Zhirong
    Zhang, He
    Yuan, Zhijian
    Oja, Erkki
    [J]. ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2011, PT I, 2011, 6791 : 250 - 257
  • [38] Exact Expressions for Kullback-Leibler Divergence for Multivariate and Matrix-Variate Distributions
    Nawa, Victor
    Nadarajah, Saralees
    [J]. ENTROPY, 2024, 26 (08)
  • [39] Modulation Classification Based on Kullback-Leibler Divergence
    Im, Chaewon
    Ahn, Seongjin
    Yoon, Dongweon
    [J]. 15TH INTERNATIONAL CONFERENCE ON ADVANCED TRENDS IN RADIOELECTRONICS, TELECOMMUNICATIONS AND COMPUTER ENGINEERING (TCSET - 2020), 2020, : 373 - 376
  • [40] MINIMIZATION OF THE KULLBACK-LEIBLER DIVERGENCE FOR NONLINEAR ESTIMATION
    Darling, Jacob E.
    DeMars, Kyle J.
    [J]. ASTRODYNAMICS 2015, 2016, 156 : 213 - 232