Kullback-Leibler Divergence and Moment Matching for Hyperspherical Probability Distributions

被引:0
|
作者
Kurz, Gerhard [1 ]
Pfaff, Florian [1 ]
Hanebeck, Uwe D. [1 ]
机构
[1] Karlsruhe Inst Technol, Inst Anthropomat & Robot, ISAS, Intelligent Sensor Actuator Syst Lab, D-76021 Karlsruhe, Germany
关键词
von Mises-Fisher distribution; Watson distribution; parameter estimation;
D O I
暂无
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
When approximating one probability density with another density, it is desirable to minimize the information loss of the approximation as quantified by, e.g., the Kullback-Leibler divergence (KLD). It has been known for some time that in the case of the Gaussian distribution, matching the first two moments of the original density yields the optimal approximation in terms of minimizing the KLD. In this paper, we will show that a similar property can be proven for certain hyperspherical probability distributions, namely the von Mises-Fisher and the Watson distribution. This result has profound implications for moment-based filtering on the unit hypersphere as it shows that moment-based approaches are optimal in the information-theoretic sense.
引用
收藏
页码:2087 / 2094
页数:8
相关论文
共 50 条
  • [1] Trigonometric Moment Matching and Minimization of the Kullback-Leibler Divergence
    Kurz, Gerhard
    Hanebeck, Uwe D.
    [J]. IEEE TRANSACTIONS ON AEROSPACE AND ELECTRONIC SYSTEMS, 2015, 51 (04) : 3480 - 3484
  • [2] Distributions of the Kullback-Leibler divergence with applications
    Belov, Dmitry I.
    Armstrong, Ronald D.
    [J]. BRITISH JOURNAL OF MATHEMATICAL & STATISTICAL PSYCHOLOGY, 2011, 64 (02): : 291 - 309
  • [3] Kullback-Leibler Divergence Estimation of Continuous Distributions
    Perez-Cruz, Fernando
    [J]. 2008 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY PROCEEDINGS, VOLS 1-6, 2008, : 1666 - 1670
  • [4] Kullback-Leibler Divergence Rate Between Probability Distributions on Sets of Different Cardinalities
    Vidyasagar, M.
    [J]. 49TH IEEE CONFERENCE ON DECISION AND CONTROL (CDC), 2010, : 948 - 953
  • [5] Assessing Similarity of Rating Distributions by Kullback-Leibler Divergence
    Corduas, Marcella
    [J]. CLASSIFICATION AND MULTIVARIATE ANALYSIS FOR COMPLEX DATA STRUCTURES, 2011, : 221 - 228
  • [6] Use of the Kullback-Leibler Divergence in Estimating Clutter Distributions
    Ritchie, M. A.
    Charlish, A.
    Woodbridge, K.
    Stove, A. G.
    [J]. 2011 IEEE RADAR CONFERENCE (RADAR), 2011, : 751 - 756
  • [7] Converting information into probability measures with the Kullback-Leibler divergence
    Bissiri, Pier Giovanni
    Walker, Stephen G.
    [J]. ANNALS OF THE INSTITUTE OF STATISTICAL MATHEMATICS, 2012, 64 (06) : 1139 - 1160
  • [8] The Kullback-Leibler Divergence Between Lattice Gaussian Distributions
    Nielsen, Frank
    [J]. JOURNAL OF THE INDIAN INSTITUTE OF SCIENCE, 2022, 102 (04) : 1177 - 1188
  • [9] Renyi Divergence and Kullback-Leibler Divergence
    van Erven, Tim
    Harremoes, Peter
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2014, 60 (07) : 3797 - 3820
  • [10] The fractional Kullback-Leibler divergence
    Alexopoulos, A.
    [J]. JOURNAL OF PHYSICS A-MATHEMATICAL AND THEORETICAL, 2021, 54 (07)