Kullback-Leibler Divergence and Moment Matching for Hyperspherical Probability Distributions

被引:0
|
作者
Kurz, Gerhard [1 ]
Pfaff, Florian [1 ]
Hanebeck, Uwe D. [1 ]
机构
[1] Karlsruhe Inst Technol, Inst Anthropomat & Robot, ISAS, Intelligent Sensor Actuator Syst Lab, D-76021 Karlsruhe, Germany
关键词
von Mises-Fisher distribution; Watson distribution; parameter estimation;
D O I
暂无
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
When approximating one probability density with another density, it is desirable to minimize the information loss of the approximation as quantified by, e.g., the Kullback-Leibler divergence (KLD). It has been known for some time that in the case of the Gaussian distribution, matching the first two moments of the original density yields the optimal approximation in terms of minimizing the KLD. In this paper, we will show that a similar property can be proven for certain hyperspherical probability distributions, namely the von Mises-Fisher and the Watson distribution. This result has profound implications for moment-based filtering on the unit hypersphere as it shows that moment-based approaches are optimal in the information-theoretic sense.
引用
收藏
页码:2087 / 2094
页数:8
相关论文
共 50 条
  • [21] Kullback-Leibler divergence for evaluating bioequivalence
    Dragalin, V
    Fedorov, V
    Patterson, S
    Jones, B
    [J]. STATISTICS IN MEDICINE, 2003, 22 (06) : 913 - 930
  • [22] Source Resolvability with Kullback-Leibler Divergence
    Nomura, Ryo
    [J]. 2018 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2018, : 2042 - 2046
  • [23] Kullback-Leibler divergence: A quantile approach
    Sankaran, P. G.
    Sunoj, S. M.
    Nair, N. Unnikrishnan
    [J]. STATISTICS & PROBABILITY LETTERS, 2016, 111 : 72 - 79
  • [24] The Kullback-Leibler divergence and nonnegative matrices
    Boche, Holger
    Stanczak, Slawomir
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2006, 52 (12) : 5539 - 5545
  • [25] A decision cognizant Kullback-Leibler divergence
    Ponti, Moacir
    Kittler, Josef
    Riva, Mateus
    de Campos, Teofilo
    Zor, Cemre
    [J]. PATTERN RECOGNITION, 2017, 61 : 470 - 478
  • [26] AN INVOLUTION INEQUALITY FOR THE KULLBACK-LEIBLER DIVERGENCE
    Pinelis, Iosif
    [J]. MATHEMATICAL INEQUALITIES & APPLICATIONS, 2017, 20 (01): : 233 - 235
  • [27] Anomaly detection based on probability density function with Kullback-Leibler divergence
    Wang, Wei
    Zhang, Baoju
    Wang, Dan
    Jiang, Yu
    Qin, Shan
    [J]. SIGNAL PROCESSING, 2016, 126 : 12 - 17
  • [28] Model Fusion with Kullback-Leibler Divergence
    Claici, Sebastian
    Yurochkin, Mikhail
    Ghosh, Soumya
    Solomon, Justin
    [J]. 25TH AMERICAS CONFERENCE ON INFORMATION SYSTEMS (AMCIS 2019), 2019,
  • [29] Kullback-Leibler Divergence Measure for Multivariate Skew-Normal Distributions
    Contreras-Reyes, Javier E.
    Arellano-Valle, Reinaldo B.
    [J]. ENTROPY, 2012, 14 (09) : 1606 - 1626
  • [30] Robust Kullback-Leibler Divergence and Universal Hypothesis Testing for Continuous Distributions
    Yang, Pengfei
    Chen, Biao
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2019, 65 (04) : 2360 - 2373