Kullback-Leibler Divergence and Moment Matching for Hyperspherical Probability Distributions

被引:0
|
作者
Kurz, Gerhard [1 ]
Pfaff, Florian [1 ]
Hanebeck, Uwe D. [1 ]
机构
[1] Karlsruhe Inst Technol, Inst Anthropomat & Robot, ISAS, Intelligent Sensor Actuator Syst Lab, D-76021 Karlsruhe, Germany
关键词
von Mises-Fisher distribution; Watson distribution; parameter estimation;
D O I
暂无
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
When approximating one probability density with another density, it is desirable to minimize the information loss of the approximation as quantified by, e.g., the Kullback-Leibler divergence (KLD). It has been known for some time that in the case of the Gaussian distribution, matching the first two moments of the original density yields the optimal approximation in terms of minimizing the KLD. In this paper, we will show that a similar property can be proven for certain hyperspherical probability distributions, namely the von Mises-Fisher and the Watson distribution. This result has profound implications for moment-based filtering on the unit hypersphere as it shows that moment-based approaches are optimal in the information-theoretic sense.
引用
收藏
页码:2087 / 2094
页数:8
相关论文
共 50 条
  • [41] Learning Multi-SenseWord Distributions using Approximate Kullback-Leibler Divergence
    Jayashree, P.
    Shreya, Ballijepalli
    Srijith, P. K.
    [J]. CODS-COMAD 2021: PROCEEDINGS OF THE 3RD ACM INDIA JOINT INTERNATIONAL CONFERENCE ON DATA SCIENCE & MANAGEMENT OF DATA (8TH ACM IKDD CODS & 26TH COMAD), 2021, : 267 - 271
  • [42] Vector-quantization by density matching in the minimum Kullback-Leibler divergence sense
    Hegde, A
    Erdogmus, D
    Lehn-Schioler, T
    Rao, YN
    Principe, JC
    [J]. 2004 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, PROCEEDINGS, 2004, : 105 - 109
  • [43] Characterizing variation of nonparametric random probability measures using the Kullback-Leibler divergence
    Watson, J.
    Nieto-Barajas, L.
    Holmes, C.
    [J]. STATISTICS, 2017, 51 (03) : 558 - 571
  • [44] The generalized Kullback-Leibler divergence and robust inference
    Park, C
    Basu, A
    [J]. JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2003, 73 (05) : 311 - 332
  • [45] COMPLEX NMF WITH THE GENERALIZED KULLBACK-LEIBLER DIVERGENCE
    Kameoka, Hirokazu
    Kagami, Hideaki
    Yukawa, Masahiro
    [J]. 2017 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2017, : 56 - 60
  • [46] On calibration of Kullback-Leibler divergence via prediction
    Keyes, TK
    Levy, MS
    [J]. COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 1999, 28 (01) : 67 - 85
  • [47] Estimation of kullback-leibler divergence by local likelihood
    Lee, Young Kyung
    Park, Byeong U.
    [J]. ANNALS OF THE INSTITUTE OF STATISTICAL MATHEMATICS, 2006, 58 (02) : 327 - 340
  • [48] The AIC criterion and symmetrizing the Kullback-Leibler divergence
    Seghouane, Abd-Krim
    Amari, Shun-Ichi
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 2007, 18 (01): : 97 - 106
  • [49] Texture similarity measure using Kullback-Leibler divergence between gamma distributions
    Mathiassen, JR
    Skavhaug, A
    Bo, K
    [J]. COMPUTER VISION - ECCV 2002 PT III, 2002, 2352 : 133 - 147
  • [50] Average Kullback-Leibler Divergence for Random Finite Sets
    Battistelli, Giorgio
    Chisci, Luigi
    Fantacci, Claudio
    Farina, Alfonso
    Vo, Ba-Ngu
    [J]. 2015 18TH INTERNATIONAL CONFERENCE ON INFORMATION FUSION (FUSION), 2015, : 1359 - 1366