Kullback-Leibler Divergence Between Multivariate Generalized Gaussian Distributions

被引:30
|
作者
Bouhlel, Nizar [1 ]
Dziri, Ali [2 ]
机构
[1] ENSTA Bretagne, Lab STICC, F-29200 Brest, France
[2] Conservatoire Natl Arts & Metiers, F-75003 Paris, France
关键词
Multivariate generalized Gaussian distribution; Kullback-Leibler divergence; Lauricella function;
D O I
10.1109/LSP.2019.2915000
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
The Kullback-Leibler divergence (KLD) between two multivariate generalized Gaussian distributions (MGGDs) is a fundamental tool in many signal and image processing applications. Until now, the KLD of MGGDs has no known explicit form, and it is in practice either estimated using expensive Monte-Carlo stochastic integration or approximated. The main contribution of this letter is to present a closed-form expression of the KLD between two zero-mean MGGDs. Depending on the Lauricella series, a simple way of calculating numerically the KLD is exposed. Finally, we show that the approximation of the KLD by Monte-Carlo sampling converges to its theoretical value when the number of samples goes to the infinity.
引用
收藏
页码:1021 / 1025
页数:5
相关论文
共 50 条
  • [1] On the Properties of Kullback-Leibler Divergence Between Multivariate Gaussian Distributions
    Zhang, Yufeng
    Pan, Jialu
    Li, Kenli
    Liu, Wanwei
    Chen, Zhenbang
    Liu, Xinwang
    Wang, Ji
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [2] The Kullback-Leibler Divergence Between Lattice Gaussian Distributions
    Nielsen, Frank
    [J]. JOURNAL OF THE INDIAN INSTITUTE OF SCIENCE, 2022, 102 (04) : 1177 - 1188
  • [3] KULLBACK-LEIBLER DISTANCE BETWEEN COMPLEX GENERALIZED GAUSSIAN DISTRIBUTIONS
    Nafornita, Corina
    Berthoumieu, Yannick
    Nafornita, Ioan
    Isar, Alexandru
    [J]. 2012 PROCEEDINGS OF THE 20TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO), 2012, : 1850 - 1854
  • [4] Incipient fault detection for geological drilling processes using multivariate generalized Gaussian distributions and Kullback-Leibler divergence
    Li, Yupeng
    Cao, Weihua
    Hu, Wenkai
    Xiong, Ying
    Wu, Min
    [J]. CONTROL ENGINEERING PRACTICE, 2021, 117
  • [5] Distributions of the Kullback-Leibler divergence with applications
    Belov, Dmitry I.
    Armstrong, Ronald D.
    [J]. BRITISH JOURNAL OF MATHEMATICAL & STATISTICAL PSYCHOLOGY, 2011, 64 (02): : 291 - 309
  • [6] Kullback-Leibler Divergence Measure for Multivariate Skew-Normal Distributions
    Contreras-Reyes, Javier E.
    Arellano-Valle, Reinaldo B.
    [J]. ENTROPY, 2012, 14 (09) : 1606 - 1626
  • [7] Abnormality detection based on the Kullback-Leibler divergence for generalized Gaussian data
    Xiong, Ying
    Jing, Yindi
    Chen, Tongwen
    [J]. CONTROL ENGINEERING PRACTICE, 2019, 85 : 257 - 270
  • [8] The Kullback–Leibler Divergence Between Lattice Gaussian Distributions
    Frank Nielsen
    [J]. Journal of the Indian Institute of Science, 2022, 102 : 1177 - 1188
  • [9] Kullback-Leibler Divergence Estimation of Continuous Distributions
    Perez-Cruz, Fernando
    [J]. 2008 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY PROCEEDINGS, VOLS 1-6, 2008, : 1666 - 1670
  • [10] Exact Expressions for Kullback-Leibler Divergence for Multivariate and Matrix-Variate Distributions
    Nawa, Victor
    Nadarajah, Saralees
    [J]. ENTROPY, 2024, 26 (08)