Kullback-Leibler Distance in Linear Parametric Modeling

被引:2
|
作者
Beheshti, Soosan [1 ]
机构
[1] Ryerson Univ, Dept Elect & Comp Engn, Toronto, ON M5B 2K3, Canada
关键词
D O I
10.1109/ISIT.2008.4595272
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
This paper addresses the estimation of the Kulback-Leibler (KL) distance in data-driven modeling of parametric probability distributions. Given a finite observation of a parametric probability density function (pdf), the goal is to provide the best representative of the true parameter, which is known to belong to a given parametric model set. The first step in this problem setting is to estimate the true parameter in available nested model sets of different orders. The proposed method calculates the KL distance between these estimates and the unknown true parameter. By using only the observed data, we provide probabilistic worst case bounds on these KL distances. The best candidate among the available estimates is the solution of a resulting probabilistic min-max problem. A comparison of this approach with existing methods that estimate the KL distance is provided.
引用
收藏
页码:1671 / 1675
页数:5
相关论文
共 50 条
  • [41] Kullback-Leibler distance as a measure of the information filtered from multivariate data
    Tumminello, Michele
    Lillo, Fabrizio
    Mantegna, Rosario N.
    [J]. PHYSICAL REVIEW E, 2007, 76 (03):
  • [42] Reducing the Plagiarism Detection Search Space on the Basis of the Kullback-Leibler Distance
    Barron-Cedeno, Alberto
    Rosso, Paolo
    Benedi, Jose-Miguel
    [J]. COMPUTATIONAL LINGUISTICS AND INTELLIGENT TEXT PROCESSING, 2009, 5449 : 523 - +
  • [43] Gaussian Kullback-Leibler Approximate Inference
    Challis, Edward
    Barber, David
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2013, 14 : 2239 - 2286
  • [44] Model Fusion with Kullback-Leibler Divergence
    Claici, Sebastian
    Yurochkin, Mikhail
    Ghosh, Soumya
    Solomon, Justin
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [45] Use of Kullback-Leibler divergence for forgetting
    Karny, Miroslav
    Andrysek, Josef
    [J]. INTERNATIONAL JOURNAL OF ADAPTIVE CONTROL AND SIGNAL PROCESSING, 2009, 23 (10) : 961 - 975
  • [46] Nonparametric Estimation of Kullback-Leibler Divergence
    Zhang, Zhiyi
    Grabchak, Michael
    [J]. NEURAL COMPUTATION, 2014, 26 (11) : 2570 - 2593
  • [47] Kullback-Leibler Divergence Metric Learning
    Ji, Shuyi
    Zhang, Zizhao
    Ying, Shihui
    Wang, Liejun
    Zhao, Xibin
    Gao, Yue
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (04) : 2047 - 2058
  • [48] Consistent estimator for basis selection based on a proxy of the Kullback-Leibler distance
    Dias, Ronaldo
    Garcia, Nancy L.
    [J]. JOURNAL OF ECONOMETRICS, 2007, 141 (01) : 167 - 178
  • [49] Kullback-Leibler distance-based enhanced detection of incipient anomalies
    Harrou, Fouzi
    Sun, Ying
    Madakyaru, Muddu
    [J]. JOURNAL OF LOSS PREVENTION IN THE PROCESS INDUSTRIES, 2016, 44 : 73 - 87
  • [50] Kullback-Leibler life time testing
    Stehlik, M.
    Economou, P.
    Kiselak, J.
    Richter, W-D
    [J]. APPLIED MATHEMATICS AND COMPUTATION, 2014, 240 : 122 - 139