Use of Kullback-Leibler divergence for forgetting

被引:12
|
作者
Karny, Miroslav [1 ]
Andrysek, Josef [1 ]
机构
[1] Acad Sci Czech Republic, Inst Informat Theory & Automat, Adapt Syst Dept, CR-18208 Prague, Czech Republic
关键词
Bayesian estimation; Kullback-Leibler divergence; functional approximation of estimation; parameter tracking by stabilized forgetting; ARX model; INFORMATION;
D O I
10.1002/acs.1080
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Non-symmetric Kullback-Leibler divergence (KLD) measures proximity of probability density functions (pdfs). Bernardo (Ann. Stat. 1979; 7(3):686-690) had shown its unique role in approximation of pdfs. The order of the KLD arguments is also implied by his methodological result. Functional approximation of estimation and stabilized forgetting, serving for tracking of slowly varying parameters, use the reversed order. This choice has the pragmatic motivation: recursive estimator often approximates the parametric model by a member of exponential family (EF) as it maps prior pdfs from the set of conjugate pdfs (CEF) back to the CEF. Approximations based on the KLD with the reversed order of arguments preserves this property. In the paper, the approximation performed within the CEF but with the proper order of arguments of the KLD is advocated. It is applied to the parameter tracking and performance improvements are demonstrated. This practical result is of importance for adaptive systems and opens a way for improving the functional approximation. Copyright (C) 2008 John Wiley & Sons, Ltd.
引用
下载
收藏
页码:961 / 975
页数:15
相关论文
共 50 条
  • [21] Kullback-Leibler Divergence Estimation of Continuous Distributions
    Perez-Cruz, Fernando
    2008 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY PROCEEDINGS, VOLS 1-6, 2008, : 1666 - 1670
  • [22] Generalization of the Kullback-Leibler divergence in the Tsallis statistics
    Huang, Juntao
    Yong, Wen-An
    Hong, Liu
    JOURNAL OF MATHEMATICAL ANALYSIS AND APPLICATIONS, 2016, 436 (01) : 501 - 512
  • [23] Kullback-Leibler Divergence for Nonnegative Matrix Factorization
    Yang, Zhirong
    Zhang, He
    Yuan, Zhijian
    Oja, Erkki
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2011, PT I, 2011, 6791 : 250 - 257
  • [24] Acoustic environment identification by Kullback-Leibler divergence
    Delgado-Gutierrez, G.
    Rodriguez-Santos, F.
    Jimenez-Ramirez, O.
    Vazquez-Medina, R.
    FORENSIC SCIENCE INTERNATIONAL, 2017, 281 : 134 - 140
  • [25] Minimization of the Kullback-Leibler Divergence for Nonlinear Estimation
    Darling, Jacob E.
    DeMars, Kyle J.
    JOURNAL OF GUIDANCE CONTROL AND DYNAMICS, 2017, 40 (07) : 1739 - 1748
  • [26] Modulation Classification Based on Kullback-Leibler Divergence
    Im, Chaewon
    Ahn, Seongjin
    Yoon, Dongweon
    15TH INTERNATIONAL CONFERENCE ON ADVANCED TRENDS IN RADIOELECTRONICS, TELECOMMUNICATIONS AND COMPUTER ENGINEERING (TCSET - 2020), 2020, : 373 - 376
  • [27] MINIMIZATION OF THE KULLBACK-LEIBLER DIVERGENCE FOR NONLINEAR ESTIMATION
    Darling, Jacob E.
    DeMars, Kyle J.
    ASTRODYNAMICS 2015, 2016, 156 : 213 - 232
  • [28] COMPLEX NMF WITH THE GENERALIZED KULLBACK-LEIBLER DIVERGENCE
    Kameoka, Hirokazu
    Kagami, Hideaki
    Yukawa, Masahiro
    2017 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2017, : 56 - 60
  • [29] The generalized Kullback-Leibler divergence and robust inference
    Park, C
    Basu, A
    JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2003, 73 (05) : 311 - 332
  • [30] The AIC criterion and symmetrizing the Kullback-Leibler divergence
    Seghouane, Abd-Krim
    Amari, Shun-Ichi
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2007, 18 (01): : 97 - 106