A Cautionary Note on Model Choice and the Kullback-Leibler Information

被引:0
|
作者
Heyde, C. C. [1 ,2 ]
Au, K. [2 ,3 ]
机构
[1] Columbia Univ, Dept Stat, New York, NY 10027 USA
[2] Australian Natl Univ, Inst Math Sci, Canberra, ACT 0200, Australia
[3] Univ Melbourne, Dept Math, Melbourne, Vic 3010, Australia
关键词
Discrimination; Entropy; Heavy-tailed; Statistical distance;
D O I
10.1080/15598608.2008.10411872
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
The Kullback-Leibler information has found application in many areas of statistical science. It typically arises in model choice and model dimension questions in a way which suggests its use as a distance. Indeed, it has been widely described as a distance although it comprehensively fails to be a metric. Some pitfalls in interpreting it as a distance are discussed and, in particular, its application to discriminate between prospective risky asset returns distributions.
引用
收藏
页码:221 / 232
页数:12
相关论文
共 50 条
  • [21] A test for fuzzy exponentiality based on Kullback-Leibler information
    Kong, Lingtao
    [J]. JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2022, 42 (03) : 1281 - 1288
  • [22] Parameter identifiability with Kullback-Leibler information divergence criterion
    Chen, Badong
    Hu, Jinchun
    Zhu, Yu
    Sun, Zengqi
    [J]. INTERNATIONAL JOURNAL OF ADAPTIVE CONTROL AND SIGNAL PROCESSING, 2009, 23 (10) : 940 - 960
  • [23] Matrix CFAR detectors based on symmetrized Kullback-Leibler and total Kullback-Leibler divergences
    Hua, Xiaoqiang
    Cheng, Yongqiang
    Wang, Hongqiang
    Qin, Yuliang
    Li, Yubo
    Zhang, Wenpeng
    [J]. DIGITAL SIGNAL PROCESSING, 2017, 69 : 106 - 116
  • [24] On the Kullback-Leibler information divergence of locally stationary processes
    Dahlhaus, R
    [J]. STOCHASTIC PROCESSES AND THEIR APPLICATIONS, 1996, 62 (01) : 139 - 168
  • [25] A test for population collinearity - A Kullback-Leibler information approach
    GuerreroCusumano, JL
    [J]. INFORMATION SCIENCES, 1996, 92 (1-4) : 295 - 311
  • [26] Kullback-Leibler Information Consistent Estimation for Censored Data
    Akio Suzukawa
    Hideyuki Imai
    Yoshiharu Sato
    [J]. Annals of the Institute of Statistical Mathematics, 2001, 53 : 262 - 276
  • [27] Integrating information by Kullback-Leibler constraint for text classification
    Yin, Shu
    Zhu, Peican
    Wu, Xinyu
    Huang, Jiajin
    Li, Xianghua
    Wang, Zhen
    Gao, Chao
    [J]. NEURAL COMPUTING & APPLICATIONS, 2023, 35 (24): : 17521 - 17535
  • [28] Some properties and applications of cumulative Kullback-Leibler information
    Di Crescenzo, Antonio
    Longobardi, Maria
    [J]. APPLIED STOCHASTIC MODELS IN BUSINESS AND INDUSTRY, 2015, 31 (06) : 875 - 891
  • [29] MONOTONICITY OF THE FISHER INFORMATION AND THE KULLBACK-LEIBLER DIVERGENCE MEASURE
    RYU, KW
    [J]. ECONOMICS LETTERS, 1993, 42 (2-3) : 121 - 128
  • [30] Kullback-Leibler information consistent estimation for censored data
    Suzukawa, A
    Imai, H
    Sato, Y
    [J]. ANNALS OF THE INSTITUTE OF STATISTICAL MATHEMATICS, 2001, 53 (02) : 262 - 276