APPROXIMATION OF INFORMATION DIVERGENCES FOR STATISTICAL LEARNING WITH APPLICATIONS

被引:3
|
作者
Stehlik, Milan [1 ,2 ,3 ]
Somorcik, Jan [4 ]
Strelec, Lubos [5 ]
Antoch, Jaromir [6 ]
机构
[1] Johannes Kepler Univ Linz, Inst Appl Stat, Altenberger Str 69, A-4040 Linz, Austria
[2] Johannes Kepler Univ Linz, Linz Inst Technol, Altenberger Str 69, A-4040 Linz, Austria
[3] Univ Valparaiso, Inst Stat, Gran Bretana 1111, Valparaiso, Chile
[4] Comenius Univ, Dept Appl Math & Stat, SK-84248 Bratislava, Slovakia
[5] Mendel Univ Brno, Dept Stat & Operat Anal, Zemedelska 1, CZ-61300 Brno, Czech Republic
[6] Charles Univ Prague, Dept Probabil & Math Stat, Sokolovska 83, CZ-18675 Prague, Czech Republic
关键词
deconvolution; information divergence; likelihood; change in intensity of Poisson process; EXACT SLOPES; LR TESTS; DISTRIBUTIONS; OPTIMALITY; PARAMETER; SENSE;
D O I
10.1515/ms-2017-0177
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
In this paper we give a partial response to one of the most important statistical questions, namely, what optimal statistical decisions are and how they are related to (statistical) information theory. We exemplify the necessity of understanding the structure of information divergences and their approximations, which may in particular be understood through deconvolution. Deconvolution of information divergences is illustrated in the exponential family of distributions, leading to the optimal tests in the Bahadur sense. We provide a new approximation of I-divergences using the Fourier transformation, saddle point approximation, and uniform convergence of the Euler polygons. Uniform approximation of deconvoluted parts of I-divergences is also discussed. Our approach is illustrated on a real data example. (C) 2018 Mathematical Institute Slovak Academy of Sciences
引用
收藏
页码:1149 / 1172
页数:24
相关论文
共 50 条
  • [11] Statistical Voronoi mean and applications to approximation theorems
    Demirci, Kamil
    Yildiz, Sevda
    Dirik, Fadime
    ANNALS OF THE UNIVERSITY OF CRAIOVA-MATHEMATICS AND COMPUTER SCIENCE SERIES, 2021, 48 (02): : 283 - 292
  • [12] Neural Estimation of Statistical Divergences
    Sreekumar, Sreejith
    Goldfeld, Ziv
    Journal of Machine Learning Research, 2022, 23
  • [13] Neural Estimation of Statistical Divergences
    Sreekumar, Sreejith
    Goldfeld, Ziv
    JOURNAL OF MACHINE LEARNING RESEARCH, 2022, 23
  • [14] Sketching information divergences
    Sudipto Guha
    Piotr Indyk
    Andrew McGregor
    Machine Learning, 2008, 72 : 5 - 19
  • [15] Sketching information divergences
    Guha, Sudipto
    Indyk, Piotr
    McGregor, Andrew
    MACHINE LEARNING, 2008, 72 (1-2) : 5 - 19
  • [16] Sketching information divergences
    Guha, Sudipto
    Indyk, Piotr
    McGregor, Andrew
    LEARNING THEORY, PROCEEDINGS, 2007, 4539 : 424 - +
  • [17] Statistical applications of contrastive learning
    Gutmann M.U.
    Kleinegesse S.
    Rhodes B.
    Behaviormetrika, 2022, 49 (2) : 277 - 301
  • [18] Iterative Approximation of Statistical Distributions and Relation to Information Geometry
    C. T. J. Dodson
    H. Wang
    Statistical Inference for Stochastic Processes, 2001, 4 (3) : 307 - 318
  • [19] Approximation of Statistical Information with Fuzzy Models for Classification in Medicine
    Pota, Marco
    Esposito, Massimo
    De Pietro, Giuseppe
    KNOWLEDGE, INFORMATION AND CREATIVITY SUPPORT SYSTEMS: RECENT TRENDS, ADVANCES AND SOLUTIONS, KICSS 2013, 2016, 364 : 359 - 371
  • [20] Formulation of the Tree Approximation Problem as a Detection Problem and Relation between the AUC and Information Theory Divergences
    Khajavi, Navid Tafaghodi
    Kuh, Anthony
    INNS CONFERENCE ON BIG DATA 2015 PROGRAM, 2015, 53 : 257 - 264