APPROXIMATION OF INFORMATION DIVERGENCES FOR STATISTICAL LEARNING WITH APPLICATIONS

被引:3
|
作者
Stehlik, Milan [1 ,2 ,3 ]
Somorcik, Jan [4 ]
Strelec, Lubos [5 ]
Antoch, Jaromir [6 ]
机构
[1] Johannes Kepler Univ Linz, Inst Appl Stat, Altenberger Str 69, A-4040 Linz, Austria
[2] Johannes Kepler Univ Linz, Linz Inst Technol, Altenberger Str 69, A-4040 Linz, Austria
[3] Univ Valparaiso, Inst Stat, Gran Bretana 1111, Valparaiso, Chile
[4] Comenius Univ, Dept Appl Math & Stat, SK-84248 Bratislava, Slovakia
[5] Mendel Univ Brno, Dept Stat & Operat Anal, Zemedelska 1, CZ-61300 Brno, Czech Republic
[6] Charles Univ Prague, Dept Probabil & Math Stat, Sokolovska 83, CZ-18675 Prague, Czech Republic
关键词
deconvolution; information divergence; likelihood; change in intensity of Poisson process; EXACT SLOPES; LR TESTS; DISTRIBUTIONS; OPTIMALITY; PARAMETER; SENSE;
D O I
10.1515/ms-2017-0177
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
In this paper we give a partial response to one of the most important statistical questions, namely, what optimal statistical decisions are and how they are related to (statistical) information theory. We exemplify the necessity of understanding the structure of information divergences and their approximations, which may in particular be understood through deconvolution. Deconvolution of information divergences is illustrated in the exponential family of distributions, leading to the optimal tests in the Bahadur sense. We provide a new approximation of I-divergences using the Fourier transformation, saddle point approximation, and uniform convergence of the Euler polygons. Uniform approximation of deconvoluted parts of I-divergences is also discussed. Our approach is illustrated on a real data example. (C) 2018 Mathematical Institute Slovak Academy of Sciences
引用
收藏
页码:1149 / 1172
页数:24
相关论文
共 50 条
  • [1] SOME STATISTICAL APPLICATIONS OF (R,S)-DIRECTED DIVERGENCES
    MENENDEZ, ML
    PARDO, L
    MORALES, D
    SALICRU, M
    UTILITAS MATHEMATICA, 1992, 42 : 115 - 127
  • [2] Asymptotic behaviour and statistical applications of weighted (h,φ)-divergences
    Landaburu, E
    Pardo, L
    KYBERNETES, 2004, 33 (9-10) : 1518 - 1534
  • [3] Clustering Positive Definite Matrices by Learning Information Divergences
    Stanitsas, Panagiotis
    Cherian, Anoop
    Morellas, Vassilios
    Papanikolopoulos, Nikolaos
    2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS (ICCVW 2017), 2017, : 1304 - 1312
  • [5] Closed-Form Information-Theoretic Divergences for Statistical Mixtures
    Nielsen, Frank
    2012 21ST INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR 2012), 2012, : 1723 - 1726
  • [6] Decompositions of information divergences: recent development, open problems and applications
    Stehlik, M.
    9TH INTERNATIONAL CONFERENCE ON MATHEMATICAL PROBLEMS IN ENGINEERING, AEROSPACE AND SCIENCES (ICNPAA 2012), 2012, 1493 : 972 - 976
  • [7] WKB approximation without divergences
    Cocolicchio, D
    Viggiano, M
    INTERNATIONAL JOURNAL OF THEORETICAL PHYSICS, 1997, 36 (12) : 3051 - 3064
  • [8] WKB approximation without divergences
    D. Cocolicchio
    M. Viggiano
    International Journal of Theoretical Physics, 1997, 36 : 3051 - 3064
  • [9] Central Approximation in Statistical Physics and Information Theory
    Mori, Ryuhei
    Tanaka, Toshiyuki
    2012 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY PROCEEDINGS (ISIT), 2012,
  • [10] A-Statistical Convergence with a Rate and Applications to Approximation
    Gulfirat, Mustafa
    FILOMAT, 2022, 36 (15) : 5323 - 5335