APPROXIMATION OF INFORMATION DIVERGENCES FOR STATISTICAL LEARNING WITH APPLICATIONS

被引:3
|
作者
Stehlik, Milan [1 ,2 ,3 ]
Somorcik, Jan [4 ]
Strelec, Lubos [5 ]
Antoch, Jaromir [6 ]
机构
[1] Johannes Kepler Univ Linz, Inst Appl Stat, Altenberger Str 69, A-4040 Linz, Austria
[2] Johannes Kepler Univ Linz, Linz Inst Technol, Altenberger Str 69, A-4040 Linz, Austria
[3] Univ Valparaiso, Inst Stat, Gran Bretana 1111, Valparaiso, Chile
[4] Comenius Univ, Dept Appl Math & Stat, SK-84248 Bratislava, Slovakia
[5] Mendel Univ Brno, Dept Stat & Operat Anal, Zemedelska 1, CZ-61300 Brno, Czech Republic
[6] Charles Univ Prague, Dept Probabil & Math Stat, Sokolovska 83, CZ-18675 Prague, Czech Republic
关键词
deconvolution; information divergence; likelihood; change in intensity of Poisson process; EXACT SLOPES; LR TESTS; DISTRIBUTIONS; OPTIMALITY; PARAMETER; SENSE;
D O I
10.1515/ms-2017-0177
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
In this paper we give a partial response to one of the most important statistical questions, namely, what optimal statistical decisions are and how they are related to (statistical) information theory. We exemplify the necessity of understanding the structure of information divergences and their approximations, which may in particular be understood through deconvolution. Deconvolution of information divergences is illustrated in the exponential family of distributions, leading to the optimal tests in the Bahadur sense. We provide a new approximation of I-divergences using the Fourier transformation, saddle point approximation, and uniform convergence of the Euler polygons. Uniform approximation of deconvoluted parts of I-divergences is also discussed. Our approach is illustrated on a real data example. (C) 2018 Mathematical Institute Slovak Academy of Sciences
引用
收藏
页码:1149 / 1172
页数:24
相关论文
共 50 条
  • [21] Applications of relative statistical convergence and associated approximation theorem
    Su, Lian-Ta
    Raj, Kuldip
    Sharma, Sonali
    Cai, Qing-Bo
    AIMS MATHEMATICS, 2022, 7 (12): : 20838 - 20849
  • [22] The Information Geometry of Bregman Divergences and Some Applications in Multi-Expert Reasoning
    Adamcik, Martin
    ENTROPY, 2014, 16 (12): : 6338 - 6381
  • [23] Computing Statistical Divergences with Sigma Points
    Nielsen, Frank
    Nock, Richard
    GEOMETRIC SCIENCE OF INFORMATION (GSI 2021), 2021, 12829 : 677 - 684
  • [24] An Information Geometry of Statistical Manifold Learning
    Sun, Ke
    Marchand-Maillet, Stephane
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 32 (CYCLE 2), 2014, 32 : 1 - 9
  • [25] On Informational Divergences for General Statistical Theories
    Stephan Zanzinger
    International Journal of Theoretical Physics, 1998, 37 : 357 - 363
  • [26] Statistical inference based on bridge divergences
    Kuchibhotla, Arun Kumar
    Mukherjee, Somabha
    Basu, Ayanendranath
    ANNALS OF THE INSTITUTE OF STATISTICAL MATHEMATICS, 2019, 71 (03) : 627 - 656
  • [27] On informational divergences for general statistical theories
    Zanzinger, S
    INTERNATIONAL JOURNAL OF THEORETICAL PHYSICS, 1998, 37 (01) : 357 - 363
  • [28] Statistical inference based on bridge divergences
    Arun Kumar Kuchibhotla
    Somabha Mukherjee
    Ayanendranath Basu
    Annals of the Institute of Statistical Mathematics, 2019, 71 : 627 - 656
  • [29] Free energies divergences as statistical quantifiers
    Pennini, F.
    Plastino, A.
    Yanez, J.
    Ferri, G. L.
    PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, 2021, 564
  • [30] Transport information Bregman divergences
    Li W.
    Information Geometry, 2021, 4 (2) : 435 - 470