Empirical Lipschitz Constants for the Renyi Entropy Maximum Likelihood Estimator

被引:0
|
作者
Konstantinides, John M. [1 ]
Andreadis, Ioannis [1 ]
机构
[1] Democritus Univ Thrace, Dept Elect & Comp Engn, Elect Lab, Xanthi 67100, Greece
关键词
Renyi entropy; maximum likelihood estimation; empirical Lipschitz constants; distribution-free analysis; strong mixing conditions; eta-mixing; INEQUALITIES; PROBABILITY; BOUNDS;
D O I
10.1109/TIT.2019.2900048
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Mathematical tools that are widely employed for the distribution-free analysis of the finite-sample performance of empirical estimators-like Efron-Stein's inequality, or Azuma's martingale inequality and derivatives-rely on the provision of tight bounds for the Lipschitz constants of the estimators. Lipschitz constants can be easily derived for simple functionals of the empirical measures, such as the Shannon or the Tsallis entropy. However, obtaining tight bounds for more generalized entropy functionals which cannot be decomposed into sums of identical terms is much more involved. The goal in this paper is the derivation of (empirical) Lipschitz constants for the maximum likelihood estimator of the family of Renyi entropies of order lambda > 0, lambda not equal 1. Analytic solutions for the optimal constants are obtained for the most important special cases; namely, for 0 < lambda < 1, for 1 < lambda < 2 as well as for lambda = 2 (collision entropy) and lambda = 3. For the remaining cases where no analytic solution is obtained (i.e., lambda > 2, lambda not equal 3), an efficient way to compute the optimal constants is derived by reducing the complexity of the underlying optimization problem from exponential Omega(n(parallel to A parallel to-1)) to linear O(n), where n the number of available samples and parallel to A parallel to >= 2 the number of symbols of the source (with the assumption parallel to A parallel to does not grow with n). The optimal constants are subsequently used for the distribution-free performance analysis of the maximum likelihood estimator of Renyi entropy; this includes variance and concentration bounds, both under the assumption of independence as well as under strong mixing conditions.
引用
下载
收藏
页码:3540 / 3554
页数:15
相关论文
共 50 条
  • [22] Optimality of the maximum likelihood estimator in astrometry
    Espinosa, Sebastian
    Silva, Jorge F.
    Mendez, Rene A.
    Lobos, Rodrigo
    Orchard, Marcos
    ASTRONOMY & ASTROPHYSICS, 2018, 616
  • [23] Concentration inequality of maximum likelihood estimator
    Miao, Yu
    APPLIED MATHEMATICS LETTERS, 2010, 23 (10) : 1305 - 1309
  • [24] Maximum Likelihood Estimator for the α-η-μ Fading Environment
    Batista, Fernando Palma
    de Souza, Rausley A. A.
    Ribeiro, Antonio Marcelo O.
    2016 IEEE RADIO AND WIRELESS SYMPOSIUM (RWS), 2016, : 133 - 136
  • [25] ASYMPTOTIC EFFICIENCY OF MAXIMUM LIKELIHOOD ESTIMATOR
    KAUFMAN, S
    ANNALS OF MATHEMATICAL STATISTICS, 1965, 36 (03): : 1084 - &
  • [26] Maximum Likelihood Estimator of Quantum Probabilities
    Navara, Mirko
    Sevic, Jan
    INTERNATIONAL JOURNAL OF THEORETICAL PHYSICS, 2023, 62 (10)
  • [27] ASYMPTOTIC EFFICIENCY OF MAXIMUM LIKELIHOOD ESTIMATOR
    WOLFOWITZ, J
    THEORY OF PROBILITY AND ITS APPLICATIONS,USSR, 1965, 10 (02): : 247 - +
  • [28] A geometric characterization of maximum Renyi entropy distributions
    Vignat, Christophe
    Hero, Alfred O.
    Costa, Jose A.
    2006 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY, VOLS 1-6, PROCEEDINGS, 2006, : 1822 - +
  • [29] Developments in maximum entropy and likelihood
    Gilmore, CJ
    Nicholson, WV
    DIRECT METHODS FOR SOLVING MACROMOLECULAR STRUCTURES, 1998, 507 : 455 - 462
  • [30] A Robust Version of the Empirical Likelihood Estimator
    Keziou, Amor
    Toma, Aida
    MATHEMATICS, 2021, 9 (08)