Empirical Lipschitz Constants for the Renyi Entropy Maximum Likelihood Estimator

被引:0
|
作者
Konstantinides, John M. [1 ]
Andreadis, Ioannis [1 ]
机构
[1] Democritus Univ Thrace, Dept Elect & Comp Engn, Elect Lab, Xanthi 67100, Greece
关键词
Renyi entropy; maximum likelihood estimation; empirical Lipschitz constants; distribution-free analysis; strong mixing conditions; eta-mixing; INEQUALITIES; PROBABILITY; BOUNDS;
D O I
10.1109/TIT.2019.2900048
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Mathematical tools that are widely employed for the distribution-free analysis of the finite-sample performance of empirical estimators-like Efron-Stein's inequality, or Azuma's martingale inequality and derivatives-rely on the provision of tight bounds for the Lipschitz constants of the estimators. Lipschitz constants can be easily derived for simple functionals of the empirical measures, such as the Shannon or the Tsallis entropy. However, obtaining tight bounds for more generalized entropy functionals which cannot be decomposed into sums of identical terms is much more involved. The goal in this paper is the derivation of (empirical) Lipschitz constants for the maximum likelihood estimator of the family of Renyi entropies of order lambda > 0, lambda not equal 1. Analytic solutions for the optimal constants are obtained for the most important special cases; namely, for 0 < lambda < 1, for 1 < lambda < 2 as well as for lambda = 2 (collision entropy) and lambda = 3. For the remaining cases where no analytic solution is obtained (i.e., lambda > 2, lambda not equal 3), an efficient way to compute the optimal constants is derived by reducing the complexity of the underlying optimization problem from exponential Omega(n(parallel to A parallel to-1)) to linear O(n), where n the number of available samples and parallel to A parallel to >= 2 the number of symbols of the source (with the assumption parallel to A parallel to does not grow with n). The optimal constants are subsequently used for the distribution-free performance analysis of the maximum likelihood estimator of Renyi entropy; this includes variance and concentration bounds, both under the assumption of independence as well as under strong mixing conditions.
引用
下载
收藏
页码:3540 / 3554
页数:15
相关论文
共 50 条
  • [1] A Microeconomic Interpretation of the Maximum Entropy Estimator of Multinomial Logit Models and Its Equivalence to the Maximum Likelihood Estimator
    Donoso, Pedro
    de Grange, Louis
    ENTROPY, 2010, 12 (10) : 2077 - 2084
  • [2] Empirical likelihood bivariate nonparametric maximum likelihood estimator with right censored data
    Jian-Jian Ren
    Tonya Riddlesworth
    Annals of the Institute of Statistical Mathematics, 2014, 66 : 913 - 930
  • [3] Empirical likelihood bivariate nonparametric maximum likelihood estimator with right censored data
    Ren, Jian-Jian
    Riddlesworth, Tonya
    ANNALS OF THE INSTITUTE OF STATISTICAL MATHEMATICS, 2014, 66 (05) : 913 - 930
  • [4] A recursive Renyi's entropy estimator
    Erdogmus, D
    Principe, JC
    Kim, SP
    Sanchez, JC
    NEURAL NETWORKS FOR SIGNAL PROCESSING XII, PROCEEDINGS, 2002, : 209 - 217
  • [5] Maximum Renyi Entropy Rate
    Bunte, Christoph
    Lapidoth, Amos
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2016, 62 (03) : 1193 - 1205
  • [6] PRODUCT LIMIT ESTIMATOR AS MAXIMUM LIKELIHOOD ESTIMATOR
    JOHANSEN, S
    SCANDINAVIAN JOURNAL OF STATISTICS, 1978, 5 (04) : 195 - 199
  • [7] Empirical likelihood bivariate nonparametric maximum likelihood estimator with right censored data and continuous covariate
    Ren, Jian-Jian
    STATISTICS AND ITS INTERFACE, 2017, 10 (04) : 601 - 605
  • [8] The density of the maximum likelihood estimator
    Hillier, G
    Armstrong, M
    ECONOMETRICA, 1999, 67 (06) : 1459 - 1470
  • [9] The Sherpa Maximum Likelihood Estimator
    Nguyen, D.
    Doe, S.
    Evans, I.
    Hain, R.
    Primini, F.
    ASTRONOMICAL DATA ANALYSIS SOFTWARE AND SYSTEMS XX, 2011, 442 : 517 - 520
  • [10] On the uniqueness of the maximum likelihood estimator
    Orme, CD
    Ruud, PA
    ECONOMICS LETTERS, 2002, 75 (02) : 209 - 217