Kullback-Leibler Information Consistent Estimation for Censored Data

被引:0
|
作者
Akio Suzukawa
Hideyuki Imai
Yoshiharu Sato
机构
[1] Hokkaido University,Division of Systems and Information Engineering
关键词
Approximate likelihood; information criterion; Kaplan-Meier estimator; maximum likelihood estimation;
D O I
暂无
中图分类号
学科分类号
摘要
This paper is intended as an investigation of parametric estimation for the randomly right censored data. In parametric estimation, the Kullback-Leibler information is used as a measure of the divergence of a true distribution generating a data relative to a distribution in an assumed parametric model M. When the data is uncensored, maximum likelihood estimator (MLE) is a consistent estimator of minimizing the Kullback-Leibler information, even if the assumed model M does not contain the true distribution. We call this property minimum Kullback-Leibler information consistency (MKLI-consistency). However, the MLE obtained by maximizing the likelihood function based on the censored data is not MKLI-consistent. As an alternative to the MLE, Oakes (1986, Biometrics, 42, 177–182) proposed an estimator termed approximate maximum likelihood estimator (AMLE) due to its computational advantage and potential for robustness. We show MKLI-consistency and asymptotic normality of the AMLE under the misspecification of the parametric model. In a simulation study, we investigate mean square errors of these two estimators and an estimator which is obtained by treating a jackknife corrected Kaplan-Meier integral as the log-likelihood. On the basis of the simulation results and the asymptotic results, we discuss comparison among these estimators. We also derive information criteria for the MLE and the AMLE under censorship, and which can be used not only for selecting models but also for selecting estimation procedures.
引用
收藏
页码:262 / 276
页数:14
相关论文
共 50 条
  • [31] MINIMIZATION OF THE KULLBACK-LEIBLER DIVERGENCE FOR NONLINEAR ESTIMATION
    Darling, Jacob E.
    DeMars, Kyle J.
    [J]. ASTRODYNAMICS 2015, 2016, 156 : 213 - 232
  • [32] The Kullback-Leibler autodependogram
    Bagnato, L.
    De Capitani, L.
    Punzo, A.
    [J]. JOURNAL OF APPLIED STATISTICS, 2016, 43 (14) : 2574 - 2594
  • [33] THE KULLBACK-LEIBLER DISTANCE
    KULLBACK, S
    [J]. AMERICAN STATISTICIAN, 1987, 41 (04): : 340 - 340
  • [34] ON GENERALIZED CUMULATIVE INFORMATION OF KULLBACK-LEIBLER TYPE
    Ciumara, Roxana
    Panait, Ioana Ileana
    [J]. PROCEEDINGS OF THE ROMANIAN ACADEMY SERIES A-MATHEMATICS PHYSICS TECHNICAL SCIENCES INFORMATION SCIENCE, 2018, 19 (04): : 529 - 536
  • [35] ON KULLBACK-LEIBLER LOSS AND DENSITY-ESTIMATION
    HALL, P
    [J]. ANNALS OF STATISTICS, 1987, 15 (04): : 1491 - 1519
  • [36] Estimation of kullback-leibler divergence by local likelihood
    Lee, Young Kyung
    Park, Byeong U.
    [J]. ANNALS OF THE INSTITUTE OF STATISTICAL MATHEMATICS, 2006, 58 (02) : 327 - 340
  • [37] Kullback-Leibler distance as a measure of the information filtered from multivariate data
    Tumminello, Michele
    Lillo, Fabrizio
    Mantegna, Rosario N.
    [J]. PHYSICAL REVIEW E, 2007, 76 (03):
  • [38] Kullback-Leibler Information of Consecutive Order Statistics
    Kim, Ilmun
    Park, Sangun
    [J]. COMMUNICATIONS FOR STATISTICAL APPLICATIONS AND METHODS, 2015, 22 (05) : 487 - 494
  • [39] Parsimonious estimation of multiplicative interaction in analysis of variance using Kullback-Leibler Information
    Viele, K
    Srinivasan, C
    [J]. JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2000, 84 (1-2) : 201 - 219
  • [40] Kullback-Leibler Boosting
    Liu, C
    Shum, HY
    [J]. 2003 IEEE COMPUTER SOCIETY CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, VOL 1, PROCEEDINGS, 2003, : 587 - 594