Bayes, minimax and nonnegative estimators of variance components under Kullback-Leibler loss

被引:6
|
作者
Kubokawa, T
Saleh, AKME
Konno, Y
机构
[1] Univ Tokyo, Fac Econ, Bunkyo Ku, Tokyo 1130033, Japan
[2] Carleton Univ, Dept Math & Stat, Ottawa, ON K1S 5B6, Canada
[3] Chiba Univ, Dept Math, Chiba 2630022, Japan
关键词
variance components model; balanced mixed linear model; point estimation; ANOVA estimator; empirical Bayes rule; generalized Bayes rule; restricted maximum likelihood estimator; Kullback-Leibler information loss;
D O I
10.1016/S0378-3758(99)00170-6
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
In a balanced one-way model with random effects, the simultaneous estimation of the variance components are considered under the intrinsic Kullback-Leibler loss function. The uniformly minimum variance unbiased (UMVU) or ANOVA estimators are known to have a drawback of taking negative values. The paper shows the minimaxity of the ANOVA estimators of the variance components and obtains classes of minimax estimators. Out of these classes, two types of minimax and nonnegative estimators are singled out, and they are characterized as empirical Bayes and generalized Bayes estimators. Also, a restricted maximum likelihood (REML) estimator is interpreted as an empirical Bayes rule. The risk performances of the derived estimators are investigated based on simulation experiments. (C) 2000 Elsevier Science B.V. All rights reserved.
引用
收藏
页码:201 / 214
页数:14
相关论文
共 50 条
  • [1] Improved minimax predictive densities under Kullback-Leibler loss
    George, Edward I.
    Liang, Feng
    Xu, Xinyi
    [J]. ANNALS OF STATISTICS, 2006, 34 (01): : 78 - 91
  • [2] The Kullback-Leibler divergence and nonnegative matrices
    Boche, Holger
    Stanczak, Slawomir
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2006, 52 (12) : 5539 - 5545
  • [3] Asymptotical improvement of maximum likelihood estimators on Kullback-Leibler loss
    Eguchi, Shinto
    Yanagimoto, Takemi
    [J]. JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2008, 138 (11) : 3502 - 3511
  • [4] Optimal exponential bounds for aggregation of estimators for the Kullback-Leibler loss
    Butucea, Cristina
    Delmas, Jean-Francois
    Dutfoy, Anne
    Fischer, Richard
    [J]. ELECTRONIC JOURNAL OF STATISTICS, 2017, 11 (01): : 2258 - 2294
  • [5] Kullback-Leibler Divergence for Nonnegative Matrix Factorization
    Yang, Zhirong
    Zhang, He
    Yuan, Zhijian
    Oja, Erkki
    [J]. ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2011, PT I, 2011, 6791 : 250 - 257
  • [6] Choice of estimators based on Kullback-Leibler risk
    Liquet, Benoit
    [J]. JOURNAL OF THE SFDS, 2010, 151 (01): : 38 - 57
  • [7] Algorithms for Nonnegative Matrix Factorization with the Kullback-Leibler Divergence
    Hien, Le Thi Khanh
    Gillis, Nicolas
    [J]. JOURNAL OF SCIENTIFIC COMPUTING, 2021, 87 (03)
  • [8] ON KULLBACK-LEIBLER LOSS AND DENSITY-ESTIMATION
    HALL, P
    [J]. ANNALS OF STATISTICS, 1987, 15 (04): : 1491 - 1519
  • [9] Newton-based optimization for Kullback-Leibler nonnegative tensor factorizations
    Hansen, Samantha
    Plantenga, Todd
    Kolda, Tamara G.
    [J]. OPTIMIZATION METHODS & SOFTWARE, 2015, 30 (05): : 1002 - 1029
  • [10] Decomposition of Kullback-Leibler risk and unbiasedness for parameter-free estimators
    Wu, Qiang
    Vos, Paul
    [J]. JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2012, 142 (06) : 1525 - 1536