Logistic regression is one of the most widely used regression models in practice, but alternatives to conventional maximum likelihood estimation methods may be more appropriate for small or sparse samples. Modification of the logistic regression score function to remove first-order bias is equivalent to penalizing the likelihood by the Jeffreys prior, and yields penalized maximum likelihood estimates (PLEs) that always exist, even in samples in which maximum likelihood estimates (MLEs) are infinite. PLEs are an attractive alternative in small-to-moderate-sized samples, and are preferred to exact conditional MLEs when there are continuous covariates. We present methods to construct confidence intervals (CI) in the penalized multinomial logistic regression model, and compare Cl coverage and length for the PLE-based methods to that of conventional MLE-based methods in trinomial logistic regressions with both binary and continuous covariates. Based on simulation studies in sparse data sets, we recommend profile CIs over asymptotic Wald-type intervals for the PLEs in all cases. Furthermore, when finite sample bias and data separation are likely to occur, we prefer PLE profile CIs over MLE methods. Copyright (c) 2006 John Wiley & Sons, Ltd.
机构:
Department of Mathematics and Statistics, University of Regina, Regina, S4S 0A2, SKDepartment of Mathematics and Statistics, University of Regina, Regina, S4S 0A2, SK
Deng D.
Paul S.R.
论文数: 0引用数: 0
h-index: 0
机构:
Department of Mathematics and Statistics, University of Windsor, Windsor, N9B 3P4, ONDepartment of Mathematics and Statistics, University of Regina, Regina, S4S 0A2, SK