This article studies the convergence rates of the adaptive elastic net and proposes a post-selection method. In the low dimensional settings, Zou and Zhou proved the variable selection consistency of the adaptive elastic net. In this article, we show the probability of the adaptive elastic net selecting the wrong variables decays at an exponential rate in the ultra-high dimensional setting. The Irrepresentable condition isn't necessary according to some appropriate initial estimator. Furthermore, we show that the MSE and bias of the adaptive elastic net may lead to an inferior rate depending on the choice of the tuning parameters. We also prove that there exists a bias of asymptotic normality in the adaptive elastic net. Hence we propose a post-selection procedure, called SSLS (Separate Selection from Least Squares), to improve the bias problem of the adaptive elastic net. It is shown that the SSLS's bias decays at an exponential rate and it is MSE decays to zero. The variable selection consistency of SSLS implies its asymptotic normality. We show by simulations and financial modeling that the SSLS performs better than the other oracle-like methods.
机构:
Hitotsubashi Univ, Grad Sch Econ, 2-1 Naka, Kunitachi, Tokyo 1868601, JapanHitotsubashi Univ, Grad Sch Econ, 2-1 Naka, Kunitachi, Tokyo 1868601, Japan
Honda, Toshio
Lin, Chien-Tong
论文数: 0引用数: 0
h-index: 0
机构:
Feng Chia Univ, Dept Stat, 100 Wenhua Rd, Taichung 407102, TaiwanHitotsubashi Univ, Grad Sch Econ, 2-1 Naka, Kunitachi, Tokyo 1868601, Japan