Square-Root LASSO for High-Dimensional Sparse Linear Systems with Weakly Dependent Errors

被引:5
|
作者
Xie, Fang [1 ,2 ]
Xiao, Zhijie [3 ,4 ]
机构
[1] Univ Macau, Fac Sci & Technol, Dept Math, Taipa, Macao, Peoples R China
[2] UM Zhuhai Res Inst, Zhuhai, Peoples R China
[3] Boston Coll, Dept Econ, 140 Commonwealth Ave, Chestnut Hill, MA 02467 USA
[4] Shandong Univ, Ctr Econ Res, Jinan, Shandong, Peoples R China
关键词
high-dimensional linear model; square-root LASSO; -mixing; phi-mixing; m-dependent; estimation consistency; CENTRAL-LIMIT-THEOREM; REGRESSION; CONVERGENCE; PREDICTORS; SHRINKAGE; VARIABLES; SELECTION; RECOVERY; MODELS;
D O I
10.1111/jtsa.12278
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
We study the square-root LASSO method for high-dimensional sparse linear models with weakly dependent errors. The asymptotic and non-asymptotic bounds for the estimation errors are derived. Our results cover a wide range of weakly dependent errors, including -mixing, -mixing, phi-mixing, and m-dependent types. Numerical simulations are conducted to show the consistency property of square-root LASSO. An empirical application to financial data highlights the importance of the results and method.
引用
收藏
页码:212 / 238
页数:27
相关论文
共 50 条
  • [41] Penalised robust estimators for sparse and high-dimensional linear models
    Amato, Umberto
    Antoniadis, Anestis'
    De Feis, Italia
    Gijbels, Irene
    STATISTICAL METHODS AND APPLICATIONS, 2021, 30 (01): : 1 - 48
  • [42] Error density estimation in high-dimensional sparse linear model
    Feng Zou
    Hengjian Cui
    Annals of the Institute of Statistical Mathematics, 2020, 72 : 427 - 449
  • [43] Empirical Priors for Prediction in Sparse High-dimensional Linear Regression
    Martin, Ryan
    Tang, Yiqi
    JOURNAL OF MACHINE LEARNING RESEARCH, 2020, 21
  • [44] Thompson Sampling for High-Dimensional Sparse Linear Contextual Bandits
    Chakraborty, Sunrit
    Roy, Saptarshi
    Tewari, Ambuj
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 202, 2023, 202
  • [45] Empirical priors for prediction in sparse high-dimensional linear regression
    Martin, Ryan
    Tang, Yiqi
    Journal of Machine Learning Research, 2020, 21
  • [46] Precision Lasso: accounting for correlations and linear dependencies in high-dimensional genomic data
    Wang, Haohan
    Lengerich, Benjamin J.
    Aragam, Bryon
    Xing, Eric P.
    BIOINFORMATICS, 2019, 35 (07) : 1181 - 1187
  • [47] A unified square-root approach for the score and Fisher information matrix computation in linear dynamic systems
    Kulikova, M. V.
    Tsyganova, J. V.
    MATHEMATICS AND COMPUTERS IN SIMULATION, 2016, 119 : 128 - 141
  • [48] Sparse and debiased lasso estimation and inference for high-dimensional composite quantile regression with distributed data
    Hou, Zhaohan
    Ma, Wei
    Wang, Lei
    TEST, 2023, 32 (04) : 1230 - 1250
  • [49] Sparse PCA-based on high-dimensional Ito processes with measurement errors
    Kim, Donggyu
    Wang, Yazhen
    JOURNAL OF MULTIVARIATE ANALYSIS, 2016, 152 : 172 - 189
  • [50] Regularized estimation of high-dimensional vector autoregressions with weakly dependent innovations
    Masini, Ricardo P.
    Medeiros, Marcelo C.
    Mendes, Eduardo F.
    JOURNAL OF TIME SERIES ANALYSIS, 2022, 43 (04) : 532 - 557