Estimation of covariance matrix via the sparse Cholesky factor with lasso

被引:27
|
作者
Chang, Changgee [2 ]
Tsay, Ruey S. [1 ]
机构
[1] Univ Chicago, Booth Sch Business, Chicago, IL 60637 USA
[2] Univ Chicago, Dept Stat, Chicago, IL 60637 USA
关键词
Adding and removing variables; Covariance matrix estimation; Equi-angular covariance estimate; Dynamic weighted lasso; L-1; penalty; Lasso; Updating; Modified Cholesky decomposition; LONGITUDINAL DATA; NONPARAMETRIC-ESTIMATION; VARIABLE SELECTION; ORACLE PROPERTIES; MODELS; LIKELIHOOD; REGRESSION;
D O I
10.1016/j.jspi.2010.04.048
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
In this paper, we discuss a parsimonious approach to estimation of high-dimensional covariance matrices via the modified Cholesky decomposition with lasso. Two different methods are proposed. They are the equi-angular and equi-sparse methods. We use simulation to compare the performance of the proposed methods with others available in the literature, including the sample covariance matrix, the banding method, and the Li-penalized normal loglikelihood method. We then apply the proposed methods to a portfolio selection problem using 80 series of daily stock returns. To facilitate the use of lasso in high-dimensional time series analysis, we develop the dynamic weighted lasso (DWL) algorithm that extends the LARS-lasso algorithm. In particular, the proposed algorithm can efficiently update the lasso solution as new data become available. It can also add or remove explanatory variables. The entire solution path of the L-1-penalized normal loglikelihood method is also constructed. (C) 2010 Elsevier B.V. All rights reserved.
引用
收藏
页码:3858 / 3873
页数:16
相关论文
共 50 条
  • [31] Robust Covariance Matrix Estimation and Sparse Bias Estimation for Multipath Mitigation
    Lesouple, Julien
    Barbiero, Franck
    Faurie, Frederic
    Sahmoudi, Mohamed
    Tourneret, Jean-Yves
    [J]. PROCEEDINGS OF THE 31ST INTERNATIONAL TECHNICAL MEETING OF THE SATELLITE DIVISION OF THE INSTITUTE OF NAVIGATION (ION GNSS+ 2018), 2018, : 3433 - 3445
  • [32] A Sparse Approximate Factor Model for High-Dimensional Covariance Matrix Estimation and Portfolio Selection
    Daniele, Maurizio
    Pohlmeier, Winfried
    Zagidullina, Aygul
    [J]. JOURNAL OF FINANCIAL ECONOMETRICS, 2024,
  • [33] Sparse EEG/MEG source estimation via a group lasso
    Lim, Michael
    Ales, Justin M.
    Cottereau, Benoit R.
    Hastie, Trevor
    Norcia, Anthony M.
    [J]. PLOS ONE, 2017, 12 (06):
  • [34] Sparse inverse covariance matrix estimation via the l0-norm with Tikhonov regularization
    Liu, Xinrui
    Zhang, Na
    [J]. INVERSE PROBLEMS, 2019, 35 (11)
  • [35] Sparse basis covariance matrix estimation for high dimensional compositional data via hard thresholding
    Li, Huimin
    Wang, Jinru
    [J]. STATISTICS & PROBABILITY LETTERS, 2024, 209
  • [36] Confidence intervals for sparse precision matrix estimation via Lasso penalized D-trace loss
    Huang Xudong
    Li Mengmeng
    [J]. COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 2017, 46 (24) : 12299 - 12316
  • [37] A LASSO Chart for Monitoring the Covariance Matrix
    Maboudou-Tchao, Edgard M.
    Diawara, Norou
    [J]. QUALITY TECHNOLOGY AND QUANTITATIVE MANAGEMENT, 2013, 10 (01): : 95 - 114
  • [38] Sparse covariance matrix estimation in high-dimensional deconvolution
    Belomestny, Denis
    Trabs, Mathias
    Tsybakov, Alexandre B.
    [J]. BERNOULLI, 2019, 25 (03) : 1901 - 1938
  • [39] Sparse Covariance Matrix Estimation by DCA-Based Algorithms
    Duy Nhat Phan
    Hoai An Le Thi
    Tao Pham Dinh
    [J]. NEURAL COMPUTATION, 2017, 29 (11) : 3040 - 3077
  • [40] LARGE-SCALE SPARSE INVERSE COVARIANCE MATRIX ESTIMATION
    Bollhoefer, Matthias
    Eftekhari, Aryan
    Scheidegger, Simon
    Schenk, Olaf
    [J]. SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2019, 41 (01): : A380 - A401