Sparse reduced-rank regression with covariance estimation

被引:0
|
作者
Lisha Chen
Jianhua Z. Huang
机构
[1] GE Captital,Quantitative Methodologies
[2] Texas A&M University,Department of Statistics
[3] Renmin University of China,The School of Statistics
来源
Statistics and Computing | 2016年 / 26卷
关键词
Covariance estimation; Group lasso; Reduced-rank regression; Variable selection;
D O I
暂无
中图分类号
学科分类号
摘要
Improving the predicting performance of the multiple response regression compared with separate linear regressions is a challenging question. On the one hand, it is desirable to seek model parsimony when facing a large number of parameters. On the other hand, for certain applications it is necessary to take into account the general covariance structure for the errors of the regression model. We assume a reduced-rank regression model and work with the likelihood function with general error covariance to achieve both objectives. In addition we propose to select relevant variables for reduced-rank regression by using a sparsity-inducing penalty, and to estimate the error covariance matrix simultaneously by using a similar penalty on the precision matrix. We develop a numerical algorithm to solve the penalized regression problem. In a simulation study and real data analysis, the new method is compared with two recent methods for multivariate regression and exhibits competitive performance in prediction and variable selection.
引用
收藏
页码:461 / 470
页数:9
相关论文
共 50 条
  • [1] Sparse reduced-rank regression with covariance estimation
    Chen, Lisha
    Huang, Jianhua Z.
    [J]. STATISTICS AND COMPUTING, 2016, 26 (1-2) : 461 - 470
  • [2] Efficient Sparse Reduced-Rank Regression With Covariance Estimation
    Li, Fengpei
    Zhao, Ziping
    [J]. 2023 IEEE STATISTICAL SIGNAL PROCESSING WORKSHOP, SSP, 2023, : 46 - 50
  • [3] Rank estimation in reduced-rank regression
    Bura, E
    Cook, RD
    [J]. JOURNAL OF MULTIVARIATE ANALYSIS, 2003, 87 (01) : 159 - 176
  • [4] Fast Algorithms for Sparse Reduced-Rank Regression
    Dubois, Benjamin
    Delmas, Jean-Francois
    Obozinski, Guillaume
    [J]. 22ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 89, 2019, 89
  • [5] ADAPTIVE ESTIMATION IN TWO-WAY SPARSE REDUCED-RANK REGRESSION
    Ma, Zhuang
    Ma, Zongming
    Sun, Tingni
    [J]. STATISTICA SINICA, 2020, 30 (04) : 2179 - 2201
  • [6] Alternating DCA for reduced-rank multitask linear regression with covariance matrix estimation
    Hoai An Le Thi
    Vinh Thanh Ho
    [J]. Annals of Mathematics and Artificial Intelligence, 2022, 90 : 809 - 829
  • [7] Alternating DCA for reduced-rank multitask linear regression with covariance matrix estimation
    Hoai An Le Thi
    Vinh Thanh Ho
    [J]. ANNALS OF MATHEMATICS AND ARTIFICIAL INTELLIGENCE, 2022, 90 (7-9) : 809 - 829
  • [8] Robust Sparse Reduced-Rank Regression with Response Dependency
    Liu, Wenchen
    Liu, Guanfu
    Tang, Yincai
    [J]. SYMMETRY-BASEL, 2022, 14 (08):
  • [9] Sparse reduced-rank regression for integrating omics data
    Hilafu, Haileab
    Safo, Sandra E.
    Haine, Lillian
    [J]. BMC BIOINFORMATICS, 2020, 21 (01)
  • [10] Sparse reduced-rank regression for integrating omics data
    Haileab Hilafu
    Sandra E. Safo
    Lillian Haine
    [J]. BMC Bioinformatics, 21