Reduced-rank regularized multivariate model for high-dimensional data

被引:2
|
作者
Kustra, Rafal [1 ]
机构
[1] Univ Toronto, Toronto, ON, Canada
基金
加拿大自然科学与工程研究理事会;
关键词
discriminants; microarrays; multivariate methods; neuronnaging; p >> n statistics; penalized linear models; smoothing;
D O I
10.1198/016214506X112416
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
This article proposes a modeling framework for high-dimensional experimental data, such as brain images or microarrays, that discovers statistically significant structures most relevant to the experimental covariates. To deal with the curse of dimensionality, three regularization schemes are used: a reduced-rank model, penalization of the covariance matrix, and regularization of the basis-expanded predictor set. The latter allows us to flexibly model associations while controlling for overfitting. The modeling framework is derived from a reduced-rank multiresponse linear model, which offers a familiar interface for researchers. The novel regularizations of both sides of the model make it applicable in high-dimensional settings, without a need for prior dimension reduction, and can model nonlinear relationships. An efficient, dual-space algorithm is proposed to estimate its components in low-dimensional space. It permits the use of the bootstrap, to provide pointwise standard error bands on association graphs, and other resampling techniques to optimize hyperparameters. We evaluate the model on a small neuroimaging dataset, and in a simulation study using simple images corrupted by additive Gaussian iid and random field noise components with signal-to-noise ratios below 0.1. Our model compares well with a general linear model (GLM) even when the nonlinear associations are specified explicitly in GLM.
引用
收藏
页码:312 / 338
页数:27
相关论文
共 50 条
  • [1] REDUCED-RANK MODELING FOR HIGH-DIMENSIONAL MODEL-BASED CLUSTERING
    Yang, Lei
    Wang, Junhui
    Ma, Shiqian
    [J]. JOURNAL OF COMPUTATIONAL MATHEMATICS, 2018, 36 (03) : 426 - 440
  • [2] High-dimensional differential networks with sparsity and reduced-rank
    Wang, Yao
    Wang, Cheng
    Jiang, Binyan
    [J]. STAT, 2024, 13 (02):
  • [3] A multivariate reduced-rank growth curve model with unbalanced data
    Heungsun Hwang
    Yoshio Takane
    [J]. Psychometrika, 2004, 69 : 65 - 79
  • [4] A multivariate reduced-rank growth curve model with unbalanced data
    Hwang, HS
    Takane, Y
    [J]. PSYCHOMETRIKA, 2004, 69 (01) : 65 - 79
  • [5] Reduced rank regression with matrix projections for high-dimensional multivariate linear regression model
    Guo, Wenxing
    Balakrishnan, Narayanaswamy
    Bian, Mengjie
    [J]. ELECTRONIC JOURNAL OF STATISTICS, 2021, 15 (02): : 4167 - 4191
  • [6] Regularized estimation of precision matrix for high-dimensional multivariate longitudinal data
    Qian, Fang
    Chen, Yu
    Zhang, Weiping
    [J]. JOURNAL OF MULTIVARIATE ANALYSIS, 2020, 176
  • [7] MODEL FREE MULTIVARIATE REDUCED-RANK REGRESSION WITH CATEGORICAL PREDICTORS
    Setodji, Claude Messan
    Li, Lexin
    [J]. STATISTICA SINICA, 2009, 19 (03) : 1119 - 1136
  • [8] Discovering genetic associations with high-dimensional neuroimaging phenotypes: A sparse reduced-rank regression approach
    Vounou, Maria
    Nichols, Thomas E.
    Montana, Giovanni
    [J]. NEUROIMAGE, 2010, 53 (03) : 1147 - 1159
  • [9] Integrative sparse reduced-rank regression via orthogonal rotation for analysis of high-dimensional multi-source data
    Kipoong Kim
    Sungkyu Jung
    [J]. Statistics and Computing, 2024, 34
  • [10] Integrative sparse reduced-rank regression via orthogonal rotation for analysis of high-dimensional multi-source data
    Kim, Kipoong
    Jung, Sungkyu
    [J]. STATISTICS AND COMPUTING, 2024, 34 (01)