Dependent relevance determination for smooth and structured sparse regression

被引:0
|
作者
Wu, Anqi [1 ]
Koyejo, Oluwasanmi [2 ]
Pillow, Jonathan [1 ]
机构
[1] Princeton Univ, Princeton Neurosci Inst, Princeton, NJ 08544 USA
[2] Univ Illinois, Dept Comp Sci, Beckman Inst Adv Sci & Technol, Urbana, IL 61801 USA
关键词
Bayesian nonparametric; Sparsity; Structure learning; Gaussian Process; fMRI; VARIABLE SELECTION; ALZHEIMERS-DISEASE; SHRINKAGE; THALAMUS; AREA; FMRI;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In many problem settings, parameter vectors are not merely sparse but dependent in such a way that non-zero coefficients tend to cluster together. We refer to this form of dependency as "region sparsity." Classical sparse regression methods, such as the lasso and automatic relevance determination (ARD), which model parameters as independent a priori, and therefore do not exploit such dependencies. Here we introduce a hierarchical model for smooth, region-sparse weight vectors and tensors in a linear regression setting. Our approach represents a hierarchical extension of the relevance determination framework, where we add a transformed Gaussian process to model the dependencies between the prior variances of regression weights. We combine this with a structured model of the prior variances of Fourier coefficients, which eliminates unnecessary high frequencies. The resulting prior encourages weights to be region-sparse in two different bases simultaneously. We develop Laplace approximation and Monte Carlo Markov Chain (MCMC) sampling to provide efficient inference for the posterior. Furthermore, a two-stage convex relaxation of the Laplace approximation approach is also provided to relax the inevitable non-convexity during the optimization. We finally show substantial improvements over comparable methods for both simulated and real datasets from brain imaging.
引用
收藏
页数:43
相关论文
共 50 条
  • [1] Dependent relevance determination for smooth and structured sparse regression
    Wu, Anqi
    Koyejo, Oluwasanmi
    Pillow, Jonathan
    [J]. Journal of Machine Learning Research, 2019, 20
  • [2] Sparse Bayesian structure learning with dependent relevance determination prior
    Wu, Anqi
    Park, Mijung
    Koyejo, Oluwasanmi
    Pillow, Jonathan W.
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 27 (NIPS 2014), 2014, 27
  • [3] An efficient algorithm for structured sparse quantile regression
    Nassiri, Vahid
    Loris, Ignace
    [J]. COMPUTATIONAL STATISTICS, 2014, 29 (05) : 1321 - 1343
  • [4] An efficient algorithm for structured sparse quantile regression
    Vahid Nassiri
    Ignace Loris
    [J]. Computational Statistics, 2014, 29 : 1321 - 1343
  • [5] Decoding brain activity with smooth sparse regression
    de Brecht, Matthew
    Yamagishi, Noriko
    [J]. NEUROSCIENCE RESEARCH, 2011, 71 : E201 - E202
  • [6] Sparse methods for automatic relevance determination
    Rudy, Samuel H.
    Sapsis, Themistoklis P.
    [J]. Physica D: Nonlinear Phenomena, 2021, 418
  • [7] Sparse methods for automatic relevance determination
    Rudy, Samuel H.
    Sapsis, Themistoklis P.
    [J]. PHYSICA D-NONLINEAR PHENOMENA, 2021, 418
  • [8] Smooth sparse coding via marginal regression for learning sparse representations
    Balasubramanian, Krishnakumar
    Yu, Kai
    Lebanon, Guy
    [J]. ARTIFICIAL INTELLIGENCE, 2016, 238 : 83 - 95
  • [9] Sparse conditional copula models for structured output regression
    Kim, Minyoung
    [J]. PATTERN RECOGNITION, 2016, 60 : 761 - 769
  • [10] Sparse regression with structured priors:: Application to audio denoising
    Fevotte, Cedric
    Daudet, Laurent
    Godsill, Simon J.
    Torresani, Bruno
    [J]. 2006 IEEE International Conference on Acoustics, Speech and Signal Processing, Vols 1-13, 2006, : 2508 - 2511