Sparse Bayesian infinite factor models

被引:245
|
作者
Bhattacharya, A. [1 ]
Dunson, D. B. [1 ]
机构
[1] Duke Univ, Dept Stat Sci, Durham, NC 27708 USA
基金
美国国家卫生研究院;
关键词
Adaptive Gibbs sampling; Factor analysis; High-dimensional data; Multiplicative gamma process; Parameter expansion; Regularization; Shrinkage; PRIOR DISTRIBUTIONS; SURVIVAL; SELECTION; ARTICLE; NUMBER;
D O I
10.1093/biomet/asr013
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
We focus on sparse modelling of high-dimensional covariance matrices using Bayesian latent factor models. We propose a multiplicative gamma process shrinkage prior on the factor loadings which allows introduction of infinitely many factors, with the loadings increasingly shrunk towards zero as the column index increases. We use our prior on a parameter-expanded loading matrix to avoid the order dependence typical in factor analysis models and develop an efficient Gibbs sampler that scales well as data dimensionality increases. The gain in efficiency is achieved by the joint conjugacy property of the proposed prior, which allows block updating of the loadings matrix. We propose an adaptive Gibbs sampler for automatically truncating the infinite loading matrix through selection of the number of important factors. Theoretical results are provided on the support of the prior and truncation approximation bounds. A fast algorithm is proposed to produce approximate Bayes estimates. Latent factor regression methods are developed for prediction and variable selection in applications with high-dimensional correlated predictors. Operating characteristics are assessed through simulation studies, and the approach is applied to predict survival times from gene expression data.
引用
收藏
页码:291 / 306
页数:16
相关论文
共 50 条
  • [21] Bayesian factor-adjusted sparse regression
    Fan, Jianqing
    Jiang, Bai
    Sun, Qiang
    JOURNAL OF ECONOMETRICS, 2022, 230 (01) : 3 - 19
  • [22] Bayesian sparse factor analysis with kernelized observations
    Sevilla-Salcedo, Carlos
    Guerrero-Lopez, Alejandro
    Olmos, Pablo M.
    Gomez-Verdejo, Vanessa
    NEUROCOMPUTING, 2022, 490 : 66 - 78
  • [23] Bayesian learning in sparse graphical factor models via variational mean-field annealing
    Yoshida, Ryo
    West, Mike
    Journal of Machine Learning Research, 2010, 11 : 1771 - 1798
  • [24] Bayesian Learning in Sparse Graphical Factor Models via Variational Mean-Field Annealing
    Yoshida, Ryo
    West, Mike
    JOURNAL OF MACHINE LEARNING RESEARCH, 2010, 11 : 1771 - 1798
  • [25] Infinite ShapeOdds: Nonparametric Bayesian Models for Shape Representations
    Xing, Wei
    Elhabian, Shireen
    Kirby, Robert M.
    Whitaker, Ross T.
    Zhe, Shandian
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 6462 - 6469
  • [26] Bayesian Structure Learning in Sparse Gaussian Graphical Models
    Mohammadi, A.
    Wit, E. C.
    BAYESIAN ANALYSIS, 2015, 10 (01): : 109 - 138
  • [27] Hyperparameter Estimation for Sparse Bayesian Learning Models\ast
    Yu, Feng
    Shen, Lixin
    Song, Guohui
    SIAM-ASA JOURNAL ON UNCERTAINTY QUANTIFICATION, 2024, 12 (03): : 759 - 787
  • [28] Sparse Bayesian ARX models with flexible noise distributions
    Dahlin, Johan
    Wills, Adrian
    Ninness, Brett
    IFAC PAPERSONLINE, 2018, 51 (15): : 25 - 30
  • [29] Polygenic Modeling with Bayesian Sparse Linear Mixed Models
    Zhou, Xiang
    Carbonetto, Peter
    Stephens, Matthew
    PLOS GENETICS, 2013, 9 (02):
  • [30] Posterior impropriety of some sparse Bayesian learning models
    Dixit, Anand
    Roy, Vivekananda
    STATISTICS & PROBABILITY LETTERS, 2021, 171