gptools: Scalable Gaussian Process Inference with Stan

被引:0
|
作者
Hoffmann, Till [1 ]
Onnela, Jukka-Pekka [1 ]
机构
[1] Harvard TH Chan Sch Publ Hlth, 677 Huntington Ave, Boston, MA 02115 USA
来源
JOURNAL OF STATISTICAL SOFTWARE | 2025年 / 112卷 / 02期
关键词
Gaussian process; Fourier transform; sparse approximation; Stan; !text type='Python']Python[!/text; R; MODELS;
D O I
10.18637/jss.v112.i02
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Gaussian processes (GPs) are sophisticated distributions to model functional data. Whilst theoretically appealing, they are computationally cumbersome except for small datasets. We implement two methods for scaling GP inference in Stan: First, a general sparse approximation using a directed acyclic dependency graph; second, a fast, exact method for regularly spaced data modeled by GPs with stationary kernels using the fast Fourier transform. Based on benchmark experiments, we offer guidance for practitioners to decide between different methods and parameterizations. We consider two real-world examples to illustrate the package. The implementation follows Stan's design and exposes performant inference through a familiar interface. Full posterior inference for ten thousand data points is feasible on a laptop in less than 20 seconds. Details on how to get started using the popular interfaces cmdstanpy for Python and cmdstanr for R are provided.
引用
收藏
页码:1 / 31
页数:31
相关论文
共 50 条
  • [21] Scalable Large Margin Gaussian Process Classification
    Wistuba, Martin
    Rawat, Ambrish
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2019, PT II, 2020, 11907 : 501 - 516
  • [22] Preconditioning for Scalable Gaussian Process Hyperparameter Optimization
    Wenger, Jonathan
    Pleiss, Geoff
    Hennig, Philipp
    Cunningham, John P.
    Gardner, Jacob R.
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [23] Generic inference in latent Gaussian process models
    Bonilla, Edwin V.
    Krauth, Karl
    Dezfouli, Amir
    Journal of Machine Learning Research, 2019, 20
  • [24] Implementing Gaussian process inference with neural networks
    Frean, Marcus
    Lilley, Matt
    Boyle, Phillip
    INTERNATIONAL JOURNAL OF NEURAL SYSTEMS, 2006, 16 (05) : 321 - 327
  • [25] Generic Inference in Latent Gaussian Process Models
    Bonilla, Edwin V.
    Krauth, Karl
    Dezfouli, Amir
    JOURNAL OF MACHINE LEARNING RESEARCH, 2019, 20
  • [26] Bayesian inference with rescaled Gaussian process priors
    van der Vaart, Aad
    van Zanten, Harry
    ELECTRONIC JOURNAL OF STATISTICS, 2007, 1 : 433 - 448
  • [27] Automated Variational Inference for Gaussian Process Models
    Nguyen, Trung, V
    Bonilla, Edwin, V
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 27 (NIPS 2014), 2014, 27
  • [28] A scalable approximate Bayesian inference for high-dimensional Gaussian processes
    Fradi, Anis
    Samir, Chafik
    Bachoc, Francois
    COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 2022, 51 (17) : 5937 - 5956
  • [29] Scalable Inference of Sparsely-changing Gaussian Markov Random Fields
    Fattahi, Salar
    Gomez, Andres
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [30] Variational Inference for Sparse Gaussian Process Modulated Hawkes Process
    Zhang, Rui
    Walder, Christian
    Rizoiu, Marian-Andrei
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 6803 - 6810