gptools: Scalable Gaussian Process Inference with Stan

被引:0
|
作者
Hoffmann, Till [1 ]
Onnela, Jukka-Pekka [1 ]
机构
[1] Harvard TH Chan Sch Publ Hlth, 677 Huntington Ave, Boston, MA 02115 USA
来源
JOURNAL OF STATISTICAL SOFTWARE | 2025年 / 112卷 / 02期
关键词
Gaussian process; Fourier transform; sparse approximation; Stan; !text type='Python']Python[!/text; R; MODELS;
D O I
10.18637/jss.v112.i02
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Gaussian processes (GPs) are sophisticated distributions to model functional data. Whilst theoretically appealing, they are computationally cumbersome except for small datasets. We implement two methods for scaling GP inference in Stan: First, a general sparse approximation using a directed acyclic dependency graph; second, a fast, exact method for regularly spaced data modeled by GPs with stationary kernels using the fast Fourier transform. Based on benchmark experiments, we offer guidance for practitioners to decide between different methods and parameterizations. We consider two real-world examples to illustrate the package. The implementation follows Stan's design and exposes performant inference through a familiar interface. Full posterior inference for ten thousand data points is feasible on a laptop in less than 20 seconds. Details on how to get started using the popular interfaces cmdstanpy for Python and cmdstanr for R are provided.
引用
收藏
页码:1 / 31
页数:31
相关论文
共 50 条
  • [1] Scalable Gaussian process inference of neural responses to natural images
    Goldin, Matias A.
    Virgili, Samuele
    Chalk, Matthew
    PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2023, 120 (34)
  • [2] Scalable Training of Inference Networks for Gaussian-Process Models
    Shi, Jiaxin
    Khan, Mohammad Emtiyaz
    Zhu, Jun
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [3] Scalable Inference for Gaussian Process Models with Black-Box Likelihoods
    Dezfouli, Amir
    Bonilla, Edwin V.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 28 (NIPS 2015), 2015, 28
  • [4] Stochastic variational inference for scalable non-stationary Gaussian process regression
    Ionut Paun
    Dirk Husmeier
    Colin J. Torney
    Statistics and Computing, 2023, 33
  • [5] Stochastic variational inference for scalable non-stationary Gaussian process regression
    Paun, Ionut
    Husmeier, Dirk
    Torney, Colin J.
    STATISTICS AND COMPUTING, 2023, 33 (02)
  • [6] Scalable Gaussian Process Inference with Finite-data Mean and Variance Guarantees
    Huggins, Jonathan H.
    Campbell, Trevor
    Kasprzak, Mikolaj
    Broderick, Tamara
    22ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 89, 2019, 89 : 796 - 805
  • [7] SCALABLE TRANSFORMED ADDITIVE SIGNAL DECOMPOSITION BY NON-CONJUGATE GAUSSIAN PROCESS INFERENCE
    Adam, Vincent
    Hensman, James
    Sahani, Maneesh
    2016 IEEE 26TH INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2016,
  • [8] Scalable Inference for Hybrid Bayesian Hidden Markov Model Using Gaussian Process Emission
    Jung, Yohan
    Park, Jinkyoo
    JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2022, 31 (03) : 666 - 683
  • [9] Scalable Gaussian Process Variational Autoencoders
    Jazbec, Metod
    Ashman, Matthew
    Fortuin, Vincent
    Pearce, Michael
    Mandt, Stephan
    Raetsch, Gunnar
    24TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS (AISTATS), 2021, 130
  • [10] Scalable Variational Gaussian Process Classification
    Hensman, James
    Matthews, Alex G. de G.
    Ghahramani, Zoubin
    ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 38, 2015, 38 : 351 - 360