Fast Approximate Multioutput Gaussian Processes

被引:4
|
作者
Joukov, Vladimir [1 ]
Kulic, Dana [2 ]
机构
[1] Univ Waterloo, Adapt Syst Lab, Waterloo, ON N2L 3G1, Canada
[2] Monash Univ, Fac Engn, Clayton, Vic 3800, Australia
关键词
D O I
10.1109/MIS.2022.3169036
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Gaussian processes regression models are an appealing machine learning method as they learn expressive nonlinear models from exemplar data with minimal parameter tuning and estimate both the mean and covariance of unseen points. However, cubic computational complexity growth with the number of samples has been a long standing challenge. Training requires the inversion of N x NN x N kernel at every iteration, whereas regression needs computation of an m x Nm x N kernel, where NN and mm are the number of training and test points, respectively. This work demonstrates how approximating the covariance kernel using eigenvalues and functions leads to an approximate Gaussian process with significant reduction in training and regression complexity. Training now requires computing only an N x nN x n eigenfunction matrix and an n x nn x n inverse, where nn is a selected number of eigenvalues. Furthermore, regression now only requires an m x nm x n matrix. Finally, in a special case, the hyperparameter optimization is completely independent from the number of training samples. The proposed method can regress over multiple outputs, learn the correlations between them, and estimate their derivatives to any order. The computational complexity reduction, regression capabilities, multioutput correlation learning, and comparison to the state of the art are demonstrated in simulation examples. Finally we show how the proposed approach can be utilized to model real human data.
引用
收藏
页码:56 / 69
页数:14
相关论文
共 50 条
  • [31] Fast methods for training Gaussian processes on large datasets
    Moore, C. J.
    Chua, A. J. K.
    Berry, C. P. L.
    Gair, J. R.
    ROYAL SOCIETY OPEN SCIENCE, 2016, 3 (05):
  • [32] Low-Precision Arithmetic for Fast Gaussian Processes
    Maddox, Wesley J.
    Potapczynski, Andres
    Wilson, Andrew Gordon
    UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, VOL 180, 2022, 180 : 1306 - 1316
  • [33] APPROXIMATE STATE-SPACE GAUSSIAN PROCESSES VIA SPECTRAL TRANSFORMATION
    Karvonen, Toni
    Sarkka, Simo
    2016 IEEE 26TH INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2016,
  • [34] Practical Hilbert space approximate Bayesian Gaussian processes for probabilistic programming
    Riutort-Mayol, Gabriel
    Buerkner, Paul-Christian
    Andersen, Michael R.
    Solin, Arno
    Vehtari, Aki
    STATISTICS AND COMPUTING, 2023, 33 (01)
  • [35] A scalable approximate Bayesian inference for high-dimensional Gaussian processes
    Fradi, Anis
    Samir, Chafik
    Bachoc, Francois
    COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 2022, 51 (17) : 5937 - 5956
  • [36] Practical Hilbert space approximate Bayesian Gaussian processes for probabilistic programming
    Gabriel Riutort-Mayol
    Paul-Christian Bürkner
    Michael R. Andersen
    Arno Solin
    Aki Vehtari
    Statistics and Computing, 2023, 33
  • [37] Nonlinear Online Multioutput Gaussian Process for Multistream Data Informatics
    Hu, Zhiyong
    Wang, Chao
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2022, 18 (06) : 3885 - 3893
  • [38] Fast algorithm for non-Gaussian stochastic processes based on translation processes
    Liu, Jinming
    Tan, Xing
    Chen, Weiting
    He, Huan
    Hangkong Xuebao/Acta Aeronautica et Astronautica Sinica, 2024, 45 (18):
  • [39] Fast and robust Bayesian inference using Gaussian processes with GPry
    El Gammal, Jonas
    Schoeneberg, Nils
    Torrado, Jesus
    Fidler, Christian
    JOURNAL OF COSMOLOGY AND ASTROPARTICLE PHYSICS, 2023, (10):
  • [40] Adaptive Dimensionality Reduction for Fast Sequential Optimization With Gaussian Processes
    Ghoreishi, Seyede Fatemeh
    Friedman, Samuel
    Allaire, Douglas L.
    JOURNAL OF MECHANICAL DESIGN, 2019, 141 (07)