Fast Approximate Multioutput Gaussian Processes

被引:4
|
作者
Joukov, Vladimir [1 ]
Kulic, Dana [2 ]
机构
[1] Univ Waterloo, Adapt Syst Lab, Waterloo, ON N2L 3G1, Canada
[2] Monash Univ, Fac Engn, Clayton, Vic 3800, Australia
关键词
D O I
10.1109/MIS.2022.3169036
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Gaussian processes regression models are an appealing machine learning method as they learn expressive nonlinear models from exemplar data with minimal parameter tuning and estimate both the mean and covariance of unseen points. However, cubic computational complexity growth with the number of samples has been a long standing challenge. Training requires the inversion of N x NN x N kernel at every iteration, whereas regression needs computation of an m x Nm x N kernel, where NN and mm are the number of training and test points, respectively. This work demonstrates how approximating the covariance kernel using eigenvalues and functions leads to an approximate Gaussian process with significant reduction in training and regression complexity. Training now requires computing only an N x nN x n eigenfunction matrix and an n x nn x n inverse, where nn is a selected number of eigenvalues. Furthermore, regression now only requires an m x nm x n matrix. Finally, in a special case, the hyperparameter optimization is completely independent from the number of training samples. The proposed method can regress over multiple outputs, learn the correlations between them, and estimate their derivatives to any order. The computational complexity reduction, regression capabilities, multioutput correlation learning, and comparison to the state of the art are demonstrated in simulation examples. Finally we show how the proposed approach can be utilized to model real human data.
引用
收藏
页码:56 / 69
页数:14
相关论文
共 50 条
  • [21] MINIMAL M68000 SYSTEM CONTROLLERS FOR FAST-ACTING MULTIINPUT MULTIOUTPUT PROCESSES
    BRADSHAW, A
    KONNANOV, P
    WOODHEAD, MA
    MICROPROCESSORS AND MICROSYSTEMS, 1986, 10 (03) : 148 - 155
  • [22] Approximate Explicit Model Predictive Controller using Gaussian Processes
    Binder, Matthias
    Darivianakis, Georgios
    Eichler, Annika
    Lygeros, John
    2019 IEEE 58TH CONFERENCE ON DECISION AND CONTROL (CDC), 2019, : 841 - 846
  • [24] Deep Gaussian Processes for Regression using Approximate Expectation Propagation
    Bui, Thang D.
    Hernandez-Lobato, Jose Miguel
    Hernandez-Lobato, Daniel
    Li, Yingzhen
    Turner, Richard E.
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 48, 2016, 48
  • [25] Fast Kronecker Inference in Gaussian Processes with non-Gaussian Likelihoods
    Flaxman, Seth
    Wilson, Andrew Gordon
    Neill, Daniel B.
    Nickisch, Hannes
    Smola, Alexander J.
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 37, 2015, 37 : 607 - 616
  • [26] Hierarchical Anomaly Detection Using a Multioutput Gaussian Process
    Cho, Woojin
    Kim, Youngrae
    Park, Jinkyoo
    IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING, 2020, 17 (01) : 261 - 272
  • [27] Fast approximate likelihood evaluation for stable VARFIMA processes
    Pai, Jeffrey
    Ravishanker, Nalini
    STATISTICS & PROBABILITY LETTERS, 2015, 103 : 160 - 168
  • [28] Global and Local Gaussian Process for Multioutput and Treed Data
    Cuesta, Jhouben J.
    Alvarez, Mauricio A.
    Orozco, Alvaro A.
    IMAGE ANALYSIS AND PROCESSING - ICIAP 2015, PT I, 2015, 9279 : 161 - 171
  • [29] Fast increased fidelity samplers for approximate Bayesian Gaussian process regression
    Moran, Kelly R.
    Wheeler, Matthew W.
    JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY, 2022, 84 (04) : 1198 - 1228
  • [30] Fast approximate learning-based multistage nonlinear model predictive control using Gaussian processes and deep neural networks
    Bonzanini, Angelo D.
    Paulson, Joel A.
    Makrygiorgos, Georgios
    Mesbah, Ali
    COMPUTERS & CHEMICAL ENGINEERING, 2021, 145