Efficient Modeling of Latent Information in Supervised Learning using Gaussian Processes

被引:0
|
作者
Dai, Zhenwen [1 ,3 ]
Alvarez, Mauricio A. [2 ]
Lawrence, Neil D. [2 ,3 ]
机构
[1] Inferentia Ltd, Chesterfield, England
[2] Univ Sheffield, Dept Comp Sci, Sheffield, S Yorkshire, England
[3] Amazon Com, Seattle, WA 98109 USA
基金
英国工程与自然科学研究理事会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Often in machine learning, data are collected as a combination of multiple conditions, e.g., the voice recordings of multiple persons, each labeled with an ID. How could we build a model that captures the latent information related to these conditions and generalize to a new one with few data? We present a new model called Latent Variable Multiple Output Gaussian Processes (LVMOGP) that allows to jointly model multiple conditions for regression and generalize to a new condition with a few data points at test time. LVMOGP infers the posteriors of Gaussian processes together with a latent space representing the information about different conditions. We derive an efficient variational inference method for LVMOGP for which the computational complexity is as low as sparse Gaussian processes. We show that LVMOGP significantly outperforms related Gaussian process methods on various tasks with both synthetic and real data.
引用
收藏
页数:9
相关论文
共 50 条
  • [1] Semi-supervised Learning with Gaussian Processes
    Li, Hongwei
    Li, Yakui
    Lu, Hanqing
    PROCEEDINGS OF THE 2008 CHINESE CONFERENCE ON PATTERN RECOGNITION (CCPR 2008), 2008, : 13 - 17
  • [2] Reorienting Latent Variable Modeling for Supervised Learning
    Jo, Booil
    Hastie, Trevor J. J.
    Li, Zetan
    Youngstrom, Eric A. A.
    Findling, Robert L. L.
    Horwitz, Sarah McCue
    MULTIVARIATE BEHAVIORAL RESEARCH, 2023, 58 (06) : 1057 - 1071
  • [3] Efficient Learning of Hyperrectangular Invariant Sets Using Gaussian Processes
    Cao, Michael Enqi
    Bloch, Matthieu
    Coogan, Samuel
    IEEE OPEN JOURNAL OF CONTROL SYSTEMS, 2022, 1 : 223 - 236
  • [4] Semi-supervised Prosody Modeling Using Deep Gaussian Process Latent Variable Model
    Koriyama, Tomoki
    Kobayashi, Takao
    INTERSPEECH 2019, 2019, : 4450 - 4454
  • [5] Meta Reinforcement Learning with Latent Variable Gaussian Processes
    Saemundsson, Steindor
    Hofmann, Katja
    Deisenroth, Marc Peter
    UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, 2018, : 642 - 652
  • [6] Modeling Dynamic Functional Connectivity with Latent Factor Gaussian Processes
    Li, Lingge
    Pluta, Dustin
    Shahbaba, Babak
    Fortin, Norbert
    Ombao, Hernando
    Baldi, Pierre
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [7] Multi-View Deep Gaussian Processes for Supervised Learning
    Dong, Wenbo
    Sun, Shiliang
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (12) : 15137 - 15153
  • [8] Bayesian learning with Gaussian processes for supervised classification of hyperspectral data
    Zhao, Kaiguang
    Popescu, Sorin
    Zhang, Xuesong
    PHOTOGRAMMETRIC ENGINEERING AND REMOTE SENSING, 2008, 74 (10): : 1223 - 1234
  • [9] Bayesian Semi-supervised Learning with Graph Gaussian Processes
    Ng, Yin Cheng
    Colombo, Nicolo
    Silva, Ricardo
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [10] Sample Efficient Reinforcement Learning with Gaussian Processes
    Grande, Robert C.
    Walsh, Thomas J.
    How, Jonathan P.
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 32 (CYCLE 2), 2014, 32 : 1332 - 1340