Multiview Variational Sparse Gaussian Processes

被引:15
|
作者
Mao, Liang [1 ]
Sun, Shiliang [1 ]
机构
[1] East China Normal Univ, Sch Comp Sci & Technol, Shanghai 200241, Peoples R China
基金
中国国家自然科学基金;
关键词
Computational modeling; Task analysis; Kernel; Bayes methods; Gaussian processes; Supervised learning; Data models; Gaussian process (GP); multiview learning; probabilistic model; supervised learning; variational inference; CANONICAL CORRELATION-ANALYSIS; CLASSIFICATION;
D O I
10.1109/TNNLS.2020.3008496
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Gaussian process (GP) models are flexible nonparametric models widely used in a variety of tasks. Variational sparse GP (VSGP) scales GP models to large data sets by summarizing the posterior process with a set of inducing points. In this article, we extend VSGP to handle multiview data. We model each view with a VSGP and augment it with an additional set of inducing points. These VSGPs are coupled together by enforcing the means of their posteriors to agree at the locations of these inducing points. To learn these shared inducing points, we introduce an additional GP model that is defined in the concatenated feature space. Experiments on real-world data sets show that our multiview VSGP (MVSGP) model outperforms single-view VSGP consistently and is superior to state-of-the-art kernel-based multiview baselines for classification tasks.
引用
收藏
页码:2875 / 2885
页数:11
相关论文
共 50 条
  • [1] Multiview learning with variational mixtures of Gaussian processes
    Sun, Shiliang
    Wang, Jiachun
    [J]. KNOWLEDGE-BASED SYSTEMS, 2020, 200
  • [2] Doubly Sparse Variational Gaussian Processes
    Adam, Vincent
    Eleftheriadis, Stefanos
    Durrande, Nicolas
    Artemev, Artem
    Hensman, James
    [J]. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108 : 2874 - 2883
  • [3] Sparse Orthogonal Variational Inference for Gaussian Processes
    Shi, Jiaxin
    Titsias, Michalis K.
    Mnih, Andriy
    [J]. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108
  • [4] Dual Parameterization of Sparse Variational Gaussian Processes
    Adam, Vincent
    Chang, Paul E.
    Khan, Mohammad Emtiyaz
    Solin, Arno
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [5] Convergence of Sparse Variational Inference in Gaussian Processes Regression
    Burt, David R.
    Rasmussen, Carl Edward
    van der Wilk, Mark
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2020, 21
  • [6] Convergence of sparse variational inference in gaussian processes regression
    Burt, David R.
    Rasmussen, Carl Edward
    Van Der Wilk, Mark
    [J]. Journal of Machine Learning Research, 2020, 21
  • [7] Stochastic variational hierarchical mixture of sparse Gaussian processes for regression
    Thi Nhat Anh Nguyen
    Bouzerdoum, Abdesselam
    Son Lam Phung
    [J]. MACHINE LEARNING, 2018, 107 (12) : 1947 - 1986
  • [8] Stochastic variational hierarchical mixture of sparse Gaussian processes for regression
    Thi Nhat Anh Nguyen
    Abdesselam Bouzerdoum
    Son Lam Phung
    [J]. Machine Learning, 2018, 107 : 1947 - 1986
  • [9] Knot selection in sparse Gaussian processes with a variational objective function
    Garton, Nathaniel
    Niemi, Jarad
    Carriquiry, Alicia
    [J]. STATISTICAL ANALYSIS AND DATA MINING, 2020, 13 (04) : 324 - 336
  • [10] Variational zero-inflated Gaussian processes with sparse kernels
    Hegde, Pashupati
    Heinonen, Markus
    Kaski, Samuel
    [J]. UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, 2018, : 361 - 371