Sparse Multivariate Gaussian Mixture Regression

被引:7
|
作者
Weruaga, Luis [1 ]
Via, Javier [2 ]
机构
[1] Khalifa Univ Sci Technol & Res, Sharjah 127788, U Arab Emirates
[2] Univ Cantabria, Dept Commun Engn, E-39005 Santander, Spain
关键词
Function approximation; Gaussian function mixture (GFM); logarithmic utility function; regression; sparsity; BASIS NEURAL-NETWORKS; ALGORITHM;
D O I
10.1109/TNNLS.2014.2334596
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Fitting a multivariate Gaussian mixture to data represents an attractive, as well as challenging problem, in especial when sparsity in the solution is demanded. Achieving this objective requires the concurrent update of all parameters (weight, centers, and precisions) of all multivariate Gaussian functions during the learning process. Such is the focus of this paper, which presents a novel method founded on the minimization of the error of the generalized logarithmic utility function (GLUF). This choice, which allows us to move smoothly from the mean square error (MSE) criterion to the one based on the logarithmic error, yields an optimization problem that resembles a locally convex problem and can be solved with a quasi-Newton method. The GLUF framework also facilitates the comparative study between both extremes, concluding that the classical MSE optimization is not the most adequate for the task. The performance of the proposed novel technique is demonstrated on simulated as well as realistic scenarios.
引用
收藏
页码:1098 / 1108
页数:11
相关论文
共 50 条
  • [41] Validation Based Sparse Gaussian Processes for Ordinal Regression
    Srijith, P. K.
    Shevade, Shirish
    Sundararajan, S.
    NEURAL INFORMATION PROCESSING, ICONIP 2012, PT II, 2012, 7664 : 409 - 416
  • [42] Correlated product of experts for sparse Gaussian process regression
    Schuerch, Manuel
    Azzimonti, Dario
    Benavoli, Alessio
    Zaffalon, Marco
    MACHINE LEARNING, 2023, 112 (05) : 1411 - 1432
  • [43] Online Sparse Gaussian Process Regression for Trajectory Modeling
    Tiger, Mattias
    Heintz, Fredrik
    2015 18TH INTERNATIONAL CONFERENCE ON INFORMATION FUSION (FUSION), 2015, : 782 - 791
  • [44] Variational inference for sparse spectrum Gaussian process regression
    Tan, Linda S. L.
    Ong, Victor M. H.
    Nott, David J.
    Jasra, Ajay
    STATISTICS AND COMPUTING, 2016, 26 (06) : 1243 - 1261
  • [45] Differentially Private Regression and Classification with Sparse Gaussian Processes
    Smith, Michael Thomas
    Alvarez, Mauricio A.
    Lawrence, Neil D.
    JOURNAL OF MACHINE LEARNING RESEARCH, 2021, 22
  • [46] Transfer learning based on sparse Gaussian process for regression
    Yang, Kai
    Lu, Jie
    Wan, Wanggen
    Zhang, Guangquan
    Hou, Li
    INFORMATION SCIENCES, 2022, 605 : 286 - 300
  • [47] A unifying view of sparse approximate Gaussian process regression
    Quiñonero-Candela, JQ
    Rasmussen, CE
    JOURNAL OF MACHINE LEARNING RESEARCH, 2005, 6 : 1939 - 1959
  • [48] Sparse Information Filter for Fast Gaussian Process Regression
    Kania, Lucas
    Schuerch, Manuel
    Azzimonti, Dario
    Benavoli, Alessio
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2021: RESEARCH TRACK, PT III, 2021, 12977 : 527 - 542
  • [49] Differentially private regression and classification with sparse Gaussian processes
    Smith, Michael Thomas
    Álvarez, Mauricio A.
    Lawrence, Neil D.
    Journal of Machine Learning Research, 2021, 22
  • [50] Online Sparse Gaussian Process Regression and Its Applications
    Ranganathan, Ananth
    Yang, Ming-Hsuan
    Ho, Jeffrey
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2011, 20 (02) : 391 - 404