Sparse Multivariate Gaussian Mixture Regression

被引:7
|
作者
Weruaga, Luis [1 ]
Via, Javier [2 ]
机构
[1] Khalifa Univ Sci Technol & Res, Sharjah 127788, U Arab Emirates
[2] Univ Cantabria, Dept Commun Engn, E-39005 Santander, Spain
关键词
Function approximation; Gaussian function mixture (GFM); logarithmic utility function; regression; sparsity; BASIS NEURAL-NETWORKS; ALGORITHM;
D O I
10.1109/TNNLS.2014.2334596
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Fitting a multivariate Gaussian mixture to data represents an attractive, as well as challenging problem, in especial when sparsity in the solution is demanded. Achieving this objective requires the concurrent update of all parameters (weight, centers, and precisions) of all multivariate Gaussian functions during the learning process. Such is the focus of this paper, which presents a novel method founded on the minimization of the error of the generalized logarithmic utility function (GLUF). This choice, which allows us to move smoothly from the mean square error (MSE) criterion to the one based on the logarithmic error, yields an optimization problem that resembles a locally convex problem and can be solved with a quasi-Newton method. The GLUF framework also facilitates the comparative study between both extremes, concluding that the classical MSE optimization is not the most adequate for the task. The performance of the proposed novel technique is demonstrated on simulated as well as realistic scenarios.
引用
收藏
页码:1098 / 1108
页数:11
相关论文
共 50 条
  • [1] Multivariate Regression with Incremental Learning of Gaussian Mixture Models
    Acevedo-Valle, Juan M.
    Trejo, Karla
    Angulo, Cecilio
    RECENT ADVANCES IN ARTIFICIAL INTELLIGENCE RESEARCH AND DEVELOPMENT, 2017, 300 : 196 - 205
  • [2] Stochastic variational hierarchical mixture of sparse Gaussian processes for regression
    Thi Nhat Anh Nguyen
    Bouzerdoum, Abdesselam
    Son Lam Phung
    MACHINE LEARNING, 2018, 107 (12) : 1947 - 1986
  • [3] Stochastic variational hierarchical mixture of sparse Gaussian processes for regression
    Thi Nhat Anh Nguyen
    Abdesselam Bouzerdoum
    Son Lam Phung
    Machine Learning, 2018, 107 : 1947 - 1986
  • [4] Context-GMM: Incremental Learning of Sparse Priors for Gaussian Mixture Regression
    Ribes, Arturo
    Cerquides Bueno, Jesus
    Demiris, Yiannis
    Lopez de Mantaras, Ramon
    2012 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND BIOMIMETICS (ROBIO 2012), 2012,
  • [5] Gaussian Scale Mixture Models for Robust Linear Multivariate Regression with Missing Data
    Ala-Luhtala, Juha
    Piche, Robert
    COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION, 2016, 45 (03) : 791 - 813
  • [6] SPARSE MULTIVARIATE FACTOR REGRESSION
    Kharratzadeh, Milad
    Coates, Mark
    2016 IEEE STATISTICAL SIGNAL PROCESSING WORKSHOP (SSP), 2016,
  • [7] AN l1-ORACLE INEQUALITY FOR THE LASSO IN MULTIVARIATE FINITE MIXTURE OF MULTIVARIATE GAUSSIAN REGRESSION MODELS
    Devijver, Emilie
    ESAIM-PROBABILITY AND STATISTICS, 2015, 19 : 649 - 670
  • [8] Sparse Structure Enabled Grid Spectral Mixture Kernel for Temporal Gaussian Process Regression
    Yin, Feng
    He, Xinwei
    Pan, Lishuo
    Chen, Tianshi
    Luo, Zhi-Quan
    Theodoridis, Sergios
    2018 21ST INTERNATIONAL CONFERENCE ON INFORMATION FUSION (FUSION), 2018, : 47 - 54
  • [9] Sparse Multivariate Regression With Covariance Estimation
    Rothman, Adam J.
    Levina, Elizaveta
    Zhu, Ji
    JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2010, 19 (04) : 947 - 962
  • [10] Sparse Additive Gaussian Process Regression
    Luo, Hengrui
    Nattino, Giovanni
    Pratola, Matthew T.
    Journal of Machine Learning Research, 2022, 23