Gaussian Kernel Width Optimization for Sparse Bayesian Learning

被引:28
|
作者
Mohsenzadeh, Yalda [1 ]
Sheikhzadeh, Hamid [1 ]
机构
[1] Amirkabir Univ Technol, Dept Elect Engn, Tehran 1415854546, Iran
关键词
Adaptive kernel learning (AKL); expectation maximization (EM); kernel width optimization; regression; relevance vector machine (RVM); sparse Bayesian learning; supervised kernel methods; MACHINE; POSE;
D O I
10.1109/TNNLS.2014.2321134
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Sparse kernel methods have been widely used in regression and classification applications. The performance and the sparsity of these methods are dependent on the appropriate choice of the corresponding kernel functions and their parameters. Typically, the kernel parameters are selected using a cross-validation approach. In this paper, a learning method that is an extension of the relevance vector machine (RVM) is presented. The proposed method can find the optimal values of the kernel parameters during the training procedure. This algorithm uses an expectation-maximization approach for updating kernel parameters as well as other model parameters; therefore, the speed of convergence and computational complexity of the proposed method are the same as the standard RVM. To control the convergence of this fully parameterized model, the optimization with respect to the kernel parameters is performed using a constraint on these parameters. The proposed method is compared with the typical RVM and other competing methods to analyze the performance. The experimental results on the commonly used synthetic data, as well as benchmark data sets, demonstrate the effectiveness of the proposed method in reducing the performance dependency on the initial choice of the kernel parameters.
引用
收藏
页码:709 / 719
页数:11
相关论文
共 50 条
  • [41] Bayesian generative kernel Gaussian process regression
    Kuok, Sin-Chi
    Yao, Shuang-Ao
    Yuen, Ka-Veng
    Yan, Wang-Ji
    Girolami, Mark
    MECHANICAL SYSTEMS AND SIGNAL PROCESSING, 2025, 227
  • [42] Integrating Bayesian and Discriminative Sparse Kernel Machines for Multi-class Active Learning
    Shi, Weishi
    Yu, Qi
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [43] Probabilistic Wind Power Forecast Using Sparse Bayesian Learning of Unified Kernel Function
    Wei, Zhang
    Liu, San Ming
    Wei, Dan
    Wang, Zhi Jie
    Yang, Ming Li
    Li, Ying
    2014 IEEE TRANSPORTATION ELECTRIFICATION CONFERENCE AND EXPO (ITEC) ASIA-PACIFIC 2014, 2014,
  • [44] Unified estimate of Gaussian kernel width for surrogate models
    Wu, Zeping
    Wang, Donghui
    Okolo, Patrick N.
    Jiang, Zhenyu
    Zhang, Weihua
    NEUROCOMPUTING, 2016, 203 : 41 - 51
  • [45] Gaussian kernel in quantum learning
    Bishwas, Arit Kumar
    Mani, Ashish
    Palade, Vasile
    INTERNATIONAL JOURNAL OF QUANTUM INFORMATION, 2020, 18 (03)
  • [46] Sparse Bayesian inference with regularized Gaussian distributions
    Everink, Jasper M.
    Dong, Yiqiu
    Andersen, Martin S.
    INVERSE PROBLEMS, 2023, 39 (11)
  • [47] A Bayesian Approach to Sparse Learning-to-Rank for Search Engine Optimization
    Krasotkina, Olga
    Mottl, Vadim
    MACHINE LEARNING AND DATA MINING IN PATTERN RECOGNITION, MLDM 2015, 2015, 9166 : 382 - 394
  • [48] Robust sparse Bayesian learning based on the Bernoulli-Gaussian model of impulsive noise
    Rong, Jiarui
    Zhang, Jingshu
    Duan, Huiping
    DIGITAL SIGNAL PROCESSING, 2023, 136
  • [49] Perspectives on sparse Bayesian learning
    Wipf, D
    Palmer, J
    Rao, B
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 16, 2004, 16 : 249 - 256
  • [50] Clustered Sparse Bayesian Learning
    Wang, Yu
    Wipf, David
    Yun, Jeong-Min
    Chen, Wei
    Wassell, Ian
    UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, 2015, : 932 - 941