Gaussian Kernel Width Optimization for Sparse Bayesian Learning

被引:28
|
作者
Mohsenzadeh, Yalda [1 ]
Sheikhzadeh, Hamid [1 ]
机构
[1] Amirkabir Univ Technol, Dept Elect Engn, Tehran 1415854546, Iran
关键词
Adaptive kernel learning (AKL); expectation maximization (EM); kernel width optimization; regression; relevance vector machine (RVM); sparse Bayesian learning; supervised kernel methods; MACHINE; POSE;
D O I
10.1109/TNNLS.2014.2321134
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Sparse kernel methods have been widely used in regression and classification applications. The performance and the sparsity of these methods are dependent on the appropriate choice of the corresponding kernel functions and their parameters. Typically, the kernel parameters are selected using a cross-validation approach. In this paper, a learning method that is an extension of the relevance vector machine (RVM) is presented. The proposed method can find the optimal values of the kernel parameters during the training procedure. This algorithm uses an expectation-maximization approach for updating kernel parameters as well as other model parameters; therefore, the speed of convergence and computational complexity of the proposed method are the same as the standard RVM. To control the convergence of this fully parameterized model, the optimization with respect to the kernel parameters is performed using a constraint on these parameters. The proposed method is compared with the typical RVM and other competing methods to analyze the performance. The experimental results on the commonly used synthetic data, as well as benchmark data sets, demonstrate the effectiveness of the proposed method in reducing the performance dependency on the initial choice of the kernel parameters.
引用
收藏
页码:709 / 719
页数:11
相关论文
共 50 条
  • [11] Sparse Bayesian dictionary learning with a Gaussian hierarchical model
    Yang, Linxiao
    Fang, Jun
    Cheng, Hong
    Li, Hongbin
    SIGNAL PROCESSING, 2017, 130 : 93 - 104
  • [12] Sparse Bayesian Learning for non-Gaussian sources
    Porter, Richard
    Tadic, Vladislav
    Achim, Achim
    DIGITAL SIGNAL PROCESSING, 2015, 45 : 2 - 12
  • [13] A new surrogate modeling method combining polynomial chaos expansion and Gaussian kernel in a sparse Bayesian learning framework
    Zhou, Yicheng
    Lu, Zhenzhou
    Cheng, Kai
    INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN ENGINEERING, 2019, 120 (04) : 498 - 516
  • [14] Sparse Bayesian Learning-Based Kernel Poisson Regression
    Jia, Yuheng
    Kwong, Sam
    Wu, Wenhui
    Wang, Ran
    Gao, Wei
    IEEE TRANSACTIONS ON CYBERNETICS, 2019, 49 (01) : 56 - 68
  • [15] On the Support Recovery of Jointly Sparse Gaussian Sources via Sparse Bayesian Learning
    Khanna, Saurabh
    Murthy, Chandra R.
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2022, 68 (11) : 7361 - 7378
  • [16] Bayesian Optimization of the PC Algorithm for Learning Gaussian Bayesian Networks
    Cordoba, Irene
    Garrido-Merchan, Eduardo C.
    Hernandez-Lobato, Daniel
    Bielza, Concha
    Larranaga, Pedro
    ADVANCES IN ARTIFICIAL INTELLIGENCE, CAEPIA 2018, 2018, 11160 : 44 - 54
  • [17] Transfer Learning with Gaussian Processes for Bayesian Optimization
    Tighineanu, Petru
    Skubch, Kathrin
    Baireuther, Paul
    Reiss, Attila
    Berkenkamp, Felix
    Vinogradska, Julia
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151 : 6152 - 6181
  • [18] LEARNING GAUSSIAN PROCESSES WITH BAYESIAN POSTERIOR OPTIMIZATION
    Chamon, Luiz F. O.
    Patemain, Santiago
    Ribeiro, Alejandro
    CONFERENCE RECORD OF THE 2019 FIFTY-THIRD ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS & COMPUTERS, 2019, : 482 - 486
  • [19] Accelerating Bayesian Structure Learning in Sparse Gaussian Graphical Models
    Mohammadi, Reza
    Massam, Helene
    Letac, Gerard
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2023, 118 (542) : 1345 - 1358
  • [20] Learning Sparse Fixed-Structure Gaussian Bayesian Networks
    Bhattacharyya, Arnab
    Choo, Davin
    Gajjala, Rishikesh
    Gayen, Sutanu
    Wang, Yuhao
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151