Bayesian Methods for Support Vector Machines: Evidence and Predictive Class Probabilities

被引:0
|
作者
Peter Sollich
机构
[1] King's College London,Department of Mathematics
来源
Machine Learning | 2002年 / 46卷
关键词
Support vector machines; Gaussian processes; Bayesian inference; evidence; hyperparameter tuning; probabilistic predictions;
D O I
暂无
中图分类号
学科分类号
摘要
I describe a framework for interpreting Support Vector Machines (SVMs) as maximum a posteriori (MAP) solutions to inference problems with Gaussian Process priors. This probabilistic interpretation can provide intuitive guidelines for choosing a ‘good’ SVM kernel. Beyond this, it allows Bayesian methods to be used for tackling two of the outstanding challenges in SVM classification: how to tune hyperparameters—the misclassification penalty C, and any parameters specifying the ernel—and how to obtain predictive class probabilities rather than the conventional deterministic class label predictions. Hyperparameters can be set by maximizing the evidence; I explain how the latter can be defined and properly normalized. Both analytical approximations and numerical methods (Monte Carlo chaining) for estimating the evidence are discussed. I also compare different methods of estimating class probabilities, ranging from simple evaluation at the MAP or at the posterior average to full averaging over the posterior. A simple toy application illustrates the various concepts and techniques.
引用
收藏
页码:21 / 52
页数:31
相关论文
共 50 条
  • [1] Bayesian methods for support vector machines: Evidence and predictive class probabilities
    Sollich, P
    MACHINE LEARNING, 2002, 46 (1-3) : 21 - 52
  • [2] Probabilistic interpretations and Bayesian methods for support vector machines
    Sollich, P
    NINTH INTERNATIONAL CONFERENCE ON ARTIFICIAL NEURAL NETWORKS (ICANN99), VOLS 1 AND 2, 1999, (470): : 91 - 96
  • [3] Estimating accurate multi-class probabilities with Support Vector Machines
    Milgram, J
    Cheriet, M
    Sabourin, R
    PROCEEDINGS OF THE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), VOLS 1-5, 2005, : 1906 - 1911
  • [4] On Bayesian inference, maximum entropy and support vector machines methods
    Costache, Mihai
    Lienou, Marie
    Datcu, Mihai
    BAYESIAN INFERENCE AND MAXIMUM ENTROPY METHODS IN SCIENCE AND ENGINEERING, 2006, 872 : 43 - +
  • [5] Precision of Multi-class Classification Methods for Support Vector Machines
    Li, Honglian
    Jiao, Ruili
    Fan, Jing
    ICSP: 2008 9TH INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING, VOLS 1-5, PROCEEDINGS, 2008, : 1517 - 1520
  • [6] Modeling method based on support vector machines within the Bayesian evidence framework
    Yan, Wei-Wu
    Chang, Jun-Lin
    Shao, Hui-He
    Kongzhi yu Juece/Control and Decision, 2004, 19 (05): : 525 - 528
  • [7] Probabilistic methods for Support Vector Machines
    Sollich, P
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 12, 2000, 12 : 349 - 355
  • [8] Sparseness Methods of Support Vector Machines
    Li Junfei
    Zhang Yiqin
    SIXTH INTERNATIONAL CONFERENCE ON ELECTROMECHANICAL CONTROL TECHNOLOGY AND TRANSPORTATION (ICECTT 2021), 2022, 12081
  • [9] A comparison of model selection methods for multi-class support vector machines
    Li, HQ
    Qi, FH
    Wang, SY
    COMPUTATIONAL SCIENCE AND ITS APPLICATIONS - ICCSA 2005, VOL 4, PROCEEDINGS, 2005, 3483 : 1140 - 1148
  • [10] Minimum class variance support vector machines
    Zafeiriou, Stefanos
    Tefas, Anastasios
    Pitas, Loannis
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2007, 16 (10) : 2551 - 2564