The Minimum Redundancy - Maximum Relevance Approach to Building Sparse Support Vector Machines

被引:0
|
作者
Yang, Xiaoxing [1 ]
Tang, Ke [1 ]
Yao, Xin [1 ]
机构
[1] Univ Sci & Technol China, Sch Comp Sci & Technol, NICAL, Hefei 230027, Peoples R China
关键词
Relevance; Redundancy; Sparse design; SVMs; Machine learning;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recently, building sparse SVMs becomes an active research topic due to its potential applications in large scale data mining tasks. One of the most popular approaches to building sparse SVMs is to select a small subset of training samples and employ them as the support vectors. In this paper, we explain that selecting the support vectors is equivalent to selecting a number of columns from the kernel matrix, and is equivalent to selecting a subset of features in the feature selection domain. Hence, we propose to use an effective feature selection algorithm, namely the Minimum Redundancy - Maximum Relevance (MRMR) algorithm to solve the support vector selection problem. MRMR algorithm was then compared to two existing methods, namely back-fitting (BF) and pre-fitting (PF) algorithms. Preliminary results showed that MRMR generally outperformed BF algorithm while it was inferior to PF algorithm, in terms of generalization performance. However. the MRMR approach was extremely efficient and significantly faster than the two compared algorithms.
引用
收藏
页码:184 / 190
页数:7
相关论文
共 50 条
  • [31] Maximum relevance minimum common redundancy feature selection for nonlinear data
    Che, Jinxing
    Yang, Youlong
    Li, Li
    Bai, Xuying
    Zhang, Shenghu
    Deng, Chengzhi
    INFORMATION SCIENCES, 2017, 409 : 68 - 86
  • [32] Sparse Linear Contextual Bandits via Relevance Vector Machines
    Gilton, Davis
    Willett, Rebecca
    2017 INTERNATIONAL CONFERENCE ON SAMPLING THEORY AND APPLICATIONS (SAMPTA), 2017, : 518 - 522
  • [33] An equivalence between sparse approximation and support vector machines
    Girosi, F
    NEURAL COMPUTATION, 1998, 10 (06) : 1455 - 1480
  • [34] Efficient sparse nonparallel support vector machines for classification
    Yingjie Tian
    Xuchan Ju
    Zhiquan Qi
    Neural Computing and Applications, 2014, 24 : 1089 - 1099
  • [35] WAVELET KERNEL SUPPORT VECTOR MACHINES FOR SPARSE APPROXIMATION
    Tong Yubing Yang Dongkai Zhang Qishan (Dept of Electronic Information Engineering
    Journal of Electronics(China), 2006, (04) : 539 - 542
  • [36] WAVELET KERNEL SUPPORT VECTOR MACHINES FOR SPARSE APPROXIMATION
    Tong Yubing Yang Dongkai Zhang Qishan Dept of Electronic Information Engineering Beijing University of Aeronautics and astronautics Beijing China
    JournalofElectronics, 2006, (04) : 539 - 542
  • [37] On support vector machines and sparse approximation for random processes
    Capobianco, E
    NEUROCOMPUTING, 2004, 56 : 39 - 60
  • [38] Efficient sparse nonparallel support vector machines for classification
    Tian, Yingjie
    Ju, Xuchan
    Qi, Zhiquan
    NEURAL COMPUTING & APPLICATIONS, 2014, 24 (05): : 1089 - 1099
  • [39] A Novel Sparse Least Squares Support Vector Machines
    Xia, Xiao-Lei
    Jiao, Weidong
    Li, Kang
    Irwin, George
    MATHEMATICAL PROBLEMS IN ENGINEERING, 2013, 2013
  • [40] DC programming for sparse proximal support vector machines
    Li, Guoquan
    Yang, Linxi
    Wu, Zhiyou
    Wu, Changzhi
    INFORMATION SCIENCES, 2021, 547 : 187 - 201