Sparse additive support vector machines in bounded variation space

被引:0
|
作者
Wang, Yue [1 ]
Lian, Heng [1 ,2 ]
机构
[1] City Univ Hong Kong, Dept Math, Kowloon, Hong Kong, Peoples R China
[2] CityU Shenzhen Res Inst, Shenzhen 518057, Peoples R China
关键词
additive models; empirical norm penalty; high dimensionality; SVM; total variation penalty; REGRESSION; RATES; CONSISTENCY; INFERENCE; MODELS; RISK;
D O I
10.1093/imaiai/iaae003
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
We propose the total variation penalized sparse additive support vector machine (TVSAM) for performing classification in the high-dimensional settings, using a mixed $l_{1}$-type functional regularization scheme to induce sparsity and smoothness simultaneously. We establish a representer theorem for TVSAM, which turns the infinite-dimensional problem into a finite-dimensional one, thereby providing computational feasibility. Even for the least squares loss, our result fills a gap in the literature when compared with the existing representer theorem. Theoretically, we derive some risk bounds for TVSAM under both exact sparsity and near sparsity, and with arbitrarily specified internal knots. In this process, we develop an important interpolation inequality for the space of functions of bounded variation, relying on analytic techniques such as mollification and partition of unity. An efficient implementation based on the alternating direction method of multipliers is employed.
引用
收藏
页数:29
相关论文
共 50 条
  • [31] Active Learning for Sparse Least Squares Support Vector Machines
    Zou, Junjie
    Yu, Zhengtao
    Zong, Huanyun
    Zhao, Xing
    ARTIFICIAL INTELLIGENCE AND COMPUTATIONAL INTELLIGENCE, PT II, 2011, 7003 : 672 - +
  • [32] A hybrid approach for sparse least squares support vector machines
    De Carvalho, B.P.R. (bernardo@vettalabs.com), Operador Nacional do Sistema Eletrico - ONS; Coordenacao de Aperfeicoamento de Pessoal de Nivel Superior (Inst. of Elec. and Elec. Eng. Computer Society, 445 Hoes Lane - P.O.Box 1331, Piscataway, NJ 08855-1331, United States):
  • [33] Hardware Acceleration of Sparse Support Vector Machines for Edge Computing
    Vranjkovic, Vuk
    Struharik, Rastislav
    ELEKTRONIKA IR ELEKTROTECHNIKA, 2020, 26 (03) : 42 - 53
  • [34] Robust and Sparse Linear Programming Twin Support Vector Machines
    M. Tanveer
    Cognitive Computation, 2015, 7 : 137 - 149
  • [35] Robust and Sparse Linear Programming Twin Support Vector Machines
    Tanveer, M.
    COGNITIVE COMPUTATION, 2015, 7 (01) : 137 - 149
  • [36] Sparse Support Vector Machines with Lp Penalty for Biomarker Identification
    Liu, Zhenqiu
    Lin, Shili
    Tan, Ming T.
    IEEE-ACM TRANSACTIONS ON COMPUTATIONAL BIOLOGY AND BIOINFORMATICS, 2010, 7 (01) : 100 - 107
  • [37] Exploitation of sparse properties of support vector machines in image compression
    Robinson, J
    Kecman, V
    PROCEEDINGS OF THE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS 2003, VOLS 1-4, 2003, : 1232 - 1236
  • [38] Improved sparse least-squares support vector machines
    Cawley, GC
    Talbot, NLC
    NEUROCOMPUTING, 2002, 48 : 1025 - 1031
  • [39] Sparse approximation based on wavelet kernel support vector machines
    Yang, DK
    Tong, YB
    Zhang, QS
    PROCEEDINGS OF 2005 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS, VOLS 1-9, 2005, : 4249 - 4253
  • [40] Additive survival least-squares support vector machines
    Van Belle, V.
    Pelckmans, K.
    Suykens, J. A. K.
    Van Huffel, S.
    STATISTICS IN MEDICINE, 2010, 29 (02) : 296 - 308