Data representations and generalization error in kernel based learning machines

被引:21
|
作者
Ancona, Nicola [1 ]
Maglietta, Rosalia [1 ]
Stella, Ettore [1 ]
机构
[1] CNR, Ist Studi Sistemi Intelligenti Automaz, I-70126 Bari, Italy
基金
美国国家科学基金会;
关键词
supervised learning; classification; support vector machines; generalization; leave-one-out error; sparse and dense data representation;
D O I
10.1016/j.patcog.2005.11.025
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper focuses on the problem of how data representation influences the generalization error of kernel based learning machines like support vector machines (SVM) for classification. Frame theory provides a well founded mathematical framework for representing data in many different ways. We analyze the effects of sparse and dense data representations on the generalization error of such learning machines measured by using leave-one-out error given a finite amount of training data. We show that, in the case of sparse data representations, the generalization error of an SVM trained by using polynomial or Gaussian kernel functions is equal to the one of a linear SVM. This is equivalent to saying that the capacity of separating points of functions belonging to hypothesis spaces induced by polynomial or Gaussian kernel functions reduces to the capacity of a separating hyperplane in the input space. Moreover, we show that, in general, sparse data representations increase or leave unchanged the generalization error of kernel based methods. Dense data representations, on the contrary, reduce the generalization error in the case of very large frames. We use two different schemes for representing data in overcomplete systems of Haar and Gabor functions, and measure SVM generalization error on benchmarked data sets. (c) 2006 Pattern Recognition Soeiety. Published by Elsevier Ltd. All rights reserved.
引用
收藏
页码:1588 / 1603
页数:16
相关论文
共 50 条
  • [1] Data representation in kernel based learning machines
    Ancona, N
    Maglietta, R
    Stella, E
    [J]. Proceedings of the Eighth IASTED International Conference on Artificial Intelligence and Soft Computing, 2004, : 243 - 248
  • [2] On the generalization of kernel machines
    Navarrete, P
    Ruiz del Solar, J
    [J]. PATTERN RECOGNITION WITH SUPPORT VECTOR MACHINES, PROCEEDINGS, 2002, 2388 : 24 - 39
  • [3] Fast generalization error bound of deep learning from a kernel perspective
    Suzuki, Taiji
    [J]. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 84, 2018, 84
  • [4] Deep convolutional representations and kernel extreme learning machines for image classification
    Zhu, Xiaobin
    Li, Zhuangzi
    Zhang, Xiao-Yu
    Li, Peng
    Xue, Ziyu
    Wang, Lei
    [J]. MULTIMEDIA TOOLS AND APPLICATIONS, 2019, 78 (20) : 29271 - 29290
  • [5] Deep convolutional representations and kernel extreme learning machines for image classification
    Xiaobin Zhu
    Zhuangzi Li
    Xiao-Yu Zhang
    Peng Li
    Ziyu Xue
    Lei Wang
    [J]. Multimedia Tools and Applications, 2019, 78 : 29271 - 29290
  • [6] LEARNING THE KERNEL BASED ON ERROR BOUNDS
    Tang, Yi
    Chen, Hong
    [J]. PROCEEDINGS OF 2008 INTERNATIONAL CONFERENCE ON WAVELET ANALYSIS AND PATTERN RECOGNITION, VOLS 1 AND 2, 2008, : 805 - 809
  • [7] Unsupervised learning of disentangled representations in deep restricted kernel machines with orthogonality constraints
    Tonin, Francesco
    Patrinos, Panagiotis
    Suykens, Johan A. K.
    [J]. NEURAL NETWORKS, 2021, 142 : 661 - 679
  • [8] Incomplete-view oriented kernel learning method with generalization error bound
    Tian, Yingjie
    Fu, Saiji
    Tang, Jingjing
    [J]. Information Sciences, 2021, 581 : 951 - 977
  • [9] Truncation Error Compensation in Kernel Machines
    Rhinelander, Jason P.
    Liu, Xiaoping P.
    [J]. 2013 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC 2013), 2013, : 1889 - 1894
  • [10] A sparsity driven kernel machine based on minimizing a generalization error bound
    Peleg, Dori
    Meir, Ron
    [J]. PATTERN RECOGNITION, 2009, 42 (11) : 2607 - 2614