Distribution-dependent Vapnik-Chervonenkis bounds

被引:0
|
作者
Vayatis, N
Azencott, R
机构
[1] Ctr Math & Leurs Applicat, Ecole Normale Super Cachan, F-94235 Cachan, France
[2] Ecole Polytech, Ctr Rech Epistemol Appl, F-91128 Palaiseau, France
来源
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Vapnik-Chervonenkis (VC) bounds play an important role in statistical learning theory as they are the fundamental result which explains the generalization ability of learning machines. There have been consequent mathematical works on the improvement of VC rates of convergence of empirical means to their expectations over the years. The result obtained by Talagrand in 1994 seems to provide more or less the final word to this issue as far as universal bounds are concerned. Though for fixed distributions, this bound can be practically outperformed. We show indeed that it is possible to replace the 2 epsilon(2) under the exponential of the deviation term by the corresponding Cramer transform as shown by large deviations theorems. Then, we formulate rigorous distribution-sensitive VC bounds and we also explain why these theoretical results on such bounds can lead to practical estimates of the effective VC dimension of learning structures.
引用
下载
收藏
页码:230 / 240
页数:11
相关论文
共 50 条
  • [41] Adaptation, Performance and Vapnik-Chervonenkis Dimension of Straight Line Programs
    Montana, Jose L.
    Alonso, Cesar L.
    Borges, Cruz E.
    Crespo, Jose L.
    GENETIC PROGRAMMING, 2009, 5481 : 315 - +
  • [42] Vapnik-Chervonenkis dimension of neural networks with binary weights
    Mertens, S
    Engel, A
    PHYSICAL REVIEW E, 1997, 55 (04) : 4478 - 4488
  • [43] Vapnik-Chervonenkis dimension of axis-parallel cuts
    Gey, Servane
    COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 2018, 47 (09) : 2291 - 2296
  • [44] Vapnik-Chervonenkis (VC) learning theory and its applications
    Cherkassky, V
    Mulier, F
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 1999, 10 (05): : 985 - 987
  • [45] Adaptive Neural Topology Based on Vapnik-Chervonenkis Dimension
    Perez-Sanchez, Beatriz
    Fontenla-Romero, Oscar
    Guijarro-Berdinas, Bertha
    AGENTS AND ARTIFICIAL INTELLIGENCE, ICAART 2014, 2015, 8946 : 194 - 210
  • [46] BOUNDING SAMPLE-SIZE WITH THE VAPNIK-CHERVONENKIS DIMENSION
    SHAWETAYLOR, J
    ANTHONY, M
    BIGGS, NL
    DISCRETE APPLIED MATHEMATICS, 1993, 42 (01) : 65 - 73
  • [47] Pseudovaluation domains with Vapnik-Chervonenkis classes of definable sets
    Bélair, L
    COMMUNICATIONS IN ALGEBRA, 2000, 28 (08) : 3785 - 3793
  • [48] LEARNING FASTER THAN PROMISED BY THE VAPNIK-CHERVONENKIS DIMENSION
    BLUMER, A
    LITTLESTONE, N
    DISCRETE APPLIED MATHEMATICS, 1989, 24 (1-3) : 47 - 53
  • [49] INAPPROXIMABILITY OF TRUTHFUL MECHANISMS VIA GENERALIZATIONS OF THE VAPNIK-CHERVONENKIS DIMENSION
    Daniely, Amit
    Schapira, Michael
    Shahaf, Gal
    SIAM JOURNAL ON COMPUTING, 2018, 47 (01) : 96 - 120
  • [50] COMPLEXITY OF COMPUTING VAPNIK-CHERVONENKIS DIMENSION AND SOME GENERALIZED DIMENSIONS
    SHINOHARA, A
    THEORETICAL COMPUTER SCIENCE, 1995, 137 (01) : 129 - 144