Distribution-dependent Vapnik-Chervonenkis bounds

被引:0
|
作者
Vayatis, N
Azencott, R
机构
[1] Ctr Math & Leurs Applicat, Ecole Normale Super Cachan, F-94235 Cachan, France
[2] Ecole Polytech, Ctr Rech Epistemol Appl, F-91128 Palaiseau, France
来源
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Vapnik-Chervonenkis (VC) bounds play an important role in statistical learning theory as they are the fundamental result which explains the generalization ability of learning machines. There have been consequent mathematical works on the improvement of VC rates of convergence of empirical means to their expectations over the years. The result obtained by Talagrand in 1994 seems to provide more or less the final word to this issue as far as universal bounds are concerned. Though for fixed distributions, this bound can be practically outperformed. We show indeed that it is possible to replace the 2 epsilon(2) under the exponential of the deviation term by the corresponding Cramer transform as shown by large deviations theorems. Then, we formulate rigorous distribution-sensitive VC bounds and we also explain why these theoretical results on such bounds can lead to practical estimates of the effective VC dimension of learning structures.
引用
下载
收藏
页码:230 / 240
页数:11
相关论文
共 50 条