VC dimension bounds for higher-order neurons

被引:0
|
作者
Schmitt, M [1 ]
机构
[1] Ruhr Univ Bochum, Fak Math, Lehrstuhl Math & Informat, D-44780 Bochum, Germany
来源
NINTH INTERNATIONAL CONFERENCE ON ARTIFICIAL NEURAL NETWORKS (ICANN99), VOLS 1 AND 2 | 1999年 / 470期
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We investigate the sample complexity for learning using higher-order neurons. We calculate upper and lower bounds on the Vapnik-Chervonenkis dimension and the pseudo dimension for higher-order neurons that allow unrestricted interactions among the input variables. In particular, we show that the degree of interaction is irrelevant for the VC dimension and that the individual degree of the variables plays only a minor role. Further, our results reveal that the crucial parameters that affect the VC dimension of higher-order neurons are the input dimension and the maximum number of occurrences of each variable. The lower bounds that we establish are asymptotically almost tight. In particular, they show that the VC dimension in super-linear in the input dimension. Bounds for higher-order neurons with sigmoidal activation function are also derived.
引用
收藏
页码:563 / 568
页数:6
相关论文
共 50 条