Some theoretical results concerning the convergence of compositions of regularized linear functions

被引:0
|
作者
Zhang, T [1 ]
机构
[1] IBM Corp, Thomas J Watson Res Ctr, Dept Math Sci, Yorktown Heights, NY 10598 USA
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recently, sample complexity bounds have been derived for problems involving linear functions such as neural networks and support vector machines. In this paper, we extend some theoretical results in this area by deriving dimensional independent covering number bounds for regularized linear functions under certain regularization conditions. We show that such bounds lead to a class of new methods for training linear classifiers with similar theoretical advantages of the support vector machine. Furthermore, we also present a theoretical analysis for these new methods fi-om the asymptotic statistical point of view. This technique provides better description for large sample behaviors of these algorithms.
引用
收藏
页码:370 / 376
页数:7
相关论文
共 50 条