On learning vector-valued functions

被引:308
|
作者
Micchelli, CA [1 ]
Pontil, M
机构
[1] SUNY Albany, Dept Math & Stat, Albany, NY 12222 USA
[2] UCL, Dept Comp Sci, London WC1E, England
基金
英国工程与自然科学研究理事会;
关键词
D O I
10.1162/0899766052530802
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this letter, we provide a study of learning in a Hilbert space of vector-valued functions. We motivate the need for extending learning theory of scalar-valued functions by practical considerations and establish some basic results for learning vector-valued functions that should prove useful in applications. Specifically, we allow an output space gamma to be a Hilbert space, and we consider a reproducing kernel Hilbert space of functions whose values lie in gamma. In this setting, we derive the form of the minimal norm interpolant to a finite set of data and apply it to study some regularization functionals that are important in learning theory. We consider specific examples of such functionals corresponding to multiple-output regularization networks and support vector machines, for both regression and classification. Finally, we provide classes of operator-valued kernels of the dot product and translation-invariant type.
引用
收藏
页码:177 / 204
页数:28
相关论文
共 50 条