Kernel Mean Shrinkage Estimators

被引:0
|
作者
Muandet, Krikamol [1 ]
Sriperumbudur, Bharath [2 ]
Fukumizu, Kenji [3 ]
Gretton, Arthur [4 ]
Schoelkopf, Bernhard [1 ]
机构
[1] Max Planck Inst Intelligent Syst, Empir Inference Dept, Spemannstr 38, D-72076 Tubingen, Germany
[2] Penn State Univ, Dept Stat, University Pk, PA 16802 USA
[3] Inst Stat Math, 10-3 Midoricho, Tachikawa, Tokyo 1908562, Japan
[4] UCL, CSML, Gatsby Computat Neurosci Unit, Alexandra House,17 Queen Sq, London WC1N 3AR, England
关键词
covariance operator; James-Stein estimators; kernel methods; kernel mean; shrinkage estimators; Stein effect; Tikhonov regularization;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
A mean function in a reproducing kernel Hilbert space (RKHS), or a kernel mean, is central to kernel methods in that it is used by many classical algorithms such as kernel principal component analysis, and it also forms the core inference step of modern kernel methods that rely on embedding probability distributions in RKHSs. Given a finite sample, an empirical average has been used commonly as a standard estimator of the true kernel mean. Despite a widespread use of this estimator, we show that it can be improved thanks to the well-known Stein phenomenon. We propose a new family of estimators called kernel mean shrinkage estimators (KMSEs), which bene fit from both theoretical justifications and good empirical performance. The results demonstrate that the proposed estimators outperform the standard one, especially in a "large d, small n" paradigm.
引用
收藏
页码:1 / 41
页数:41
相关论文
共 50 条