Finite-Sample Analysis of Fixed-k Nearest Neighbor Density Functional Estimators

被引:0
|
作者
Singh, Shashank [1 ,2 ]
Poczos, Barnabas [2 ]
机构
[1] Carnegie Mellon Univ, Stat Dept, Pittsburgh, PA 15213 USA
[2] Carnegie Mellon Univ, Machine Learning Dept, Pittsburgh, PA 15213 USA
基金
美国国家科学基金会;
关键词
ENTROPY;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We provide finite-sample analysis of a general framework for using k-nearest neighbor statistics to estimate functionals of a nonparametric continuous probability density, including entropies and divergences. Rather than plugging a consistent density estimate (which requires k -> infinity as the sample size n -> infinity) into the functional of interest, the estimators we consider fix k and perform a bias correction. This is more efficient computationally, and, as we show in certain cases, statistically, leading to faster convergence rates. Our framework unifies several previous estimators, for most of which ours are the first finite sample guarantees.
引用
收藏
页数:9
相关论文
共 50 条