Dimensionality Reduction and Wasserstein Stability for Kernel Regression

被引:0
|
作者
Eckstein, Stephan [1 ]
Iske, Armin [2 ]
Trabs, Mathias [3 ]
机构
[1] Swiss Fed Inst Technol, Dept Math, Zurich, Switzerland
[2] Univ Hamburg, Dept Math, Hamburg, Germany
[3] Karlsruhe Inst Technol, Dept Math, Karlsruhe, Germany
关键词
Kernel regression; Dimensionality reduction; Principal component analysis; Stability; Wasserstein distance; OPTIMAL RATES; PRINCIPAL COMPONENT; LEARNING RATES; REGULARIZATION; APPROXIMATION; ROBUSTNESS; THEOREM;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In a high-dimensional regression framework, we study consequences of the naive two-step procedure where first the dimension of the input variables is reduced and second, the reduced input variables are used to predict the output variable with kernel regression. In order to analyze the resulting regression errors, a novel stability result for kernel regression with respect to the Wasserstein distance is derived. This allows us to bound errors that occur when perturbed input data is used to fit the regression function. We apply the general stability result to principal component analysis (PCA). Exploiting known estimates from the literature on both principal component analysis and kernel regression, we deduce convergence rates for the two-step procedure. The latter turns out to be particularly useful in a semi-supervised setting.
引用
收藏
页数:35
相关论文
共 50 条
  • [31] Quantum algorithm for the nonlinear dimensionality reduction with arbitrary kernel
    Li, YaoChong
    Zhou, Ri-Gui
    Xu, RuiQing
    Hu, WenWen
    Fan, Ping
    QUANTUM SCIENCE AND TECHNOLOGY, 2021, 6 (01)
  • [32] Dimensionality reduction by feature clustering for regression problems
    Xu, Rong-Fang
    Lee, Shie-Jue
    INFORMATION SCIENCES, 2015, 299 : 42 - 57
  • [33] Unsupervised nearest neighbor regression for dimensionality reduction
    Oliver Kramer
    Soft Computing, 2015, 19 : 1647 - 1661
  • [34] Deep kernel dimensionality reduction for scalable data integration
    Sokolovska, Nataliya
    Clement, Karine
    Zucker, Jean-Daniel
    INTERNATIONAL JOURNAL OF APPROXIMATE REASONING, 2016, 74 : 121 - 132
  • [35] Dimensionality reduction by Mixed Kernel Canonical Correlation Analysis
    Zhu, Xiaofeng
    Huang, Zi
    Shen, Heng Tao
    Cheng, Jian
    Xu, Changsheng
    PATTERN RECOGNITION, 2012, 45 (08) : 3003 - 3016
  • [36] Wasserstein Regression
    Chen, Yaqing
    Lin, Zhenhua
    Muller, Hans-Georg
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2023, 118 (542) : 869 - 882
  • [37] Bias reduction in kernel binary regression
    Hazelton, Martin L.
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2007, 51 (09) : 4393 - 4402
  • [38] Kernel regression, minimax rates and effective dimensionality: Beyond the regular case
    Blanchard, Gilles
    Muecke, Nicole
    ANALYSIS AND APPLICATIONS, 2020, 18 (04) : 683 - 696
  • [39] Kernel-based nonlinear dimensionality reduction for electrocardiogram recognition
    Li, Xuehua
    Shu, Lan
    Hu, Hongli
    NEURAL COMPUTING & APPLICATIONS, 2009, 18 (08): : 1013 - 1020
  • [40] Generalized kernel framework for unsupervised spectral methods of dimensionality reduction
    Peluffo-Ordonez, Diego H.
    Lee, John Aldo
    Verleysen, Michel
    2014 IEEE SYMPOSIUM ON COMPUTATIONAL INTELLIGENCE AND DATA MINING (CIDM), 2014, : 171 - 177