Dimensionality Reduction and Wasserstein Stability for Kernel Regression

被引:0
|
作者
Eckstein, Stephan [1 ]
Iske, Armin [2 ]
Trabs, Mathias [3 ]
机构
[1] Swiss Fed Inst Technol, Dept Math, Zurich, Switzerland
[2] Univ Hamburg, Dept Math, Hamburg, Germany
[3] Karlsruhe Inst Technol, Dept Math, Karlsruhe, Germany
关键词
Kernel regression; Dimensionality reduction; Principal component analysis; Stability; Wasserstein distance; OPTIMAL RATES; PRINCIPAL COMPONENT; LEARNING RATES; REGULARIZATION; APPROXIMATION; ROBUSTNESS; THEOREM;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In a high-dimensional regression framework, we study consequences of the naive two-step procedure where first the dimension of the input variables is reduced and second, the reduced input variables are used to predict the output variable with kernel regression. In order to analyze the resulting regression errors, a novel stability result for kernel regression with respect to the Wasserstein distance is derived. This allows us to bound errors that occur when perturbed input data is used to fit the regression function. We apply the general stability result to principal component analysis (PCA). Exploiting known estimates from the literature on both principal component analysis and kernel regression, we deduce convergence rates for the two-step procedure. The latter turns out to be particularly useful in a semi-supervised setting.
引用
收藏
页数:35
相关论文
共 50 条
  • [21] KERNEL DIMENSION REDUCTION IN REGRESSION
    Fukumizu, Kenji
    Bach, Francis R.
    Jordan, Michael I.
    ANNALS OF STATISTICS, 2009, 37 (04): : 1871 - 1905
  • [22] The Kernel-Based Regression for Seismic Attenuation Estimation on Wasserstein Space
    Zhang, Mingke
    Gao, Jinghuai
    Wang, Zhiguo
    Yang, Yang
    Liu, Naihao
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2024, 62
  • [23] Learning bounds for kernel regression using effective data dimensionality
    Zhang, T
    NEURAL COMPUTATION, 2005, 17 (09) : 2077 - 2098
  • [24] Effect of dimensionality on convergence rates of kernel ridge regression estimator
    Bak, Kwan-Young
    Lee, Woojoo
    JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2025, 236
  • [25] Linear regression based projections for dimensionality reduction
    Chen, Si-Bao
    Ding, Chris H. Q.
    Luo, Bin
    INFORMATION SCIENCES, 2018, 467 : 74 - 86
  • [26] Structured Dimensionality Reduction for Additive Model Regression
    Fawzi, Alhussein
    Fiot, Jean-Baptiste
    Chen, Bei
    Sinn, Mathieu
    Frossard, Pascal
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2016, 28 (06) : 1589 - 1601
  • [27] Dimensionality reduction based on ICA for regression problems
    Kwak, Nojun
    Kim, Chunghoon
    ARTIFICIAL NEURAL NETWORKS - ICANN 2006, PT 1, 2006, 4131 : 1 - 10
  • [28] Unsupervised nearest neighbor regression for dimensionality reduction
    Kramer, Oliver
    SOFT COMPUTING, 2015, 19 (06) : 1647 - 1661
  • [29] Dimensionality reduction based on ICA for regression problems
    Kwak, Nojun
    Kim, Chunghoon
    Kim, Hwangnam
    NEUROCOMPUTING, 2008, 71 (13-15) : 2596 - 2603
  • [30] Dimensionality Reduction via Regression in Hyperspectral Imagery
    Laparra, Valero
    Malo, Jesus
    Camps-Valls, Gustau
    IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2015, 9 (06) : 1026 - 1036