Dimensionality Reduction and Wasserstein Stability for Kernel Regression

被引:0
|
作者
Eckstein, Stephan [1 ]
Iske, Armin [2 ]
Trabs, Mathias [3 ]
机构
[1] Swiss Fed Inst Technol, Dept Math, Zurich, Switzerland
[2] Univ Hamburg, Dept Math, Hamburg, Germany
[3] Karlsruhe Inst Technol, Dept Math, Karlsruhe, Germany
关键词
Kernel regression; Dimensionality reduction; Principal component analysis; Stability; Wasserstein distance; OPTIMAL RATES; PRINCIPAL COMPONENT; LEARNING RATES; REGULARIZATION; APPROXIMATION; ROBUSTNESS; THEOREM;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In a high-dimensional regression framework, we study consequences of the naive two-step procedure where first the dimension of the input variables is reduced and second, the reduced input variables are used to predict the output variable with kernel regression. In order to analyze the resulting regression errors, a novel stability result for kernel regression with respect to the Wasserstein distance is derived. This allows us to bound errors that occur when perturbed input data is used to fit the regression function. We apply the general stability result to principal component analysis (PCA). Exploiting known estimates from the literature on both principal component analysis and kernel regression, we deduce convergence rates for the two-step procedure. The latter turns out to be particularly useful in a semi-supervised setting.
引用
收藏
页数:35
相关论文
共 50 条
  • [1] Multiple Kernel Spectral Regression for Dimensionality Reduction
    Liu, Bing
    Xia, Shixiong
    Zhou, Yong
    JOURNAL OF APPLIED MATHEMATICS, 2013,
  • [2] Dimensionality Reduction for Wasserstein Barycenter
    Izzo, Zachary
    Silwal, Sandeep
    Zhou, Samson
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [3] REGRESSION USING GAUSSIAN PROCESS MANIFOLD KERNEL DIMENSIONALITY REDUCTION
    Moon, Kooksang
    Pavlovic, Vladimir
    2008 IEEE WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING, 2008, : 14 - 19
  • [4] Linearized Wasserstein dimensionality reduction with approximation guarantees
    Cloninger, Alexander
    Hamm, Keaton
    Khurana, Varun
    Moosmueller, Caroline
    APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, 2025, 74
  • [5] Multiple kernel dimensionality reduction via spectral regression and trace ratio maximization
    Liu, Mingming
    Sun, Wei
    Liu, Bing
    Knowledge-Based Systems, 2015, 83 (01) : 159 - 169
  • [6] Multiple kernel dimensionality reduction via spectral regression and trace ratio maximization
    School of Information and Electrical Engineering, China University of Mining and Technology, Xuzhou
    Jiangsu province, China
    不详
    Jiangsu province, China
    Knowl Based Syst, (159-169): : 159 - 169
  • [7] Multiple kernel dimensionality reduction via spectral regression and trace ratio maximization
    Liu, Mingming
    Sun, Wei
    Liu, Bing
    KNOWLEDGE-BASED SYSTEMS, 2015, 83 : 159 - 169
  • [8] Dimensionality reduction by unsupervised regression
    Carreira-Perpinan, Miguel A.
    Lu, Zhengdong
    2008 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, VOLS 1-12, 2008, : 2523 - +
  • [9] Dimensionality Reduction for Tukey Regression
    Clarkson, Kenneth L.
    Wang, Ruosong
    Woodruff, David P.
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [10] Multiple Kernel Learning for Dimensionality Reduction
    Lin, Yen-Yu
    Liu, Tyng-Luh
    Fuh, Chiou-Shann
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2011, 33 (06) : 1147 - 1160