Dimensionality Reduction and Wasserstein Stability for Kernel Regression

被引:0
|
作者
Eckstein, Stephan [1 ]
Iske, Armin [2 ]
Trabs, Mathias [3 ]
机构
[1] Swiss Fed Inst Technol, Dept Math, Zurich, Switzerland
[2] Univ Hamburg, Dept Math, Hamburg, Germany
[3] Karlsruhe Inst Technol, Dept Math, Karlsruhe, Germany
关键词
Kernel regression; Dimensionality reduction; Principal component analysis; Stability; Wasserstein distance; OPTIMAL RATES; PRINCIPAL COMPONENT; LEARNING RATES; REGULARIZATION; APPROXIMATION; ROBUSTNESS; THEOREM;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In a high-dimensional regression framework, we study consequences of the naive two-step procedure where first the dimension of the input variables is reduced and second, the reduced input variables are used to predict the output variable with kernel regression. In order to analyze the resulting regression errors, a novel stability result for kernel regression with respect to the Wasserstein distance is derived. This allows us to bound errors that occur when perturbed input data is used to fit the regression function. We apply the general stability result to principal component analysis (PCA). Exploiting known estimates from the literature on both principal component analysis and kernel regression, we deduce convergence rates for the two-step procedure. The latter turns out to be particularly useful in a semi-supervised setting.
引用
收藏
页数:35
相关论文
共 50 条
  • [41] Kernel based nonlinear dimensionality reduction and classification for genomic microarray
    Li, Xuehua
    Shu, Lan
    SENSORS, 2008, 8 (07): : 4186 - 4200
  • [42] Four algorithms to construct a sparse kriging kernel for dimensionality reduction
    Blanchet-Scalliet, Christophette
    Helbert, Celine
    Ribaud, Melina
    Vial, Celine
    COMPUTATIONAL STATISTICS, 2019, 34 (04) : 1889 - 1909
  • [43] Dimensionality Reduction and Bandwidth Selection for Spatial Kernel Discriminant Analysis
    Boumeddane, Soumia
    Hamdad, Leila
    Haddadou, Hamid
    Dabo-Niang, Sophie
    ICAART: PROCEEDINGS OF THE 13TH INTERNATIONAL CONFERENCE ON AGENTS AND ARTIFICIAL INTELLIGENCE - VOL 2, 2021, : 278 - 285
  • [44] EARLY FAILURE PREDICTION IN SOFTWARE PROGRAMS: DIMENSIONALITY REDUCTION KERNEL
    Naree, Somaye Arabi
    Parsa, Saeed
    COMPUTING AND INFORMATICS, 2016, 35 (05) : 1110 - 1140
  • [45] Visualization of Regression Models Using Discriminative Dimensionality Reduction
    Schulz, Alexander
    Hammer, Barbara
    COMPUTER ANALYSIS OF IMAGES AND PATTERNS, CAIP 2015, PT II, 2015, 9257 : 437 - 449
  • [46] Supervised Dimensionality Reduction Methods via Recursive Regression
    Liu, Yun
    Zhang, Rui
    Nie, Feiping
    Li, Xuelong
    Ding, Chris
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (09) : 3269 - 3279
  • [47] Compressed Spectral Regression for Efficient Nonlinear Dimensionality Reduction
    Cai, Deng
    PROCEEDINGS OF THE TWENTY-FOURTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE (IJCAI), 2015, : 3359 - 3365
  • [48] Dimensionality reduction for supervised learning with reproducing kernel Hilbert spaces
    Fukumizu, K
    Bach, FR
    Jordan, MI
    JOURNAL OF MACHINE LEARNING RESEARCH, 2004, 5 : 73 - 99
  • [49] Kernel-based nonlinear dimensionality reduction for electrocardiogram recognition
    Xuehua Li
    Lan Shu
    Hongli Hu
    Neural Computing and Applications, 2009, 18 : 1013 - 1020
  • [50] Dimensionality reduction using kernel pooled local discriminant information
    Zhang, P
    Peng, J
    Domeniconi, C
    THIRD IEEE INTERNATIONAL CONFERENCE ON DATA MINING, PROCEEDINGS, 2003, : 701 - 704