Local Dimensionality Reduction for Non-Parametric Regression

被引:24
|
作者
Hoffmann, Heiko [1 ,2 ]
Schaal, Stefan [1 ]
Vijayakumar, Sethu [2 ]
机构
[1] Univ So Calif, Los Angeles, CA 90089 USA
[2] Univ Edinburgh, IPAB, Sch Informat, Edinburgh EH9 3JZ, Midlothian, Scotland
关键词
Correlation; Dimensionality reduction; Factor analysis; Incremental learning; Kernel function; Locally-weighted regression; Partial least squares; Principal component analysis; Principal component regression; Reduced-rank regression; MODELS;
D O I
10.1007/s11063-009-9098-0
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Locally-weighted regression is a computationally-efficient technique for non-linear regression. However, for high-dimensional data, this technique becomes numerically brittle and computationally too expensive if many local models need to be maintained simultaneously. Thus, local linear dimensionality reduction combined with locally-weighted regression seems to be a promising solution. In this context, we review linear dimensionality-reduction methods, compare their performance on non-parametric locally-linear regression, and discuss their ability to extend to incremental learning. The considered methods belong to the following three groups: (1) reducing dimensionality only on the input data, (2) modeling the joint input-output data distribution, and (3) optimizing the correlation between projection directions and output data. Group 1 contains principal component regression (PCR); group 2 contains principal component analysis (PCA) in joint input and output space, factor analysis, and probabilistic PCA; and group 3 contains reduced rank regression (RRR) and partial least squares (PLS) regression. Among the tested methods, only group 3 managed to achieve robust performance even for a non-optimal number of components (factors or projection directions). In contrast, group 1 and 2 failed for fewer components since these methods rely on the correct estimate of the true intrinsic dimensionality. In group 3, PLS is the only method for which a computationally-efficient incremental implementation exists. Thus, PLS appears to be ideally suited as a building block for a locally-weighted regressor in which projection directions are incrementally added on the fly.
引用
收藏
页码:109 / 131
页数:23
相关论文
共 50 条
  • [1] Local Dimensionality Reduction for Non-Parametric Regression
    Heiko Hoffmann
    Stefan Schaal
    Sethu Vijayakumar
    [J]. Neural Processing Letters, 2009, 29
  • [2] Dimensionality reduction based on non-parametric mutual information
    Faivishevsky, Lev
    Goldberger, Jacob
    [J]. NEUROCOMPUTING, 2012, 80 : 31 - 37
  • [3] BIAS AND VARIANCE REDUCTION PROCEDURES IN NON-PARAMETRIC REGRESSION
    Cockeran, Marike
    Swanepoel, Cornelia J.
    [J]. SOUTH AFRICAN STATISTICAL JOURNAL, 2016, 50 (01) : 123 - 148
  • [4] Non-parametric Regression Tests Using Dimension Reduction Techniques
    Haag, Berthold R.
    [J]. SCANDINAVIAN JOURNAL OF STATISTICS, 2008, 35 (04) : 719 - 738
  • [5] Non-parametric regression for networks
    Severn, Katie E.
    Dryden, Ian L.
    Preston, Simon P.
    [J]. STAT, 2021, 10 (01):
  • [6] Gradient-based explanation for non-linear non-parametric dimensionality reduction
    Corbugy, Sacha
    Marion, Rebecca
    Frenay, Benoit
    [J]. DATA MINING AND KNOWLEDGE DISCOVERY, 2024,
  • [7] Non-parametric regression methods
    Ince H.
    [J]. Computational Management Science, 2006, 3 (2) : 161 - 174
  • [8] A non-parametric dimensionality reduction technique using gradient descent of misclassification rate
    Redmond, S
    Heneghan, C
    [J]. PATTERN RECOGNITION AND IMAGE ANALYSIS, PT 2, PROCEEDINGS, 2005, 3687 : 155 - 164
  • [9] A note on combining parametric and non-parametric regression
    Rahman, M
    Gokhale, DV
    Ullah, A
    [J]. COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION, 1997, 26 (02) : 519 - 529
  • [10] Local Linear M-estimation in non-parametric spatial regression
    Lin, Zhengyan
    Li, Degui
    Gao, Jiti
    [J]. JOURNAL OF TIME SERIES ANALYSIS, 2009, 30 (03) : 286 - 314