Orthogonal least squares regression for feature extraction

被引:42
|
作者
Zhao, Haifeng [1 ,2 ]
Wang, Zheng [1 ,2 ]
Nie, Feiping [3 ,4 ]
机构
[1] Anhui Univ, MOE, Key Lab Intelligent Comp & Signal Proc, Hefei 230039, Peoples R China
[2] Anhui Univ, Sch Comp & Technol, Hefei 230039, Peoples R China
[3] Northwestern Polytech Univ, Sch Comp Sci, Xian 710072, Shanxi, Peoples R China
[4] Northwestern Polytech Univ, Ctr OPT IMagery Anal & Learning OPTIMAL, Xian 710072, Shanxi, Peoples R China
基金
中国国家自然科学基金;
关键词
Feature extraction; Least squares regression; Orthogonal constraint; Unbalanced orthogonal procrustes problem; DIMENSIONALITY REDUCTION; EFFICIENT;
D O I
10.1016/j.neucom.2016.07.037
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In many data mining applications, dimensionality reduction is a primary technique to map high-dimensional data to a lower dimensional space. In order to preserve more local structure information, we propose a novel orthogonal least squares regression model for feature extraction in this paper. The main contributions of this paper are shown as follows: first, the new least squares regression method is constructed under the orthogonal constraint which can preserve more discriminant information in the subspace. Second, the optimization problem of classical least squares regression can be solved easily. However the proposed objective function is an unbalanced orthogonal procrustes problem, it is so difficult to obtain the solution that we present a novel iterative optimization algorithm to obtain the optimal solution. The last one, we also provide a proof of the convergence for our iterative algorithm. Additionally, experimental results show that we obtain a global optimal solution through our iterative algorithm even though the optimization problem is a non-convex problem. Both theoretical analysis and empirical studies demonstrate that our method can more effectively reduce the data dimensionality than conventional methods. (C) 2016 Elsevier B.V. All rights reserved.
引用
收藏
页码:200 / 207
页数:8
相关论文
共 50 条
  • [31] Efficient Sparse Kernel Feature Extraction Based on Partial Least Squares
    Dhanjal, Charanpal
    Gunn, Steve R.
    Shawe-Taylor, John
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2009, 31 (08) : 1347 - 1361
  • [32] Output relevant slow feature extraction using partial least squares
    Chiplunkar, Ranjith
    Huang, Biao
    CHEMOMETRICS AND INTELLIGENT LABORATORY SYSTEMS, 2019, 191 : 148 - 157
  • [33] Novel Kernel Orthogonal Partial Least Squares for Dominant Sensor Data Extraction
    Chen, Bo-Wei
    IEEE ACCESS, 2020, 8 (08): : 36131 - 36139
  • [34] LEAST MEDIAN OF SQUARES REGRESSION
    ROUSSEEUW, PJ
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 1984, 79 (388) : 871 - 880
  • [35] Least Squares Percentage Regression
    Tofallis, Chris
    JOURNAL OF MODERN APPLIED STATISTICAL METHODS, 2008, 7 (02) : 526 - 534
  • [36] Partial least squares regression
    deJong, S
    Phatak, A
    RECENT ADVANCES IN TOTAL LEAST SQUARES TECHNIQUES AND ERRORS-IN-VARIABLES MODELING, 1997, : 25 - 36
  • [37] LINEAR LEAST SQUARES REGRESSION
    WATSON, GS
    ANNALS OF MATHEMATICAL STATISTICS, 1967, 38 (06): : 1679 - &
  • [38] Feature Selection using Partial Least Squares Regression and Optimal Experiment Design
    Nagaraja, Varun K.
    Abd-Almageed, Wael
    2015 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2015,
  • [39] Multiclass Classification and Feature Selection Based on Least Squares Regression with Large Margin
    Zhao, Haifeng
    Wang, Siqi
    Wang, Zheng
    NEURAL COMPUTATION, 2018, 30 (10) : 2781 - 2804
  • [40] Perturbation Analysis of Orthogonal Least Squares
    Geng, Pengbo
    Chen, Wengu
    Ge, Huanmin
    CANADIAN MATHEMATICAL BULLETIN-BULLETIN CANADIEN DE MATHEMATIQUES, 2019, 62 (04): : 780 - 797