A generalized least-squares approach regularized with graph embedding for dimensionality reduction

被引:44
|
作者
Shen, Xiang-Jun [1 ]
Liu, Si-Xing [1 ]
Bao, Bing-Kun [2 ]
Pan, Chun-Hong [3 ]
Zha, Zheng-Jun [4 ]
Fan, Jianping [5 ]
机构
[1] JiangSu Univ, Sch Comp Sci & Commun Engn, Nanjing 212013, Jiangsu, Peoples R China
[2] Nanjing Univ Posts & Telecommun, Nanjing, Jiangsu, Peoples R China
[3] Chinese Acad Sci, Inst Automat, Natl Lab Pattern Recognit, Beijing 100190, Peoples R China
[4] Univ Sci & Technol China, Sch Informat Sci & Technol, Hefei, Anhui, Peoples R China
[5] Univ N Carolina, Dept Comp Sci, Charlotte, NC 28223 USA
基金
中国国家自然科学基金;
关键词
Dimensionality reduction; Graph embedding; Subspace learning; Least-squares; PRESERVING PROJECTIONS; EIGENMAPS; FRAMEWORK;
D O I
10.1016/j.patcog.2019.107023
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In current graph embedding methods, low dimensional projections are obtained by preserving either global geometrical structure of data or local geometrical structure of data. In this paper, the PCA (Principal Component Analysis) idea of minimizing least-squares reconstruction errors is regularized with graph embedding, to unify various local manifold embedding methods within a generalized framework to keep global and local low dimensional subspace. Different from the well-known PCA method, our proposed generalized least-squares approach considers data distributions together with an instance penalty in each data point. In this way, PCA is viewed as a special instance of our proposed generalized least squares framework for preserving global projections. Applying a regulation of graph embedding, we can obtain projection that preserves both intrinsic geometrical structure and global structure of data. From the experimental results on a variety of face and handwritten digit recognition, our proposed method has advantage of superior performances in keeping lower dimensional subspaces and higher classification results than state-of-the-art graph embedding methods. (C) 2019 Elsevier Ltd. All rights reserved.
引用
收藏
页数:10
相关论文
共 50 条
  • [41] GENERALIZED SOLUTIONS OF METHOD OF LEAST-SQUARES
    LEONOV, YP
    DOKLADY AKADEMII NAUK SSSR, 1972, 206 (01): : 37 - &
  • [42] QR FACTORIZATION FOR THE REGULARIZED LEAST-SQUARES PROBLEM ON HYPERCUBES
    ZHU, JP
    PARALLEL COMPUTING, 1993, 19 (08) : 939 - 948
  • [43] A Sparse Regularized Least-Squares Preference Learning Algorithm
    Tsivtsivadze, Evgeni
    Pahikkala, Tapio
    Airola, Antti
    Boberg, Jorma
    Salakoski, Tapio
    TENTH SCANDINAVIAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2008, 173 : 76 - 83
  • [44] Load identification with regularized total least-squares method
    Tang, Zhonghua
    Zhang, Zhifei
    Xu, Zhongming
    He, Yansong
    Jin, Jie
    JOURNAL OF VIBRATION AND CONTROL, 2022, 28 (21-22) : 3058 - 3069
  • [45] ON ACCELERATING THE REGULARIZED ALTERNATING LEAST-SQUARES ALGORITHM FOR TENSORS
    Wang, Xiaofei
    Navasca, Carmeliza
    Kindermann, Stefan
    ELECTRONIC TRANSACTIONS ON NUMERICAL ANALYSIS, 2018, 48 : 1 - 14
  • [46] Analytical bounds on the minimizers of (nonconvex) regularized least-squares
    Nikolova, Mila
    INVERSE PROBLEMS AND IMAGING, 2008, 2 (01) : 133 - 149
  • [47] VARIANCE REDUCTION IN STOCHASTIC METHODS FOR LARGE-SCALE REGULARIZED LEAST-SQUARES PROBLEMS
    Pilavci, Yusuf Yigit
    Amblard, Pierre-Olivier
    Barthelme, Simon
    Tremblay, Nicolas
    2022 30TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO 2022), 2022, : 1771 - 1775
  • [48] Practical configurations to recover the regularized least-squares solution
    Sundaram, R
    APPLICATIONS OF DIGITAL IMAGE PROCESSING XXII, 1999, 3808 : 436 - 446
  • [49] Statistical and Heuristic Model Selection in Regularized Least-Squares
    Braga, Igor
    Monard, Maria Carolina
    2013 BRAZILIAN CONFERENCE ON INTELLIGENT SYSTEMS (BRACIS), 2013, : 231 - 236
  • [50] Early-Stopping Regularized Least-Squares Classification
    Li, Wenye
    ADVANCES IN NEURAL NETWORKS - ISNN 2014, 2014, 8866 : 278 - 285