Principal Component Analysis on Graph-Hessian

被引:0
|
作者
Pan, Yichen [1 ]
Liu, Weifeng [1 ]
Zhou, Yicong [2 ]
Nie, Liqiang [3 ]
机构
[1] China Univ Petr East China, Coll Control Sci & Engn, Qingdao, Peoples R China
[2] Univ Macau, Fac Sci & Technol, Macau, Peoples R China
[3] Shandong Univ, Sch Comp Sci & Technol, Qingdao, Peoples R China
基金
中国国家自然科学基金;
关键词
dimensionality reduction; principal component analysis; manifold learning; graph; hessian regularization; PCA;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Principal Component Analysis (PCA) is a widely used linear dimensionality reduction method, which assumes that the data are drawn from a low-dimensional affine subspace of a high-dimensional space. However, it only uses the feature information of the samples. By exploiting structural information of data and embedding it into the PCA framework, the local positional relationship between samples in the original space can be preserved, so that the performance of downstream tasks based on PCA can be improved. In this paper, we introduce Hessian regularization into PCA and propose a new model called Graph-Hessian Principal Component Analysis (GHPCA). Hessian can correctly use the intrinsic local geometry of the data manifold. It is better able to maintain the neighborhood relationship between data in high-dimensional space. Compared with other Laplacian-based models, our model can obtain more abundant structural information after dimensionality reduction, and it can better restore low-dimensional structures. By comparing with several methods of PCA, GLPCA, RPCA and RPCAG, through the K-means clustering experiments on USPS handwritten digital dataset, YALE face dataset and COIL20 object image dataset, it is proved that our models are superior to other principal component analysis models in clustering tasks.
引用
收藏
页码:1494 / 1501
页数:8
相关论文
共 50 条
  • [1] Principal component analysis based on graph embedding
    Ju, Fujiao
    Sun, Yanfeng
    Li, Jianqiang
    Zhang, Yaxiao
    Piao, Xinglin
    MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 82 (05) : 7105 - 7116
  • [2] Principal component analysis based on graph embedding
    Fujiao Ju
    Yanfeng Sun
    Jianqiang Li
    Yaxiao Zhang
    Xinglin Piao
    Multimedia Tools and Applications, 2023, 82 : 7105 - 7116
  • [3] Graph-dual Laplacian principal component analysis
    He, Jinrong
    Bi, Yingzhou
    Liu, Bin
    Zeng, Zhigao
    JOURNAL OF AMBIENT INTELLIGENCE AND HUMANIZED COMPUTING, 2019, 10 (08) : 3249 - 3262
  • [4] Graph-dual Laplacian principal component analysis
    Jinrong He
    Yingzhou Bi
    Bin Liu
    Zhigao Zeng
    Journal of Ambient Intelligence and Humanized Computing, 2019, 10 : 3249 - 3262
  • [5] Sensitivity of principal Hessian direction analysis
    Prendergast, Luke A.
    Smith, Jodie A.
    ELECTRONIC JOURNAL OF STATISTICS, 2007, 1 : 253 - 267
  • [6] The independent and principal component of graph spectra
    Luo, B
    Wilson, RC
    Hancock, ER
    16TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION, VOL II, PROCEEDINGS, 2002, : 164 - 167
  • [7] Denoising Aggregation of Graph Neural Networks by Using Principal Component Analysis
    Dong, Wei
    Wozniak, Marcin
    Wu, Junsheng
    Li, Weigang
    Bai, Zongwen
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2023, 19 (03) : 2385 - 2394
  • [8] Image Clustering Based on Graph Regularized Robust Principal Component Analysis
    Jiang, Yan
    Liang, Wei
    Tang, Mingdong
    Xie, Yong
    Tang, Jintian
    BLOCKCHAIN AND TRUSTWORTHY SYSTEMS, BLOCKSYS 2019, 2020, 1156 : 563 - 573
  • [9] Latent graph-regularized inductive robust principal component analysis
    Wei, Lai
    Zhou, Rigui
    Yin, Jun
    Zhu, Changming
    Zhang, Xiafen
    Liu, Hao
    KNOWLEDGE-BASED SYSTEMS, 2019, 177 : 68 - 81
  • [10] Principal Component Projection Without Principal Component Analysis
    Frostig, Roy
    Musco, Cameron
    Musco, Christopher
    Sidford, Aaron
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 48, 2016, 48