Sparse data-dependent kernel principal component analysis based on least squares support vector machine for feature extraction and recognition

被引:11
|
作者
Li, Jun-Bao [1 ]
Gao, Huijun [2 ]
机构
[1] Harbin Inst Technol, Dept Automat Test & Control, Harbin 150001, Peoples R China
[2] Harbin Inst Technol, Dept Control Sci & Engn, Harbin 150001, Peoples R China
来源
NEURAL COMPUTING & APPLICATIONS | 2012年 / 21卷 / 08期
基金
美国国家科学基金会;
关键词
Kernel method; Kernel principal component analysis; Sparse learning; Data-dependent kernel function; Feature extraction; Computation efficiency;
D O I
10.1007/s00521-011-0600-z
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Kernel learning is widely used in many areas, and many methods are developed. As a famous kernel learning method, kernel principal component analysis (KPCA) endures two problems in the practical applications. One is that all training samples need to be stored for the computing the kernel matrix during kernel learning. Second is that the kernel and its parameter have the heavy influence on the performance of kernel learning. In order to solve the above problem, we present a novel kernel learning namely sparse data-dependent kernel principal component analysis through reducing the training samples with sparse learning-based least squares support vector machine and adaptive self-optimizing kernel structure according to the input training samples. Experimental results on UCI datasets, ORL and YALE face databases, and Wisconsin Breast Cancer database show that it is feasible to improve KPCA on saving consuming space and optimizing kernel structure.
引用
收藏
页码:1971 / 1980
页数:10
相关论文
共 50 条
  • [1] Sparse data-dependent kernel principal component analysis based on least squares support vector machine for feature extraction and recognition
    Jun-Bao Li
    Huijun Gao
    [J]. Neural Computing and Applications, 2012, 21 : 1971 - 1980
  • [2] Face Recognition with Kernel Principal Component Analysis and Support Vector Machine
    Liliana, Dewi Yanti
    Setiawan, I. Made Agus
    [J]. 2019 INTERNATIONAL CONFERENCE ON INFORMATICS, MULTIMEDIA, CYBER AND INFORMATION SYSTEM (ICIMCIS), 2019, : 175 - 179
  • [3] Sparse Least Squares Support Vector Machine With Adaptive Kernel Parameters
    Yang, Chaoyu
    Yang, Jie
    Ma, Jun
    [J]. INTERNATIONAL JOURNAL OF COMPUTATIONAL INTELLIGENCE SYSTEMS, 2020, 13 (01) : 212 - 222
  • [4] Sparse Least Squares Support Vector Machine With Adaptive Kernel Parameters
    Chaoyu Yang
    Jie Yang
    Jun Ma
    [J]. International Journal of Computational Intelligence Systems, 2020, 13 : 212 - 222
  • [5] Principal Composite Kernel Feature Analysis: Data-Dependent Kernel Approach
    Motai, Yuichi
    Yoshida, Hiroyuki
    [J]. IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2013, 25 (08) : 1863 - 1875
  • [6] Speech Emotion Recognition Based on Kernel Principal Component Analysis and Optimized Support Vector Machine
    Chen, Chuang
    Chellali, Ryad
    Xing, Yin
    [J]. 2018 EIGHTH INTERNATIONAL CONFERENCE ON INSTRUMENTATION AND MEASUREMENT, COMPUTER, COMMUNICATION AND CONTROL (IMCCC 2018), 2018, : 751 - 755
  • [7] Tool wear prediction based on kernel principal component analysis and least square support vector machine
    Gao, Kangping
    Xu, Xinxin
    Jiao, Shengjie
    [J]. MEASUREMENT SCIENCE AND TECHNOLOGY, 2024, 35 (10)
  • [8] Efficient Sparse Kernel Feature Extraction Based on Partial Least Squares
    Dhanjal, Charanpal
    Gunn, Steve R.
    Shawe-Taylor, John
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2009, 31 (08) : 1347 - 1361
  • [9] Least Squares Support Vector Machine Regression Based on Sparse Samples and Mixture Kernel Learning
    Ma, Wenlu
    Liu, Han
    [J]. INFORMATION TECHNOLOGY AND CONTROL, 2021, 50 (02): : 319 - 331
  • [10] Sparse Principal Component Analysis Based on Least Trimmed Squares
    Wang, Yixin
    Van Aelst, Stefan
    [J]. TECHNOMETRICS, 2020, 62 (04) : 473 - 485