An empirical study of dimensionality reduction in support vector machine

被引:0
|
作者
Cao, L. J.
Zhang, JingQing
Cai, Zongwu
Lim, Kian Guan
机构
[1] Fudan Univ, Shanghai 200433, Peoples R China
[2] Univ N Carolina, Dept Math & Stat, Charlotte, NC 28223 USA
[3] Singapore Management Univ, Dept Business, Singapore 259756, Singapore
关键词
support vector machines; principal component analysis; kernel principal component analysis; independent component analysis;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recently, the the support vector machine (SVM) has become a popular tool in time series forecasting. In developing a successful SVM forecaster, the first step is feature extraction. This paper proposes the applications of principal component analysis (PCA), kernel principal component analysis (KPCA) and independent component analysis (ICA) to SVM for feature extraction. The PCA linearly transforms the original inputs into new uncorrelated features. The KPCA is a nonlinear PCA developed by using the kernel method. In ICA, the original inputs are linearly transformed into features which are mutually statistically independent. By examining the sunspot data, Santa Fe data set A and five real futures contracts, the experiment shows that SVM by feature extraction using PCA, KPCA or ICA can perform better than that without feature extraction. Furthermore, among the three methods, there is the best performance in the KPCA feature extraction, followed by the ICA feature extraction.
引用
收藏
页码:177 / 192
页数:16
相关论文
共 50 条
  • [1] An empirical study of dimensionality reduction in support vector machine
    Financial Studies of Fudan University, HanDan Road, ShangHai 200433, China
    不详
    不详
    [J]. Neural Network World, 2006, 3 (177-192)
  • [2] Dimensionality Reduction by Soft-Margin Support Vector Machine
    Dong, Ruipeng
    Meng, Hua
    Long, Zhiguo
    Zhao, Hailiang
    [J]. 2017 IEEE INTERNATIONAL CONFERENCE ON AGENTS (ICA), 2017, : 154 - 156
  • [3] A comparison of PCA, KPCA and ICA for dimensionality reduction in support vector machine
    Cao, LJ
    Chua, KS
    Chong, WK
    Lee, HP
    Gu, QM
    [J]. NEUROCOMPUTING, 2003, 55 (1-2) : 321 - 336
  • [4] Support Vector Machines for Visualization and Dimensionality Reduction
    Maszczyk, Tomasz
    Duch, Wlodzislaw
    [J]. ARTIFICIAL NEURAL NETWORKS - ICANN 2008, PT I, 2008, 5163 : 346 - 356
  • [5] Cascade Support Vector Machines with Dimensionality Reduction
    Kramer, Oliver
    [J]. APPLIED COMPUTATIONAL INTELLIGENCE AND SOFT COMPUTING, 2015, 2015
  • [6] Recursive support vector machines for dimensionality reduction
    Tao, Qing
    Chu, Dejun
    Wang, Jue
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 2008, 19 (01): : 189 - 193
  • [7] Hybrid Dimensionality Reduction Method Based on Support Vector Machine and Independent Component Analysis
    Moon, Sangwoo
    Qi, Hairong
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2012, 23 (05) : 749 - 761
  • [8] Support vector machine based intrusion detection method combined with nonlinear dimensionality reduction algorithm
    Li, Xiaoping
    [J]. Sensors and Transducers, 2013, 159 (11): : 226 - 229
  • [9] Bit reduction support vector machine
    Luo, T
    Hall, LO
    Goldgof, DB
    Remsen, A
    [J]. FIFTH IEEE INTERNATIONAL CONFERENCE ON DATA MINING, PROCEEDINGS, 2005, : 733 - 736
  • [10] A study on imbalance support vector machine algorithms for sufficient dimension reduction
    Smallman, Luke
    Artemiou, Andreas
    [J]. COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 2017, 46 (06) : 2751 - 2763