Least squares support vector machines with tuning based on chaotic differential evolution approach applied to the identification of a thermal process

被引:45
|
作者
dos Santos, Glauber Souto [2 ]
Justi Luvizotto, Luiz Guilherme [3 ]
Mariani, Viviana Cocco [3 ]
Coelho, Leandro dos Santos [1 ]
机构
[1] Pontificia Univ Catolica Parana, Ind & Syst Engn Grad Program, PPGEPS, PUCPR, BR-80215901 Curitiba, Parana, Brazil
[2] Pontificia Univ Catolica Parana, Mechatron Engn Undergrad Program, PUCPR, BR-80215901 Curitiba, Parana, Brazil
[3] Pontificia Univ Catolica Parana, Mech Engn Grad Program, PPGEM, PUCPR, BR-80215901 Curitiba, Parana, Brazil
关键词
Least squares support vector machines; Chaotic differential evolution; Identification; PARTICLE SWARM OPTIMIZATION; SVM; ALGORITHM; REGRESSION; PATTERN; CLASSIFICATION; DIAGNOSIS;
D O I
10.1016/j.eswa.2011.09.137
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In the past decade, support vector machines (SVMs) have gained the attention of many researchers. SVMs are non-parametric supervised learning schemes that rely on statistical learning theory which enables learning machines to generalize well to unseen data. SVMs refer to kernel-based methods that have been introduced as a robust approach to classification and regression problems, lately has handled nonlinear identification problems, the so called support vector regression. In SVMs designs for nonlinear identification, a nonlinear model is represented by an expansion in terms of nonlinear mappings of the model input. The nonlinear mappings define a feature space, which may have infinite dimension. In this context, a relevant identification approach is the least squares support vector machines (LS-SVMs). Compared to the other identification method, LS-SVMs possess prominent advantages: its generalization performance (i.e. error rates on test sets) either matches or is significantly better than that of the competing methods, and more importantly, the performance does not depend on the dimensionality of the input data. Consider a constrained optimization problem of quadratic programing with a regularized cost function, the training process of LS-SVM involves the selection of kernel parameters and the regularization parameter of the objective function. A good choice of these parameters is crucial for the performance of the estimator. In this paper, the LS-SVMs design proposed is the combination of LS-SVM and a new chaotic differential evolution optimization approach based on Ikeda map (CDEK). The CDEK is adopted in tuning of regularization parameter and the radial basis function bandwith. Simulations using LS-SVMs on NARX (Nonlinear AutoRegressive with exogenous inputs) for the identification of a thermal process show the effectiveness and practicality of the proposed CDEK algorithm when compared with the classical DE approach. (C) 2011 Published by Elsevier Ltd.
引用
收藏
页码:4805 / 4812
页数:8
相关论文
共 50 条
  • [21] Optimizing the tuning parameters of least squares support vector machines regression for NIR spectra
    Coen, T.
    Saeys, W.
    Ramon, H.
    De Baerdemaeker, J.
    JOURNAL OF CHEMOMETRICS, 2006, 20 (05) : 184 - 192
  • [22] Subspace identification of Hammerstein systems using least squares support vector machines
    Goethals, I
    Pelckmans, K
    Suykens, JAK
    De Moor, B
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2005, 50 (10) : 1509 - 1519
  • [23] Identification of MIMO Hammerstein models using least squares support vector machines
    Goethals, I
    Pelckmans, K
    Suykens, JAK
    De Moor, B
    AUTOMATICA, 2005, 41 (07) : 1263 - 1272
  • [24] Temperature prediction control based on least squares support vector machines
    Bin Liu
    Hongye Su
    Weihua Huang
    Jian Chu
    Journal of Control Theory and Applications, 2004, 2 (4): : 365 - 370
  • [25] Least Squares Support Vector Machines based on Fuzzy Rough Set
    Zhang, Zhi-Wei
    He, Qiang
    Chen, De-Gang
    Wang, Hui
    IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN AND CYBERNETICS (SMC 2010), 2010,
  • [26] Modeling and control of PEMFC based on least squares support vector machines
    Li, X
    Cao, GY
    Zhu, XJ
    ENERGY CONVERSION AND MANAGEMENT, 2006, 47 (7-8) : 1032 - 1050
  • [27] New least squares support vector machines based on matrix patterns
    Wang, Zhe
    Chen, Songean
    NEURAL PROCESSING LETTERS, 2007, 26 (01) : 41 - 56
  • [28] Predictive control algorithm based on least squares support vector machines
    Liu, Bin
    Su, Hong-Ye
    Chu, Jian
    Kongzhi yu Juece/Control and Decision, 2004, 19 (12): : 1399 - 1402
  • [29] Subspace Based Least Squares Support Vector Machines for Pattern Classification
    Kitamura, Takuya
    Abe, Shigeo
    Fukui, Kazuhiro
    IJCNN: 2009 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1- 6, 2009, : 1275 - +
  • [30] Sparseness of least squares support vector machines based on active learning
    Yu, Zheng-Tao
    Zou, Jun-Jie
    Zhao, Xing
    Su, Lei
    Mao, Cun-Li
    Nanjing Li Gong Daxue Xuebao/Journal of Nanjing University of Science and Technology, 2012, 36 (01): : 12 - 17