On extreme learning machine for ε-insensitive regression in the primal by Newton method

被引:10
|
作者
Balasundaram, S. [1 ]
Kapil [1 ]
机构
[1] Jawaharlal Nehru Univ, Sch Comp & Syst Sci, New Delhi 110067, India
来源
NEURAL COMPUTING & APPLICATIONS | 2013年 / 22卷 / 3-4期
关键词
Extreme learning machine; Generalized Hessian matrix; Newton method; Single hidden layer feedforward neural networks; Smoothing technique; Support vector regression; SUPPORT VECTOR MACHINE; TIME-SERIES; CLASSIFICATION;
D O I
10.1007/s00521-011-0798-9
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, extreme learning machine (ELM) for epsilon-insensitive error loss function-based regression problem formulated in 2-norm as an unconstrained optimization problem in primal variables is proposed. Since the objective function of this unconstrained optimization problem is not twice differentiable, the popular generalized Hessian matrix and smoothing approaches are considered which lead to optimization problems whose solutions are determined using fast Newton-Armijo algorithm. The main advantage of the algorithm is that at each iteration, a system of linear equations is solved. By performing numerical experiments on a number of interesting synthetic and real-world datasets, the results of the proposed method are compared with that of ELM using additive and radial basis function hidden nodes and of support vector regression (SVR) using Gaussian kernel. Similar or better generalization performance of the proposed method on the test data in comparable computational time over ELM and SVR clearly illustrates its efficiency and applicability.
引用
收藏
页码:559 / 567
页数:9
相关论文
共 50 条
  • [21] Regularized extreme learning machine for regression with missing data
    Yu, Qi
    Miche, Yoan
    Eirola, Emil
    van Heeswijk, Mark
    Severin, Eric
    Lendasse, Amaury
    [J]. NEUROCOMPUTING, 2013, 102 : 45 - 51
  • [22] Mixture extreme learning machine algorithm for robust regression
    Zhao, Shangrui
    Chen, Xuan-Ang
    Wu, Jinran
    Wang, You-Gan
    [J]. KNOWLEDGE-BASED SYSTEMS, 2023, 280
  • [23] Application of Extreme Learning Machine Algorithm in the Regression Fitting
    Li, Gu-Xiong
    [J]. 2016 INTERNATIONAL CONFERENCE ON INFORMATION SYSTEM AND ARTIFICIAL INTELLIGENCE (ISAI 2016), 2016, : 419 - 422
  • [24] Parallel extreme learning machine for regression based on MapReduce
    He, Qing
    Shang, Tianfeng
    Zhuang, Fuzhen
    Shi, Zhongzhi
    [J]. NEUROCOMPUTING, 2013, 102 : 52 - 58
  • [25] Mixture Regression Estimation based on Extreme Learning Machine
    Mao, Wentao
    Wang, Yali
    Cao, Xizheng
    Zheng, Yanbin
    [J]. JOURNAL OF COMPUTERS, 2013, 8 (11) : 2925 - 2933
  • [26] An Enhanced Extreme Learning Machine Based on Liu Regression
    Hasan Yıldırım
    M. Revan Özkale
    [J]. Neural Processing Letters, 2020, 52 : 421 - 442
  • [27] Two-stage extreme learning machine for regression
    Lan, Yuan
    Soh, Yeng Chai
    Huang, Guang-Bin
    [J]. NEUROCOMPUTING, 2010, 73 (16-18) : 3028 - 3038
  • [28] Evolutionary selection extreme learning machine optimization for regression
    Feng, Guorui
    Qian, Zhenxing
    Zhang, Xinpeng
    [J]. SOFT COMPUTING, 2012, 16 (09) : 1485 - 1491
  • [29] Bidirectional Extreme Learning Machine for Regression Problem and Its Learning Effectiveness
    Yang, Yimin
    Wang, Yaonan
    Yuan, Xiaofang
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2012, 23 (09) : 1498 - 1505
  • [30] Recursive finite Newton algorithm for support vector regression in the primal
    Bo, Liefeng
    Wang, Ling
    Jiao, Licheng
    [J]. NEURAL COMPUTATION, 2007, 19 (04) : 1082 - 1096