Extreme Learning Machine With Affine Transformation Inputs in an Activation Function

被引:47
|
作者
Cao, Jiuwen [1 ,2 ]
Zhang, Kai [3 ]
Yong, Hongwei [4 ]
Lai, Xiaoping [1 ]
Chen, Badong [5 ]
Lin, Zhiping [6 ]
机构
[1] Hangzhou Dianzi Univ, Key Lab IOT & Informat Fus Technol Zhejiang, Hangzhou 310018, Zhejiang, Peoples R China
[2] Univ Wuppertal, Sch Elect Informat & Media Engn, D-42119 Wuppertal, Germany
[3] Harbin Inst Technol, Sch Comp Sci & Technol, Harbin 150001, Heilongjiang, Peoples R China
[4] Hong Kong Polytech Univ, Dept Comp, Hong Kong, Peoples R China
[5] Xi An Jiao Tong Univ, Sch Elect & Informat Engn, Xian 710049, Shaanxi, Peoples R China
[6] Nanyang Technol Univ, Sch Elect & Elect Engn, Singapore 639798, Singapore
基金
中国国家自然科学基金;
关键词
Affine transformation (AT) activation function; classification; extreme learning machine (ELM); maximum entropy; regression; CLASSIFICATION; APPROXIMATION;
D O I
10.1109/TNNLS.2018.2877468
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The extreme learning machine (ELM) has attracted much attention over the past decade due to its fast learning speed and convincing generalization performance. However, there still remains a practical issue to be approached when applying the ELM: the randomly generated hidden node parameters without tuning can lead to the hidden node outputs being nonuniformly distributed, thus giving rise to poor generalization performance. To address this deficiency, a novel activation function with an affine transformation (AT) on its input is introduced into the ELM, which leads to an improved ELM algorithm that is referred to as an AT-ELM in this paper. The scaling and translation parameters of the AT activation function are computed based on the maximum entropy principle in such a way that the hidden layer outputs approximately obey a uniform distribution. Application of the AT-ELM algorithm in nonlinear function regression shows its robustness to the range scaling of the network inputs. Experiments on nonlinear function regression, real-world data set classification, and benchmark image recognition demonstrate better performance for the AT-ELM compared with the original ELM, the regularized ELM, and the kernel ELM. Recognition results on benchmark image data sets also reveal that the AT-ELM outperforms several other state-of-the-art algorithms in general.
引用
收藏
页码:2093 / 2107
页数:15
相关论文
共 50 条
  • [1] Affine Transformation Based Hierarchical Extreme Learning Machine
    Ma, Rongzhi
    Cao, Jiuwen
    Wang, Tianlei
    Lai, Xiaoping
    [J]. 2020 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS), 2020,
  • [2] Regularization Activation Function for Extreme Learning Machine
    Ismail, Noraini
    Othman, Zulaiha Ali
    Samsudin, Noor Azah
    [J]. INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2019, 10 (03) : 240 - 247
  • [3] The extreme learning machine learning algorithm with tunable activation function
    Bin Li
    Yibin Li
    Xuewen Rong
    [J]. Neural Computing and Applications, 2013, 22 : 531 - 539
  • [4] The extreme learning machine learning algorithm with tunable activation function
    Li, Bin
    Li, Yibin
    Rong, Xuewen
    [J]. NEURAL COMPUTING & APPLICATIONS, 2013, 22 (3-4): : 531 - 539
  • [5] A Kind of Extreme Learning Machine Based on Memristor Activation Function
    Li, Hanman
    Wang, Lidan
    Duan, ShuKai
    [J]. PROCEEDINGS OF ELM-2017, 2019, 10 : 210 - 218
  • [6] METAHEURISTIC ALGORITHMS IN EXTREME LEARNING MACHINE FOR SELECTION OF PARAMETERS IN ACTIVATION FUNCTION
    Struniawski, Karol
    Konopka, Aleksandra
    Kozera, Ryszard
    [J]. 37TH ANNUAL EUROPEAN SIMULATION AND MODELLING CONFERENCE 2023, ESM 2023, 2023, : 239 - 243
  • [7] Evolutionary Extreme Learning Machine with novel activation function for credit scoring
    Tripathi, Diwakar
    Edla, Damodar Reddy
    Kuppili, Venkatanareshbabu
    Bablani, Annushree
    [J]. ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2020, 96
  • [8] Variable activation function extreme learning machine based on residual prediction compensation
    Wang, Gai-tang
    Li, Ping
    Cao, Jiang-tao
    [J]. SOFT COMPUTING, 2012, 16 (09) : 1477 - 1484
  • [9] Variable activation function extreme learning machine based on residual prediction compensation
    Gai-tang Wang
    Ping Li
    Jiang-tao Cao
    [J]. Soft Computing, 2012, 16 : 1477 - 1484
  • [10] An extreme learning machine approach for modeling evapotranspiration using extrinsic inputs
    Patil, Amit Prakash
    Deka, Paresh Chandra
    [J]. COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2016, 121 : 385 - 392