Online ADMM-Based Extreme Learning Machine for Sparse Supervised Learning

被引:9
|
作者
Song, Tianheng [1 ]
Li, Dazi [1 ]
Liu, Zhiyin [1 ]
Yang, Weimin [2 ]
机构
[1] Beijing Univ Chem Technol, Coll Informat Sci & Technol, Beijing 100029, Peoples R China
[2] Beijing Univ Chem Technol, Coll Mech & Elect Engn, Beijing 100029, Peoples R China
基金
北京市自然科学基金; 中国国家自然科学基金; 中国博士后科学基金;
关键词
Online learning; alternative direction method of multipliers (ADMM); l(1)-regularization; extreme learning machine (ELM); sparse output parameters; NEURAL-NETWORKS; CLASSIFICATION; REGULARIZATION; APPROXIMATION; ALGORITHM;
D O I
10.1109/ACCESS.2019.2915970
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Sparse learning is an efficient technique for feature selection and avoiding overfitting in machine learning research areas. Considering sparse learning for real-world problems with online learning demands in neural networks, an online sparse supervised learning of extreme learning machine (ELM) algorithm is proposed based on alternative direction method of multipliers (ADMM), termed OAL1-ELM. In OAL1-ELM, an l(1)-regularization penalty is added in loss function for generating a sparse solution to enhance the generalization ability. This convex combinatorial loss function is solved by using ADMM in a distributed way. Furthermore, an improved ADMM is used to reduce computational complexity and to achieve online learning. The proposed algorithm can learn data one-by-one or batch-by-batch. The convergence analysis for the fixed point of the solution is given to show the efficiency and optimality of the proposed method. The experimental results show that the proposed method can obtain a sparse solution and have strong generalization performance in a wide range of regression tasks, multiclass classification tasks, and a real-world industrial project.
引用
收藏
页码:64533 / 64544
页数:12
相关论文
共 50 条
  • [1] ADMM-Based Sparse Distributed Learning for Stochastic Configuration Networks
    Zhou, Yujun
    Ai, Wu
    2020 CHINESE AUTOMATION CONGRESS (CAC 2020), 2020, : 4354 - 4358
  • [2] Extreme learning machine based supervised subspace learning
    Iosifidis, Alexandros
    NEUROCOMPUTING, 2015, 167 : 158 - 164
  • [3] DP-ADMM: ADMM-Based Distributed Learning With Differential Privacy
    Huang, Zonghao
    Hu, Rui
    Guo, Yuanxiong
    Chan-Tin, Eric
    Gong, Yanmin
    IEEE TRANSACTIONS ON INFORMATION FORENSICS AND SECURITY, 2020, 15 : 1002 - 1012
  • [4] Online semi-supervised extreme learning machine based on manifold regularization
    Wang, Ping
    Wang, Di
    Feng, Wei
    Shanghai Jiaotong Daxue Xuebao/Journal of Shanghai Jiaotong University, 2015, 49 (08): : 1153 - 1158
  • [5] Symmetric ADMM-Based Federated Learning with a Relaxed Step
    Lu, Jinglei
    Zhu, Ya
    Dang, Yazheng
    MATHEMATICS, 2024, 12 (17)
  • [6] Density-based semi-supervised online sequential extreme learning machine
    Xia, Min
    Wang, Jie
    Liu, Jia
    Weng, Liguo
    Xu, Yiqing
    NEURAL COMPUTING & APPLICATIONS, 2020, 32 (12): : 7747 - 7758
  • [7] Density-based semi-supervised online sequential extreme learning machine
    Min Xia
    Jie Wang
    Jia Liu
    Liguo Weng
    Yiqing Xu
    Neural Computing and Applications, 2020, 32 : 7747 - 7758
  • [8] Engine Condition Online Prediction Based on Incremental Sparse Kernel Extreme Learning Machine
    Liu M.
    Zhang Y.-T.
    Fan H.-B.
    Li Z.-N.
    Beijing Ligong Daxue Xuebao/Transaction of Beijing Institute of Technology, 2019, 39 (01): : 34 - 40
  • [9] A semi-supervised online sequential extreme learning machine method
    Jia, Xibin
    Wang, Runyuan
    Liu, Junfa
    Powers, David M. W.
    NEUROCOMPUTING, 2016, 174 : 168 - 178
  • [10] A Novel Sparse Extreme Learning Machine based Classifier
    Xu, Li
    Zhang, Kezhong
    Zhang, Yueyan
    Zhang, Han
    Feng, Zhiyong
    2018 IEEE/CIC INTERNATIONAL CONFERENCE ON COMMUNICATIONS IN CHINA (ICCC), 2018, : 283 - 287