Lagrangian supervised and semi-supervised extreme learning machine

被引:0
|
作者
Jun Ma
Yakun Wen
Liming Yang
机构
[1] China Agricultural University,College of Information and Electrical Engineering
[2] China Agricultural University,College of Science
来源
Applied Intelligence | 2019年 / 49卷
关键词
Optimality conditions; Lagrangian function; Extreme learning machine; Semi-supervised learning; Classification;
D O I
暂无
中图分类号
学科分类号
摘要
Two extreme learning machine (ELM) frameworks are proposed to handle supervised and semi-supervised classifications. The first is called lagrangian extreme learning machine (LELM), which is based on optimality conditions and dual theory. Then LELM is extended to semi-supervised setting to obtain a semi-supervised extreme learning machine (called Lap-LELM), which incorporates the manifold regularization into LELM to improve performance when insufficient training information is available. In order to avoid the inconvenience caused by matrix inversion, Sherman-Morrison-Woodbury (SMW) identity is used in LELM and Lap-LELM, which leads to two smaller sized unconstrained minimization problems. The proposed models are solvable in a space of dimensionality equal to the number of sample points. The resulting iteration algorithms converge globally and have low computational burden. So as to verify the feasibility and effectiveness of the proposed method, we perform a series of experiments on a synthetic dataset, near-infrared (NIR) spectroscopy datasets and benchmark datasets. Compared with the traditional methods, experimental results demonstrate that the proposed methods achieve better performances than the traditional supervised and semi-supervised methods in most of considered datasets.
引用
收藏
页码:303 / 318
页数:15
相关论文
共 50 条
  • [31] Semi-Supervised Learning for Neural Machine Translation
    Cheng, Yong
    Xu, Wei
    He, Zhongjun
    He, Wei
    Wu, Hua
    Sun, Maosong
    Liu, Yang
    [J]. PROCEEDINGS OF THE 54TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 1, 2016, : 1965 - 1974
  • [32] Semi-supervised low rank kernel learning algorithm via extreme learning machine
    Liu, Mingming
    Liu, Bing
    Zhang, Chen
    Wang, Weidong
    Sun, Wei
    [J]. INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2017, 8 (03) : 1039 - 1052
  • [33] A Semi-supervised Low Rank Kernel Learning Algorithm via Extreme Learning Machine
    Liu, Bing
    Liu, Mingming
    Zhang, Chen
    Wang, Weidong
    [J]. PROCEEDINGS OF ELM-2015, VOL 1: THEORY, ALGORITHMS AND APPLICATIONS (I), 2016, 6 : 279 - 292
  • [34] A Hybrid Regularization Semi-Supervised Extreme Learning Machine Method and Its Application
    Lei, Yongxiang
    Cen, Lihui
    Chen, Xiaofang
    Xie, Yongfang
    [J]. IEEE ACCESS, 2019, 7 : 30102 - 30111
  • [35] Incremental semi-supervised Extreme Learning Machine for Mixed data stream classification
    Li, Qiude
    Xiong, Qingyu
    Ji, Shengfen
    Yu, Yang
    Wu, Chao
    Gao, Min
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2021, 185
  • [36] Density-based semi-supervised online sequential extreme learning machine
    Xia, Min
    Wang, Jie
    Liu, Jia
    Weng, Liguo
    Xu, Yiqing
    [J]. NEURAL COMPUTING & APPLICATIONS, 2020, 32 (12): : 7747 - 7758
  • [37] Semi-supervised low rank kernel learning algorithm via extreme learning machine
    Mingming Liu
    Bing Liu
    Chen Zhang
    Weidong Wang
    Wei Sun
    [J]. International Journal of Machine Learning and Cybernetics, 2017, 8 : 1039 - 1052
  • [38] A semi-supervised extreme learning machine method based on co-training
    Li, Kunlun
    Zhang, Juan
    Xu, Hongyu
    Luo, Shangzong
    Li, Hexin
    [J]. Journal of Computational Information Systems, 2013, 9 (01): : 207 - 214
  • [39] Semi-supervised extreme learning machine using L1-graph
    Zhao H.
    Liu Y.
    Liu S.
    Feng L.
    [J]. Feng, Lin (fenglin@dlut.edu.cn), 2018, Totem Publishers Ltd (14) : 603 - 610
  • [40] Density-based semi-supervised online sequential extreme learning machine
    Min Xia
    Jie Wang
    Jia Liu
    Liguo Weng
    Yiqing Xu
    [J]. Neural Computing and Applications, 2020, 32 : 7747 - 7758