Unconstrained convex minimization based implicit Lagrangian twin extreme learning machine for classification (ULTELMC)

被引:0
|
作者
Parashjyoti Borah
Deepak Gupta
机构
[1] National Institute of Technology,Department of Computer Science & Engineering
来源
Applied Intelligence | 2020年 / 50卷
关键词
Extreme learning machine; Unconstrained minimization; Smoothing approaches; Quadratic programming problem; Iterative schemes;
D O I
暂无
中图分类号
学科分类号
摘要
The recently proposed twin extreme learning machine (TELM) requires solving two quadratic programming problems (QPPs) in order to find two non-parallel hypersurfaces in the feature that brings in the additional requirement of external optimization toolbox such as MOSEK. In this paper, we propose implicit Lagrangian TELM for classification via unconstrained convex minimization problem (ULTELMC) and further suggest iterative convergent schemes which eliminates the requirement of external optimization toolbox generally required in solving the quadratic programming problems (QPPs) of TELM. The solutions to the dual variables of the proposed ULTELMC are obtained using iterative schemes containing ‘plus’ function which is not differentiable. To overcome this shortcoming, the generalized derivative approach and smooth approximation approaches are suggested. Further, to test the performance of the proposed approaches, classification performances are compared with support vector machine (SVM), twin support vector machine (TWSVM), extreme learning machine (ELM), twin extreme learning machine (TELM) and Lagrangian extreme learning machine (LELM). Moreover, non-requirement to solve QPPs makes the iterative schemes find the solution faster as compared to the reported methods that finds the solution in dual space. Computational times required in finding the solutions are also presented for comparison.
引用
收藏
页码:1327 / 1344
页数:17
相关论文
共 50 条