1-Norm random vector functional link networks for classification problems

被引:12
|
作者
Hazarika, Barenya Bikash [1 ,2 ]
Gupta, Deepak [1 ]
机构
[1] Natl Inst Technol, Dept Comp Sci & Engn, Jote, Arunachal Prade, India
[2] Koneru Lakshmaiah Educ Fdn, Vaddeswaram, Andhra Pradesh, India
关键词
1; Norm; Single layer feed-forward neural network; Random vector functional link; Sparseness; Classification; EXTREME LEARNING-MACHINE; KERNEL RIDGE-REGRESSION; MULTILAYER FEEDFORWARD NETWORKS; APPROXIMATION; CLASSIFIERS; ALGORITHM; ENSEMBLE;
D O I
10.1007/s40747-022-00668-y
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper presents a novel random vector functional link (RVFL) formulation called the 1-norm RVFL (1N RVFL) networks, for solving the binary classification problems. The solution to the optimization problem of 1N RVFL is obtained by solving its exterior dual penalty problem using a Newton technique. The 1-norm makes the model robust and delivers sparse outputs, which is the fundamental advantage of this model. The sparse output indicates that most of the elements in the output matrix are zero; hence, the decision function can be achieved by incorporating lesser hidden nodes compared to the conventional RVFL model. 1N RVFL produces a classifier that is based on a smaller number of input features. To put it another way, this method will suppress the neurons in the hidden layer. Statistical analyses have been carried out on several real-world benchmark datasets. The proposed 1N RVFL with two activation functions viz., ReLU and sine are used in this work. The classification accuracies of 1N RVFL are compared with the extreme learning machine (ELM), kernel ridge regression (KRR), RVFL, kernel RVFL (K-RVFL) and generalized Lagrangian twin RVFL (GLTRVFL) networks. The experimental results with comparable or better accuracy indicate the effectiveness and usability of 1N RVFL for solving binary classification problems.
引用
收藏
页码:3505 / 3521
页数:17
相关论文
共 50 条
  • [31] The median of a random fuzzy number. The 1-norm distance approach
    Sinova, Beatriz
    Angeles Gil, Maria
    Colubi, Ana
    Van Aelst, Stefan
    FUZZY SETS AND SYSTEMS, 2012, 200 : 99 - 115
  • [32] L 1-norm estimation and random weighting method in a semiparametric model
    Xue L.-G.
    Zhu L.-X.
    Acta Mathematicae Applicatae Sinica, 2005, 21 (2) : 295 - 302
  • [33] ONLINE CONTINUAL LEARNING USING ENHANCED RANDOM VECTOR FUNCTIONAL LINK NETWORKS
    Wong, Cheryl Sze Yin
    Yang, Guo
    Ambikapathi, Arulmurugan
    Savitha, Ramasamy
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 1905 - 1909
  • [34] Modeling uncertain processes with interval random vector functional-link networks
    Guan, Shouping
    Cui, Zhouying
    JOURNAL OF PROCESS CONTROL, 2020, 93 : 43 - 52
  • [35] Optimized incremental random vector functional-link networks and its application
    Jiang Y.
    Zhou P.
    Huagong Xuebao/CIESC Journal, 2019, 70 (12): : 4710 - 4721
  • [36] Bayesian Random Vector Functional-Link Networks for Robust Data Modeling
    Scardapane, Simone
    Wang, Dianhui
    Uncini, Aurelio
    IEEE TRANSACTIONS ON CYBERNETICS, 2018, 48 (07) : 2049 - 2059
  • [37] Representation learning using deep random vector functional link networks for clustering
    Hu, Minghui
    Suganthan, P. N.
    PATTERN RECOGNITION, 2022, 129
  • [38] Stacked autoencoder based deep random vector functional link neural network for classification
    Katuwal, Rakesh
    Suganthan, P. N.
    APPLIED SOFT COMPUTING, 2019, 85
  • [39] Random vector functional link with ε-insensitive Huber loss function for biomedical data classification
    Hazarika, Barenya Bikash
    Gupta, Deepak
    COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE, 2022, 215
  • [40] An iterative algorithm for l 1-norm approximation in dynamic estimation problems
    Akimov, P. A.
    Matasov, A. I.
    AUTOMATION AND REMOTE CONTROL, 2015, 76 (05) : 733 - 748