Generalization improvement for regularized least squares classification

被引:6
|
作者
Gan, Haitao [1 ]
She, Qingshan [1 ]
Ma, Yuliang [1 ]
Wu, Wei [1 ]
Meng, Ming [1 ]
机构
[1] Hangzhou Dianzi Univ, Sch Automat, Hangzhou 310018, Zhejiang, Peoples R China
来源
NEURAL COMPUTING & APPLICATIONS | 2019年 / 31卷 / Suppl 2期
基金
中国国家自然科学基金;
关键词
Regularized least squares classification; Margin distribution; Second-order statistic; FRAMEWORK; SVM;
D O I
10.1007/s00521-017-3090-9
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In the past decades, regularized least squares classification (RLSC) is a commonly used supervised classification method in the machine learning filed because it can be easily resolved through the simple matrix analysis and achieve a close-form solution. Recently, some studies conjecture that the margin distribution is more crucial to the generalization performance. Moreover, from the view of margin distribution, RLSC only considers the first-order statistics (i.e., margin mean) and does not consider the actual higher-order statistics of margin distribution. In this paper, we propose a novel RLSC which takes into account the actual second-order (i.e., variance) information of margin distribution. It is intuitively expected that small margin variance will improve the generalization performance of RLSC from a geometric view. We incorporate the margin variance into the objective function of RLSC and achieve the optimal classifier by minimizing the margin variance. To evaluate the performance of our algorithm, we conduct a series of experiments on several benchmark datasets in comparison with RLSC, kernel minimum squared error, support vector machine and large margin distribution machine. And the empirical results verify the effectiveness of our algorithm and indicate that the margin distribution is helpful to improve the classification performance.
引用
收藏
页码:1045 / 1051
页数:7
相关论文
共 50 条
  • [41] Optimal Rates for the Regularized Least-Squares Algorithm
    A. Caponnetto
    E. De Vito
    [J]. Foundations of Computational Mathematics, 2007, 7 : 331 - 368
  • [42] Quantum regularized least squares solver with parameter estimate
    Changpeng Shao
    Hua Xiang
    [J]. Quantum Information Processing, 2020, 19
  • [43] Sparsity regularized recursive total least-squares
    Tanc, A. Korhan
    [J]. DIGITAL SIGNAL PROCESSING, 2015, 40 : 176 - 180
  • [44] BER ANALYSIS OF REGULARIZED LEAST SQUARES FOR BPSK RECOVERY
    Ben Atitallah, Ismail
    Thrampoulidis, Christos
    Kammoun, Abla
    Al-Naffouri, Tareq Y.
    Hassibi, Babak
    Alouini, Mohamed-Slim
    [J]. 2017 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2017, : 4262 - 4266
  • [45] Regularized Least Squares Power System State Estimation
    de Almeida, Madson
    Garcia, Ariovaldo
    Asada, Eduardo
    [J]. 2012 IEEE POWER AND ENERGY SOCIETY GENERAL MEETING, 2012,
  • [46] Optimality of regularized least squares ranking with imperfect kernels
    He, Fangchao
    Zeng, Yu
    Zheng, Lie
    Wu, Qiang
    [J]. INFORMATION SCIENCES, 2022, 589 : 564 - 579
  • [47] Continuous Regularized Least Squares Polynomial Approximation on the Sphere
    Zhou, Yang
    Kong, Yanan
    [J]. MATHEMATICAL PROBLEMS IN ENGINEERING, 2020, 2020 (2020)
  • [48] Traveltime tomography using regularized recursive least squares
    Yao, ZS
    Osypov, KS
    Roberts, RG
    [J]. GEOPHYSICAL JOURNAL INTERNATIONAL, 1998, 134 (02) : 545 - 553
  • [49] Regularized least weighted squares estimator in linear regression
    Kalina, Jan
    [J]. COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION, 2024,
  • [50] Efficient retrieval of the regularized least-squares solution
    Sundaram, R
    [J]. OPTICAL ENGINEERING, 1998, 37 (04) : 1283 - 1289