A regularized estimation framework for online sparse LSSVR models

被引:11
|
作者
Santos, Jose Daniel A. [1 ]
Barreto, Guilherme A. [2 ]
机构
[1] Fed Inst Educ Sci & Technol Ceara, Dept Ind, Maracanaa, Ceara, Brazil
[2] Univ Fed Ceara, Ctr Technol, Dept Teleinformat Engn, Campus Pici, Fortaleza, Ceara, Brazil
关键词
LSSVR model; Kernel RLS Algorithm; Kernel adaptive filtering; System identification; Time series prediction; KERNEL; IDENTIFICATION;
D O I
10.1016/j.neucom.2017.01.042
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Aiming at machine learning applications in which fast online learning is required, we develop a variant of the Least Squares SVR (LSSVR) model that can learn incrementally from data and eventually provide a sparse solution vector. This is possible by incorporating into the LSSVR model the sparsification mechanism used by the kernel RLS (KRLS) model introduced in Engel et al., 2004. The performance of the resulting model, henceforth referred to as the online sparse LSSVR (OS-LSSVR) model, is comprehensively evaluated by computer experiments on several benchmarking datasets (including a large scale one) covering a number of challenging tasks in nonlinear time series prediction and system identification. Convergence, efficiency and error bounds of the OS-LSSVR model are also addressed. The results indicate that the proposed approach consistently outperforms the state of the art in kernel adaptive filtering algorithms, by providing more sparse solutions with smaller prediction errors and smaller norms for the solution vector. (C) 2017 Elsevier B.V. All rights reserved.
引用
收藏
页码:114 / 125
页数:12
相关论文
共 50 条
  • [1] REGULARIZED ESTIMATION IN SPARSE HIGH-DIMENSIONAL TIME SERIES MODELS
    Basu, Sumanta
    Michailidis, George
    [J]. ANNALS OF STATISTICS, 2015, 43 (04): : 1535 - 1567
  • [2] Survey on the regularized sparse models
    Liu, Jian-Wei
    Cui, Li-Peng
    Liu, Ze-Yu
    Luo, Xiong-Lin
    [J]. Jisuanji Xuebao/Chinese Journal of Computers, 2015, 38 (07): : 1307 - 1325
  • [3] Unified Framework to Regularized Covariance Estimation in Scaled Gaussian Models
    Wiesel, Ami
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2012, 60 (01) : 18 - 27
  • [4] A Regularized Framework for Sparse and Structured Neural Attention
    Niculae, Vlad
    Blondel, Mathieu
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [5] A Framework for Interpreting Regularized State Estimation
    Sugiura, Nozomi
    Masuda, Shuhei
    Fujii, Yosuke
    Kamachi, Masafumi
    Ishikawa, Yoichi
    Awaji, Toshiyuki
    [J]. MONTHLY WEATHER REVIEW, 2014, 142 (01) : 386 - 400
  • [6] Regularized Sparse Modelling for Microarray Missing Value Estimation
    Wang, Aiguo
    Yang, Jing
    An, Ning
    [J]. IEEE ACCESS, 2021, 9 : 16899 - 16913
  • [7] Regularized online tensor factorization for sparse knowledge graph embeddings
    Zulaika, Unai
    Almeida, Aitor
    Lopez-de-Ipina, Diego
    [J]. NEURAL COMPUTING & APPLICATIONS, 2023, 35 (01): : 787 - 797
  • [8] Regularized online tensor factorization for sparse knowledge graph embeddings
    Unai Zulaika
    Aitor Almeida
    Diego López-de-Ipiña
    [J]. Neural Computing and Applications, 2023, 35 : 787 - 797
  • [9] A Unified Framework for Sparse Online Learning
    Zhao, Peilin
    Wang, Dayong
    Wu, Pengcheng
    Hoi, Steven C. H.
    [J]. ACM TRANSACTIONS ON KNOWLEDGE DISCOVERY FROM DATA, 2020, 14 (05)
  • [10] REGULARIZED ESTIMATION OF DYNAMIC PANEL MODELS
    Carrasco, Marine
    Nayihouba, Ada
    [J]. ECONOMETRIC THEORY, 2024, 40 (02) : 360 - 418