When cannot regularization improve the least squares estimate in the kernel-based regularized system identification

被引:0
|
作者
Mu, Biqiang [1 ]
Ljung, Lennart [2 ]
Chen, Tianshi [3 ,4 ]
机构
[1] Chinese Acad Sci, Acad Math & Syst Sci, Key Lab Syst & Control, Beijing 100190, Peoples R China
[2] Linkoping Univ, Dept Elect Engn, Div Automat Control, S-58183 Linkoping, Sweden
[3] Chinese Univ Hong Kong, Sch Data Sci, Shenzhen 518172, Peoples R China
[4] Chinese Univ Hong Kong, Shenzhen Res Inst Big Data, Shenzhen 518172, Peoples R China
基金
中国国家自然科学基金; 国家重点研发计划;
关键词
Regularized least squares; Least squares; Squared error criterion; REGRESSION; STABILITY; CONVEX;
D O I
10.1016/j.automatica.2023.111442
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In the last decade, kernel-based regularization methods (KRMs) have been widely used for stable impulse response estimation in system identification. Its favorable performance over classic maximum likelihood/prediction error methods (ML/PEM) has been verified by extensive simulations. Recently, we noticed a surprising observation: for some data sets and kernels, no matter how the hyper-parameters are tuned, the regularized least square estimate cannot have higher model fit than the least square (LS) estimate, which implies that for such cases, the regularization cannot improve the LS estimate. Therefore, this paper focuses on how to understand this observation. To this purpose, we first introduce the squared error (SE) criterion, and the corresponding oracle hyper-parameter estimator in the sense of minimizing the SE criterion. Then we find the necessary and sufficient conditions under which the regularization cannot improve the LS estimate, and we show that the probability that this happens is greater than zero. The theoretical findings are demonstrated through numerical simulations, and simultaneously the anomalous simulation outcome wherein the probability is nearly zero is elucidated, and due to the ill-conditioned nature of either the kernel matrix, the Gram matrix, or both. (c) 2023 Elsevier Ltd. All rights reserved.
引用
收藏
页数:12
相关论文
共 50 条
  • [1] Kernel-Based Regularized Least Squares in R (KRLS) and Stata (krls)
    Ferwerda, Jeremy
    Hainmueller, Jens
    Hazlett, Chad J.
    [J]. JOURNAL OF STATISTICAL SOFTWARE, 2017, 79 (03): : 1 - 26
  • [2] On Robustness of Kernel-Based Regularized System Identification
    Khosravi, Mohammad
    Smith, Roy S.
    [J]. IFAC PAPERSONLINE, 2021, 54 (07): : 749 - 754
  • [3] Towards Scalable Kernel-Based Regularized System Identification
    Chen, Lujing
    Chen, Tianshi
    Detha, Utkarsh
    Andersen, Martin S.
    [J]. 2023 62ND IEEE CONFERENCE ON DECISION AND CONTROL, CDC, 2023, : 1498 - 1504
  • [4] Kernel-based Volatility Generalised Least Squares
    Chronopoulos, Ilias
    Kapetanios, George
    Petrova, Katerina
    [J]. ECONOMETRICS AND STATISTICS, 2021, 20 : 2 - 11
  • [5] A comparison of manifold regularization approaches for kernel-based system identification
    Mazzoleni, M.
    Scandella, M.
    Previdi, F.
    [J]. IFAC PAPERSONLINE, 2019, 52 (29): : 180 - 185
  • [6] Kernel-based system identification with manifold regularization: A Bayesian perspective
    Mazzoleni, Mirko
    Chiuso, Alessandro
    Scandella, Matteo
    Formentin, Simone
    Previdi, Fabio
    [J]. AUTOMATICA, 2022, 142
  • [7] How to apply the novel dynamic ARDL simulations (dynardl) and Kernel-based regularized least squares (krls)
    Sarkodie, Samuel Asumadu
    Owusu, Phebe Asantewaa
    [J]. METHODSX, 2020, 7
  • [8] KBERG: A MatLab toolbox for nonlinear kernel-based regularization and system identification
    Mazzoleni, M.
    Scandella, M.
    Previdi, F.
    [J]. IFAC PAPERSONLINE, 2020, 53 (02): : 1231 - 1236
  • [9] Kernel-Based Least Squares Temporal Difference With Gradient Correction
    Song, Tianheng
    Li, Dazi
    Cao, Liulin
    Hirasawa, Kotaro
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2016, 27 (04) : 771 - 782
  • [10] Kernel-based least squares policy iteration for reinforcement learning
    Xu, Xin
    Hu, Dewen
    Lu, Xicheng
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 2007, 18 (04): : 973 - 992