Generalized Kernel Regularized Least Squares

被引:2
|
作者
Chang, Qing [1 ]
Goplerud, Max [1 ]
机构
[1] Univ Pittsburgh, Dept Polit Sci, Pittsburgh, PA 15260 USA
关键词
kernel ridge regression; hierarchical modeling; machine learning; heterogeneous effects;
D O I
10.1017/pan.2023.27
中图分类号
O1 [数学]; C [社会科学总论];
学科分类号
03 ; 0303 ; 0701 ; 070101 ;
摘要
Kernel regularized least squares (KRLS) is a popular method for flexibly estimating models that may have complex relationships between variables. However, its usefulness to many researchers is limited for two reasons. First, existing approaches are inflexible and do not allow KRLS to be combined with theoretically motivated extensions such as random effects, unregularized fixed effects, or non-Gaussian outcomes. Second, estimation is extremely computationally intensive for even modestly sized datasets. Our paper addresses both concerns by introducing generalized KRLS (gKRLS). We note that KRLS can be re-formulated as a hierarchical model thereby allowing easy inference and modular model construction where KRLS can be used alongside random effects, splines, and unregularized fixed effects. Computationally, we also implement random sketching to dramatically accelerate estimation while incurring a limited penalty in estimation quality. We demonstrate that gKRLS can be fit on datasets with tens of thousands of observations in under 1 min. Further, state-of-the-art techniques that require fitting the model over a dozen times (e.g., meta-learners) can be estimated quickly.
引用
收藏
页码:157 / 171
页数:15
相关论文
共 50 条
  • [31] Regularized Least-Squares for parse ranking
    Tsivtsivadze, E
    Pahikkala, T
    Pyysalo, S
    Boberg, J
    Mylläri, A
    Salakoski, T
    ADVANCES IN INTELLIGENT DATA ANALYSIS VI, PROCEEDINGS, 2005, 3646 : 464 - 474
  • [32] Approximate Regularized Least Squares Algorithm for Classification
    Peng, Jinn
    Aved, Alex J.
    PATTERN RECOGNITION AND TRACKING XXIX, 2018, 10649
  • [33] Generalization improvement for regularized least squares classification
    Haitao Gan
    Qingshan She
    Yuliang Ma
    Wei Wu
    Ming Meng
    Neural Computing and Applications, 2019, 31 : 1045 - 1051
  • [34] Generalization improvement for regularized least squares classification
    Gan, Haitao
    She, Qingshan
    Ma, Yuliang
    Wu, Wei
    Meng, Ming
    NEURAL COMPUTING & APPLICATIONS, 2019, 31 (Suppl 2): : 1045 - 1051
  • [35] Regularized least squares phase sampling interferometry
    Antonio Quiroga, Juan
    Cesar Estrada, Julio
    Servin, Manuel
    Vargas, Javier
    OPTICS EXPRESS, 2011, 19 (06): : 5002 - 5013
  • [36] Approximate kernel partial least squares
    Xiling Liu
    Shuisheng Zhou
    Annals of Mathematics and Artificial Intelligence, 2020, 88 : 973 - 986
  • [37] Approximate kernel partial least squares
    Liu, Xiling
    Zhou, Shuisheng
    ANNALS OF MATHEMATICS AND ARTIFICIAL INTELLIGENCE, 2020, 88 (09) : 973 - 986
  • [38] Least Squares kernel algorithms and applications
    Kuh, A
    2003 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY - PROCEEDINGS, 2003, : 222 - 222
  • [39] Sparse kernel least squares classifier
    Sun, P
    FOURTH IEEE INTERNATIONAL CONFERENCE ON DATA MINING, PROCEEDINGS, 2004, : 539 - 542
  • [40] When cannot regularization improve the least squares estimate in the kernel-based regularized system identification
    Mu, Biqiang
    Ljung, Lennart
    Chen, Tianshi
    AUTOMATICA, 2024, 160