Generalized Kernel Regularized Least Squares

被引:2
|
作者
Chang, Qing [1 ]
Goplerud, Max [1 ]
机构
[1] Univ Pittsburgh, Dept Polit Sci, Pittsburgh, PA 15260 USA
关键词
kernel ridge regression; hierarchical modeling; machine learning; heterogeneous effects;
D O I
10.1017/pan.2023.27
中图分类号
O1 [数学]; C [社会科学总论];
学科分类号
03 ; 0303 ; 0701 ; 070101 ;
摘要
Kernel regularized least squares (KRLS) is a popular method for flexibly estimating models that may have complex relationships between variables. However, its usefulness to many researchers is limited for two reasons. First, existing approaches are inflexible and do not allow KRLS to be combined with theoretically motivated extensions such as random effects, unregularized fixed effects, or non-Gaussian outcomes. Second, estimation is extremely computationally intensive for even modestly sized datasets. Our paper addresses both concerns by introducing generalized KRLS (gKRLS). We note that KRLS can be re-formulated as a hierarchical model thereby allowing easy inference and modular model construction where KRLS can be used alongside random effects, splines, and unregularized fixed effects. Computationally, we also implement random sketching to dramatically accelerate estimation while incurring a limited penalty in estimation quality. We demonstrate that gKRLS can be fit on datasets with tens of thousands of observations in under 1 min. Further, state-of-the-art techniques that require fitting the model over a dozen times (e.g., meta-learners) can be estimated quickly.
引用
收藏
页码:157 / 171
页数:15
相关论文
共 50 条
  • [41] A generalized multi-dictionary least squares framework regularized with multi-graph embeddings
    Abeo, Timothy Apasiba
    Shen, Xiang-Jun
    Bao, Bing-Kun
    Zha, Zheng-Jun
    Fan, Jianping
    PATTERN RECOGNITION, 2019, 90 : 1 - 11
  • [42] Least squares and generalized least squares in models with orthogonal block structure
    Fonseca, Miguel
    Mexia, Joao Tiago
    Zmyslony, Roman
    JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2010, 140 (05) : 1346 - 1352
  • [43] On the equivalence of the weighted least squares and the generalised least squares estimators, with applications to kernel smoothing
    Luati, Alessandra
    Proietti, Tommaso
    ANNALS OF THE INSTITUTE OF STATISTICAL MATHEMATICS, 2011, 63 (04) : 851 - 871
  • [44] On the equivalence of the weighted least squares and the generalised least squares estimators, with applications to kernel smoothing
    Alessandra Luati
    Tommaso Proietti
    Annals of the Institute of Statistical Mathematics, 2011, 63 : 851 - 871
  • [45] DISTRIBUTED KERNEL LEARNING USING KERNEL RECURSIVE LEAST SQUARES
    Fraser, Nicholas J.
    Moss, Duncan J. M.
    Epain, Nicolas
    Leong, Philip H. W.
    2015 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING (ICASSP), 2015, : 5500 - 5504
  • [46] How to apply the novel dynamic ARDL simulations (dynardl) and Kernel-based regularized least squares (krls)
    Sarkodie, Samuel Asumadu
    Owusu, Phebe Asantewaa
    METHODSX, 2020, 7
  • [47] Improved prediction of drug-target interactions using regularized least squares integrating with kernel fusion technique
    Hao, Ming
    Wang, Yanli
    Bryant, Stephen H.
    ANALYTICA CHIMICA ACTA, 2016, 909 : 41 - 50
  • [48] Generalization errors of Laplacian regularized least squares regression
    CAO Ying & CHEN DiRong Department of Mathematics
    Science China(Mathematics), 2012, 55 (09) : 1859 - 1868
  • [49] Optimal Rates for the Regularized Least-Squares Algorithm
    A. Caponnetto
    E. De Vito
    Foundations of Computational Mathematics, 2007, 7 : 331 - 368
  • [50] Quantum regularized least squares solver with parameter estimate
    Changpeng Shao
    Hua Xiang
    Quantum Information Processing, 2020, 19