The Kernel Kalman Rule - Efficient Nonparametric Inference with Recursive Least Squares

被引:0
|
作者
Gebhardt, Gregor H. W. [1 ]
Kupcsik, Andras [2 ]
Neumann, Gerhard [3 ]
机构
[1] Tech Univ Darmstadt, Hochschulstr 10, D-64289 Darmstadt, Germany
[2] Natl Univ Singapore, Sch Comp, 13 Comp Dr, Singapore 117417, Singapore
[3] Univ Lincoln, Sch Comp Sci, Lincoln LN6 7TS, England
基金
欧盟地平线“2020”;
关键词
TIME-SERIES; MODELS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Nonparametric inference techniques provide promising tools for probabilistic reasoning in high-dimensional nonlinear systems. Most of these techniques embed distributions into reproducing kernel Hilbert spaces (RKHS) and rely on the kernel Bayes' rule (KBR) to manipulate the embeddings. However, the computational demands of the KBR scale poorly with the number of samples and the KBR often suffers from numerical instabilities. In this paper, we present the kernel Kalman rule (KKR) as an alternative to the KBR. The derivation of the KKR is based on recursive least squares, inspired by the derivation of the Kalman innovation update. We apply the KKR to filtering tasks where we use RKHS embeddings to represent the belief state, resulting in the kernel Kalman filter (KKF). We show on a nonlinear state estimation task with high dimensional observations that our approach provides a significantly improved estimation accuracy while the computational demands are significantly decreased.
引用
收藏
页码:3754 / 3760
页数:7
相关论文
共 50 条
  • [1] The kernel Kalman rule Efficient nonparametric inference by recursive least-squares and subspace projections
    Gebhardt, Gregor H. W.
    Kupcsik, Andras
    Neumann, Gerhard
    [J]. MACHINE LEARNING, 2019, 108 (12) : 2113 - 2157
  • [2] The kernel Kalman ruleEfficient nonparametric inference by recursive least-squares and subspace projections
    Gregor H. W. Gebhardt
    Andras Kupcsik
    Gerhard Neumann
    [J]. Machine Learning, 2019, 108 : 2113 - 2157
  • [3] Extended Kalman Filter Using a Kernel Recursive Least Squares Observer
    Zhu, Pingping
    Chen, Badong
    Principe, Jose C.
    [J]. 2011 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2011, : 1402 - 1408
  • [4] Quantized Kernel Recursive Least Squares Algorithm
    Chen, Badong
    Zhao, Songlin
    Zhu, Pingping
    Principe, Jose C.
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2013, 24 (09) : 1484 - 1491
  • [5] Projected Kernel Recursive Least Squares Algorithm
    Zhao, Ji
    Zhang, Hongbin
    [J]. NEURAL INFORMATION PROCESSING, ICONIP 2017, PT I, 2017, 10634 : 356 - 365
  • [6] Extended Kernel Recursive Least Squares Algorithm
    Liu, Weifeng
    Park, Il
    Wang, Yiwen
    Principe, Jose C.
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2009, 57 (10) : 3801 - 3814
  • [7] Regularization for the Kernel Recursive Least Squares CMAC
    Laufer, C.
    Coghill, G.
    [J]. 2012 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2012,
  • [8] The kernel recursive least-squares algorithm
    Engel, Y
    Mannor, S
    Meir, R
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2004, 52 (08) : 2275 - 2285
  • [9] DISTRIBUTED KERNEL LEARNING USING KERNEL RECURSIVE LEAST SQUARES
    Fraser, Nicholas J.
    Moss, Duncan J. M.
    Epain, Nicolas
    Leong, Philip H. W.
    [J]. 2015 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING (ICASSP), 2015, : 5500 - 5504
  • [10] Kernel recursive least squares dictionary learning algorithm
    Alipoor, Ghasem
    Skretting, Karl
    [J]. DIGITAL SIGNAL PROCESSING, 2023, 141