Multi-Kernel Correntropy Regression: Robustness, Optimality, and Application on Magnetometer Calibration

被引:1
|
作者
Li, Shilei [1 ]
Chen, Yihan [2 ]
Lou, Yunjiang [2 ]
Shi, Dawei [1 ]
Li, Lijing [3 ]
Shi, Ling [4 ]
机构
[1] Beijing Inst Technol, Sch Automat, Beijing 100081, Peoples R China
[2] Harbin Inst Technol Shenzhen, Sch Mech Engn & Automat, State Key Lab Robot & Syst, Shenzhen 518055, Peoples R China
[3] China Univ Min & Technol, Sch Informat & Control Engn, Xuzhou 221018, Peoples R China
[4] Hong Kong Univ Sci & Technol, Dept Elect & Comp Engn, Hong Kong, Peoples R China
关键词
Linear regression; multi-kernel correntropy; robustness and optimality; maximum likelihood estimation; expectation-maximization; magnetometer calibration; MAXIMUM; CONVERGENCE;
D O I
10.1109/TASE.2023.3332645
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper investigates the robustness and optimality of the multi-kernel correntropy (MKC) on linear regression. We first derive an upper error bound for a scalar regression problem in the presence of arbitrarily large outliers. Then, we find that the proposed MKC is related to a specific heavy-tail distribution, where its head shape is consistent with the Gaussian distribution while its tail shape is heavy-tailed and the extent of heavy-tail is controlled by the kernel bandwidth. Interestingly, when the bandwidth is infinite, the MKC-induced distribution becomes a Gaussian distribution, enabling the MKC to address both Gaussian and non-Gaussian problems by appropriately selecting correntropy parameters. To automatically tune these parameters, an expectation-maximization-like (EM) algorithm is developed to estimate the parameter vectors and the correntropy parameters in an alternating manner. The results show that our algorithm can achieve equivalent performance compared with the traditional linear regression under Gaussian noise, and significantly outperforms the conventional method under heavy-tailed noise. Both numerical simulations and experiments on a magnetometer calibration application verify the effectiveness of the proposed method. Note to Practitioners-The goal of this paper is to enhance the accuracy of conventional linear regression in handling outliers while maintaining its optimality under Gaussian situations. Our algorithm is formulated under the maximum likelihood estimation (MLE) framework, assuming the regression residuals follow a type of heavy-tailed noise distribution with an extreme case of Gaussian. The degree of the heavy tail is explored alternatingly using an Expectation-Maximization (EM) algorithm which converges very quickly. The robustness and optimality of the proposed approach are investigated and compared with the traditional approaches. Both theoretical analysis and experiments on magnetometer calibration demonstrate the superiority of the proposed method over the conventional methods. In the future, we will extend the proposed method to more general cases (such as nonlinear regression and classification) and derive new algorithms to accommodate more complex applications (such as with equality or inequality constraints or with prior knowledge of parameter vectors).
引用
收藏
页码:1 / 13
页数:13
相关论文
共 50 条
  • [1] Multi-Kernel Maximum Correntropy Kalman Filter
    Li, Shilei
    Shi, Dawei
    Zou, Wulin
    Shi, Ling
    [J]. IEEE CONTROL SYSTEMS LETTERS, 2022, 6 : 1490 - 1495
  • [2] Broad learning system based on maximum multi-kernel correntropy criterion
    Zhao, Haiquan
    Lu, Xin
    [J]. NEURAL NETWORKS, 2024, 179
  • [3] MULTI-KERNEL REGRESSION FOR GRAPH SIGNAL PROCESSING
    Venkitaraman, Arun
    Chatterjee, Saikat
    Handel, Peter
    [J]. 2018 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2018, : 4644 - 4648
  • [4] FAST CONVERGENT ALGORITHMS FOR MULTI-KERNEL REGRESSION
    Zhang, Liang
    Romero, Daniel
    Giannakis, Georgios B.
    [J]. 2016 IEEE STATISTICAL SIGNAL PROCESSING WORKSHOP (SSP), 2016,
  • [5] Gaussian process regression: Optimality, robustness, and relationship with kernel ridge regression
    Wang, Wenjia
    Jing, Bing-Yi
    [J]. Journal of Machine Learning Research, 2022, 23
  • [6] Gaussian process regression: Optimality, robustness, and relationship with kernel ridge regression
    Wang, Wenjia
    Jing, Bing-Yi
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2022, 23 : 1 - 67
  • [7] Multi-Kernel Based Feature Selection for Regression
    Lin, Chao-Zhe
    Chen, Xian-Kai
    [J]. INTELLIGENT COMPUTING THEORIES AND APPLICATIONS, ICIC 2012, 2012, 7390 : 308 - 316
  • [8] Learning rates of multi-kernel regularized regression
    Chen, Hong
    Li, Luoqing
    [J]. JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2010, 140 (09) : 2562 - 2568
  • [9] Fusion-based multi-kernel learning filter with maximum correntropy criterion
    Chu, Lin
    Li, Wenling
    [J]. PROCEEDINGS OF 2020 IEEE 9TH DATA DRIVEN CONTROL AND LEARNING SYSTEMS CONFERENCE (DDCLS'20), 2020, : 556 - 561
  • [10] Multi-kernel correntropy based extended Kalman filtering for state-of-charge estimation
    Dang, Lujuan
    Huang, Yulong
    Zhang, Yonggang
    Chen, Badong
    [J]. ISA TRANSACTIONS, 2022, 129 : 271 - 283