Mixture quantized error entropy for recursive least squares adaptive filtering

被引:17
|
作者
He, Jiacheng [1 ]
Wang, Gang [2 ]
Peng, Bei [1 ]
Sun, Qi [2 ]
Feng, Zhenyu [1 ]
Zhang, Kun [1 ]
机构
[1] Univ Elect Sci & Technol China, Sch Mech & Elect Engn, Chengdu 611731, Peoples R China
[2] Univ Elect Sci & Technol China, Sch Informat & Commun Engn, Chengdu 611731, Peoples R China
基金
中国国家自然科学基金;
关键词
UNSCENTED KALMAN; CORRENTROPY; ALGORITHM; CRITERION;
D O I
10.1016/j.jfranklin.2021.12.015
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Error entropy is a well-known learning criterion in information theoretic learning (ITL), and it has been successfully applied in robust signal processing and machine learning. To date, many robust learning algorithms have been devised based on the minimum error entropy (MEE) criterion, and the Gaussian kernel function is always utilized as the default kernel function in these algorithms, which is not always the best option. To further improve learning performance, two concepts using a mixture of two Gaussian functions as kernel functions, called mixture error entropy and mixture quantized error entropy, are proposed in this paper. We further propose two new recursive least-squares algorithms based on mixture minimum error entropy (MMEE) and mixture quantized minimum error entropy (MQMEE) optimization criteria. The convergence analysis, steady-state mean-square performance, and computational complexity of the two proposed algorithms are investigated. In addition, the reason why the mixture mechanism (mixture correntropy and mixture error entropy) can improve the performance of adaptive filtering algorithms is explained. Simulation results show that the proposed new recursive least-squares algorithms outperform other RLS-type algorithms, and the practicality of the proposed algorithms is verified by the electro-encephalography application. (C) 2021 The Franklin Institute. Published by Elsevier Ltd. All rights reserved.
引用
收藏
页码:1362 / 1381
页数:20
相关论文
共 50 条
  • [1] Mitigation of GPS Multipath Error Using Recursive Least Squares Adaptive Filtering
    Yedukondalu, K.
    Sarma, A. D.
    Kumar, Ashwani
    [J]. PROCEEDINGS OF THE 2010 IEEE ASIA PACIFIC CONFERENCE ON CIRCUIT AND SYSTEM (APCCAS), 2010, : 104 - 107
  • [2] Sparsity regularised recursive least squares adaptive filtering
    Eksioglu, E. M.
    [J]. IET SIGNAL PROCESSING, 2011, 5 (05) : 480 - 487
  • [3] Diffusion Quantized Recursive Mixture Minimum Error Entropy Algorithm
    Cai, Peng
    Wang, Shiyuan
    [J]. IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-EXPRESS BRIEFS, 2022, 69 (12) : 5189 - 5193
  • [4] Adaptive filtering with quantized minimum error entropy criterion
    Li, Zhuang
    Xing, Lei
    Chen, Badong
    [J]. SIGNAL PROCESSING, 2020, 172
  • [5] Analysis of fast recursive least squares algorithms for adaptive filtering
    Arezki, M.
    Beneallal, A.
    Meyrueis, P.
    Guessoum, A.
    Berkani, D.
    [J]. PROCEEDINGS OF THE 11TH WSEAS INTERNATIONAL CONFERENCE ON SYSTEMS, VOL 2: SYSTEMS THEORY AND APPLICATIONS, 2007, : 473 - +
  • [6] Quantized Kernel Recursive Least Squares Algorithm
    Chen, Badong
    Zhao, Songlin
    Zhu, Pingping
    Principe, Jose C.
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2013, 24 (09) : 1484 - 1491
  • [7] REGULARIZED FAST RECURSIVE LEAST-SQUARES ALGORITHMS FOR ADAPTIVE FILTERING
    HOUACINE, A
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 1991, 39 (04) : 860 - 871
  • [8] Fast recursive total least squares algorithm for adaptive FIR filtering
    Feng, DZ
    Zhang, XD
    Chang, DX
    Zheng, WX
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2004, 52 (10) : 2729 - 2737
  • [9] A fast recursive total least squares algorithm for adaptive IIR filtering
    Chang, DX
    Feng, DZ
    Zheng, WX
    Li, L
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2005, 53 (03) : 957 - 965
  • [10] FAST, RECURSIVE-LEAST-SQUARES TRANSVERSAL FILTERS FOR ADAPTIVE FILTERING
    CIOFFI, JM
    KAILATH, T
    [J]. IEEE TRANSACTIONS ON ACOUSTICS SPEECH AND SIGNAL PROCESSING, 1984, 32 (02): : 304 - 337