Sparse estimation using Bayesian hierarchical prior modeling for real and complex linear models

被引:35
|
作者
Pedersen, Niels Lovmand [1 ]
Manchon, Carles Navarro [1 ]
Badiu, Mihai-Alin [1 ]
Shutin, Dmitriy [2 ]
Fleury, Bernard Henri [1 ]
机构
[1] Aalborg Univ, Dept Elect Syst, DK-9220 Aalborg, Denmark
[2] German Aerosp Ctr, Inst Commun & Nav, D-82234 Wessling, Germany
关键词
Sparse Bayesian learning; Sparse signal representations; Underdetermined linear systems; Hierarchical Bayesian modeling; Sparsity-inducing priors; SCALE MIXTURES; REGRESSION;
D O I
10.1016/j.sigpro.2015.03.013
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
In sparse Bayesian learning (SBL), Gaussian scale mixtures (GSMs) have been used to model sparsity-inducing priors that realize a class of concave penalty functions for the regression task in real-valued signal models. Motivated by the relative scarcity of formal tools for SBL in complex-valued models, this paper proposes a GSM model - the Bessel K model - that induces concave penalty functions for the estimation of complex sparse signals. The properties of the Bessel K model are analyzed when it is applied to Type I and Type II estimation. This analysis reveals that, by tuning the parameters of the mixing pdf different penalty functions are invoked depending on the estimation type used, the value of the noise variance, and whether real or complex signals are estimated. Using the Bessel K model, we derive sparse estimators based on a modification of the expectation-maximization algorithm formulated for Type II estimation. The estimators include as special instances the algorithms proposed by Tipping and Faul [1] and Babacan et al. [2]. Numerical results show the superiority of the proposed estimators over these state-of-the-art algorithms in terms of convergence speed, sparseness, reconstruction error, and robustness in low and medium signal-to-noise ratio regimes. (C) 2015 Elsevier B.V. All rights reserved.
引用
收藏
页码:94 / 109
页数:16
相关论文
共 50 条
  • [1] Application of Bayesian Hierarchical Prior Modeling to Sparse Channel Estimation
    Pedersen, Niels Lovmand
    Manchon, Carles Navarro
    Shutin, Dmitriy
    Fleury, Bernard Henri
    [J]. 2012 IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC), 2012,
  • [2] Polygenic Modeling with Bayesian Sparse Linear Mixed Models
    Zhou, Xiang
    Carbonetto, Peter
    Stephens, Matthew
    [J]. PLOS GENETICS, 2013, 9 (02):
  • [3] Bayesian parsimonious covariance estimation for hierarchical linear mixed models
    Sylvia Frühwirth-Schnatter
    Regina Tüchler
    [J]. Statistics and Computing, 2008, 18 : 1 - 13
  • [4] Bayesian parsimonious covariance estimation for hierarchical linear mixed models
    Fruhwirth-Schnatter, Sylvia
    Tuchler, Regina
    [J]. STATISTICS AND COMPUTING, 2008, 18 (01) : 1 - 13
  • [5] Hierarchical sparse observation models and informative prior for Bayesian inference of spatially varying parameters
    Lee, Wonjung
    Sun, Yiqun
    Lu, Shuai
    [J]. JOURNAL OF COMPUTATIONAL PHYSICS, 2020, 422
  • [6] Bayesian Structured-Sparse Modeling Using a Bernoulli–Laplacian Prior
    Mayadeh Kouti
    Karim Ansari-Asl
    Ehsan Namjoo
    [J]. Circuits, Systems, and Signal Processing, 2024, 43 : 1862 - 1888
  • [7] On the sparse Bayesian learning of linear models
    Yee, Chia Chye
    Atchade, Yves F.
    [J]. COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 2017, 46 (15) : 7672 - 7691
  • [8] Hierarchical Modeling Using Generalized Linear Models
    Kumar, Naveen
    Mastrangelo, Christina
    Montgomery, Doug
    [J]. QUALITY AND RELIABILITY ENGINEERING INTERNATIONAL, 2011, 27 (06) : 835 - 842
  • [9] HIERARCHICAL BAYESIAN MODELING WITH ELICITED PRIOR INFORMATION
    STEFFEY, D
    [J]. COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 1992, 21 (03) : 799 - 821
  • [10] Sparse Bayesian Learning Using Generalized Double Pareto Prior for DOA Estimation
    Wang, Qisen
    Yu, Hua
    Li, Jie
    Ji, Fei
    Chen, Fangjiong
    [J]. IEEE SIGNAL PROCESSING LETTERS, 2021, 28 : 1744 - 1748