Sparse Bayesian learning with automatic-weighting Laplace priors for sparse signal recovery

  • Zonglong Bai*
  • , Jinwei Sun
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

The least absolute shrinkage and selection operator (LASSO) and its variants are widely used for sparse signal recovery. However, the determination of the regularization factor requires cross-validation strategy, which may obtain a sub-optimal solution. Motivated by the self-regularization nature of sparse Bayesian learning (SBL) approach and the framework of generalized LASSO, we propose a new hierarchical Bayesian model using automatic-weighting Laplace priors in this paper. In the proposed hierarchical Bayesian model, the posterior distributions of all the parameters can be approximated using variational Bayesian inference, resulting in closed-form solutions for all parameters updating. Moreover, the space alternating variational estimation strategy is used to avoid matrix inversion, and a fast algorithm (SAVE-WLap-SBL) is proposed. Comparing to existed SBL methods, the proposed method encourages the sparsity of signals more efficiently. Numerical experiments on synthetic and real data illustrate the benefit of these advances.

Original languageEnglish
Pages (from-to)2053-2074
Number of pages22
JournalComputational Statistics
Volume38
Issue number4
DOIs
StatePublished - Dec 2023
Externally publishedYes

Keywords

  • Space alternative variational estimation
  • Sparse Bayesian leaning
  • Sparse signal recovery

Fingerprint

Dive into the research topics of 'Sparse Bayesian learning with automatic-weighting Laplace priors for sparse signal recovery'. Together they form a unique fingerprint.

Cite this