Abstract
The least absolute shrinkage and selection operator (LASSO) and its variants are widely used for sparse signal recovery. However, the determination of the regularization factor requires cross-validation strategy, which may obtain a sub-optimal solution. Motivated by the self-regularization nature of sparse Bayesian learning (SBL) approach and the framework of generalized LASSO, we propose a new hierarchical Bayesian model using automatic-weighting Laplace priors in this paper. In the proposed hierarchical Bayesian model, the posterior distributions of all the parameters can be approximated using variational Bayesian inference, resulting in closed-form solutions for all parameters updating. Moreover, the space alternating variational estimation strategy is used to avoid matrix inversion, and a fast algorithm (SAVE-WLap-SBL) is proposed. Comparing to existed SBL methods, the proposed method encourages the sparsity of signals more efficiently. Numerical experiments on synthetic and real data illustrate the benefit of these advances.
| Original language | English |
|---|---|
| Pages (from-to) | 2053-2074 |
| Number of pages | 22 |
| Journal | Computational Statistics |
| Volume | 38 |
| Issue number | 4 |
| DOIs | |
| State | Published - Dec 2023 |
| Externally published | Yes |
Keywords
- Space alternative variational estimation
- Sparse Bayesian leaning
- Sparse signal recovery
Fingerprint
Dive into the research topics of 'Sparse Bayesian learning with automatic-weighting Laplace priors for sparse signal recovery'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver