Unitary Approximate Message Passing for Sparse Bayesian Learning

Man Luo, Qinghua Guo, Ming Jin, Yonina C. Eldar, Defeng David Huang, Xiangming Meng

Research output: Contribution to journalArticlepeer-review

30 Citations (Scopus)


Sparse Bayesian learning (SBL) can be implemented with low complexity based on the approximate message passing (AMP) algorithm. However, it does not work well for a generic measurement matrix, which may cause AMP to diverge. Damped AMP has been used for SBL to alleviate the problem at the cost of reducing convergence speed. In this work, we propose a new SBL algorithm based on structured variational inference, leveraging AMP with a unitary transformation (UAMP). Both single measurement vector and multiple measurement vector problems are investigated. It is shown that, compared to stateof- the-art AMP-based SBL algorithms, the proposed UAMPSBL is more robust and efficient, leading to remarkably better performance.

Original languageEnglish
Pages (from-to)6023-6039
Number of pages17
JournalIEEE Transactions on Signal Processing
Early online date24 Sept 2021
Publication statusPublished - 2021


Dive into the research topics of 'Unitary Approximate Message Passing for Sparse Bayesian Learning'. Together they form a unique fingerprint.

Cite this