Learning and Free Energies for Vector Approximate Message Passing

02/26/2016
by   Alyson K. Fletcher, et al.
0

Vector approximate message passing (VAMP) is a computationally simple approach to the recovery of a signal x from noisy linear measurements y=Ax+w. Like the AMP proposed by Donoho, Maleki, and Montanari in 2009, VAMP is characterized by a rigorous state evolution (SE) that holds under certain large random matrices and that matches the replica prediction of optimality. But while AMP's SE holds only for large i.i.d. sub-Gaussian A, VAMP's SE holds under the much larger class: right-rotationally invariant A. To run VAMP, however, one must specify the statistical parameters of the signal and noise. This work combines VAMP with Expectation-Maximization to yield an algorithm, EM-VAMP, that can jointly recover x while learning those statistical parameters. The fixed points of the proposed EM-VAMP algorithm are shown to be stationary points of a certain constrained free-energy, providing a variational interpretation of the algorithm. Numerical simulations show that EM-VAMP is robust to highly ill-conditioned A with performance nearly matching oracle-parameter VAMP.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset