Beyond Independent Measurements: General Compressed Sensing with GNN Application
We consider the problem of recovering a structured signal 𝐱∈ℝ^n from noisy linear observations 𝐲 =𝐌𝐱+𝐰. The measurement matrix is modeled as 𝐌 = 𝐁𝐀, where 𝐁∈ℝ^l × m is arbitrary and 𝐀∈ℝ^m × n has independent sub-gaussian rows. By varying 𝐁, and the sub-gaussian distribution of 𝐀, this gives a family of measurement matrices which may have heavy tails, dependent rows and columns, and singular values with a large dynamic range. When the structure is given as a possibly non-convex cone T ⊂ℝ^n, an approximate empirical risk minimizer is proven to be a robust estimator if the effective number of measurements is sufficient, even in the presence of a model mismatch. In classical compressed sensing with independent (sub-)gaussian measurements, one asks how many measurements are needed to recover 𝐱? In our setting, however, the effective number of measurements depends on the properties of 𝐁. We show that the effective rank of 𝐁 may be used as a surrogate for the number of measurements, and if this exceeds the squared Gaussian mean width of (T-T) ∩𝕊^n-1, then accurate recovery is guaranteed. Furthermore, we examine the special case of generative priors in detail, that is when 𝐱 lies close to T = ran(G) and G: ℝ^k →ℝ^n is a Generative Neural Network (GNN) with ReLU activation functions. Our work relies on a recent result in random matrix theory by Jeong, Li, Plan, and Yilmaz arXiv:2001.10631. .
READ FULL TEXT