Nearly Minimax-Optimal Rates for Noisy Sparse Phase Retrieval via Early-Stopped Mirror Descent

by   Fan Wu, et al.

This paper studies early-stopped mirror descent applied to noisy sparse phase retrieval, which is the problem of recovering a k-sparse signal ๐ฑ^โ‹†โˆˆโ„^n from a set of quadratic Gaussian measurements corrupted by sub-exponential noise. We consider the (non-convex) unregularized empirical risk minimization problem and show that early-stopped mirror descent, when equipped with the hyperbolic entropy mirror map and proper initialization, achieves a nearly minimax-optimal rate of convergence, provided the sample size is at least of order k^2 (modulo logarithmic term) and the minimum (in modulus) non-zero entry of the signal is on the order of ๐ฑ^โ‹†_2/โˆš(k). Our theory leads to a simple algorithm that does not rely on explicit regularization or thresholding steps to promote sparsity. More generally, our results establish a connection between mirror descent and sparsity in the non-convex problem of noisy sparse phase retrieval, adding to the literature on early stopping that has mostly focused on non-sparse, Euclidean, and convex settings via gradient descent. Our proof combines a potential-based analysis of mirror descent with a quantitative control on a variational coherence property that we establish along the path of mirror descent, up to a prescribed stopping time.


page 1

page 2

page 3

page 4

โˆ™ 10/20/2020

A Continuous-Time Mirror Descent Approach to Sparse Phase Retrieval

We analyze continuous-time mirror descent applied to sparse phase retrie...
โˆ™ 06/10/2015

Optimal Rates of Convergence for Noisy Sparse Phase Retrieval via Thresholded Wirtinger Flow

This paper considers the noisy sparse phase retrieval problem: recoverin...
โˆ™ 05/26/2023

Fast and Minimax Optimal Estimation of Low-Rank Matrices via Non-Convex Gradient Descent

We study the problem of estimating a low-rank matrix from noisy measurem...
โˆ™ 09/11/2019

Implicit Regularization for Optimal Sparse Recovery

We investigate implicit regularization schemes for gradient descent meth...
โˆ™ 04/16/2017

Boosting with Structural Sparsity: A Differential Inclusion Approach

Boosting as gradient descent algorithms is one popular method in machine...
โˆ™ 06/30/2014

Sparse Recovery via Differential Inclusions

In this paper, we recover sparse signals from their noisy linear measure...
โˆ™ 08/16/2023

Phase Retrieval with Background Information: Decreased References and Efficient Methods

Fourier phase retrieval(PR) is a severely ill-posed inverse problem that...

Please sign up or login with your details

Forgot password? Click here to reset