A Laplacian Approach to ℓ_1-Norm Minimization

01/25/2019
by   Vincenzo Bonifaci, et al.
0

We propose a novel differentiable reformulation of the linearly-constrained ℓ_1 minimization problem, also known as the basis pursuit problem. The reformulation is inspired by the Laplacian paradigm of network theory and leads to a new family of gradient-based, matrix-free methods for the solution of ℓ_1 minimization problems. We analyze the iteration complexity of a natural solution approach to the reformulation, based on a multiplicative weights update scheme, as well as the iteration complexity of an accelerated gradient scheme. The accelerated method, in particular, yields an improved worst-case bound on the complexity of matrix-free methods of basis pursuit.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro