A semismooth Newton-proximal method of multipliers for ℓ_1-regularized convex quadratic programming

01/25/2022
by   Spyridon Pougkakiotis, et al.
0

In this paper we present a method for the solution of ℓ_1-regularized convex quadratic optimization problems. It is derived by suitably combining a proximal method of multipliers strategy with a semi-smooth Newton method. The resulting linear systems are solved using a Krylov-subspace method, accelerated by appropriate general-purpose preconditioners, which are shown to be optimal with respect to the proximal parameters. Practical efficiency is further improved by warm-starting the algorithm using a proximal alternating direction method of multipliers. We show that the method achieves global convergence under feasibility assumptions. Furthermore, under additional standard assumptions, the method can achieve global linear and local superlinear convergence. The effectiveness of the approach is numerically demonstrated on L^1-regularized PDE-constrained optimization problems.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset