Provably Convergent Working Set Algorithm for Non-Convex Regularized Regression

06/24/2020
by   Alain Rakotomamonjy, et al.
13

Owing to their statistical properties, non-convex sparse regularizers have attracted much interest for estimating a sparse linear model from high dimensional data. Given that the solution is sparse, for accelerating convergence, a working set strategy addresses the optimization problem through an iterative algorithm by incre-menting the number of variables to optimize until the identification of the solution support. While those methods have been well-studied and theoretically supported for convex regularizers, this paper proposes a working set algorithm for non-convex sparse regularizers with convergence guarantees. The algorithm, named FireWorks, is based on a non-convex reformulation of a recent primal-dual approach and leverages on the geometry of the residuals. Our theoretical guarantees derive from a lower bound of the objective function decrease between two inner solver iterations and shows the convergence to a stationary point of the full problem. More importantly, we also show that convergence is preserved even when the inner solver is inexact, under sufficient decay of the error across iterations. Our experimental results demonstrate high computational gain when using our working set strategy compared to the full problem solver for both block-coordinate descent or a proximal gradient solver.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/16/2019

Screening Rules for Lasso with Non-Convex Sparse Regularizers

Leveraging on the convexity of the Lasso problem , screening rules help ...
research
10/13/2021

The Convex Geometry of Backpropagation: Neural Network Gradient Flows Converge to Extreme Points of the Dual Convex Program

We study non-convex subgradient flows for training two-layer ReLU neural...
research
02/03/2023

Efficient Gradient Approximation Method for Constrained Bilevel Optimization

Bilevel optimization has been developed for many machine learning tasks ...
research
04/16/2022

Beyond L1: Faster and Better Sparse Models with skglm

We propose a new fast algorithm to estimate any sparse generalized linea...
research
07/20/2018

A Fast, Principled Working Set Algorithm for Exploiting Piecewise Linear Structure in Convex Problems

By reducing optimization to a sequence of smaller subproblems, working s...
research
05/01/2021

NuSPAN: A Proximal Average Network for Nonuniform Sparse Model – Application to Seismic Reflectivity Inversion

We solve the problem of sparse signal deconvolution in the context of se...
research
08/04/2021

Rapid Convex Optimization of Centroidal Dynamics using Block Coordinate Descent

In this paper we explore the use of block coordinate descent (BCD) to op...

Please sign up or login with your details

Forgot password? Click here to reset