Expanding boundaries of Gap Safe screening

02/22/2021
by   Cassio Dantas, et al.
10

Sparse optimization problems are ubiquitous in many fields such as statistics, signal/image processing and machine learning. This has led to the birth of many iterative algorithms to solve them. A powerful strategy to boost the performance of these algorithms is known as safe screening: it allows the early identification of zero coordinates in the solution, which can then be eliminated to reduce the problem's size and accelerate convergence. In this work, we extend the existing Gap Safe screening framework by relaxing the global strong-concavity assumption on the dual cost function. Instead, we exploit local regularity properties, that is, strong concavity on well-chosen subsets of the domain. The non-negativity constraint is also integrated to the existing framework. Besides making safe screening possible to a broader class of functions that includes beta-divergences (e.g., the Kullback-Leibler divergence), the proposed approach also improves upon the existing Gap Safe screening rules on previously applicable cases (e.g., logistic regression). The proposed general framework is exemplified by some notable particular cases: logistic function, beta = 1.5 and Kullback-Leibler divergences. Finally, we showcase the effectiveness of the proposed screening rules with different solvers (coordinate descent, multiplicative-update and proximal gradient algorithms) and different data sets (binary classification, hyperspectral and count data).

READ FULL TEXT

page 20

page 22

page 24

research
06/11/2015

GAP Safe screening rules for sparse multi-task and multi-class models

High dimensional regression benefits from sparsity promoting regularizat...
research
02/15/2022

Accelerating Non-Negative and Bounded-Variable Linear Regression Algorithms with Safe Screening

Non-negative and bounded-variable linear regression problems arise in a ...
research
02/01/2022

Safe Screening for Logistic Regression with ℓ_0-ℓ_2 Regularization

In logistic regression, it is often desirable to utilize regularization ...
research
11/17/2016

Gap Safe screening rules for sparsity enforcing penalties

In high dimensional regression settings, sparsity enforcing penalties ha...
research
02/21/2018

Dual Extrapolation for Faster Lasso Solvers

Convex sparsity-inducing regularizations are ubiquitous in high-dimensio...
research
05/22/2018

Safe Element Screening for Submodular Function Minimization

Submodular functions are discrete analogs of convex functions, which hav...
research
02/22/2020

Safe Screening for the Generalized Conditional Gradient Method

The conditional gradient method (CGM) has been widely used for fast spar...

Please sign up or login with your details

Forgot password? Click here to reset