Sparsest Univariate Learning Models Under Lipschitz Constraint

12/27/2021
by   Shayan Aziznejad, et al.
0

Beside the minimization of the prediction error, two of the most desirable properties of a regression scheme are stability and interpretability. Driven by these principles, we propose continuous-domain formulations for one-dimensional regression problems. In our first approach, we use the Lipschitz constant as a regularizer, which results in an implicit tuning of the overall robustness of the learned mapping. In our second approach, we control the Lipschitz constant explicitly using a user-defined upper-bound and make use of a sparsity-promoting regularizer to favor simpler (and, hence, more interpretable) solutions. The theoretical study of the latter formulation is motivated in part by its equivalence, which we prove, with the training of a Lipschitz-constrained two-layer univariate neural network with rectified linear unit (ReLU) activations and weight decay. By proving representer theorems, we show that both problems admit global minimizers that are continuous and piecewise-linear (CPWL) functions. Moreover, we propose efficient algorithms that find the sparsest solution of each problem: the CPWL mapping with the least number of linear regions. Finally, we illustrate numerically the outcome of our formulations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/17/2020

Deep Neural Networks with Trainable Activations and Controlled Lipschitz Constant

We introduce a variational framework to learn the activation functions o...
research
11/02/2021

Training Certifiably Robust Neural Networks with Efficient Local Lipschitz Bounds

Certified robustness is a desirable property for deep neural networks in...
research
03/02/2020

Exactly Computing the Local Lipschitz Constant of ReLU Networks

The Lipschitz constant of a neural network is a useful metric for provab...
research
07/02/2020

Efficient Proximal Mapping of the 1-path-norm of Shallow Networks

We demonstrate two new important properties of the 1-path-norm of shallo...
research
04/07/2016

Online Optimization of Smoothed Piecewise Constant Functions

We study online optimization of smoothed piecewise constant functions ov...
research
07/22/2023

Sparse Index Tracking: Simultaneous Asset Selection and Capital Allocation via ℓ_0-Constrained Portfolio

Sparse index tracking is one of the prominent passive portfolio manageme...

Please sign up or login with your details

Forgot password? Click here to reset