Wasserstein regularization for sparse multi-task regression

05/20/2018
by   Hicham Janati, et al.
0

Two important elements have driven recent innovation in the field of regression: sparsity-inducing regularization, to cope with high-dimensional problems; multi-task learning through joint parameter estimation, to augment the number of training samples. Both approaches complement each other in the sense that a joint estimation results in more samples, which are needed to estimate sparse models accurately, whereas sparsity promotes models that act on subsets of related variables. This idea has driven the proposal of block regularizers such as L1/Lq norms, which however effective, require that active regressors strictly overlap. In this paper, we propose a more flexible convex regularizer based on unbalanced optimal transport (OT) theory. That regularizer promotes parameters that are close, according to the OT geometry, which takes into account a prior geometric knowledge on the regressor variables. We derive an efficient algorithm based on a regularized formulation of optimal transport, which iterates through applications of Sinkhorn's algorithm along with coordinate descent iterations. The performance of our model is demonstrated on regular grids and complex triangulated geometries of the cortex with an application in neuroimaging.

READ FULL TEXT
research
09/26/2013

High-dimensional Joint Sparsity Random Effects Model for Multi-task Learning

Joint sparsity regularization in multi-task learning has attracted much ...
research
02/14/2021

Sliced Multi-Marginal Optimal Transport

We study multi-marginal optimal transport, a generalization of optimal t...
research
06/11/2015

GAP Safe screening rules for sparse multi-task and multi-class models

High dimensional regression benefits from sparsity promoting regularizat...
research
02/08/2023

Monge, Bregman and Occam: Interpretable Optimal Transport in High-Dimensions with Feature-Sparse Maps

Optimal transport (OT) theory focuses, among all maps T:ℝ^d→ℝ^d that can...
research
10/22/2012

Multi-Stage Multi-Task Feature Learning

Multi-task sparse feature learning aims to improve the generalization pe...
research
09/22/2019

Volume Preserving Image Segmentation with Entropic Regularization Optimal Transport and Its Applications in Deep Learning

Image segmentation with a volume constraint is an important prior for ma...
research
08/27/2019

DeepHoyer: Learning Sparser Neural Network with Differentiable Scale-Invariant Sparsity Measures

In seeking for sparse and efficient neural network models, many previous...

Please sign up or login with your details

Forgot password? Click here to reset