Asymmetric compressive learning guarantees with applications to quantized sketches

04/20/2021
by   Vincent Schellekens, et al.
0

The compressive learning framework reduces the computational cost of training on large-scale datasets. In a sketching phase, the data is first compressed to a lightweight sketch vector, obtained by mapping the data samples through a well-chosen feature map, and averaging those contributions. In a learning phase, the desired model parameters are then extracted from this sketch by solving an optimization problem, which also involves a feature map. When the feature map is identical during the sketching and learning phases, formal statistical guarantees (excess risk bounds) have been proven. However, the desirable properties of the feature map are different during sketching and learning (e.g. quantized outputs, and differentiability, respectively). We thus study the relaxation where this map is allowed to be different for each phase. First, we prove that the existing guarantees carry over to this asymmetric scheme, up to a controlled error term, provided some Limited Projected Distortion (LPD) property holds. We then instantiate this framework to the setting of quantized sketches, by proving that the LPD indeed holds for binary sketch contributions. Finally, we further validate the approach with numerical simulations, including a large-scale application in audio event classification.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/22/2017

Compressive Statistical Learning with Random Feature Moments

We describe a general framework --compressive statistical learning-- for...
research
08/04/2020

Sketching Datasets for Large-Scale Learning (long version)

This article considers "sketched learning," or "compressive learning," a...
research
04/26/2018

Quantized Compressive K-Means

The recent framework of compressive statistical learning aims at designi...
research
09/14/2020

When compressive learning fails: blame the decoder or the sketch?

In compressive learning, a mixture model (a set of centroids or a Gaussi...
research
10/21/2021

Mean Nyström Embeddings for Adaptive Compressive Learning

Compressive learning is an approach to efficient large scale learning ba...
research
10/22/2019

Compressive Learning for Semi-Parametric Models

In the compressive learning theory, instead of solving a statistical lea...
research
02/12/2020

Compressive Learning of Generative Networks

Generative networks implicitly approximate complex densities from their ...

Please sign up or login with your details

Forgot password? Click here to reset