Active Labeling: Streaming Stochastic Gradients

05/26/2022
by   Vivien Cabannes, et al.
0

The workhorse of machine learning is stochastic gradient descent. To access stochastic gradients, it is common to consider iteratively input/output pairs of a training dataset. Interestingly, it appears that one does not need full supervision to access stochastic gradients, which is the main motivation of this paper. After formalizing the "active labeling" problem, which generalizes active learning based on partial supervision, we provide a streaming technique that provably minimizes the ratio of generalization error over number of samples. We illustrate our technique in depth for robust regression.

READ FULL TEXT
research
09/23/2022

From Weakly Supervised Learning to Active Learning

Applied mathematics and machine computations have raised a lot of hope s...
research
12/05/2013

Semi-Stochastic Gradient Descent Methods

In this paper we study the problem of minimizing the average of a large ...
research
12/10/2016

Active Learning for Speech Recognition: the Power of Gradients

In training speech recognition systems, labeling audio clips can be expe...
research
05/25/2022

Learning from time-dependent streaming data with online stochastic algorithms

We study stochastic algorithms in a streaming framework, trained on samp...
research
02/26/2020

LASG: Lazily Aggregated Stochastic Gradients for Communication-Efficient Distributed Learning

This paper targets solving distributed machine learning problems such as...
research
05/15/2015

Algorithmic Connections Between Active Learning and Stochastic Convex Optimization

Interesting theoretical associations have been established by recent pap...

Please sign up or login with your details

Forgot password? Click here to reset