Active Learning with Neural Networks: Insights from Nonparametric Statistics

by   Yinglun Zhu, et al.

Deep neural networks have great representation power, but typically require large numbers of training examples. This motivates deep active learning methods that can significantly reduce the amount of labeled training data. Empirical successes of deep active learning have been recently reported in the literature, however, rigorous label complexity guarantees of deep active learning have remained elusive. This constitutes a significant gap between theory and practice. This paper tackles this gap by providing the first near-optimal label complexity guarantees for deep active learning. The key insight is to study deep active learning from the nonparametric classification perspective. Under standard low noise conditions, we show that active learning with neural networks can provably achieve the minimax label complexity, up to disagreement coefficient and other logarithmic terms. When equipped with an abstention option, we further develop an efficient deep active learning algorithm that achieves π—‰π—ˆπ—…π—’π—…π—ˆπ—€(1/Ο΅) label complexity, without any low noise assumptions. We also provide extensions of our results beyond the commonly studied Sobolev/HΓΆlder spaces and develop label complexity guarantees for learning in Radon 𝖑𝖡^2 spaces, which have recently been proposed as natural function spaces associated with neural networks.


page 1

page 2

page 3

page 4

βˆ™ 03/31/2022

Efficient Active Learning with Abstention

The goal of active learning is to achieve the same accuracy achievable b...
βˆ™ 06/29/2015

S2: An Efficient Graph Based Active Learning Algorithm with Application to Nonparametric Classification

This paper investigates the problem of active learning for binary label ...
βˆ™ 06/06/2021

Neural Active Learning with Performance Guarantees

We investigate the problem of active learning in the streaming setting i...
βˆ™ 09/28/2018

Target-Independent Active Learning via Distribution-Splitting

To reduce the label complexity in Agnostic Active Learning (A^2 algorith...
βˆ™ 06/14/2019

Online Active Learning of Reject Option Classifiers

Active learning is an important technique to reduce the number of labele...
βˆ™ 10/11/2019

Not All are Made Equal: Consistency of Weighted Averaging Estimators Under Active Learning

Active learning seeks to build the best possible model with a budget of ...
βˆ™ 02/23/2016

Search Improves Label for Active Learning

We investigate active learning with access to two distinct oracles: Labe...

Please sign up or login with your details

Forgot password? Click here to reset