How transferable are the datasets collected by active learners?

by   David Lowell, et al.

Active learning is a widely-used training strategy for maximizing predictive performance subject to a fixed annotation budget. Between rounds of training, an active learner iteratively selects examples for annotation, typically based on some measure of the model's uncertainty, coupling the acquired dataset with the underlying model. However, owing to the high cost of annotation and the rapid pace of model development, labeled datasets may remain valuable long after a particular model is surpassed by new technology. In this paper, we investigate the transferability of datasets collected with an acquisition model A to a distinct successor model S. We seek to characterize whether the benefits of active learning persist when A and S are different models. To this end, we consider two standard NLP tasks and associated datasets: text classification and sequence tagging. We find that training S on a dataset actively acquired with a (different) model A typically yields worse performance than when S is trained with "native" data (i.e., acquired actively using S), and often performs worse than training on i.i.d. sampled data. These findings have implications for the use of active learning in practice,suggesting that it is better suited to cases where models are updated no more frequently than labeled data.


page 1

page 2

page 3

page 4


Active^2 Learning: Actively reducing redundancies in Active Learning methods for Sequence Tagging and Machine Translation

While deep learning is a powerful tool for natural language processing (...

On Dataset Transferability in Active Learning for Transformers

Active learning (AL) aims to reduce labeling costs by querying the examp...

Budgeted Learning of Naive-Bayes Classifiers

Frequently, acquiring training data has an associated cost. We consider ...

Depth Uncertainty Networks for Active Learning

In active learning, the size and complexity of the training dataset chan...

Active Learning with Siamese Twins for Sequence Tagging

Deep learning, in general, and natural language processing methods, in p...

Char-RNN and Active Learning for Hashtag Segmentation

We explore the abilities of character recurrent neural network (char-RNN...

Analysis of Stopping Active Learning based on Stabilizing Predictions

Within the natural language processing (NLP) community, active learning ...

Please sign up or login with your details

Forgot password? Click here to reset