Learning to Learn from Weak Supervision by Full Supervision

11/30/2017
by   Mostafa Dehghani, et al.
0

In this paper, we propose a method for training neural networks when we have a large set of data with weak labels and a small amount of data with true labels. In our proposed model, we train two neural networks: a target network, the learner and a confidence network, the meta-learner. The target network is optimized to perform a given task and is trained using a large set of unlabeled data that are weakly annotated. We propose to control the magnitude of the gradient updates to the target network using the scores provided by the second confidence network, which is trained on a small amount of supervised data. Thus we avoid that the weight updates computed from noisy labels harm the quality of the target network model.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/01/2017

Avoiding Your Teacher's Mistakes: Training Neural Networks with Controlled Weak Supervision

Training deep neural networks requires massive amounts of training data,...
research
06/13/2018

Towards Theoretical Understanding of Weak Supervision for Information Retrieval

Neural network approaches have recently shown to be effective in several...
research
11/08/2017

Fidelity-Weighted Learning

Training deep neural networks requires many training samples, but in pra...
research
06/21/2018

Learning to Rank from Samples of Variable Quality

Training deep neural networks requires many training samples, but in pra...
research
10/10/2019

Learning from Indirect Observations

Weakly-supervised learning is a paradigm for alleviating the scarcity of...
research
05/07/2018

Learning Matching Models with Weak Supervision for Response Selection in Retrieval-based Chatbots

We propose a method that can leverage unlabeled data to learn a matching...
research
04/26/2018

Network Transplanting

This paper focuses on a novel problem, i.e., transplanting a category-an...

Please sign up or login with your details

Forgot password? Click here to reset