kNet: A Deep kNN Network To Handle Label Noise

by   Itzik Mizrahi, et al.

Deep Neural Networks require large amounts of labeled data for their training. Collecting this data at scale inevitably causes label noise.Hence,the need to develop learning algorithms that are robust to label noise. In recent years, k Nearest Neighbors (kNN) emerged as a viable solution to this problem. Despite its success, kNN is not without its problems. Mainly, it requires a huge memory footprint to store all the training samples and it needs an advanced data structure to allow for fast retrieval of the relevant examples, given a query sample. We propose a neural network, termed kNet, that learns to perform kNN. Once trained, we no longer need to store the training data, and processing a query sample is a simple matter of inference. To use kNet, we first train a preliminary network on the data set, and then train kNet on the penultimate layer of the preliminary network.We find that kNet gives a smooth approximation of kNN,and cannot handle the sharp label changes between samples that kNN can exhibit. This indicates that currently kNet is best suited to approximate kNN with a fairly large k. Experiments on two data sets show that this is the regime in which kNN works best,and can therefore be replaced by kNet.In practice, kNet consistently improve the results of all preliminary networks, in all label noise regimes, by up to 3


page 4

page 7


Deep Learning is Robust to Massive Label Noise

Deep neural networks trained on large supervised datasets have led to im...

A robust approach for deep neural networks in presence of label noise: relabelling and filtering instances during training

Deep learning has outperformed other machine learning algorithms in a va...

On the Resistance of Neural Nets to Label Noise

We investigate the behavior of convolutional neural networks (CNN) in th...

Image Classification with Deep Learning in the Presence of Noisy Labels: A Survey

Image classification systems recently made a big leap with the advanceme...

Trust Your Model: Iterative Label Improvement and Robust Training by Confidence Based Filtering and Dataset Partitioning

State-of-the-art, high capacity deep neural networks not only require la...

Robust Deep Learning with Active Noise Cancellation for Spatial Computing

This paper proposes CANC, a Co-teaching Active Noise Cancellation method...

Contextual Memory Trees

We design and study a Contextual Memory Tree (CMT), a learning memory co...

Please sign up or login with your details

Forgot password? Click here to reset