Information-theoretical label embeddings for large-scale image classification

07/19/2016
by   Francois Chollet, et al.
0

We present a method for training multi-label, massively multi-class image classification models, that is faster and more accurate than supervision via a sigmoid cross-entropy loss (logistic regression). Our method consists in embedding high-dimensional sparse labels onto a lower-dimensional dense sphere of unit-normed vectors, and treating the classification problem as a cosine proximity regression problem on this sphere. We test our method on a dataset of 300 million high-resolution images with 17,000 labels, where it yields considerably faster convergence, as well as a 7 compared to logistic regression.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset