Evaluating Bregman Divergences for Probability Learning from Crowd

01/30/2019
by   F. A. Mena, et al.
0

The crowdsourcing scenarios are a good example of having a probability distribution over some categories showing what the people in a global perspective thinks. Learn a predictive model of this probability distribution can be of much more valuable that learn only a discriminative model that gives the most likely category of the data. Here we present differents models that adapts having probability distribution as target to train a machine learning model. We focus on the Bregman divergences framework to used as objective function to minimize. The results show that special care must be taken when build a objective function and consider a equal optimization on neural network in Keras framework.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro