Structured Learning via Logistic Regression

07/03/2014
by   Justin Domke, et al.
0

A successful approach to structured learning is to write the learning objective as a joint function of linear parameters and inference messages, and iterate between updates to each. This paper observes that if the inference problem is "smoothed" through the addition of entropy terms, for fixed messages, the learning objective reduces to a traditional (non-structured) logistic regression problem with respect to parameters. In these logistic regression problems, each training example has a bias term determined by the current set of messages. Based on this insight, the structured energy function can be extended from linear factors to any function class where an "oracle" exists to minimize a logistic loss.

READ FULL TEXT

page 11

page 12

page 14

page 15

page 16

page 17

page 18

page 19

research
08/24/2017

Logistic Regression as Soft Perceptron Learning

We comment on the fact that gradient ascent for logistic regression has ...
research
08/07/2018

Belief likelihood function for generalised logistic regression

The notion of belief likelihood function of repeated trials is introduce...
research
11/18/2015

A New Smooth Approximation to the Zero One Loss with a Probabilistic Interpretation

We examine a new form of smooth approximation to the zero one loss in wh...
research
10/03/2015

Distributed Parameter Map-Reduce

This paper describes how to convert a machine learning problem into a se...
research
06/30/2018

Probabilistic Bisection with Spatial Metamodels

Probabilistic Bisection Algorithm performs root finding based on knowled...
research
03/09/2015

Kernel-Based Just-In-Time Learning for Passing Expectation Propagation Messages

We propose an efficient nonparametric strategy for learning a message op...
research
05/21/2013

Robust Logistic Regression using Shift Parameters (Long Version)

Annotation errors can significantly hurt classifier performance, yet dat...

Please sign up or login with your details

Forgot password? Click here to reset