Semistochastic Quadratic Bound Methods

09/05/2013
by   Aleksandr Y. Aravkin, et al.
0

Partition functions arise in a variety of settings, including conditional random fields, logistic regression, and latent gaussian models. In this paper, we consider semistochastic quadratic bound (SQB) methods for maximum likelihood inference based on partition function optimization. Batch methods based on the quadratic bound were recently proposed for this class of problems, and performed favorably in comparison to state-of-the-art techniques. Semistochastic methods fall in between batch algorithms, which use all the data, and stochastic gradient type methods, which use small random selections at each iteration. We build semistochastic quadratic bound-based methods, and prove both global convergence (to a stationary point) under very weak assumptions, and linear convergence rate under stronger assumptions on the objective. To make the proposed methods faster and more stable, we consider inexact subproblem minimization and batch-size selection schemes. The efficacy of SQB methods is demonstrated via comparison with several state-of-the-art techniques on commonly used datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/03/2020

SGB: Stochastic Gradient Bound Method for Optimizing Partition Functions

This paper addresses the problem of optimizing partition functions in a ...
research
06/20/2020

Asymptotically Optimal Exact Minibatch Metropolis-Hastings

Metropolis-Hastings (MH) is a commonly-used MCMC algorithm, but it can b...
research
09/23/2020

A Unified Analysis of First-Order Methods for Smooth Games via Integral Quadratic Constraints

The theory of integral quadratic constraints (IQCs) allows the certifica...
research
11/03/2017

Analysis of Approximate Stochastic Gradient Using Quadratic Constraints and Sequential Semidefinite Programs

We present convergence rate analysis for the approximate stochastic grad...
research
11/05/2015

Stop Wasting My Gradients: Practical SVRG

We present and analyze several strategies for improving the performance ...
research
12/02/2019

Risk Bounds for Low Cost Bipartite Ranking

Bipartite ranking is an important supervised learning problem; however, ...
research
12/04/2022

Convergence under Lipschitz smoothness of ease-controlled Random Reshuffling gradient Algorithms

We consider minimizing the average of a very large number of smooth and ...

Please sign up or login with your details

Forgot password? Click here to reset