Convergence Rates of Gradient Descent and MM Algorithms for Generalized Bradley-Terry Models

01/01/2019
by   Milan Vojnovic, et al.
8

We show tight convergence rate bounds for gradient descent and MM algorithms for maximum likelihood estimation and maximum aposteriori probability estimation of a popular Bayesian inference method for generalized Bradley-Terry models. This class of models includes the Bradley-Terry model of paired comparisons, the Rao-Kupper model of paired comparisons with ties, the Luce choice model, and the Plackett-Luce ranking model. Our results show that MM algorithms have same convergence rates as gradient descent algorithms up to constant factors. For the maximum likelihood estimation, the convergence is linear with the rate crucially determined by the algebraic connectivity of the matrix of item pair co-occurrences in observed comparison data. For the Bayesian inference, the convergence rate is also linear, with the rate determined by a parameter of the prior distribution in a way that can make convergence arbitrarily slow for small values of this parameter. We propose a simple, first-order acceleration method that resolves the slow convergence issue.

READ FULL TEXT
research
11/08/2010

Efficient Bayesian Inference for Generalized Bradley-Terry Models

The Bradley-Terry model is a popular approach to describe probabilities ...
research
02/17/2022

Refined Convergence Rates for Maximum Likelihood Estimation under Finite Mixture Models

We revisit convergence rates for maximum likelihood estimation (MLE) und...
research
09/16/2022

Maximum likelihood estimation and prediction error for a Matérn model on the circle

This work considers Gaussian process interpolation with a periodized ver...
research
12/06/2022

Further analysis of multilevel Stein variational gradient descent with an application to the Bayesian inference of glacier ice models

Multilevel Stein variational gradient descent is a method for particle-b...
research
12/16/2017

Hierarchical Bayesian Bradley-Terry for Applications in Major League Baseball

A common problem faced in statistical inference is drawing conclusions f...
research
03/13/2020

Boosting Frank-Wolfe by Chasing Gradients

The Frank-Wolfe algorithm has become a popular first-order optimization ...
research
06/23/2023

On tracking varying bounds when forecasting bounded time series

We consider a new framework where a continuous, though bounded, random v...

Please sign up or login with your details

Forgot password? Click here to reset