Hessian barrier algorithms for linearly constrained optimization problems

09/25/2018
by   Immanuel M. Bomze, et al.
0

In this paper, we propose an interior-point method for linearly constrained optimization problems (possibly nonconvex). The method -- which we call the Hessian barrier algorithm (HBA) -- combines a forward Euler discretization of Hessian Riemannian gradient flows with an Armijo backtracking step-size policy. In this way, HBA can be seen as an explicit alternative to mirror descent (MD), and contains as special cases the affine scaling algorithm, regularized Newton processes, and several other iterative solution methods. Our main result is that, modulo a non-degeneracy condition, the algorithm converges to the problem's set of critical points; hence, in the convex case, the algorithm converges globally to the problem's minimum set. In the case of linearly constrained quadratic programs (not necessarily convex), we also show that the method's convergence rate is O(1/k^ρ) for some ρ∈(0,1] that depends only on the choice of kernel function (i.e., not on the problem's primitives). These theoretical results are validated by numerical experiments in standard non-convex test functions and large-scale traffic assignment problems.

READ FULL TEXT
research
11/04/2019

Generalized Self-concordant Hessian-barrier algorithms

Many problems in statistical learning, imaging, and computer vision invo...
research
06/25/2021

Hessian informed mirror descent

Inspired by the recent paper (L. Ying, Mirror descent algorithms for min...
research
05/26/2019

Partial minimization of strict convex functions and tensor scaling

Assume that f is a strict convex function with a unique minimum in R^n. ...
research
03/09/2018

A Stochastic Semismooth Newton Method for Nonsmooth Nonconvex Optimization

In this work, we present a globalized stochastic semismooth Newton metho...
research
04/10/2023

Interior Point Methods with a Gradient Oracle

We provide an interior point method based on quasi-Newton iterations, wh...
research
10/04/2022

The Dynamics of Sharpness-Aware Minimization: Bouncing Across Ravines and Drifting Towards Wide Minima

We consider Sharpness-Aware Minimization (SAM), a gradient-based optimiz...
research
07/04/2022

Approximate Vanishing Ideal Computations at Scale

The approximate vanishing ideal of a set of points X = {𝐱_1, …, 𝐱_m}⊆ [0...

Please sign up or login with your details

Forgot password? Click here to reset