Accelerating Inexact HyperGradient Descent for Bilevel Optimization

06/30/2023
by   Haikuo Yang, et al.
0

We present a method for solving general nonconvex-strongly-convex bilevel optimization problems. Our method – the Restarted Accelerated HyperGradient Descent () method – finds an ϵ-first-order stationary point of the objective with 𝒪̃(κ^3.25ϵ^-1.75) oracle complexity, where κ is the condition number of the lower-level objective and ϵ is the desired accuracy. We also propose a perturbed variant of for finding an (ϵ,𝒪(κ^2.5√(ϵ) ))-second-order stationary point within the same order of oracle complexity. Our results achieve the best-known theoretical guarantees for finding stationary points in bilevel optimization and also improve upon the existing upper complexity bound for finding second-order stationary points in nonconvex-strongly-concave minimax optimization problems, setting a new state-of-the-art benchmark. Empirical studies are conducted to validate the theoretical results in this paper.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset