Stochastic Inexact Augmented Lagrangian Method for Nonconvex Expectation Constrained Optimization

12/19/2022
by   Zichong Li, et al.
ibm
Rensselaer Polytechnic Institute
Michigan State University
0

Many real-world problems not only have complicated nonconvex functional constraints but also use a large number of data points. This motivates the design of efficient stochastic methods on finite-sum or expectation constrained problems. In this paper, we design and analyze stochastic inexact augmented Lagrangian methods (Stoc-iALM) to solve problems involving a nonconvex composite (i.e. smooth+nonsmooth) objective and nonconvex smooth functional constraints. We adopt the standard iALM framework and design a subroutine by using the momentum-based variance-reduced proximal stochastic gradient method (PStorm) and a postprocessing step. Under certain regularity conditions (assumed also in existing works), to reach an ε-KKT point in expectation, we establish an oracle complexity result of O(ε^-5), which is better than the best-known O(ε^-6) result. Numerical experiments on the fairness constrained problem and the Neyman-Pearson classification problem with real data demonstrate that our proposed method outperforms an existing method with the previously best-known complexity result.

READ FULL TEXT

page 1

page 2

page 3

page 4

07/08/2019

A Hybrid Stochastic Optimization Framework for Stochastic Composite Nonconvex Optimization

In this paper, we introduce a new approach to develop stochastic optimiz...
02/15/2019

ProxSARAH: An Efficient Algorithmic Framework for Stochastic Composite Nonconvex Optimization

In this paper, we propose a new stochastic algorithmic framework to solv...
06/24/2019

A Stochastic Composite Gradient Method with Incremental Variance Reduction

We consider the problem of minimizing the composition of a smooth (nonco...
02/16/2022

Data-Driven Minimax Optimization with Expectation Constraints

Attention to data-driven optimization approaches, including the well-kno...
02/07/2019

Momentum Schemes with Stochastic Variance Reduction for Nonconvex Composite Optimization

Two new stochastic variance-reduced algorithms named SARAH and SPIDER ha...
11/12/2019

Nonconvex Stochastic Nested Optimization via Stochastic ADMM

We consider the stochastic nested composition optimization problem where...

Please sign up or login with your details

Forgot password? Click here to reset