Convergence rates for an inexact ADMM applied to separable convex optimization

01/06/2020
by   William W. Hager, et al.
0

Convergence rates are established for an inexact accelerated alternating direction method of multipliers (I-ADMM) for general separable convex optimization with a linear constraint. Both ergodic and non-ergodic iterates are analyzed. Relative to the iteration number k, the convergence rate is O(1/k) in a convex setting and O(1/k^2) in a strongly convex setting. When an error bound condition holds, the algorithm is 2-step linearly convergent. The I-ADMM is designed so that the accuracy of the inexact iteration preserves the global convergence rates of the exact iteration, leading to better performance in the test problems.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset