Convergence of block coordinate descent with diminishing radius for nonconvex optimization
Block coordinate descent (BCD), also known as nonlinear Gauss-Seidel, is a simple iterative algorithm for nonconvex optimization that sequentially minimizes the objective function in each block coordinate while the other coordinates are held fixed. It is known that block-wise convexity of the objective is not enough to guarantee convergence of BCD to the stationary points and some additional regularity condition is needed. In this work, we provide a simple modification of BCD that has guaranteed global convergence to the stationary points for block-wise convex objective function without additional conditions. Our idea is to restrict the parameter search within a diminishing radius to promote stability of iterates, and then to show that such auxiliary constraint vanishes in the limit. As an application, we provide a modified alternating least squares algorithm for nonnegative CP tensor factorization that is guaranteed to converge to the stationary points of reconstruction error function. We also provide some experimental validation of our result.
READ FULL TEXT