Accelerating Alternating Least Squares for Tensor Decomposition by Pairwise Perturbation
The alternating least squares algorithm for CP and Tucker decomposition is dominated in cost by the tensor contractions necessary to set up the quadratic optimization subproblems. We introduce a novel family of algorithms that uses perturbative corrections to the subproblems rather than recomputing the tensor contractions. This approximation is accurate when the factor matrices are changing little across iterations, which occurs when alternating least squares approaches convergence. We provide a theoretical analysis to bound the approximation error, leveraging a novel notion of the tensor condition number. Our numerical experiments demonstrate that the proposed pairwise perturbation algorithms are easy to control and converge to minima that are as good as alternating least squares. The performance of the new algorithms shows improvements of 1.3-2.8X with respect to state of the art alternating least squares approaches for various model tensor problems and real datasets on 1, 16 and 256 Intel KNL nodes of the Stampede2 supercomputer.
READ FULL TEXT