Fast and Accurate Randomized Algorithms for Low-rank Tensor Decompositions

04/02/2021
by   Linjian Ma, et al.
0

Low-rank Tucker and CP tensor decompositions are powerful tools in data analytics. The widely used alternating least squares (ALS) method, which solves a sequence of over-determined least squares subproblems, is inefficient for large and sparse tensors. We propose a fast and accurate sketched ALS algorithm for Tucker decomposition, which solves a sequence of sketched rank-constrained linear least squares subproblems. Theoretical sketch size upper bounds are provided to achieve O(ϵ)-relative error for each subproblem with two sketching techniques, TensorSketch and leverage score sampling. Experimental results show that this new ALS algorithm, combined with a new initialization scheme based on randomized range finder, yields up to 22.0% relative decomposition residual improvement compared to the state-of-the-art sketched randomized algorithm for Tucker decomposition of various synthetic datasets. This Tucker-ALS algorithm is further used to accelerate CP decomposition, by using randomized Tucker compression followed by CP decomposition of the Tucker core tensor. Experimental results show that this algorithm not only converges faster, but also yields more accurate CP decompositions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/11/2022

Distributed-Memory Randomized Algorithms for Sparse Tensor CP Decomposition

Low-rank Candecomp / PARAFAC (CP) Decomposition is a powerful tool for t...
research
06/30/2020

Practical Leverage-Based Sampling for Low-Rank Tensor Decomposition

Conventional algorithms for finding low-rank canonical polyadic (CP) ten...
research
10/27/2019

Comparison of Accuracy and Scalability of Gauss-Newton and Alternating Least Squares for CP Decomposition

Alternating least squares is the most widely used algorithm for CP tenso...
research
07/03/2018

OCTen: Online Compression-based Tensor Decomposition

Tensor decompositions are powerful tools for large data analytics as the...
research
12/17/2019

Lower Memory Oblivious (Tensor) Subspace Embeddings with Fewer Random Bits: Modewise Methods for Least Squares

In this paper new general modewise Johnson-Lindenstrauss (JL) subspace e...
research
10/14/2021

More Efficient Sampling for Tensor Decomposition

Recent papers have developed alternating least squares (ALS) methods for...
research
10/09/2020

Concurrent Alternating Least Squares for multiple simultaneous Canonical Polyadic Decompositions

Tensor decompositions, such as CANDECOMP/PARAFAC (CP), are widely used i...

Please sign up or login with your details

Forgot password? Click here to reset