Analysis of the Stochastic Alternating Least Squares Method for the Decomposition of Random Tensors

04/27/2020
by   Yanzhao Cao, et al.
0

Stochastic Alternating Least Squares (SALS) is a method that approximates the canonical decomposition of averages of sampled random tensors. Its simplicity and efficient memory usage make SALS an ideal tool for decomposing tensors in an online setting. We show, under mild regularization and readily verifiable assumptions on the boundedness of the data, that the SALS algorithm is globally convergent. Numerical experiments validate our theoretical findings and demonstrate the algorithm's performance and complexity.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/05/2019

A Fast Implementation for the Canonical Polyadic Decomposition

A new implementation of the canonical polyadic decomposition (CPD) is pr...
research
06/04/2019

Stochastic Gradients for Large-Scale Tensor Decomposition

Tensor decomposition is a well-known tool for multiway data analysis. Th...
research
11/25/2019

The Epsilon-Alternating Least Squares for Orthogonal Low-Rank Tensor Approximation and Its Global Convergence

The epsilon alternating least squares (ϵ-ALS) is developed and analyzed ...
research
10/25/2021

A rank-adaptive higher-order orthogonal iteration algorithm for truncated Tucker decomposition

We propose a novel rank-adaptive higher-order orthogonal iteration (HOOI...
research
10/20/2022

Practical Alternating Least Squares for Tensor Ring Decomposition

Tensor ring (TR) decomposition has been widely applied as an effective a...
research
05/10/2021

A Coupled Random Projection Approach to Large-Scale Canonical Polyadic Decomposition

We propose a novel algorithm for the computation of canonical polyadic d...
research
10/25/2022

Moment Estimation for Nonparametric Mixture Models Through Implicit Tensor Decomposition

We present an alternating least squares type numerical optimization sche...

Please sign up or login with your details

Forgot password? Click here to reset