Low Rank Approximation for General Tensor Networks

07/15/2022
by   Arvind V. Mahankali, et al.
0

We study the problem of approximating a given tensor with q modes A ∈ℝ^n ×…× n with an arbitrary tensor network of rank k – that is, a graph G = (V, E), where |V| = q, together with a collection of tensors {U_v | v ∈ V} which are contracted in the manner specified by G to obtain a tensor T. For each mode of U_v corresponding to an edge incident to v, the dimension is k, and we wish to find U_v such that the Frobenius norm distance between T and A is minimized. This generalizes a number of well-known tensor network decompositions, such as the Tensor Train, Tensor Ring, Tucker, and PEPS decompositions. We approximate A by a binary tree network T' with O(q) cores, such that the dimension on each edge of this network is at most O(k^O(dt)· q/ε), where d is the maximum degree of G and t is its treewidth, such that A - T'_F^2 ≤ (1 + ε) A - T_F^2. The running time of our algorithm is O(q ·nnz(A)) + n ·poly(k^dtq/ε), where nnz(A) is the number of nonzero entries of A. Our algorithm is based on a new dimensionality reduction technique for tensor decomposition which may be of independent interest. We also develop fixed-parameter tractable (1 + ε)-approximation algorithms for Tensor Train and Tucker decompositions, improving the running time of Song, Woodruff and Zhong (SODA, 2019) and avoiding the use of generic polynomial system solvers. We show that our algorithms have a nearly optimal dependence on 1/ε assuming that there is no O(1)-approximation algorithm for the 2 → 4 norm with better running time than brute force. Finally, we give additional results for Tucker decomposition with robust loss functions, and fixed-parameter tractable CP decomposition.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/21/2015

Decomposing Overcomplete 3rd Order Tensors using Sum-of-Squares Algorithms

Tensor rank and low-rank tensor decompositions have many applications in...
research
10/14/2021

More Efficient Sampling for Tensor Decomposition

Recent papers have developed alternating least squares (ALS) methods for...
research
02/08/2023

Approximately Optimal Core Shapes for Tensor Decompositions

This work studies the combinatorial optimization problem of finding an o...
research
12/23/2020

Ranks of Tensor Networks for Eigenspace Projections and the Curse of Dimensionality

The hierarchical (multi-linear) rank of an order-d tensor is key in dete...
research
06/18/2020

MARS: Masked Automatic Ranks Selection in Tensor Decompositions

Tensor decomposition methods have recently proven to be efficient for co...
research
08/11/2022

Interaction Decompositions for Tensor Network Regression

It is well known that tensor network regression models operate on an exp...
research
06/15/2021

Augmented Tensor Decomposition with Stochastic Optimization

Tensor decompositions are powerful tools for dimensionality reduction an...

Please sign up or login with your details

Forgot password? Click here to reset