Machine learning with tree tensor networks, CP rank constraints, and tensor dropout

05/30/2023
by   Hao Chen, et al.
0

Tensor networks approximate order-N tensors with a reduced number of degrees of freedom that is only polynomial in N and arranged as a network of partially contracted smaller tensors. As suggested in [arXiv:2205.15296] in the context of quantum many-body physics, computation costs can be further substantially reduced by imposing constraints on the canonical polyadic (CP) rank of the tensors in such networks. Here we demonstrate how tree tensor networks (TTN) with CP rank constraints and tensor dropout can be used in machine learning. The approach is found to outperform other tensor-network based methods in Fashion-MNIST image classification. A low-rank TTN classifier with branching ratio b=4 reaches test set accuracy 90.3% with low computation costs. Consisting of mostly linear elements, tensor network classifiers avoid the vanishing gradient problem of deep neural networks. The CP rank constraints have additional advantages: The number of parameters can be decreased and tuned more freely to control overfitting, improve generalization properties, and reduce computation costs. They allow us to employ trees with large branching ratios which substantially improves the representation power.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/06/2020

Triple Decomposition and Tensor Recovery of Third Order Tensors

In this paper, we introduce a new tensor decomposition for third order t...
research
08/22/2022

Efficient construction of canonical polyadic approximations of tensor networks

We consider the problem of constructing a canonical polyadic (CP) decomp...
research
06/19/2018

Parallel Nonnegative CP Decomposition of Dense Tensors

The CP tensor decomposition is a low-rank approximation of a tensor. We ...
research
01/08/2020

On Recoverability of Randomly Compressed Tensors with Low CP Rank

Our interest lies in the recoverability properties of compressed tensors...
research
07/01/2020

Tensor Estimation with Nearly Linear Samples

There is a conjectured computational-statistical gap in terms of the num...
research
11/16/2018

Composite Binary Decomposition Networks

Binary neural networks have great resource and computing efficiency, whi...
research
12/19/2014

Speeding-up Convolutional Neural Networks Using Fine-tuned CP-Decomposition

We propose a simple two-step approach for speeding up convolution layers...

Please sign up or login with your details

Forgot password? Click here to reset