Representation Theorem for Matrix Product States

03/15/2021
by   Erdong Guo, et al.
0

In this work, we investigate the universal representation capacity of the Matrix Product States (MPS) from the perspective of boolean functions and continuous functions. We show that MPS can accurately realize arbitrary boolean functions by providing a construction method of the corresponding MPS structure for an arbitrarily given boolean gate. Moreover, we prove that the function space of MPS with the scale-invariant sigmoidal activation is dense in the space of continuous functions defined on a compact subspace of the n-dimensional real coordinate space ℝ^𝕟. We study the relation between MPS and neural networks and show that the MPS with a scale-invariant sigmoidal function is equivalent to a one-hidden-layer neural network equipped with a kernel function. We construct the equivalent neural networks for several specific MPS models and show that non-linear kernels such as the polynomial kernel which introduces the couplings between different components of the input into the model appear naturally in the equivalent neural networks. At last, we discuss the realization of the Gaussian Process (GP) with infinitely wide MPS by studying their equivalent neural networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/07/2021

On the space of coefficients of a Feed Forward Neural Network

We define and establish the conditions for `equivalent neural networks' ...
research
06/05/2023

Global universal approximation of functional input maps on weighted spaces

We introduce so-called functional input neural networks defined on a pos...
research
09/29/2022

On Symmetric Pseudo-Boolean Functions: Factorization, Kernels and Applications

A symmetric pseudo-Boolean function is a map from Boolean tuples to real...
research
01/12/2022

On neural network kernels and the storage capacity problem

In this short note, we reify the connection between work on the storage ...
research
06/10/2020

Representation formulas and pointwise properties for Barron functions

We study the natural function space for infinitely wide two-layer neural...
research
10/07/2021

Bisimulations for Neural Network Reduction

We present a notion of bisimulation that induces a reduced network which...
research
07/30/2021

Generating Boolean Functions on Totalistic Automata Networks

We consider the problem of studying the simulation capabilities of the d...

Please sign up or login with your details

Forgot password? Click here to reset