Deterministic equivalent of the Conjugate Kernel matrix associated to Artificial Neural Networks

06/09/2023
by   Clément Chouard, et al.
0

We study the Conjugate Kernel associated to a multi-layer linear-width feed-forward neural network with random weights, biases and data. We show that the empirical spectral distribution of the Conjugate Kernel converges to a deterministic limit. More precisely we obtain a deterministic equivalent for its Stieltjes transform and its resolvent, with quantitative bounds involving both the dimension and the spectral parameter. The limiting equivalent objects are described by iterating free convolution of measures and classical matrix operations involving the parameters of the model.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/20/2018

Neural Tangent Kernel: Convergence and Generalization in Neural Networks

At initialization, artificial neural networks (ANNs) are equivalent to G...
research
11/23/2022

Quantitative deterministic equivalent of sample covariance matrices with a general dependence structure

We study sample covariance matrices arising from rectangular random matr...
research
09/07/2021

On the space of coefficients of a Feed Forward Neural Network

We define and establish the conditions for `equivalent neural networks' ...
research
11/27/2018

Knots in random neural networks

The weights of a neural network are typically initialized at random, and...
research
06/17/2018

Exact information propagation through fully-connected feed forward neural networks

Neural network ensembles at initialisation give rise to the trainability...
research
08/28/2021

Limiting free energy of multi-layer generalized linear models

We compute the high-dimensional limit of the free energy associated with...
research
04/25/2022

Using the Projected Belief Network at High Dimensions

The projected belief network (PBN) is a layered generative network (LGN)...

Please sign up or login with your details

Forgot password? Click here to reset