The world as a neural network

by   Vitaly Vanchurin, et al.

We discuss a possibility that the entire universe on its most fundamental level is a neural network. We identify two different types of dynamical degrees of freedom: "trainable" variables (e.g. bias vector or weight matrix) and "hidden" variables (e.g. state vector of neurons). We first consider stochastic evolution of the trainable variables to argue that near equilibrium their dynamics is well approximated by Madelung equations (with free energy representing the phase) and further away from the equilibrium by Hamilton-Jacobi equations (with free energy representing the Hamilton's principal function). This shows that the trainable variables can indeed exhibit classical and quantum behaviors with the state vector of neurons representing the hidden variables. We then study stochastic evolution of the hidden variables by considering D non-interacting subsystems with average state vectors, x̅^1, ..., x̅^D and an overall average state vector x̅^0. In the limit when the weight matrix is a permutation matrix, the dynamics of x̅^μ can be described in terms of relativistic strings in an emergent D+1 dimensional Minkowski space-time. If the subsystems are minimally interacting, with interactions described by a metric tensor, then the emergent space-time becomes curved. We argue that the entropy production in such a system is a local function of the metric tensor which should be determined by the symmetries of the Onsager tensor. It turns out that a very simple and highly symmetric Onsager tensor leads to the entropy production described by the Einstein-Hilbert term. This shows that the learning dynamics of a neural network can indeed exhibit approximate behaviors described by both quantum mechanics and general relativity. We also discuss a possibility that the two descriptions are holographic duals of each other.


page 1

page 2

page 3

page 4


Self-organized criticality in neural networks

We demonstrate, both analytically and numerically, that learning dynamic...

Emergent Quantumness in Neural Networks

It was recently shown that the Madelung equations, that is, a hydrodynam...

Towards a theory of machine learning

We define a neural network as a septuple consisting of (1) a state vecto...

Universal Uhrig dynamical decoupling for bosonic systems

We construct efficient deterministic dynamical decoupling schemes protec...

Large Associative Memory Problem in Neurobiology and Machine Learning

Dense Associative Memories or modern Hopfield networks permit storage an...

Chaos and Complexity from Quantum Neural Network: A study with Diffusion Metric in Machine Learning

In this work, our prime objective is to study the phenomena of quantum c...

Free Energy Minimization Using the 2-D Cluster Variation Method: Initial Code Verification and Validation

A new approach for general artificial intelligence (GAI), building on ne...

Please sign up or login with your details

Forgot password? Click here to reset