Learning in the Machine: the Symmetries of the Deep Learning Channel

12/22/2017
by   Pierre Baldi, et al.
0

In a physical neural system, learning rules must be local both in space and time. In order for learning to occur, non-local information must be communicated to the deep synapses through a communication channel, the deep learning channel. We identify several possible architectures for this learning channel (Bidirectional, Conjoined, Twin, Distinct) and six symmetry challenges: 1) symmetry of architectures; 2) symmetry of weights; 3) symmetry of neurons; 4) symmetry of derivatives; 5) symmetry of processing; and 6) symmetry of learning rules. Random backpropagation (RBP) addresses the second and third symmetry, and some of its variations, such as skipped RBP (SRBP) address the first and the fourth symmetry. Here we address the last two desirable symmetries showing through simulations that they can be achieved and that the learning channel is particularly robust to symmetry variations. Specifically, random backpropagation and its variations can be performed with the same non-linear neurons used in the main input-output forward channel, and the connections in the learning channel can be adapted using the same algorithm used in the forward channel, removing the need for any specialized hardware in the learning channel. Finally, we provide mathematical results in simple cases showing that the learning equations in the forward and backward channels converge to fixed points, for almost any initial conditions. In symmetric architectures, if the weights in both channels are small at initialization, adaptation in both channels leads to weights that are essentially symmetric during and after learning. Biological connections are discussed.

READ FULL TEXT

page 11

page 12

page 13

page 14

page 15

page 30

research
06/22/2015

A Theory of Local Learning, the Learning Channel, and the Optimality of Backpropagation

In a physical neural system, where storage and processing are intimately...
research
12/08/2016

Learning in the Machine: Random Backpropagation and the Learning Channel

Random backpropagation (RBP) is a variant of the backpropagation algorit...
research
08/14/2018

Generalization of Equilibrium Propagation to Vector Field Dynamics

The biological plausibility of the backpropagation algorithm has long be...
research
12/08/2021

Symmetry Perception by Deep Networks: Inadequacy of Feed-Forward Architectures and Improvements with Recurrent Connections

Symmetry is omnipresent in nature and perceived by the visual system of ...
research
01/27/2022

Error-driven Input Modulation: Solving the Credit Assignment Problem without a Backward Pass

Supervised learning in artificial neural networks typically relies on ba...
research
02/19/2022

Multi-Channel FFT Architectures Designed via Folding and Interleaving

Computing the FFT of a single channel is well understood in the literatu...
research
07/27/2017

Dynamic Switching Networks: A Dynamic, Non-local, and Time-independent Approach to Emergence

The concept of emergence is a powerful concept to explain very complex b...

Please sign up or login with your details

Forgot password? Click here to reset