Forward Signal Propagation Learning

04/04/2022
by   Adam Kohan, et al.
0

We propose a new learning algorithm for propagating a learning signal and updating neural network parameters via a forward pass, as an alternative to backpropagation. In forward signal propagation learning (sigprop), there is only the forward path for learning and inference, so there are no additional structural or computational constraints on learning, such as feedback connectivity, weight transport, or a backward pass, which exist under backpropagation. Sigprop enables global supervised learning with only a forward path. This is ideal for parallel training of layers or modules. In biology, this explains how neurons without feedback connections can still receive a global learning signal. In hardware, this provides an approach for global supervised learning without backward connectivity. Sigprop by design has better compatibility with models of learning in the brain and in hardware than backpropagation and alternative approaches to relaxing learning constraints. We also demonstrate that sigprop is more efficient in time and memory than they are. To further explain the behavior of sigprop, we provide evidence that sigprop provides useful learning signals in context to backpropagation. To further support relevance to biological and hardware learning, we use sigprop to train continuous time neural networks with Hebbian updates and train spiking neural networks without surrogate functions.

READ FULL TEXT
research
06/07/2023

Correlative Information Maximization: A Biologically Plausible Approach to Supervised Deep Neural Networks without Weight Symmetry

The backpropagation algorithm has experienced remarkable success in trai...
research
06/15/2017

Hardware-efficient on-line learning through pipelined truncated-error backpropagation in binary-state networks

Artificial neural networks (ANNs) trained using backpropagation are powe...
research
05/24/2023

Block-local learning with probabilistic latent representations

The ubiquitous backpropagation algorithm requires sequential updates acr...
research
01/04/2018

An Implementation of Back-Propagation Learning on GF11, a Large SIMD Parallel Computer

Current connectionist simulations require huge computational resources. ...
research
02/10/2023

Forward Learning with Top-Down Feedback: Empirical and Analytical Characterization

"Forward-only" algorithms, which train neural networks while avoiding a ...
research
01/04/2023

The Predictive Forward-Forward Algorithm

We propose the predictive forward-forward (PFF) algorithm for conducting...
research
12/22/2017

Benchmarking Decoupled Neural Interfaces with Synthetic Gradients

Artifical Neural Network are a particular class of learning system model...

Please sign up or login with your details

Forgot password? Click here to reset