The HSIC Bottleneck: Deep Learning without Back-Propagation

08/05/2019
by   Wan-Duo Kurt Ma, et al.
0

We introduce the HSIC (Hilbert-Schmidt independence criterion) bottleneck for training deep neural networks. The HSIC bottleneck is an alternative to conventional backpropagation, that has a number of distinct advantages. The method facilitates parallel processing and requires significantly less operations. It does not suffer from exploding or vanishing gradients. It is biologically more plausible than backpropagation as there is no requirement for symmetric feedback. We find that the HSIC bottleneck provides a performance on the MNIST/FashionMNIST/CIFAR10 classification comparable to backpropagation with a cross-entropy target, even when the system is not encouraged to make the output resemble the classification labels. Appending a single layer trained with SGD (without backpropagation) results in state-of-the-art performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/12/2020

Kernelized information bottleneck leads to biologically plausible 3-factor Hebbian learning in deep networks

The state-of-the art machine learning approach to training deep neural n...
research
06/06/2020

Scaling Equilibrium Propagation to Deep ConvNets by Drastically Reducing its Gradient Estimator Bias

Equilibrium Propagation (EP) is a biologically-inspired algorithm for co...
research
07/13/2021

Tourbillon: a Physically Plausible Neural Architecture

In a physical neural system, backpropagation is faced with a number of o...
research
06/13/2019

Associated Learning: Decomposing End-to-end Backpropagation based on Auto-encoders and Target Propagation

Backpropagation has been widely used in deep learning approaches, but it...
research
06/15/2020

Layer-wise Learning of Kernel Dependence Networks

We propose a greedy strategy to train a deep network for multi-class cla...
research
03/26/2023

Lazy learning: a biologically-inspired plasticity rule for fast and energy efficient synaptic plasticity

When training neural networks for classification tasks with backpropagat...
research
07/12/2021

SoftHebb: Bayesian inference in unsupervised Hebbian soft winner-take-all networks

State-of-the-art artificial neural networks (ANNs) require labelled data...

Please sign up or login with your details

Forgot password? Click here to reset