Tensor decomposition of higher-order correlations by nonlinear Hebbian plasticity

06/29/2021
by   Gabriel Koch Ocker, et al.
0

Biological synaptic plasticity exhibits nonlinearities that are not accounted for by classic Hebbian learning rules. Here, we introduce a simple family of generalized, nonlinear Hebbian learning rules. We study the computations implemented by their dynamics in the simple setting of a neuron receiving feedforward inputs. We show that these nonlinear Hebbian rules allow a neuron to learn tensor decompositions of its higher-order input correlations. The particular input correlation decomposed, and the form of the decomposition, depend on the location of nonlinearities in the plasticity rule. For simple, biologically motivated parameters, the neuron learns tensor eigenvectors of higher-order input correlations. We prove that each tensor eigenvector is an attractor and determine their basins of attraction. We calculate the volume of those basins, showing that the dominant eigenvector has the largest basin of attraction. We then study arbitrary learning rules, and find that any learning rule that admits a finite Taylor expansion into the neural input and output also has stable equilibria at tensor eigenvectors of its higher-order input correlations. Nonlinearities in synaptic plasticity thus allow a neuron to encode higher-order input correlations in a simple fashion.

READ FULL TEXT
research
01/24/2023

Neuronal architecture extracts statistical temporal patterns

Neuronal systems need to process temporal signals. We here show how high...
research
02/21/2020

Convolutional Tensor-Train LSTM for Spatio-temporal Learning

Higher-order Recurrent Neural Networks (RNNs) are effective for long-ter...
research
03/07/2016

On higher order computations and synaptic meta-plasticity in the human brain: IT point of view (June, 2016)

Glia modify neuronal connectivity by creating structural changes in the ...
research
02/28/2020

HOTCAKE: Higher Order Tucker Articulated Kernels for Deeper CNN Compression

The emerging edge computing has promoted immense interests in compacting...
research
06/14/2023

The ELM Neuron: an Efficient and Expressive Cortical Neuron Model Can Solve Long-Horizon Tasks

Traditional large-scale neuroscience models and machine learning utilize...
research
06/24/2020

A Higher Order Unscented Transform

We develop a new approach for estimating the expected values of nonlinea...
research
08/19/2021

Residual Tensor Train: a Flexible and Efficient Approach for Learning Multiple Multilinear Correlations

Tensor Train (TT) approach has been successfully applied in the modellin...

Please sign up or login with your details

Forgot password? Click here to reset