Advantages of biologically-inspired adaptive neural activation in RNNs during learning

06/22/2020
by   Victor Geadah, et al.
18

Dynamic adaptation in single-neuron response plays a fundamental role in neural coding in biological neural networks. Yet, most neural activation functions used in artificial networks are fixed and mostly considered as an inconsequential architecture choice. In this paper, we investigate nonlinear activation function adaptation over the large time scale of learning, and outline its impact on sequential processing in recurrent neural networks. We introduce a novel parametric family of nonlinear activation functions, inspired by input-frequency response curves of biological neurons, which allows interpolation between well-known activation functions such as ReLU and sigmoid. Using simple numerical experiments and tools from dynamical systems and information theory, we study the role of neural activation features in learning dynamics. We find that activation adaptation provides distinct task-specific solutions and in some cases, improves both learning speed and performance. Importantly, we find that optimal activation features emerging from our parametric family are considerably different from typical functions used in the literature, suggesting that exploiting the gap between these usual configurations can help learning. Finally, we outline situations where neural activation adaptation alone may help mitigate changes in input statistics in a given task, suggesting mechanisms for transfer learning optimization.

READ FULL TEXT

page 3

page 5

page 7

research
11/07/2021

Biologically Inspired Oscillating Activation Functions Can Bridge the Performance Gap between Biological and Artificial Neurons

Nonlinear activation functions endow neural networks with the ability to...
research
01/17/2019

Activation Functions for Generalized Learning Vector Quantization - A Performance Comparison

An appropriate choice of the activation function (like ReLU, sigmoid or ...
research
01/17/2020

Approximating Activation Functions

ReLU is widely seen as the default choice for activation functions in ne...
research
03/19/2018

Deep learning improved by biological activation functions

`Biologically inspired' activation functions, such as the logistic sigmo...
research
10/17/2018

Continual Learning of Recurrent Neural Networks by Locally Aligning Distributed Representations

Temporal models based on recurrent neural networks have proven to be qui...
research
05/18/2016

Learning activation functions from data using cubic spline interpolation

Neural networks require a careful design in order to perform properly on...

Please sign up or login with your details

Forgot password? Click here to reset