DeepSITH: Efficient Learning via Decomposition of What and When Across Time Scales

04/09/2021
by   Brandon Jacques, et al.
0

Extracting temporal relationships over a range of scales is a hallmark of human perception and cognition – and thus it is a critical feature of machine learning applied to real-world problems. Neural networks are either plagued by the exploding/vanishing gradient problem in recurrent neural networks (RNNs) or must adjust their parameters to learn the relevant time scales (e.g., in LSTMs). This paper introduces DeepSITH, a network comprising biologically-inspired Scale-Invariant Temporal History (SITH) modules in series with dense connections between layers. SITH modules respond to their inputs with a geometrically-spaced set of time constants, enabling the DeepSITH network to learn problems along a continuum of time-scales. We compare DeepSITH to LSTMs and other recent RNNs on several time series prediction and decoding tasks. DeepSITH achieves state-of-the-art performance on these problems.

READ FULL TEXT

page 2

page 4

page 7

research
02/15/2019

Learning to Adaptively Scale Recurrent Neural Networks

Recent advancements in recurrent neural network (RNN) research have demo...
research
07/09/2021

SITHCon: A neural network robust to variations in input scaling on the time dimension

In machine learning, convolutional neural networks (CNNs) have been extr...
research
05/16/2020

Achieving Online Regression Performance of LSTMs with Simple RNNs

Recurrent Neural Networks (RNNs) are widely used for online regression d...
research
06/07/2019

Relaxed Weight Sharing: Effectively Modeling Time-Varying Relationships in Clinical Time-Series

Recurrent neural networks (RNNs) are commonly applied to clinical time-s...
research
12/15/2016

Tunable Efficient Unitary Neural Networks (EUNN) and their application to RNNs

Using unitary (instead of general) matrices in artificial neural network...
research
10/27/2020

Hybrid Backpropagation Parallel Reservoir Networks

In many real-world applications, fully-differentiable RNNs such as LSTMs...
research
08/17/2020

HiPPO: Recurrent Memory with Optimal Polynomial Projections

A central problem in learning from sequential data is representing cumul...

Please sign up or login with your details

Forgot password? Click here to reset