Adaptive Computation Time for Recurrent Neural Networks

03/29/2016
by   Alex Graves, et al.
0

This paper introduces Adaptive Computation Time (ACT), an algorithm that allows recurrent neural networks to learn how many computational steps to take between receiving an input and emitting an output. ACT requires minimal changes to the network architecture, is deterministic and differentiable, and does not add any noise to the parameter gradients. Experimental results are provided for four synthetic problems: determining the parity of binary vectors, applying binary logic operations, adding integers, and sorting real numbers. Overall, performance is dramatically improved by the use of ACT, which successfully adapts the number of computational steps to the requirements of the problem. We also present character-level language modelling results on the Hutter prize Wikipedia dataset. In this case ACT does not yield large gains in performance; however it does provide intriguing insight into the structure of the data, with more computation allocated to harder-to-predict transitions, such as spaces between words and ends of sentences. This suggests that ACT or other adaptive computation methods could provide a generic method for inferring segment boundaries in sequence data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/21/2018

Comparing Fixed and Adaptive Computation Time for Recurrent Neural Networks

Adaptive Computation Time for Recurrent Neural Networks (ACT) is one of ...
research
12/06/2018

Layer Flexible Adaptive Computational Time for Recurrent Neural Networks

Deep recurrent neural networks show significant benefits in prediction t...
research
04/27/2020

Differentiable Adaptive Computation Time for Visual Reasoning

This paper presents a novel attention-based algorithm for achieving adap...
research
10/24/2016

Learning to Reason With Adaptive Computation

Multi-hop inference is necessary for machine learning systems to success...
research
11/18/2016

Variable Computation in Recurrent Neural Networks

Recurrent neural networks (RNNs) have been used extensively and with inc...
research
09/01/2016

Neural Network Architecture Optimization through Submodularity and Supermodularity

Deep learning models' architectures, including depth and width, are key ...
research
10/04/2019

Multi-level Gated Recurrent Neural Network for Dialog Act Classification

In this paper we focus on the problem of dialog act (DA) labelling. This...

Please sign up or login with your details

Forgot password? Click here to reset