Recurrent Independent Mechanisms

09/24/2019
by   Anirudh Goyal, et al.
35

Learning modular structures which reflect the dynamics of the environment can lead to better generalization and robustness to changes which only affect a few of the underlying causes. We propose Recurrent Independent Mechanisms (RIMs), a new recurrent architecture in which multiple groups of recurrent cells operate with nearly independent transition dynamics, communicate only sparingly through the bottleneck of attention, and are only updated at time steps where they are most relevant. We show that this leads to specialization amongst the RIMs, which in turn allows for dramatically improved generalization on tasks where some factors of variation differ systematically between training and evaluation.

READ FULL TEXT

page 20

page 26

page 27

page 28

page 29

page 30

research
03/17/2021

PredRNN: A Recurrent Neural Network for Spatiotemporal Predictive Learning

The predictive learning of spatiotemporal sequences aims to generate fut...
research
05/18/2021

Fast and Slow Learning of Recurrent Independent Mechanisms

Decomposing knowledge into interchangeable pieces promises a generalizat...
research
03/13/2017

DRAGNN: A Transition-based Framework for Dynamically Connected Neural Networks

In this work, we present a compact, modular framework for constructing n...
research
07/20/2019

Learning spatiotemporal signals using a recurrent spiking network that discretizes time

Learning to produce spatiotemporal sequences is a common task the brain ...
research
08/26/2021

Disentangling ODE parameters from dynamics in VAEs

Deep networks have become increasingly of interest in dynamical system p...
research
02/21/2023

Reusable Slotwise Mechanisms

Agents that can understand and reason over the dynamics of objects can h...

Please sign up or login with your details

Forgot password? Click here to reset