Gate-Variants of Gated Recurrent Unit (GRU) Neural Networks

01/20/2017
by   Rahul Dey, et al.
0

The paper evaluates three variants of the Gated Recurrent Unit (GRU) in recurrent neural networks (RNN) by reducing parameters in the update and reset gates. We evaluate the three variant GRU models on MNIST and IMDB datasets and show that these GRU-RNN variant models perform as well as the original GRU RNN model while reducing the computational expense.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset