Restricted Recurrent Neural Tensor Networks: Exploiting Word Frequency and Compositionality for Increased Model Capacity and Performance With No Computational Overhead

04/03/2017
by   Alexandre Salle, et al.
0

Increasing the capacity of recurrent neural networks (RNN) usually involves augmenting the size of the hidden layer, resulting in a significant increase of computational cost. An alternative is the recurrent neural tensor network (RNTN), which increases capacity by employing distinct hidden layer weights for each vocabulary word. However, memory usage scales linearly with vocabulary size, which can reach millions for word-level language models. In this paper, we introduce restricted recurrent neural tensor networks (r-RNTN) which reserve distinct hidden layer weights for frequent vocabulary words while sharing a single set of weights for infrequent words. Perplexity evaluations show that r-RNTNs improve language model performance over standard RNNs using only a small fraction of the parameters of unrestricted RNTNs.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset