Spell Once, Summon Anywhere: A Two-Level Open-Vocabulary Language Model

04/23/2018
by   Sebastian J. Mielke, et al.
0

We show how to deploy recurrent neural networks within a hierarchical Bayesian language model. Our generative story combines a standard RNN language model (generating the word tokens in each sentence) with an RNN-based spelling model (generating the letters in each word type). These two RNNs respectively capture sentence structure and word structure, and are kept separate as in linguistics. The model can generate spellings for novel words in context and thus serves as an open-vocabulary language model. For known words, embeddings are naturally inferred by combining evidence from type spelling and token context. We compare to a number of baselines and previous work, establishing state-of-the-art results.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset