Despite their unprecedented success, even the largest language models ma...
Neural sequence models, especially transformers, exhibit a remarkable
ca...
Humans can reason compositionally when presented with new tasks. Previou...
Neural language models (LMs) have been shown to memorize a great deal of...
Language model (LM) pre-training has proven useful for a wide variety of...
Standard deep network models lack the inductive biases needed to general...
Few-shot class incremental learning – the problem of updating a trained
...
Sequence-to-sequence transduction is the core problem in language proces...
Flexible neural models outperform grammar- and automaton-based counterpa...
We introduce MorphNet, a single model that combines morphological analys...