Textless self-supervised speech models have grown in capabilities in rec...
While natural languages differ widely in both canonical word order and w...
Listeners recognize and integrate words in rapid and noisy everyday spee...
Prompting is now a dominant method for evaluating the linguistic knowled...
Scalar inferences (SI) are a signature example of how humans interpret
l...
Targeted syntactic evaluations of language models ask whether models sho...
Over the past two decades, numerous studies have demonstrated how less
p...
Emergent communication research often focuses on optimizing task-specifi...
Although approximately 50
female physicians tend to be underrepresented ...
Recent causal probing literature reveals when language models and syntac...
Numerous analyses of reading time (RT) data have been implemented – all ...
The uniform information density (UID) hypothesis posits a preference amo...
Prior work has shown that structural supervision helps English language
...
Models of context-sensitive communication often use the Rational Speech ...
Transformer-based language models pre-trained on large amounts of text d...
Neural language models exhibit impressive performance on a variety of ta...
Humans have the ability to rapidly understand rich combinatorial concept...
Previous studies investigating the syntactic abilities of deep learning
...
Humans can learn structural properties about a word from minimal experie...
In this work, we analyze how human gaze during reading comprehension is
...
Human reading behavior is tuned to the statistics of natural language: t...
We present STARC (Structured Annotations for Reading Comprehension), a n...
What information from an act of sentence understanding is robustly
repre...
Neural language models have achieved state-of-the-art performances on ma...
Deep learning sequence models have led to a marked increase in performan...
Recurrent Neural Networks (RNNs) trained on a language modeling task hav...
Speakers often face choices as to how to structure their intended messag...
Word embeddings trained on large-scale historical corpora can illuminate...
We deploy the methods of controlled psycholinguistic experimentation to ...
State-of-the-art LSTM language models trained on large corpora learn
seq...
Simple reference games are of central theoretical and empirical importan...
Recurrent neural networks (RNNs) are the state of the art in sequence
mo...
RNN language models have achieved state-of-the-art perplexity results an...
Children learning their first language face multiple problems of inducti...
We present a novel approach for determining learners' second language
pr...
A frequent object of study in linguistic typology is the order of elemen...