A Statistical Investigation of Long Memory in Language and Music

Representation and learning of long-range dependencies is a central challenge confronted in modern applications of machine learning to sequence data. Yet despite the prominence of this issue, the basic problem of measuring long-range dependence, either in a given data source or as represented in a trained deep model, remains largely limited to heuristic tools. We contribute a statistical framework for investigating long-range dependence in current applications of sequence modeling, drawing on the statistical theory of long memory stochastic processes. By analogy with their linear predecessors in the time series literature, we identify recurrent neural networks (RNNs) as nonlinear processes that simultaneously attempt to learn both a feature representation for and the long-range dependency structure of an input sequence. We derive testable implications concerning the relationship between long memory in real-world data and its learned representation in a deep network architecture, which are explored through a semiparametric framework adapted to the high-dimensional setting. We establish the validity of statistical inference for a simple estimator, which yields a decision rule for long memory in RNNs. Experiments illustrating this statistical framework confirm the presence of long memory in a diverse collection of natural language and music data, but show that a variety of RNN architectures fail to capture this property even after training to benchmark accuracy in a language model.

READ FULL TEXT

page 24

page 25

research
12/11/2017

Long-Range Correlation Underlying Childhood Language and Generative Models

Long-range correlation, a property of time series exhibiting long-term m...
research
07/20/2022

Learning Sequence Representations by Non-local Recurrent Neural Memory

The key challenge of sequence representation learning is to capture the ...
research
05/23/2019

Quantifying Long Range Dependence in Language and User Behavior to improve RNNs

Characterizing temporal dependence patterns is a critical step in unders...
research
03/11/2023

Resurrecting Recurrent Neural Networks for Long Sequences

Recurrent Neural Networks (RNNs) offer fast inference on long sequences ...
research
05/20/2009

Learning Nonlinear Dynamic Models

We present a novel approach for learning nonlinear dynamic models, which...
research
11/27/2019

AR-Net: A simple Auto-Regressive Neural Network for time-series

In this paper we present a new framework for time-series modeling that c...
research
11/29/2019

Refinements of Barndorff-Nielsen and Shephard model: an analysis of crude oil price with machine learning

A commonly used stochastic model for derivative and commodity market ana...

Please sign up or login with your details

Forgot password? Click here to reset