Causal inference in time series in terms of Rényi transfer entropy

03/22/2022
by   Petr Jizba, et al.
0

Uncovering causal interdependencies from observational data is one of the great challenges of nonlinear time series analysis. In this paper, we discuss this topic with the help of information-theoretic concept known as Rényi information measure. In particular, we tackle the directional information flow between bivariate time series in terms of Rényi transfer entropy. We show that by choosing Rényi α parameter appropriately we can control information that is transferred only between selected parts of underlying distributions. This, in turn, provides particularly potent tool for quantifying causal interdependencies in time series, where the knowledge of "black swan" events such as spikes or sudden jumps are of a key importance. In this connection, we first prove that for Gaussian variables, Granger causality and Rényi transfer entropy are entirely equivalent. Moreover, we also partially extend this results to heavy-tailed α-Gaussian variables. These results allow to establish connection between autoregressive and Rényi entropy based information-theoretic approaches to data-driven causal inference. To aid our intuition we employ Leonenko et al. entropy estimator and analyze Rényi information flow between bivariate time series generated from two unidirectionally coupled Rössler systems. Notably, we find that Rényi transfer entropy not only allowed us to detect a threshold of synchronization but it also provided a non-trivial insight into the structure of a transient regime that exists between region of chaotic correlations and synchronization threshold. In addition, from Rényi transfer entropy we could reliably infer the direction of coupling - and hence causality, only for coupling strengths smaller that the onset value of transient regime, i.e. when two Rössler systems were coupled, but have not yet entered a synchronization.

READ FULL TEXT

page 2

page 3

page 7

page 19

page 20

page 28

page 29

page 30

research
10/09/2012

Quantifying Causal Coupling Strength: A Lag-specific Measure For Multivariate Time Series Related To Transfer Entropy

While it is an important problem to identify the existence of causal ass...
research
09/15/2022

Information Theoretic Measures of Causal Influences during Transient Neural Events

Transient phenomena play a key role in coordinating brain activity at mu...
research
03/19/2021

On Spurious Causality, CO2, and Global Temperature

Stips, Macias, Coughlan, Garcia-Gorriz, and Liang (2016, Nature Scientif...
research
06/21/2022

Statistical inference of lead-lag at various timescales between asynchronous time series from p-values of transfer entropy

Symbolic transfer entropy is a powerful non-parametric tool to detect le...
research
04/29/2021

Learning in Feedforward Neural Networks Accelerated by Transfer Entropy

Current neural networks architectures are many times harder to train bec...
research
03/18/2019

Transfer Entropy Rate Through Lempel-Ziv Complexity

In this article we present a methodology to estimate the Transfer Entrop...
research
08/21/2018

Modes of Information Flow

Information flow between components of a system takes many forms and is ...

Please sign up or login with your details

Forgot password? Click here to reset