Link inference of noisy delay-coupled networks: Machine learning and opto-electronic experimental tests

by   Amitava Banerjee, et al.

We devise a machine learning technique to solve the general problem of inferring network links that have time-delays. The goal is to do this purely from time-series data of the network nodal states. This task has applications in fields ranging from applied physics and engineering to neuroscience and biology. To achieve this, we first train a type of machine learning system known as reservoir computing to mimic the dynamics of the unknown network. We formulate and test a technique that uses the trained parameters of the reservoir system output layer to deduce an estimate of the unknown network structure. Our technique, by its nature, is non-invasive, but is motivated by the widely-used invasive network inference method whereby the responses to active perturbations applied to the network are observed and employed to infer network links (e.g., knocking down genes to infer gene regulatory networks). We test this technique on experimental and simulated data from delay-coupled opto-electronic oscillator networks. We show that the technique often yields very good results particularly if the system does not exhibit synchrony. We also find that the presence of dynamical noise can strikingly enhance the accuracy and ability of our technique, especially in networks that exhibit synchrony.


page 1

page 2

page 3

page 4


Using Machine Learning to Assess Short Term Causal Dependence and Infer Network Links

We introduce and test a general machine-learning-based technique for the...

Machine Learning assisted Chimera and Solitary states in Networks

Chimera and Solitary states have captivated scientists and engineers due...

Inferring Turbulent Parameters via Machine Learning

We design a machine learning technique to solve the general problem of i...

Parallel Machine Learning for Forecasting the Dynamics of Complex Networks

Forecasting the dynamics of large complex networks from previous time-se...

A Survey on Reservoir Computing and its Interdisciplinary Applications Beyond Traditional Machine Learning

Reservoir computing (RC), first applied to temporal signal processing, i...

Online Training of an Opto-Electronic Reservoir Computer Applied to Real-Time Channel Equalisation

Reservoir Computing is a bio-inspired computing paradigm for processing ...

Inference of cell dynamics on perturbation data using adjoint sensitivity

Data-driven dynamic models of cell biology can be used to predict cell r...

Please sign up or login with your details

Forgot password? Click here to reset