Exploring Interpretable LSTM Neural Networks over Multi-Variable Data

05/28/2019
by   Tian Guo, et al.
0

For recurrent neural networks trained on time series with target and exogenous variables, in addition to accurate prediction, it is also desired to provide interpretable insights into the data. In this paper, we explore the structure of LSTM recurrent neural networks to learn variable-wise hidden states, with the aim to capture different dynamics in multi-variable time series and distinguish the contribution of variables to the prediction. With these variable-wise hidden states, a mixture attention mechanism is proposed to model the generative process of the target. Then we develop associated training methods to jointly learn network parameters, variable and temporal importance w.r.t the prediction of the target variable. Extensive experiments on real datasets demonstrate enhanced prediction performance by capturing the dynamics of different variables. Meanwhile, we evaluate the interpretation results both qualitatively and quantitatively. It exhibits the prospect as an end-to-end framework for both forecasting and knowledge extraction over multi-variable data.

READ FULL TEXT

page 8

page 13

research
06/17/2018

Multi-variable LSTM neural network for autoregressive exogenous model

In this paper, we propose multi-variable LSTM capable of accurate foreca...
research
04/14/2018

An interpretable LSTM neural network for autoregressive exogenous model

In this paper, we propose an interpretable LSTM recurrent neural network...
research
06/02/2018

Hierarchical Attention-Based Recurrent Highway Networks for Time Series Prediction

Time series prediction has been studied in a variety of domains. However...
research
09/13/2021

Prediction of gene expression time series and structural analysis of gene regulatory networks using recurrent neural networks

Methods for time series prediction and classification of gene regulatory...
research
04/25/2015

Differential Recurrent Neural Networks for Action Recognition

The long short-term memory (LSTM) neural network is capable of processin...
research
03/19/2021

Graph Attention Recurrent Neural Networks for Correlated Time Series Forecasting – Full version

We consider a setting where multiple entities inter-act with each other ...
research
12/13/2022

Temporal Weights

In artificial neural networks, weights are a static representation of sy...

Please sign up or login with your details

Forgot password? Click here to reset