Learning Wave Propagation with Attention-Based Convolutional Recurrent Autoencoder Net

01/17/2022
by   Indu Kant Deo, et al.
13

In this paper, we present an end-to-end attention-based convolutional recurrent autoencoder (AB-CRAN) network for data-driven modeling of wave propagation phenomena. The proposed network architecture relies on the attention-based recurrent neural network (RNN) with long short-term memory (LSTM) cells. To construct the low-dimensional learning model, we employ a denoising-based convolutional autoencoder from the full-order snapshots given by time-dependent hyperbolic partial differential equations for wave propagation. To begin, we attempt to address the difficulty in evolving the low-dimensional representation in time with a plain RNN-LSTM for wave propagation phenomenon. We build an attention-based sequence-to-sequence RNN-LSTM architecture to predict the solution over a long time horizon. To demonstrate the effectiveness of the proposed learning model, we consider three benchmark problems namely one-dimensional linear convection, nonlinear viscous Burgers, and two-dimensional Saint-Venant shallow water system. Using the time-series datasets from the benchmark problems, our novel AB-CRAN architecture accurately captures the wave amplitude and preserves the wave characteristics of the solution for long time horizons. The attention-based sequence-to-sequence network increases the time-horizon of prediction by five times compared to the plain RNN-LSTM. Denoising autoencoder further reduces the mean squared error of prediction and improves the generalization capability in the parameter space.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset