Removing numerical dispersion from linear evolution equations

06/22/2019
by   Jens Wittsten, et al.
0

In this paper we describe a method for removing the numerical errors in the modeling of linear evolution equations that are caused by approximating the time derivative by a finite difference operator. We prove that the method results in a solution with correct evolution throughout the entire lifespan. We also demonstrate the method on a model equation as well as on the simulation of elastic and viscoelastic wave propagation.

READ FULL TEXT

page 20

page 21

page 22

page 33

research
11/09/2021

Approximate solution of the Cauchy problem for a first-order integrodifferential equation with solution derivative memory

We consider the Cauchy problem for a first-order evolution equation with...
research
01/12/2018

Simulation of the propagation of a cylindrical shear wave : non linear and dissipative modelling

The simulation of a wave propagation caused by seismic stimulation allow...
research
09/30/2016

GPU Acceleration of Hermite Methods for the Simulation of Wave Propagation

The Hermite methods of Goodrich, Hagstrom, and Lorenz (2006) use Hermite...
research
03/11/2023

Simulation of Wave in Hypo-Elastic-Plastic Solids Modeled by Eulerian Conservation Laws

This paper reports a theoretical and numerical framework to model nonlin...
research
02/02/2023

Spectra of evolution operators of a class of neutral renewal equations: theoretical and numerical aspects

In this work we begin a theoretical and numerical investigation on the s...
research
08/08/2018

Backprop Evolution

The back-propagation algorithm is the cornerstone of deep learning. Desp...
research
12/08/2016

Geodesics using Waves: Computing Distances using Wave Propagation

In this paper, we present a new method for computing approximate geodesi...

Please sign up or login with your details

Forgot password? Click here to reset