VPNets: Volume-preserving neural networks for learning source-free dynamics

by   Aiqing Zhu, et al.

We propose volume-preserving networks (VPNets) for learning unknown source-free dynamical systems using trajectory data. We propose three modules and combine them to obtain two network architectures, coined R-VPNet and LA-VPNet. The distinct feature of the proposed models is that they are intrinsic volume-preserving. In addition, the corresponding approximation theorems are proved, which theoretically guarantee the expressivity of the proposed VPNets to learn source-free dynamics. The effectiveness, generalization ability and structure-preserving property of the VP-Nets are demonstrated by numerical experiments.


page 1

page 2

page 3

page 4


Locally-symplectic neural networks for learning volume-preserving dynamics

We propose locally-symplectic neural networks LocSympNets for learning v...

Volume-preserving Neural Networks: A Solution to the Vanishing Gradient Problem

We propose a novel approach to addressing the vanishing (or exploding) g...

Variational Integrator Networks for Physically Meaningful Embeddings

Learning workable representations of dynamical systems is becoming an in...

Structure-preserving Method for Reconstructing Unknown Hamiltonian Systems from Trajectory Data

We present a numerical approach for approximating unknown Hamiltonian sy...

The aromatic bicomplex for the description of divergence-free aromatic forms and volume-preserving integrators

Aromatic B-series were introduced as an extension of standard Butcher-se...

Approximation capabilities of measure-preserving neural networks

Measure-preserving neural networks are well-developed invertible models,...

Learning sparse linear dynamic networks in a hyper-parameter free setting

We address the issue of estimating the topology and dynamics of sparse l...

Please sign up or login with your details

Forgot password? Click here to reset