GRAND: Graph Neural Diffusion

We present Graph Neural Diffusion (GRAND) that approaches deep learning on graphs as a continuous diffusion process and treats Graph Neural Networks (GNNs) as discretisations of an underlying PDE. In our model, the layer structure and topology correspond to the discretisation choices of temporal and spatial operators. Our approach allows a principled development of a broad new class of GNNs that are able to address the common plights of graph learning models such as depth, oversmoothing, and bottlenecks. Key to the success of our models are stability with respect to perturbations in the data and this is addressed for both implicit and explicit discretisation schemes. We develop linear and nonlinear versions of GRAND, which achieve competitive results on many standard graph benchmarks.

READ FULL TEXT
research
10/18/2021

Beltrami Flow and Neural Diffusion on Graphs

We propose a novel class of graph neural networks based on the discretis...
research
02/09/2022

Neural Sheaf Diffusion: A Topological Perspective on Heterophily and Oversmoothing in GNNs

Cellular sheaves equip graphs with "geometrical" structure by assigning ...
research
05/11/2021

Graph Theory for Metro Traffic Modelling

A unifying graph theoretic framework for the modelling of metro transpor...
research
05/05/2020

Deep Lagrangian Constraint-based Propagation in Graph Neural Networks

Several real-world applications are characterized by data that exhibit a...
research
07/12/2022

Tuning the Geometry of Graph Neural Networks

By recursively summing node features over entire neighborhoods, spatial ...
research
07/01/2023

Re-Think and Re-Design Graph Neural Networks in Spaces of Continuous Graph Diffusion Functionals

Graph neural networks (GNNs) are widely used in domains like social netw...
research
03/14/2022

Simulating Liquids with Graph Networks

Simulating complex dynamics like fluids with traditional simulators is c...

Please sign up or login with your details

Forgot password? Click here to reset