What functions can Graph Neural Networks compute on random graphs? The role of Positional Encoding

05/24/2023
by   Nicolas Keriven, et al.
0

We aim to deepen the theoretical understanding of Graph Neural Networks (GNNs) on large graphs, with a focus on their expressive power. Existing analyses relate this notion to the graph isomorphism problem, which is mostly relevant for graphs of small sizes, or studied graph classification or regression tasks, while prediction tasks on nodes are far more relevant on large graphs. Recently, several works showed that, on very general random graphs models, GNNs converge to certains functions as the number of nodes grows. In this paper, we provide a more complete and intuitive description of the function space generated by equivariant GNNs for node-tasks, through general notions of convergence that encompass several previous examples. We emphasize the role of input node features, and study the impact of node Positional Encodings (PEs), a recent line of work that has been shown to yield state-of-the-art results in practice. Through the study of several examples of PEs on large random graphs, we extend previously known universality results to significantly more general models. Our theoretical results hint at some normalization tricks, which is shown numerically to have a positive impact on GNN generalization on synthetic and real data. Our proofs contain new concentration inequalities of independent interest.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/27/2021

On the Universality of Graph Neural Networks on Large Random Graphs

We study the approximation power of Graph Neural Networks (GNNs) on late...
research
11/13/2019

A Hierarchy of Graph Neural Networks Based on Learnable Local Features

Graph neural networks (GNNs) are a powerful tool to learn representation...
research
09/22/2022

Memory-Augmented Graph Neural Networks: A Neuroscience Perspective

Graph neural networks (GNNs) have been extensively used for many domains...
research
09/21/2021

Learning General Optimal Policies with Graph Neural Networks: Expressive Power, Transparency, and Limits

It has been recently shown that general policies for many classical plan...
research
06/06/2023

How does over-squashing affect the power of GNNs?

Graph Neural Networks (GNNs) are the state-of-the-art model for machine ...
research
09/24/2020

How Neural Networks Extrapolate: From Feedforward to Graph Neural Networks

We study how neural networks trained by gradient descent extrapolate, i....
research
07/16/2022

Rewiring Networks for Graph Neural Network Training Using Discrete Geometry

Information over-squashing is a phenomenon of inefficient information pr...

Please sign up or login with your details

Forgot password? Click here to reset