Fine-grained Expressivity of Graph Neural Networks

06/06/2023
by   Jan Böker, et al.
0

Numerous recent works have analyzed the expressive power of message-passing graph neural networks (MPNNs), primarily utilizing combinatorial techniques such as the 1-dimensional Weisfeiler-Leman test (1-WL) for the graph isomorphism problem. However, the graph isomorphism objective is inherently binary, not giving insights into the degree of similarity between two given graphs. This work resolves this issue by considering continuous extensions of both 1-WL and MPNNs to graphons. Concretely, we show that the continuous variant of 1-WL delivers an accurate topological characterization of the expressive power of MPNNs on graphons, revealing which graphs these networks can distinguish and the level of difficulty in separating them. We identify the finest topology where MPNNs separate points and prove a universal approximation theorem. Consequently, we provide a theoretical framework for graph and graphon similarity combining various topological variants of classical characterizations of the 1-WL. In particular, we characterize the expressive power of MPNNs in terms of the tree distance, which is a graph distance based on the concepts of fractional isomorphisms, and substructure counts via tree homomorphisms, showing that these concepts have the same expressive power as the 1-WL and MPNNs on graphons. Empirically, we validate our theoretical findings by showing that randomly initialized MPNNs, without training, exhibit competitive performance compared to their trained counterparts. Moreover, we evaluate different MPNN architectures based on their ability to preserve graph distances, highlighting the significance of our continuous 1-WL test in understanding MPNNs' expressivity.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/23/2020

The expressive power of kth-order invariant graph networks

The expressive power of graph neural network formalisms is commonly meas...
research
12/12/2019

Coloring graph neural networks for node disambiguation

In this paper, we show that a simple coloring scheme can improve, both t...
research
03/04/2021

Weisfeiler and Lehman Go Topological: Message Passing Simplicial Networks

The pairwise interaction paradigm of graph machine learning has predomin...
research
06/16/2020

Walk Message Passing Neural Networks and Second-Order Graph Neural Networks

The expressive power of message passing neural networks (MPNNs) is known...
research
02/13/2018

On Characterizing the Capacity of Neural Networks using Algebraic Topology

The learnability of different neural architectures can be characterized ...
research
10/06/2021

Equivariant Subgraph Aggregation Networks

Message-passing neural networks (MPNNs) are the leading architecture for...
research
02/21/2022

1-WL Expressiveness Is (Almost) All You Need

It has been shown that a message passing neural networks (MPNNs), a popu...

Please sign up or login with your details

Forgot password? Click here to reset