The Descriptive Complexity of Graph Neural Networks

03/08/2023
by   Martin Grohe, et al.
0

We analyse the power of graph neural networks (GNNs) in terms of Boolean circuit complexity and descriptive complexity. We prove that the graph queries that can be computed by a polynomial-size bounded-depth family of GNNs are exactly those definable in the guarded fragment GFO+C of first-order logic with counting and with built-in relations. This puts GNNs in the circuit complexity class TC^0. Remarkably, the GNN families may use arbitrary real weights and a wide class of activation functions that includes the standard ReLU, logistic "sigmod", and hyperbolic tangent functions. If the GNNs are allowed to use random initialisation and global readout (both standard features of GNNs widely used in practice), they can compute exactly the same queries as bounded depth Boolean circuits with threshold gates, that is, exactly the queries in TC^0. Moreover, we show that queries computable by a single GNN with piecewise linear activations and rational weights are definable in GFO+C without built-in relations. Therefore, they are contained in uniform TC^0.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/10/2023

On the power of graph neural networks and the role of the activation function

In this article we present new results about the expressivity of Graph N...
research
12/07/2020

Learning Graph Neural Networks with Approximate Gradient Descent

The first provably efficient algorithm for learning graph neural network...
research
10/29/2018

Median activation functions for graph neural networks

Graph neural networks (GNNs) have been shown to replicate convolutional ...
research
11/06/2022

Exponentially Improving the Complexity of Simulating the Weisfeiler-Lehman Test with Graph Neural Networks

Recent work shows that the expressive power of Graph Neural Networks (GN...
research
12/18/2022

Graph Neural Networks are Inherently Good Generalizers: Insights by Bridging GNNs and MLPs

Graph neural networks (GNNs), as the de-facto model class for representa...
research
09/13/2021

Program-to-Circuit: Exploiting GNNs for Program Representation and Circuit Translation

Circuit design is complicated and requires extensive domain-specific exp...
research
05/25/2023

Demystifying Oversmoothing in Attention-Based Graph Neural Networks

Oversmoothing in Graph Neural Networks (GNNs) refers to the phenomenon w...

Please sign up or login with your details

Forgot password? Click here to reset