Towards Arbitrarily Expressive GNNs in O(n^2) Space by Rethinking Folklore Weisfeiler-Lehman

06/05/2023
by   Jiarui Feng, et al.
0

Message passing neural networks (MPNNs) have emerged as the most popular framework of graph neural networks (GNNs) in recent years. However, their expressive power is limited by the 1-dimensional Weisfeiler-Lehman (1-WL) test. Some works are inspired by k-WL/FWL (Folklore WL) and design the corresponding neural versions. Despite the high expressive power, there are serious limitations in this line of research. In particular, (1) k-WL/FWL requires at least O(n^k) space complexity, which is impractical for large graphs even when k=3; (2) The design space of k-WL/FWL is rigid, with the only adjustable hyper-parameter being k. To tackle the first limitation, we propose an extension, (k, t)-FWL. We theoretically prove that even if we fix the space complexity to O(n^2) in (k, t)-FWL, we can construct an expressiveness hierarchy up to solving the graph isomorphism problem. To tackle the second problem, we propose k-FWL+, which considers any equivariant set as neighbors instead of all nodes, thereby greatly expanding the design space of k-FWL. Combining these two modifications results in a flexible and powerful framework (k, t)-FWL+. We demonstrate (k, t)-FWL+ can implement most existing models with matching expressiveness. We then introduce an instance of (k,t)-FWL+ called Neighborhood^2-FWL (N^2-FWL), which is practically and theoretically sound. We prove that N^2-FWL is no less powerful than 3-WL, can encode many substructures while only requiring O(n^2) space. Finally, we design its neural version named N^2-GNN and evaluate its performance on various tasks. N^2-GNN achieves superior performance on almost all tasks, with record-breaking results on ZINC-Subset (0.059) and ZINC-Full (0.013), outperforming previous state-of-the-art results by 10.6 respectively.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/05/2021

Improving the Expressive Power of Graph Neural Network with Tinhofer Algorithm

In recent years, Graph Neural Network (GNN) has bloomly progressed for i...
research
05/08/2023

From Relational Pooling to Subgraph GNNs: A Universal Framework for More Expressive Graph Neural Networks

Relational pooling is a framework for building more expressive and permu...
research
06/16/2021

A unifying point of view on expressive power of GNNs

Graph Neural Networks (GNNs) are a wide class of connectionist models fo...
research
02/22/2023

Equivariant Polynomials for Graph Neural Networks

Graph Neural Networks (GNN) are inherently limited in their expressive p...
research
11/04/2022

Weisfeiler and Leman go Hyperbolic: Learning Distance Preserving Node Representations

In recent years, graph neural networks (GNNs) have emerged as a promisin...
research
04/19/2023

HTNet: Dynamic WLAN Performance Prediction using Heterogenous Temporal GNN

Predicting the throughput of WLAN deployments is a classic problem that ...
research
07/06/2022

Pure Transformers are Powerful Graph Learners

We show that standard Transformers without graph-specific modifications ...

Please sign up or login with your details

Forgot password? Click here to reset