Exphormer: Sparse Transformers for Graphs

03/10/2023
by   Hamed Shirzad, et al.
0

Graph transformers have emerged as a promising architecture for a variety of graph learning and representation tasks. Despite their successes, though, it remains challenging to scale graph transformers to large graphs while maintaining accuracy competitive with message-passing networks. In this paper, we introduce Exphormer, a framework for building powerful and scalable graph transformers. Exphormer consists of a sparse attention mechanism based on two mechanisms: virtual global nodes and expander graphs, whose mathematical characteristics, such as spectral expansion, pseduorandomness, and sparsity, yield graph transformers with complexity only linear in the size of the graph, while allowing us to prove desirable theoretical properties of the resulting transformer models. We show that incorporating Exphormer into the recently-proposed GraphGPS framework produces models with competitive empirical results on a wide variety of graph datasets, including state-of-the-art results on three datasets. We also show that Exphormer can scale to datasets on larger graphs than shown in previous graph transformer architectures. Code can be found at https://github.com/hamed1375/Exphormer.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/08/2023

Attending to Graph Transformers

Recently, transformer architectures for graphs emerged as an alternative...
research
10/27/2021

Transformers Generalize DeepSets and Can be Extended to Graphs and Hypergraphs

We present a generalization of Transformers to any-order permutation inv...
research
01/29/2023

Graph Mixer Networks

In recent years, the attention mechanism has demonstrated superior perfo...
research
07/06/2022

Pure Transformers are Powerful Graph Learners

We show that standard Transformers without graph-specific modifications ...
research
06/23/2022

Equiformer: Equivariant Graph Attention Transformer for 3D Atomistic Graphs

3D-related inductive biases like translational invariance and rotational...
research
03/25/2022

Gransformer: Transformer-based Graph Generation

Transformers have become widely used in modern models for various tasks ...
research
02/08/2021

Graph Traversal with Tensor Functionals: A Meta-Algorithm for Scalable Learning

Graph Representation Learning (GRL) methods have impacted fields from ch...

Please sign up or login with your details

Forgot password? Click here to reset