SEA: Graph Shell Attention in Graph Neural Networks

10/20/2021
by   Christian M. M. Frey, et al.
0

A common issue in Graph Neural Networks (GNNs) is known as over-smoothing. By increasing the number of iterations within the message-passing of GNNs, the nodes' representations of the input graph align with each other and become indiscernible. Recently, it has been shown that increasing a model's complexity by integrating an attention mechanism yields more expressive architectures. This is majorly contributed to steering the nodes' representations only towards nodes that are more informative than others. Transformer models in combination with GNNs result in architectures including Graph Transformer Layers (GTL), where layers are entirely based on the attention operation. However, the calculation of a node's representation is still restricted to the computational working flow of a GNN. In our work, we relax the GNN architecture by means of implementing a routing heuristic. Specifically, the nodes' representations are routed to dedicated experts. Each expert calculates the representations according to their respective GNN workflow. The definitions of distinguishable GNNs result from k-localized views starting from the central node. We call this procedure Graph Shell Attention (SEA), where experts process different subgraphs in a transformer-motivated fashion. Intuitively, by increasing the number of experts, the models gain in expressiveness such that a node's representation is solely based on nodes that are located within the receptive field of an expert. We evaluate our architecture on various benchmark datasets showing competitive results compared to state-of-the-art models.

READ FULL TEXT

page 6

page 7

page 9

page 10

research
03/01/2023

Diffusing Graph Attention

The dominant paradigm for machine learning on graphs uses Message Passin...
research
08/28/2023

Can Transformer and GNN Help Each Other?

Although Transformer has achieved great success in natural language proc...
research
12/27/2022

A Generalization of ViT/MLP-Mixer to Graphs

Graph Neural Networks (GNNs) have shown great potential in the field of ...
research
01/29/2022

Rewiring with Positional Encodings for Graph Neural Networks

Several recent works use positional encodings to extend the receptive fi...
research
02/05/2021

Graph Joint Attention Networks

Graph attention networks (GATs) have been recognized as powerful tools f...
research
03/29/2021

RAN-GNNs: breaking the capacity limits of graph neural networks

Graph neural networks have become a staple in problems addressing learni...
research
02/27/2023

Connectivity Optimized Nested Graph Networks for Crystal Structures

Graph neural networks (GNNs) have been applied to a large variety of app...

Please sign up or login with your details

Forgot password? Click here to reset