Graph Self-Attention for learning graph representation with Transformer

by   Wonpyo Park, et al.

We propose a novel Graph Self-Attention module to enable Transformer models to learn graph representation. We aim to incorporate graph information, on the attention map and hidden representations of Transformer. To this end, we propose context-aware attention which considers the interactions between query, key and graph information. Moreover, we propose graph-embedded value to encode the graph information on the hidden representation. Our extensive experiments and ablation studies validate that our method successfully encodes graph representation on Transformer architecture. Finally, our method achieves state-of-the-art performance on multiple benchmarks of graph representation learning, such as graph classification on images and molecules to graph regression on quantum chemistry.


page 1

page 2

page 3

page 4


Structure-Aware Transformer for Graph Representation Learning

The Transformer architecture has gained growing attention in graph repre...

Scalar Coupling Constant Prediction Using Graph Embedding Local Attention Encoder

Scalar coupling constant (SCC) plays a key role in the analysis of three...

Relative Molecule Self-Attention Transformer

Self-supervised learning holds promise to revolutionize molecule propert...

GATOR: Graph-Aware Transformer with Motion-Disentangled Regression for Human Mesh Recovery from a 2D Pose

3D human mesh recovery from a 2D pose plays an important role in various...

Circle Feature Graphormer: Can Circle Features Stimulate Graph Transformer?

In this paper, we introduce two local graph features for missing link pr...

Rethinking Batch Sample Relationships for Data Representation: A Batch-Graph Transformer based Approach

Exploring sample relationships within each mini-batch has shown great po...

DA-Transformer: Distance-aware Transformer

Transformer has achieved great success in the NLP field by composing var...

Please sign up or login with your details

Forgot password? Click here to reset