Graph Transformer Networks: Learning Meta-path Graphs to Improve GNNs

06/11/2021
by   Seongjun Yun, et al.
0

Graph Neural Networks (GNNs) have been widely applied to various fields due to their powerful representations of graph-structured data. Despite the success of GNNs, most existing GNNs are designed to learn node representations on the fixed and homogeneous graphs. The limitations especially become problematic when learning representations on a misspecified graph or a heterogeneous graph that consists of various types of nodes and edges. To address this limitations, we propose Graph Transformer Networks (GTNs) that are capable of generating new graph structures, which preclude noisy connections and include useful connections (e.g., meta-paths) for tasks, while learning effective node representations on the new graphs in an end-to-end fashion. We further propose enhanced version of GTNs, Fast Graph Transformer Networks (FastGTNs), that improve scalability of graph transformations. Compared to GTNs, FastGTNs are 230x faster and use 100x less memory while allowing the identical graph transformations as GTNs. In addition, we extend graph transformations to the semantic proximity of nodes allowing non-local operations beyond meta-paths. Extensive experiments on both homogeneous graphs and heterogeneous graphs show that GTNs and FastGTNs with non-local operations achieve the state-of-the-art performance for node classification tasks. The code is available: https://github.com/seongjunyun/Graph_Transformer_Networks

READ FULL TEXT

page 11

page 13

research
05/29/2020

Non-Local Graph Neural Networks

Modern graph neural networks (GNNs) learn node embeddings through multil...
research
03/03/2020

Heterogeneous Graph Transformer

Recent years have witnessed the emerging success of graph neural network...
research
05/31/2023

Spectral Heterogeneous Graph Convolutions via Positive Noncommutative Polynomials

Heterogeneous Graph Neural Networks (HGNNs) have gained significant popu...
research
11/26/2022

PatchGT: Transformer over Non-trainable Clusters for Learning Graph Representations

Recently the Transformer structure has shown good performances in graph ...
research
06/10/2021

GraphiT: Encoding Graph Structure in Transformers

We show that viewing graphs as sets of node features and incorporating s...
research
04/28/2022

DOTIN: Dropping Task-Irrelevant Nodes for GNNs

Scalability is an important consideration for deep graph neural networks...
research
12/24/2019

Multi-Graph Transformer for Free-Hand Sketch Recognition

Learning meaningful representations of free-hand sketches remains a chal...

Please sign up or login with your details

Forgot password? Click here to reset