TGL: A General Framework for Temporal GNN Training on Billion-Scale Graphs

03/28/2022
by   Hongkuan Zhou, et al.
0

Many real world graphs contain time domain information. Temporal Graph Neural Networks capture temporal information as well as structural and contextual information in the generated dynamic node embeddings. Researchers have shown that these embeddings achieve state-of-the-art performance in many different tasks. In this work, we propose TGL, a unified framework for large-scale offline Temporal Graph Neural Network training where users can compose various Temporal Graph Neural Networks with simple configuration files. TGL comprises five main components, a temporal sampler, a mailbox, a node memory module, a memory updater, and a message passing engine. We design a Temporal-CSR data structure and a parallel sampler to efficiently sample temporal neighbors to formtraining mini-batches. We propose a novel random chunk scheduling technique that mitigates the problem of obsolete node memory when training with a large batch size. To address the limitations of current TGNNs only being evaluated on small-scale datasets, we introduce two large-scale real-world datasets with 0.2 and 1.3 billion temporal edges. We evaluate the performance of TGL on four small-scale datasets with a single GPU and the two large datasets with multiple GPUs for both link prediction and node classification tasks. We compare TGL with the open-sourced code of five methods and show that TGL achieves similar or better accuracy with an average of 13x speedup. Our temporal parallel sampler achieves an average of 173x speedup on a multi-core CPU compared with the baselines. On a 4-GPU machine, TGL can train one epoch of more than one billion temporal edges within 1-10 hours. To the best of our knowledge, this is the first work that proposes a general framework for large-scale Temporal Graph Neural Networks training on multiple GPUs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/14/2023

DistTGL: Distributed Memory-Based Temporal Graph Neural Network Training

Memory-based Temporal Graph Neural Networks are powerful tools in dynami...
research
01/01/2023

PiPAD: Pipelined and Parallel Dynamic GNN Training on GPUs

Dynamic Graph Neural Networks (DGNNs) have been broadly applied in vario...
research
03/10/2022

Model-Architecture Co-Design for High Performance Temporal GNN Inference on FPGA

Temporal Graph Neural Networks (TGNNs) are powerful models to capture te...
research
09/16/2021

Efficient Scaling of Dynamic Graph Neural Networks

We present distributed algorithms for training dynamic Graph Neural Netw...
research
04/19/2023

HTNet: Dynamic WLAN Performance Prediction using Heterogenous Temporal GNN

Predicting the throughput of WLAN deployments is a classic problem that ...
research
09/09/2021

SeDyT: A General Framework for Multi-Step Event Forecasting via Sequence Modeling on Dynamic Entity Embeddings

Temporal Knowledge Graphs store events in the form of subjects, relation...
research
02/13/2023

TIGER: Temporal Interaction Graph Embedding with Restarts

Temporal interaction graphs (TIGs), consisting of sequences of timestamp...

Please sign up or login with your details

Forgot password? Click here to reset