Efficient Scaling of Dynamic Graph Neural Networks

We present distributed algorithms for training dynamic Graph Neural Networks (GNN) on large scale graphs spanning multi-node, multi-GPU systems. To the best of our knowledge, this is the first scaling study on dynamic GNN. We devise mechanisms for reducing the GPU memory usage and identify two execution time bottlenecks: CPU-GPU data transfer; and communication volume. Exploiting properties of dynamic graphs, we design a graph difference-based strategy to significantly reduce the transfer time. We develop a simple, but effective data distribution technique under which the communication volume remains fixed and linear in the input size, for any number of GPUs. Our experiments using billion-size graphs on a system of 128 GPUs shows that: (i) the distribution scheme achieves up to 30x speedup on 128 GPUs; (ii) the graph-difference technique reduces the transfer time by a factor of up to 4.1x and the overall execution time by up to 40

READ FULL TEXT

page 10

page 16

page 17

research
03/01/2023

HyScale-GNN: A Scalable Hybrid GNN Training System on Single-Node Heterogeneous Architecture

Graph Neural Networks (GNNs) have shown success in many real-world appli...
research
01/31/2021

A Runtime-Based Computational Performance Predictor for Deep Neural Network Training

Deep learning researchers and practitioners usually leverage GPUs to hel...
research
07/14/2023

DistTGL: Distributed Memory-Based Temporal Graph Neural Network Training

Memory-based Temporal Graph Neural Networks are powerful tools in dynami...
research
01/01/2023

PiPAD: Pipelined and Parallel Dynamic GNN Training on GPUs

Dynamic Graph Neural Networks (DGNNs) have been broadly applied in vario...
research
06/21/2022

Nimble GNN Embedding with Tensor-Train Decomposition

This paper describes a new method for representing embedding tables of g...
research
03/25/2021

ButterFly BFS – An Efficient Communication Pattern for Multi Node Traversals

Breadth-First Search (BFS) is a building block used in a wide array of g...
research
03/28/2022

TGL: A General Framework for Temporal GNN Training on Billion-Scale Graphs

Many real world graphs contain time domain information. Temporal Graph N...

Please sign up or login with your details

Forgot password? Click here to reset