GNNSampler: Bridging the Gap between Sampling Algorithms of GNN and Hardware

08/26/2021
by   Xin Liu, et al.
0

Sampling is a critical operation in the training of Graph Neural Network (GNN) that helps reduce the cost. Previous works have explored improving sampling algorithms through mathematical and statistical methods. However, there is a gap between sampling algorithms and hardware. Without consideration of hardware, algorithm designers merely optimize sampling at the algorithm level, missing the great potential of promoting the efficiency of existing sampling algorithms by leveraging hardware features. In this paper, we first propose a unified programming model for mainstream sampling algorithms, termed GNNSampler, covering the key processes for sampling algorithms in various categories. Second, we explore the data locality among nodes and their neighbors (i.e., the hardware feature) in real-world datasets for alleviating the irregular memory access in sampling. Third, we implement locality-aware optimizations in GNNSampler for diverse sampling algorithms to optimize the general sampling process in the training of GNN. Finally, we emphatically conduct experiments on large graph datasets to analyze the relevance between the training time, model accuracy, and hardware-level metrics, which helps achieve a good trade-off between time and accuracy in GNN training. Extensive experimental results show that our method is universal to mainstream sampling algorithms and reduces the training time of GNN (range from 4.83 layer-wise sampling to 44.92 accuracy.

READ FULL TEXT
research
09/07/2022

Hardware Acceleration of Sampling Algorithms in Sample and Aggregate Graph Neural Networks

Sampling is an important process in many GNN structures in order to trai...
research
01/27/2023

TrojanSAINT: Gate-Level Netlist Sampling-Based Inductive Learning for Hardware Trojan Detection

We propose TrojanSAINT, a graph neural network (GNN)-based hardware Troj...
research
12/16/2021

BGL: GPU-Efficient GNN Training by Optimizing Graph Data I/O and Preprocessing

Graph neural networks (GNNs) have extended the success of deep neural ne...
research
05/27/2023

GraphTensor: Comprehensive GNN-Acceleration Framework for Efficient Parallel Processing of Massive Datasets

We present GraphTensor, a comprehensive open-source framework that suppo...
research
01/18/2023

ReFresh: Reducing Memory Access from Exploiting Stable Historical Embeddings for Graph Neural Network Training

A key performance bottleneck when training graph neural network (GNN) mo...
research
06/16/2022

ProGNNosis: A Data-driven Model to Predict GNN Computation Time Using Graph Metrics

Graph Neural Networks (GNN) show great promise in problems dealing with ...
research
06/23/2022

Similarity-aware Positive Instance Sampling for Graph Contrastive Pre-training

Graph instance contrastive learning has been proved as an effective task...

Please sign up or login with your details

Forgot password? Click here to reset