Scalable Consistency Training for Graph Neural Networks via Self-Ensemble Self-Distillation

10/12/2021
by   Cole Hawkins, et al.
8

Consistency training is a popular method to improve deep learning models in computer vision and natural language processing. Graph neural networks (GNNs) have achieved remarkable performance in a variety of network science learning tasks, but to date no work has studied the effect of consistency training on large-scale graph problems. GNNs scale to large graphs by minibatch training and subsample node neighbors to deal with high degree nodes. We utilize the randomness inherent in the subsampling of neighbors and introduce a novel consistency training method to improve accuracy. For a target node we generate different neighborhood expansions, and distill the knowledge of the average of the predictions to the GNN. Our method approximates the expected prediction of the possible neighborhood samples and practically only requires a few samples. We demonstrate that our training method outperforms standard GNN training in several different settings, and yields the largest gains when label rates are low.

READ FULL TEXT

page 6

page 7

page 11

research
05/23/2023

The Evolution of Distributed Systems for Graph Neural Networks and their Origin in Graph Processing and Deep Learning: A Survey

Graph Neural Networks (GNNs) are an emerging research field. This specia...
research
02/08/2023

On Generalized Degree Fairness in Graph Neural Networks

Conventional graph neural networks (GNNs) are often confronted with fair...
research
07/03/2020

Scaling Graph Neural Networks with Approximate PageRank

Graph neural networks (GNNs) have emerged as a powerful approach for sol...
research
11/04/2020

On Self-Distilling Graph Neural Network

Recently, the teacher-student knowledge distillation framework has demon...
research
06/09/2019

Redundancy-Free Computation Graphs for Graph Neural Networks

Graph Neural Networks (GNNs) are based on repeated aggregations of infor...
research
01/19/2022

Decoupling the Depth and Scope of Graph Neural Networks

State-of-the-art Graph Neural Networks (GNNs) have limited scalability w...
research
02/12/2021

Two Sides of the Same Coin: Heterophily and Oversmoothing in Graph Convolutional Neural Networks

Most graph neural networks (GNN) perform poorly in graphs where neighbor...

Please sign up or login with your details

Forgot password? Click here to reset