Graph-Free Knowledge Distillation for Graph Neural Networks

05/16/2021
by   Xiang Deng, et al.
0

Knowledge distillation (KD) transfers knowledge from a teacher network to a student by enforcing the student to mimic the outputs of the pretrained teacher on training data. However, data samples are not always accessible in many cases due to large data sizes, privacy, or confidentiality. Many efforts have been made on addressing this problem for convolutional neural networks (CNNs) whose inputs lie in a grid domain within a continuous space such as images and videos, but largely overlook graph neural networks (GNNs) that handle non-grid data with different topology structures within a discrete space. The inherent differences between their inputs make these CNN-based approaches not applicable to GNNs. In this paper, we propose to our best knowledge the first dedicated approach to distilling knowledge from a GNN without graph data. The proposed graph-free KD (GFKD) learns graph topology structures for knowledge transfer by modeling them with multinomial distribution. We then introduce a gradient estimator to optimize this framework. Essentially, the gradients w.r.t. graph structures are obtained by only using GNN forward-propagation without back-propagation, which means that GFKD is compatible with modern GNN libraries such as DGL and Geometric. Moreover, we provide the strategies for handling different types of prior knowledge in the graph data or the GNNs. Extensive experiments demonstrate that GFKD achieves the state-of-the-art performance for distilling knowledge from GNNs without training data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/24/2022

Geometric Knowledge Distillation: Topology Compression for Graph Neural Networks

We study a new paradigm of knowledge transfer that aims at encoding grap...
research
01/03/2023

RELIANT: Fair Knowledge Distillation for Graph Neural Networks

Graph Neural Networks (GNNs) have shown satisfying performance on variou...
research
03/23/2020

Distillating Knowledge from Graph Convolutional Networks

Existing knowledge distillation methods focus on convolutional neural ne...
research
03/23/2020

Distilling Knowledge from Graph Convolutional Networks

Existing knowledge distillation methods focus on convolutional neural ne...
research
12/10/2020

Overcoming Catastrophic Forgetting in Graph Neural Networks

Catastrophic forgetting refers to the tendency that a neural network "fo...
research
03/04/2021

Extract the Knowledge of Graph Neural Networks and Go Beyond it: An Effective Knowledge Distillation Framework

Semi-supervised learning on graphs is an important problem in the machin...
research
08/18/2023

Bridged-GNN: Knowledge Bridge Learning for Effective Knowledge Transfer

The data-hungry problem, characterized by insufficiency and low-quality ...

Please sign up or login with your details

Forgot password? Click here to reset