Incremental Training of Graph Neural Networks on Temporal Graphs under Distribution Shift

06/25/2020
by   Lukas Galke, et al.
0

Current graph neural networks (GNNs) are promising, especially when the entire graph is known for training. However, it is not yet clear how to efficiently train GNNs on temporal graphs, where new vertices, edges, and even classes appear over time. We face two challenges: First, shifts in the label distribution (including the appearance of new labels), which require adapting the model. Second, the growth of the graph, which makes it, at some point, infeasible to train over all vertices and edges. We address these issues by applying a sliding window technique, i.e., we incrementally train GNNs on limited window sizes and analyze their performance. For our experiments, we have compiled three new temporal graph datasets based on scientific publications and evaluate isotropic and anisotropic GNN architectures. Our results show that both GNN types provide good results even for a window size of just 1 time step. With window sizes of 3 to 4 time steps, GNNs achieve at least 95 sizes of 6 or 8, at least 99 have direct consequences for training GNNs over temporal graphs. We provide the code (https://github.com/Incremental-GNNs) and the newly compiled datasets (https://zenodo.org/record/3764770) for reproducibility and reuse.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset