Self-Supervised Continual Graph Learning in Adaptive Riemannian Spaces

by   Li Sun, et al.

Continual graph learning routinely finds its role in a variety of real-world applications where the graph data with different tasks come sequentially. Despite the success of prior works, it still faces great challenges. On the one hand, existing methods work with the zero-curvature Euclidean space, and largely ignore the fact that curvature varies over the coming graph sequence. On the other hand, continual learners in the literature rely on abundant labels, but labeling graph in practice is particularly hard especially for the continuously emerging graphs on-the-fly. To address the aforementioned challenges, we propose to explore a challenging yet practical problem, the self-supervised continual graph learning in adaptive Riemannian spaces. In this paper, we propose a novel self-supervised Riemannian Graph Continual Learner (RieGrace). In RieGrace, we first design an Adaptive Riemannian GCN (AdaRGCN), a unified GCN coupled with a neural curvature adapter, so that Riemannian space is shaped by the learnt curvature adaptive to each graph. Then, we present a Label-free Lorentz Distillation approach, in which we create teacher-student AdaRGCN for the graph sequence. The student successively performs intra-distillation from itself and inter-distillation from the teacher so as to consolidate knowledge without catastrophic forgetting. In particular, we propose a theoretically grounded Generalized Lorentz Projection for the contrastive distillation in Riemannian space. Extensive experiments on the benchmark datasets show the superiority of RieGrace, and additionally, we investigate on how curvature changes over the graph sequence.


page 1

page 2

page 3

page 4


A Self-supervised Riemannian GNN with Time Varying Curvature for Temporal Graph Learning

Representation learning on temporal graphs has drawn considerable resear...

A Self-supervised Mixed-curvature Graph Neural Network

Graph representation learning received increasing attentions in recent y...

Contrastive Supervised Distillation for Continual Representation Learning

In this paper, we propose a novel training procedure for the continual r...

Contrastive Graph Clustering in Curvature Spaces

Graph clustering is a longstanding research topic, and has achieved rema...

Iterative Graph Self-Distillation

How to discriminatively vectorize graphs is a fundamental challenge that...

Better and Faster: Knowledge Transfer from Multiple Self-supervised Learning Tasks via Graph Distillation for Video Classification

Video representation learning is a vital problem for classification task...

L3DMC: Lifelong Learning using Distillation via Mixed-Curvature Space

The performance of a lifelong learning (L3) model degrades when it is tr...

Please sign up or login with your details

Forgot password? Click here to reset