Towards Graph Self-Supervised Learning with Contrastive Adjusted Zooming

by   Yizhen Zheng, et al.

Graph representation learning (GRL) is critical for graph-structured data analysis. However, most of the existing graph neural networks (GNNs) heavily rely on labeling information, which is normally expensive to obtain in the real world. Existing unsupervised GRL methods suffer from certain limitations, such as the heavy reliance on monotone contrastiveness and limited scalability. To overcome the aforementioned problems, in light of the recent advancements in graph contrastive learning, we introduce a novel self-supervised graph representation learning algorithm via Graph Contrastive Adjusted Zooming, namely G-Zoom, to learn node representations by leveraging the proposed adjusted zooming scheme. Specifically, this mechanism enables G-Zoom to explore and extract self-supervision signals from a graph from multiple scales: micro (i.e., node-level), meso (i.e., neighbourhood-level), and macro (i.e., subgraph-level). Firstly, we generate two augmented views of the input graph via two different graph augmentations. Then, we establish three different contrastiveness on the above three scales progressively, from node, neighbouring, to subgraph level, where we maximize the agreement between graph representations across scales. While we can extract valuable clues from a given graph on the micro and macro perspectives, the neighbourhood-level contrastiveness offers G-Zoom the capability of a customizable option based on our adjusted zooming scheme to manually choose an optimal viewpoint that lies between the micro and macro perspectives to better understand the graph data. Additionally, to make our model scalable to large graphs, we employ a parallel graph diffusion approach to decouple model training from the graph size. We have conducted extensive experiments on real-world datasets, and the results demonstrate that our proposed model outperforms state-of-the-art methods consistently.


page 1

page 2


Multi-Scale Contrastive Siamese Networks for Self-Supervised Graph Representation Learning

Graph representation learning plays a vital role in processing graph-str...

Bi-CLKT: Bi-Graph Contrastive Learning based Knowledge Tracing

The goal of Knowledge Tracing (KT) is to estimate how well students have...

Deep Graph Contrastive Representation Learning

Graph representation learning nowadays becomes fundamental in analyzing ...

Graph Communal Contrastive Learning

Graph representation learning is crucial for many real-world application...

Contrastive Learning under Heterophily

Graph Neural Networks are powerful tools for learning node representatio...

Motif-Driven Contrastive Learning of Graph Representations

Graph motifs are significant subgraph patterns occurring frequently in g...

A Self-supervised Mixed-curvature Graph Neural Network

Graph representation learning received increasing attentions in recent y...

Please sign up or login with your details

Forgot password? Click here to reset