Graph Learning with 1D Convolutions on Random Walks

02/17/2021
by   Jan Toenshoff, et al.
13

We propose CRaWl (CNNs for Random Walks), a novel neural network architecture for graph learning. It is based on processing sequences of small subgraphs induced by random walks with standard 1D CNNs. Thus, CRaWl is fundamentally different from typical message passing graph neural network architectures. It is inspired by techniques counting small subgraphs, such as the graphlet kernel and motif counting, and combines them with random walk based techniques in a highly efficient and scalable neural architecture. We demonstrate empirically that CRaWl matches or outperforms state-of-the-art GNN architectures across a multitude of benchmark datasets for graph learning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/02/2021

Autobahn: Automorphism-based Graph Neural Nets

We introduce Automorphism-based graph neural networks (Autobahn), a new ...
research
11/22/2021

Anomaly-resistant Graph Neural Networks via Neural Architecture Search

In general, Graph Neural Networks(GNN) have been using a message passing...
research
09/17/2022

De Bruijn goes Neural: Causality-Aware Graph Neural Networks for Time Series Data on Dynamic Graphs

We introduce De Bruijn Graph Neural Networks (DBGNNs), a novel time-awar...
research
03/22/2021

Learning physical properties of anomalous random walks using graph neural networks

Single particle tracking allows probing how biomolecules interact physic...
research
04/24/2019

PAN: Path Integral Based Convolution for Deep Graph Neural Networks

Convolution operations designed for graph-structured data usually utiliz...
research
12/01/2021

Efficient and Local Parallel Random Walks

Random walks are a fundamental primitive used in many machine learning a...
research
10/21/2020

Density of States Graph Kernels

An important problem on graph-structured data is that of quantifying sim...

Please sign up or login with your details

Forgot password? Click here to reset