Beyond permutation equivariance in graph networks

03/25/2021
by   Emma Slade, et al.
0

We introduce a novel architecture for graph networks which is equivariant to the Euclidean group in n-dimensions, and is additionally able to deal with affine transformations. Our model is designed to work with graph networks in their most general form, thus including particular variants as special cases. Thanks to its equivariance properties, we expect the proposed model to be more data efficient with respect to classical graph architectures and also intrinsically equipped with a better inductive bias. As a preliminary example, we show that the architecture with both equivariance under the Euclidean group, as well as the affine transformations, performs best on a standard dataset for graph neural networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/25/2021

Data efficiency in graph networks through equivariance

We introduce a novel architecture for graph networks which is equivarian...
research
05/28/2021

Symmetry-driven graph neural networks

Exploiting symmetries and invariance in data is a powerful, yet not full...
research
05/04/2020

Sum-Product-Transform Networks: Exploiting Symmetries using Invertible Transformations

In this work, we propose Sum-Product-Transform Networks (SPTN), an exten...
research
02/13/2021

Euclidean Affine Functions and Applications to Calendar Algorithms

We study properties of Euclidean affine functions (EAFs), namely those o...
research
09/22/2022

Equivariant Transduction through Invariant Alignment

The ability to generalize compositionally is key to understanding the po...
research
05/21/2022

Equivariant Mesh Attention Networks

Equivariance to symmetries has proven to be a powerful inductive bias in...
research
01/23/2023

The Identity Problem in the special affine group of ℤ^2

We consider semigroup algorithmic problems in the Special Affine group 𝖲...

Please sign up or login with your details

Forgot password? Click here to reset