P-tensors: a General Formalism for Constructing Higher Order Message Passing Networks

06/19/2023
by   Tianyi Sun, et al.
0

Several recent papers have recently shown that higher order graph neural networks can achieve better accuracy than their standard message passing counterparts, especially on highly structured graphs such as molecules. These models typically work by considering higher order representations of subgraphs contained within a given graph and then perform some linear maps between them. We formalize these structures as permutation equivariant tensors, or P-tensors, and derive a basis for all linear maps between arbitrary order equivariant P-tensors. Experimentally, we demonstrate this paradigm achieves state of the art performance on several benchmark datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/25/2022

SpeqNets: Sparsity-aware Permutation-equivariant Graph Networks

While (message-passing) graph neural networks have clear limitations in ...
research
06/01/2022

Higher-Order Attention Networks

This paper introduces higher-order attention networks (HOANs), a novel c...
research
10/27/2021

Transformers Generalize DeepSets and Can be Extended to Graphs and Hypergraphs

We present a generalization of Transformers to any-order permutation inv...
research
03/17/2022

Graph Representation Learning with Individualization and Refinement

Graph Neural Networks (GNNs) have emerged as prominent models for repres...
research
06/15/2022

MACE: Higher Order Equivariant Message Passing Neural Networks for Fast and Accurate Force Fields

Creating fast and accurate force fields is a long-standing challenge in ...
research
08/21/2023

Topological Graph Signal Compression

Recently emerged Topological Deep Learning (TDL) methods aim to extend c...
research
01/15/2022

Edge-based Tensor prediction via graph neural networks

Message-passing neural networks (MPNN) have shown extremely high efficie...

Please sign up or login with your details

Forgot password? Click here to reset