Training Stable Graph Neural Networks Through Constrained Learning

10/07/2021
by   Juan Cervino, et al.
0

Graph Neural Networks (GNN) rely on graph convolutions to learn features from network data. GNNs are stable to different types of perturbations of the underlying graph, a property that they inherit from graph filters. In this paper we leverage the stability property of GNNs as a typing point in order to seek for representations that are stable within a distribution. We propose a novel constrained learning approach by imposing a constraint on the stability condition of the GNN within a perturbation of choice. We showcase our framework in real world data, corroborating that we are able to obtain more stable representations while not compromising the overall accuracy of the predictor.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/21/2019

Stability of Graph Neural Networks to Relative Perturbations

Graph neural networks (GNNs), consisting of a cascade of layers applying...
research
10/23/2020

Graph and graphon neural network stability

Graph neural networks (GNNs) are learning architectures that rely on kno...
research
07/08/2022

Stability of Aggregation Graph Neural Networks

In this paper we study the stability properties of aggregation graph neu...
research
11/13/2022

Learning Stable Graph Neural Networks via Spectral Regularization

Stability of graph neural networks (GNNs) characterizes how GNNs react t...
research
05/11/2019

Stability Properties of Graph Neural Networks

Data stemming from networks exhibit an irregular support, whereby each d...
research
02/25/2021

Towards a Unified Framework for Fair and Stable Graph Representation Learning

As the representations output by Graph Neural Networks (GNNs) are increa...
research
12/14/2021

Robust Graph Neural Networks via Probabilistic Lipschitz Constraints

Graph neural networks (GNNs) have recently been demonstrated to perform ...

Please sign up or login with your details

Forgot password? Click here to reset