Factor Graph Neural Networks

by   Zhen Zhang, et al.

In recent years, we have witnessed a surge of Graph Neural Networks (GNNs), most of which can learn powerful representations in an end-to-end fashion with great success in many real-world applications. They have resemblance to Probabilistic Graphical Models (PGMs), but break free from some limitations of PGMs. By aiming to provide expressive methods for representation learning instead of computing marginals or most likely configurations, GNNs provide flexibility in the choice of information flowing rules while maintaining good performance. Despite their success and inspirations, they lack efficient ways to represent and learn higher-order relations among variables/nodes. More expressive higher-order GNNs which operate on k-tuples of nodes need increased computational resources in order to process higher-order tensors. We propose Factor Graph Neural Networks (FGNNs) to effectively capture higher-order relations for inference and learning. To do so, we first derive an efficient approximate Sum-Product loopy belief propagation inference algorithm for discrete higher-order PGMs. We then neuralize the novel message passing scheme into a Factor Graph Neural Network (FGNN) module by allowing richer representations of the message update rules; this facilitates both efficient inference and powerful end-to-end learning. We further show that with a suitable choice of message aggregation operators, our FGNN is also able to represent Max-Product belief propagation, providing a single family of architecture that can represent both Max and Sum-Product loopy belief propagation. Our extensive experimental evaluation on synthetic as well as real datasets demonstrates the potential of the proposed model.


page 1

page 2

page 3

page 4


Neuralizing Efficient Higher-order Belief Propagation

Graph neural network models have been extensively used to learn node rep...

Factor Graph Neural Network

Most of the successful deep neural network architectures are structured,...

Graph Representation Learning with Individualization and Refinement

Graph Neural Networks (GNNs) have emerged as prominent models for repres...

Sum-Product Networks for Sequence Labeling

We consider higher-order linear-chain conditional random fields (HO-LC-C...

From Latent Graph to Latent Topology Inference: Differentiable Cell Complex Module

Latent Graph Inference (LGI) relaxed the reliance of Graph Neural Networ...

Neural Message Passing on High Order Paths

Graph neural network have achieved impressive results in predicting mole...

End-to-end learning potentials for structured attribute prediction

We present a structured inference approach in deep neural networks for m...

Please sign up or login with your details

Forgot password? Click here to reset