DeepAI AI Chat
Log In Sign Up

Fisher Information Embedding for Node and Graph Learning

by   Dexiong Chen, et al.

Attention-based graph neural networks (GNNs), such as graph attention networks (GATs), have become popular neural architectures for processing graph-structured data and learning node embeddings. Despite their empirical success, these models rely on labeled data and the theoretical properties of these models have yet to be fully understood. In this work, we propose a novel attention-based node embedding framework for graphs. Our framework builds upon a hierarchical kernel for multisets of subgraphs around nodes (e.g. neighborhoods) and each kernel leverages the geometry of a smooth statistical manifold to compare pairs of multisets, by "projecting" the multisets onto the manifold. By explicitly computing node embeddings with a manifold of Gaussian mixtures, our method leads to a new attention mechanism for neighborhood aggregation. We provide theoretical insights into genralizability and expressivity of our embeddings, contributing to a deeper understanding of attention-based GNNs. We propose efficient unsupervised and supervised methods for learning the embeddings, with the unsupervised method not requiring any labeled data. Through experiments on several node classification benchmarks, we demonstrate that our proposed method outperforms existing attention-based graph models like GATs. Our code is available at


page 1

page 2

page 3

page 4


Towards Deep Attention in Graph Neural Networks: Problems and Remedies

Graph neural networks (GNNs) learn the representation of graph-structure...

Understanding Graph Neural Networks from Graph Signal Denoising Perspectives

Graph neural networks (GNNs) have attracted much attention because of th...

Improving Attention Mechanism in Graph Neural Networks via Cardinality Preservation

Graph Neural Networks (GNNs) are powerful to learn the representation of...

Embeddings and Attention in Predictive Modeling

We explore in depth how categorical data can be processed with embedding...

Transfer learning for atomistic simulations using GNNs and kernel mean embeddings

Interatomic potentials learned using machine learning methods have been ...

How Attentive are Graph Attention Networks?

Graph Attention Networks (GATs) are one of the most popular GNN architec...

Neural Processes with Stochastic Attention: Paying more attention to the context dataset

Neural processes (NPs) aim to stochastically complete unseen data points...