Neural Embeddings of Graphs in Hyperbolic Space

Neural embeddings have been used with great success in Natural Language Processing (NLP). They provide compact representations that encapsulate word similarity and attain state-of-the-art performance in a range of linguistic tasks. The success of neural embeddings has prompted significant amounts of research into applications in domains other than language. One such domain is graph-structured data, where embeddings of vertices can be learned that encapsulate vertex similarity and improve performance on tasks including edge prediction and vertex labelling. For both NLP and graph based tasks, embeddings have been learned in high-dimensional Euclidean spaces. However, recent work has shown that the appropriate isometric space for embedding complex networks is not the flat Euclidean space, but negatively curved, hyperbolic space. We present a new concept that exploits these recent insights and propose learning neural embeddings of graphs in hyperbolic space. We provide experimental evidence that embedding graphs in their natural geometry significantly improves performance on downstream tasks for several real-world public datasets.

READ FULL TEXT
research
06/12/2018

Embedding Text in Hyperbolic Spaces

Natural language text exhibits hierarchical structure in a variety of re...
research
02/14/2022

HyLa: Hyperbolic Laplacian Features For Graph Learning

Due to its geometric properties, hyperbolic space can support high-fidel...
research
04/17/2021

Robust Embeddings Via Distributions

Despite recent monumental advances in the field, many Natural Language P...
research
11/04/2019

Spherical Text Embedding

Unsupervised text embedding has shown great power in a wide range of NLP...
research
05/30/2023

Stable Anisotropic Regularization

Given the success of Large Language Models (LLMs), there has been consid...
research
06/14/2018

GLoMo: Unsupervisedly Learned Relational Graphs as Transferable Representations

Modern deep transfer learning approaches have mainly focused on learning...
research
10/08/2019

Beyond Vector Spaces: Compact Data Representationas Differentiable Weighted Graphs

Learning useful representations is a key ingredient to the success of mo...

Please sign up or login with your details

Forgot password? Click here to reset