LP-BERT: Multi-task Pre-training Knowledge Graph BERT for Link Prediction

by   Da Li, et al.

Link prediction plays an significant role in knowledge graph, which is an important resource for many artificial intelligence tasks, but it is often limited by incompleteness. In this paper, we propose knowledge graph BERT for link prediction, named LP-BERT, which contains two training stages: multi-task pre-training and knowledge graph fine-tuning. The pre-training strategy not only uses Mask Language Model (MLM) to learn the knowledge of context corpus, but also introduces Mask Entity Model (MEM) and Mask Relation Model (MRM), which can learn the relationship information from triples by predicting semantic based entity and relation elements. Structured triple relation information can be transformed into unstructured semantic information, which can be integrated into the pre-training model together with context corpus information. In the fine-tuning phase, inspired by contrastive learning, we carry out a triple-style negative sampling in sample batch, which greatly increased the proportion of negative sampling while keeping the training time almost unchanged. Furthermore, we propose a data augmentation method based on the inverse relationship of triples to further increase the sample diversity. We achieve state-of-the-art results on WN18RR and UMLS datasets, especially the Hits@10 indicator improved by 5% from the previous state-of-the-art result on WN18RR dataset.


page 1

page 2

page 3

page 4


KGLM: Integrating Knowledge Graph Structure in Language Models for Link Prediction

The ability of knowledge graphs to represent complex relationships at sc...

KG-BERT: BERT for Knowledge Graph Completion

Knowledge graphs are important resources for many artificial intelligenc...

TravelBERT: Pre-training Language Model Incorporating Domain-specific Heterogeneous Knowledge into A Unified Representation

Existing technologies expand BERT from different perspectives, e.g. desi...

Inductive Entity Representations from Text via Link Prediction

We present a method for learning representations of entities, that uses ...

SMiLE: Schema-augmented Multi-level Contrastive Learning for Knowledge Graph Link Prediction

Link prediction is the task of inferring missing links between entities ...

Improving Scholarly Knowledge Representation: Evaluating BERT-based Models for Scientific Relation Classification

With the rapid growth of research publications, there is a vast amount o...

Knowledge-driven Site Selection via Urban Knowledge Graph

Site selection determines optimal locations for new stores, which is of ...

Please sign up or login with your details

Forgot password? Click here to reset