Unified Interpretation of Softmax Cross-Entropy and Negative Sampling: With Case Study for Knowledge Graph Embedding

06/14/2021
by   Hidetaka Kamigaito, et al.
0

In knowledge graph embedding, the theoretical relationship between the softmax cross-entropy and negative sampling loss functions has not been investigated. This makes it difficult to fairly compare the results of the two different loss functions. We attempted to solve this problem by using the Bregman divergence to provide a unified interpretation of the softmax cross-entropy and negative sampling loss functions. Under this interpretation, we can derive theoretical findings for fair comparison. Experimental results on the FB15k-237 and WN18RR datasets show that the theoretical findings are valid in practical settings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/09/2023

Optimized Hybrid Focal Margin Loss for Crack Segmentation

Many loss functions have been derived from cross-entropy loss functions ...
research
05/10/2021

Generalized Jensen-Shannon Divergence Loss for Learning with Noisy Labels

We propose two novel loss functions based on Jensen-Shannon divergence f...
research
11/22/2022

BESS: Balanced Entity Sampling and Sharing for Large-Scale Knowledge Graph Completion

We present the award-winning submission to the WikiKG90Mv2 track of OGB-...
research
03/01/2023

Enhancing Knowledge Graph Embedding Models with Semantic-driven Loss Functions

Knowledge graph embedding models (KGEMs) are used for various tasks rela...
research
06/21/2022

Comprehensive Analysis of Negative Sampling in Knowledge Graph Representation Learning

Negative sampling (NS) loss plays an important role in learning knowledg...
research
04/12/2020

Exploring Effects of Random Walk Based Minibatch Selection Policy on Knowledge Graph Completion

In this paper, we have explored the effects of different minibatch sampl...
research
09/14/2023

Turning Dross Into Gold Loss: is BERT4Rec really better than SASRec?

Recently sequential recommendations and next-item prediction task has be...

Please sign up or login with your details

Forgot password? Click here to reset