Empirical Evaluation of Pretraining Strategies for Supervised Entity Linking

05/28/2020
by   Thibault Févry, et al.
0

In this work, we present an entity linking model which combines a Transformer architecture with large scale pretraining from Wikipedia links. Our model achieves the state-of-the-art on two commonly used entity linking datasets: 96.7 what design choices are important for entity linking, including choices of negative entity candidates, Transformer architecture, and input perturbations. Lastly, we present promising results on more challenging settings such as end-to-end entity linking and entity linking without in-domain training data.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset