Learning Multilingual Word Embeddings in a Latent Metric Space: A Geometric Approach

08/27/2018
by   Pratik Jawanpuria, et al.
0

We propose a novel geometric approach for learning bilingual mappings given monolingual embeddings and a bilingual dictionary. Our approach decouples learning the transformation from the source language to the target language into (a) learning rotations for language-specific embeddings to align them to a common space, and (b) learning a similarity metric in the common space to model similarities between the embeddings. We model the bilingual mapping problem as an optimization problem on smooth Riemannian manifolds. We show that our approach outperforms previous approaches on the bilingual lexicon induction and cross-lingual word similarity tasks. Since we represent the rotated embeddings in a common latent space, our approach can easily represent multiple languages in a common space. We also show that these multilingual embeddings can be learned jointly given bilingual dictionaries for multiple language pairs. We demonstrate the effectiveness of the multilingual embeddings in one zero-shot word translation setting: word translation using these multilingual embeddings is better than word translation using a pivot language when no source-target bilingual dictionary is available, but source-pivot and pivot-target bilingual dictionaries are available.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset