Learning Joint Multilingual Sentence Representations with Neural Machine Translation

04/13/2017
by   Holger Schwenk, et al.
0

In this paper, we use the framework of neural machine translation to learn joint sentence representations across six very different languages. Our aim is that a representation which is independent of the language, is likely to capture the underlying semantics. We define a new cross-lingual similarity measure, compare up to 1.4M sentence representations and study the characteristics of close sentences. We provide experimental evidence that sentences that are close in embedding space are indeed semantically highly related, but often have quite different structure and syntax. These relations also hold when comparing sentences in different languages.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset