Scalable Cross-Lingual Transfer of Neural Sentence Embeddings

04/11/2019
by   Hanan Aldarmaki, et al.
0

We develop and investigate several cross-lingual alignment approaches for neural sentence embedding models, such as the supervised inference classifier, InferSent, and sequential encoder-decoder models. We evaluate three alignment frameworks applied to these models: joint modeling, representation transfer learning, and sentence mapping, using parallel text to guide the alignment. Our results support representation transfer as a scalable approach for modular cross-lingual alignment of neural sentence embeddings, where we observe better performance compared to joint models in intrinsic and extrinsic evaluations, particularly with smaller sets of parallel data.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset