Dual-Alignment Pre-training for Cross-lingual Sentence Embedding

by   Ziheng Li, et al.

Recent studies have shown that dual encoder models trained with the sentence-level translation ranking task are effective methods for cross-lingual sentence embedding. However, our research indicates that token-level alignment is also crucial in multilingual scenarios, which has not been fully explored previously. Based on our findings, we propose a dual-alignment pre-training (DAP) framework for cross-lingual sentence embedding that incorporates both sentence-level and token-level alignment. To achieve this, we introduce a novel representation translation learning (RTL) task, where the model learns to use one-side contextualized token representation to reconstruct its translation counterpart. This reconstruction objective encourages the model to embed translation information into the token representation. Compared to other token-level alignment methods such as translation language modeling, RTL is more suitable for dual encoder architectures and is computationally efficient. Extensive experiments on three sentence-level cross-lingual benchmarks demonstrate that our approach can significantly improve sentence embedding. Our code is available at https://github.com/ChillingDream/DAP.


page 1

page 2

page 3

page 4


VECO 2.0: Cross-lingual Language Model Pre-training with Multi-granularity Contrastive Learning

Recent studies have demonstrated the potential of cross-lingual transfer...

Narrative Incoherence Detection

Motivated by the increasing popularity of intelligent editing assistant,...

Improving Pretrained Cross-Lingual Language Models via Self-Labeled Word Alignment

The cross-lingual language models are typically pretrained with masked l...

Towards Unsupervised Recognition of Semantic Differences in Related Documents

Automatically highlighting words that cause semantic differences between...

Lightweight Cross-Lingual Sentence Representation Learning

Large-scale models for learning fixed-dimensional cross-lingual sentence...

Synergy with Translation Artifacts for Training and Inference in Multilingual Tasks

Translation has played a crucial role in improving the performance on mu...

Knowledge-Prompted Estimator: A Novel Approach to Explainable Machine Translation Assessment

Cross-lingual Machine Translation (MT) quality estimation plays a crucia...

Code Repositories


ACL 2023 Dual-Alignment Pre-training for Cross-lingual Sentence Embedding

view repo

Please sign up or login with your details

Forgot password? Click here to reset