Enabling Multi-Source Neural Machine Translation By Concatenating Source Sentences In Multiple Languages
In this paper, we propose a novel and elegant solution to "Multi-Source Neural Machine Translation" (MSNMT) which only relies on preprocessing a N-way multilingual corpus without modifying the Neural Machine Translation (NMT) architecture or training procedure. We simply concatenate the source sentences to form a single long multi-source input sentence while keeping the target side sentence as it is and train an NMT system using this preprocessed corpus. We evaluate our method in resource poor as well as resource rich settings and show its effectiveness (up to 4 BLEU using 2 source languages and up to 6 BLEU using 5 source languages) by comparing against existing methods for MSNMT. We also provide some insights on how the NMT system leverages multilingual information in such a scenario by visualizing attention.
READ FULL TEXT