Unsupervised Neural Machine Translation with Indirect Supervision
Neural machine translation (NMT) is ineffective for zero-resource languages. Recent works exploring the possibility of unsupervised neural machine translation (UNMT) with only monolingual data can achieve promising results. However, there are still big gaps between UNMT and NMT with parallel supervision. In this work, we introduce a multilingual unsupervised NMT () framework to leverage weakly supervised signals from high-resource language pairs to zero-resource translation directions. More specifically, for unsupervised language pairs En-De, we can make full use of the information from parallel dataset En-Fr to jointly train the unsupervised translation directions all in one model. is based on multilingual models which require no changes to the standard unsupervised NMT. Empirical results demonstrate that significantly improves the translation quality by more than 3 BLEU score on six benchmark unsupervised translation directions.
READ FULL TEXT