Near or Far, Wide Range Zero-Shot Cross-Lingual Dependency Parsing

11/01/2018
by   Wasi Uddin Ahmad, et al.
0

Cross-lingual transfer is the major means toleverage knowledge from high-resource lan-guages to help low-resource languages. In this paper, we investigate cross-lingual trans-fer across a broad spectrum of language dis-tances. We posit that Recurrent Neural Net-works (RNNs)-based encoders, since explic-itly incorporating surface word order, are brit-tle for transferring across distant languages,while self-attentive models are more flexibleon modeling word order information; thusthey would be more robust in the cross-lingualtransfer setting. We test our hypothesis bytraining dependency parsers on only Englishcorpus and evaluating them on 31 other lan-guages. With detailed analysis, we find inter-esting patterns showing that RNNs-based ar-chitectures can transfer well for languages thatare close to English, while self-attentive mod-els are have better cross-lingual transferabilityacross a wide range of languages.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro