Adapting BigScience Multilingual Model to Unseen Languages

04/11/2022
by   Zheng Xin Yong, et al.
0

We benchmark different strategies of adding new languages (German and Korean) into the BigScience's pretrained multilingual language model with 1.3 billion parameters that currently supports 13 languages. We investigate the factors that affect the language adaptability of the model and the trade-offs between computational costs and expected performance.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset