Adapting BigScience Multilingual Model to Unseen Languages

Zheng-Xin Yong, Vassilina Nikoulina

We benchmark different strategies of adding new languages (German and Korean) into the BigScience's pretrained multilingual language model with 1.3 billion parameters that currently supports 13 languages. We investigate the factors that affect the language adaptability of the model and the trade-offs between computational costs and expected performance.

Knowledge Graph

arrow_drop_up

Comments

Sign up or login to leave a comment