Research Scientist at Cohere For AI (@CohereForAI, @cohere). Formerly @GroNlp, @naverlabseurope.
Oct 22, 2021 • 8 tweets • 4 min read
📜 Excited to share our new work:
You have your nice multilingual translation model? Congrats 🎉
...
but what do you do if you want to add a new language (e.g., 🇳🇱) and don't have parallel data (🏴 - 🇳🇱) ?
Bonus✨: you can finally get rid of back-translation
🧵1/8
If you take a multilingual language model like mBART, add task adapters and fine-tune them with cross-attention for translation ➡️ This works well for your supervised pairs, but for your new language 🇳🇱, mBART forgets everything it learned before: