Multilingual pre-training is really useful for improving the performance of deep networks on low resource languages (i.e., those that lack sufficient training data). But, whether multilingual pre-training is damaging for high resource languages is currently unclear. 🧵[1/5]
For BERT-style models like XLM-R (bit.ly/3Ww8qjY by @alex_conneau), models pre-trained over multilingual corpora (given proper tuning) can match the performance of monolingual models for high resources languages like English/French on GLUE. [2/5] Image
Recent research in large language models (bit.ly/3zI5JSQ by @Fluke_Ellington), however, indicates that multilingual pre-training significantly damages zero-shot generalization performance of LLMs in English. [3/5] Image
The basic takeaway from these results is that the curse of multilinguality (bit.ly/3NvW5bI) is real! Multilingual pre-training is sometimes damaging, but this damage can be mitigated by using larger models or tuning the sampling ratio between languages. [4/5] Image
Thanks to @davisblalock for pointing out this issue in his awesome newsletter (dblalock.substack.com). If you find these threads useful, feel free to subscribe to my newsletter as well (cameronrwolfe.substack.com)! [5/5]

#DeepLearning #NLP

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Cameron R. Wolfe

Cameron R. Wolfe Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us on Twitter!

:(