Transformers can read and write, but how well can they listen and speak 🗣️?

Find out by pitting your models against the SUPERB Challenge 📊!

SUPERB tests pretrained models on a wide range of speech processing tasks & datasets

Submit here 👉: superbbenchmark.org
SUPERB aims to accelerate the development of *universal* speech representations 🤯

You can find all the details about the challenge here 👉: superbbenchmark.org/challenge
Participants of the SUPERB Challenge can submit their results to the AAAI 2022 Workshop: The 2nd Self-supervised Learning for Audio and Speech Processing🤖

The winners will be invited to present their methods 🏅🏅🏅!

Workshop details here 👉:
aaai-sas-2022.github.io
To help you get started, the SUPERB team has released the S3PRL framework 🙌

It provides all the tools you need to *benchmark* your speech representation models!

github.com/s3prl/s3prl
Your pretrained models will be hosted on the Hugging Face Hub, so you can easily share results with your team 🥳!

We've provided a template repo to help you get started 👉: huggingface.co/superb/superb-…
Hosting an ambitious benchmark like this involves the collaboration of many organizations, and we are super excited to see SUPERB help democratize the development of speech processing technologies!

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Hugging Face

Hugging Face Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @huggingface

30 Jul
The results for the JAX/Flax sprint are in🎉. Here are the top-3 projects picked by our awesome jury @wightmanr @nikiparmar09 @ashVaswani @Thom_Wolf
🥇 DALL.E mini hf.co/spaces/flax-co…
🥈 CLIP+NeRF hf.co/spaces/flax-co…
🥉 CLIP for Satellite images hf.co/spaces/sujitpa…
Starting today all 🤗 Spaces are publicly viewable 🚀 You can find all the amazing demos created as part of the sprint here 👉 huggingface.co/spaces
This has been the largest Hugging Face event, and we're extremely excited by the results. Almost 800 members joined and had almost 100 projects, 170 models & 36 Spaces! 🤯 That is super impressive given the timeframes of the event!
Read 4 tweets
11 Feb
We've been getting lots of questions on how to contribute models to🤗Transformers.

Recently we started to publish model-specific recipes on how to do so!

If you want to get better at open-source contributions and want to contribute to🤗Transformers, here is how it works: 👇
1. Watch out for open proposals to add a model here:
github.com/huggingface/tr…
2. Having read the explanation, if this is a project that interests you and that you think you will be able to finish within ~6 weeks - obviously with the help of the Hugging Face team - please send us a message to team@huggingface.co
Read 6 tweets
10 Feb
Blog alert: check out the new guest post by Amog Kamsetty and the @raydistributed team on training a Retrieval Augmented Generation Model with Hugging Face and Ray!

huggingface.co/blog/ray-rag
The RAG model by @olapiktus @PSH_Lewis and @facebookai colleagues leverages external knowledge sources like Wikipedia to have direct and dynamic access to information at inference time

Part of this process relies on training a retriever to learn how to find that information
@raydistributed is a framework-agnostic, flexible implementation for ad-hoc concurrent programming, which makes it ideal for scaling up this training, making retrieval 2x faster and drastically improving the scalability of RAG distributed fine-tuning

Go try it out for yourself!
Read 4 tweets
6 Mar 20
1/4. Four NLP tutorials are now available on
@kaggle
! It's now easier than ever to leverage tokenizers and transformer models like BERT, GPT2, RoBERTa, XLNet, DistilBERT,... for your next competition! 💪💪💪! #NLProc #NLP #DataScience #kaggle
2/4. Tokenizers - Training your own tokenizer kaggle.com/funtowiczmo/hu…
3/4. Transformers - Getting started with transformers kaggle.com/funtowiczmo/hu…
Read 4 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!

:(