The results for the JAX/Flax sprint are in🎉. Here are the top-3 projects picked by our awesome jury @wightmanr @nikiparmar09 @ashVaswani @Thom_Wolf
🥇 DALL.E mini hf.co/spaces/flax-co…
🥈 CLIP+NeRF hf.co/spaces/flax-co…
🥉 CLIP for Satellite images hf.co/spaces/sujitpa…
Starting today all 🤗 Spaces are publicly viewable 🚀 You can find all the amazing demos created as part of the sprint here 👉 huggingface.co/spaces
This has been the largest Hugging Face event, and we're extremely excited by the results. Almost 800 members joined and had almost 100 projects, 170 models & 36 Spaces! 🤯 That is super impressive given the timeframes of the event!
We would like to thank @avitaloliver, @marcvanzee, @walkforhours, Skye Wanderman-Milne, Jonathan Heek, and all the JAX, Flax, and Cloud TPU team for making this event possible 🤗. And thank you to all the participants for making this an awesome event!

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Hugging Face

Hugging Face Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @huggingface

11 Feb
We've been getting lots of questions on how to contribute models to🤗Transformers.

Recently we started to publish model-specific recipes on how to do so!

If you want to get better at open-source contributions and want to contribute to🤗Transformers, here is how it works: 👇
1. Watch out for open proposals to add a model here:
github.com/huggingface/tr…
2. Having read the explanation, if this is a project that interests you and that you think you will be able to finish within ~6 weeks - obviously with the help of the Hugging Face team - please send us a message to team@huggingface.co
Read 6 tweets
10 Feb
Blog alert: check out the new guest post by Amog Kamsetty and the @raydistributed team on training a Retrieval Augmented Generation Model with Hugging Face and Ray!

huggingface.co/blog/ray-rag
The RAG model by @olapiktus @PSH_Lewis and @facebookai colleagues leverages external knowledge sources like Wikipedia to have direct and dynamic access to information at inference time

Part of this process relies on training a retriever to learn how to find that information
@raydistributed is a framework-agnostic, flexible implementation for ad-hoc concurrent programming, which makes it ideal for scaling up this training, making retrieval 2x faster and drastically improving the scalability of RAG distributed fine-tuning

Go try it out for yourself!
Read 4 tweets
6 Mar 20
1/4. Four NLP tutorials are now available on
@kaggle
! It's now easier than ever to leverage tokenizers and transformer models like BERT, GPT2, RoBERTa, XLNet, DistilBERT,... for your next competition! 💪💪💪! #NLProc #NLP #DataScience #kaggle
2/4. Tokenizers - Training your own tokenizer kaggle.com/funtowiczmo/hu…
3/4. Transformers - Getting started with transformers kaggle.com/funtowiczmo/hu…
Read 4 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!

:(