Discover and read the best of Twitter Threads about #ReplitDevDay

Most recents (1)

At #ReplitDevDay, we announced we’ve trained and are open-sourcing our first Complete Code model.

Introducing replit-code-v1-3b:

- 2.7B params
- 20 languages
- 525B tokens
- 40% better than comparable models
- Trained in 10 days

Take a look at the benchmarks yourself 🧵 Image
Replit-code-v1-3b & replit-finetuned-v1-3b were trained entirely on code and were meant for single-line code completion.

We didn’t expect either to perform so well on HumanEval, but they did.

replit-finetuned-v1-3b outperformed all OSS code models, even those 5x its size. Image
Both models also benchmark impressively well against commercial models.

replit-finetuned-v1-3b is by far the smallest model on the table and it outperformed Codex and LLaMA.

PaLM-Coder is 200x larger and we’re closing in on their performance with much better latency. Image
Read 5 tweets

Related hashtags

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!