That website is generated automatically from the notebooks in this repo.
Take a look around -- all the things you'd hope to see in a high-quality project are there. It's all done for you by nbdev. For instance, see the nice README? Built from a notebook! github.com/fastai/nbdev/
nbdev v1 is already recommended by experts, and v2 is a big step up again.
"From my point of view it is close to a Pareto improvement over traditional Python library development." Thankyou @erikgaas 😀
Here's an example of the beautiful and useful docs that are auto-generated by nbdev+Quarto. nbdev.fast.ai/merge.html
Here's an example of an exported function in a notebook cell. This is automatically added to the python module, and the documentation you see on the right is auto-generated. nbdev.fast.ai/merge.html#nbd…
Every time we update a notebook to change the docs, library, or tests, everything is checked by @github Actions automatically
Here's the @pypi pip installer that's auto-generated. See the description? That's created for you from the notebook you use for your documentation homepage (just like the README, and the description for your conda package) pypi.org/project/nbdev/
I've barely scratched the surface in this brief tweet thread! For much more information, take a look at the blog post authored with @HamelHusain fast.ai/2022/07/28/nbd…
This launch wouldn't have been possible without some amazing people. I'd especially like to highlight Hamel & @wasimlorgat, who made nbdev2 a far better product than it would have been without them, JJ Allaire @fly_upside_down & the @quarto_pub team, & the @fastdotai community
• • •
Missing some Tweet in this thread? You can try to
force a refresh
Today, with @Tim_Dettmers, @huggingface, & @mobius_labs, we're releasing FSDP/QLoRA, a new project that lets you efficiently train very large (70b) models on a home computer with consumer gaming GPUs. 1/🧵 answer.ai/posts/2024-03-…
"With this capability we can take huge models to new heights locally, and gigantic, hundreds of billions of parameter models are now accessible by small labs", says legendary model builder @Teknium1
As the name suggests, this project combines two key pieces: @Tim_Dettmers' QLoRA, which lets us train models ~4x smaller by using quantization, with @AIatMeta's FSDP, which shards large models across multiple GPUs.
Currying and composition in a nutshell (with APL).
(This is easier for primary school children to learn than many things they are taught. At least according to the primary school kids I've taught it to.)
Hopefully it's obvious from the context what "∘" does, but if not, look at this link (and the link there to "beside"): aplwiki.com/wiki/Bind
There are few things more important to our civilization than understanding how to better do R&D. Thankfully, @eric_is_weird has dedicated himself to studying this question.
As a result, he's become the foremost scholar and historian of 19th and 20th century R&D labs.
1/🧵
We are incredibly lucky that @eric_is_weird has taken a strong interest in , and decided to do a deep dive into our organizational structure and R&D approach.
Examples are provided (but they need to be run directly on GPT4 API with temperature 0) so you can check the claims.
I think a key question (assuming that running the provided prompts reproduces the result) is whether the prompts encode too much problem-specific information
OK everyone's asking me for my take on the OpenAI stuff, so here it is. I have a strong feeling about what's going on, but no internal info so this is just me talking.
The first point to make is that the Dev Day was (IMO) an absolute embarrassment.
I could barely watch the keynote. It was just another bland corp-speak bunch of product updates.
For those researchers I know that were involved from the beginning, this must have felt nausea-inducing.
The plan was AGI, lifting society to a new level. We got Laundry Buddy.
When OAI was founded I felt like it was gonna but a rough ride. It was created by a bunch of brilliant researchers that I knew and respected, plus some huge names from outside the field: Elon, GDB, and sama, none of who I'd ever come across at any AI/ML conference or meetup.
If you're like me and find it easier to read *code* than *math*, and you have access to @OpenAI GPT 4V (or use @bing or @google Bard), try pasting a image of an equation you wanna understand in there.