After 2 years, Practical Deep Learning for Coders v5 is finally ready! 🎊
This is a from-scratch rewrite of our most popular course. It has a focus on interactive explorations, & covers @PyTorch, @huggingface, DeBERTa, ConvNeXt, @Gradio & other goodies 🧵 course.fast.ai
For details on what's in this new course, check out the launch post: fast.ai/2022/07/21/dl-…
There are 9 lessons, and each lesson is around 90 minutes long. It's based on our 5⭐rated book, which is freely available online. Special hardware/software isn't needed—we show how to use free resources for everything. amazon.com/Deep-Learning-…
University math isn’t needed either — the necessary calculus and linear algebra is introduced as needed during the course. course.fast.ai/Lessons/lesson…
Since first using neural nets >25 years ago, I've led many companies and projects that have ML at their core, including first company to focus on DL and medicine, and the first company to develop a fully optimised pricing algorithm for insurance
Many students have told us about how they’ve been gold medallists in ML competitions, got offers from top companies, and had research papers published.
Alum have gone on to jobs at organizations like Google Brain, OpenAI, Adobe, Amazon, and Tesla course.fast.ai/Resources/test…
If you're ready to start your deep learning journey, we're ready to support you! course.fast.ai
Big thanks to @quarto_pub, which I used to create the course website (along with #nbdev)
Also, a big hat-tip to @OpenAI for making DALL-E available, so that I could ensure there's an adorable bunny or teddy illustration for every lesson! My favorite is the "data ethics" illustration... course.fast.ai/Lessons/lesson…
Nearly all the materials for this course can be run directly on @kaggle GPUs, which means you can run them in the cloud for free, with nothing to install!
• • •
Missing some Tweet in this thread? You can try to
force a refresh
I led the team that studied mask efficacy in early 2020 and published our results in the Proceedings of the National Academy of Science.
I spent three months earlier this year revisiting this topic, and today I'm publishing my notes and links here: fast.ai/2022/07/04/upd…
An admission: these notes were meant to be the basis of another academic paper, and I gave up on it. In Jan 2022 when I finished this research, I looked around, and it seemed like no-one much cared about avoiding COVID any more.
So I figured it wasn't worth spending time on.
It seems like in the last couple of weeks there's signs that folks might be more open to protecting themselves and others by wearing a mask.
But the vast majority of public health advice I see on mask use is scientifically inaccurate. So I'm digging out this research for you.
Where things got slow is if you imported `fastcore.xtras`, which is a module that wraps a bunch of python stdlib functionality into some convenient interfaces. It's used by `fastcore.net`, `fastcore.parallel`, and `fastcore.utils`, so it comes up a lot.
But to do that, fastcore.xtras had to import a *lot* of stuff from python's stdlib!...
One of my fave chapters of "Practical Deep Learning for Coders", co-written with @GuggerSylvain, is chapter 8. I've just made the whole thing available as an executable notebook on Kaggle!
The chapter looks at the "matrix completion" problem at the heart of recommendation systems -- e.g what would you guess are the missing values in this matrix showing what rating users gave movies?
The key idea is to find the "latent factors" behind people's preferences
Often we want to predict more than one dependent variable in a neural network. For instance in the current @kaggle "Paddy Doctor" competition there's both paddy disease and rice variety provided for each image.
In the pic in the previous tweet, you can see that each image is associated with two outputs: disease, and variety.
Here's all the code needed to create DataLoaders which provide that data to a model (see the notebook for details on what every line does)
To train our model to predict these two outputs, we need to create the two parts of our loss function, and update the Learner to create enough outputs for our needs
Big release of fastai today - @huggingface Accelerate is now supported for distributed training thanks to @TheZachMueller and @GuggerSylvain. That means you can now do distributed training in a notebook! 1/🧵
Here's all you need to train imagewoof with xresnet50 and mixup augmentation, on multiple GPUs. Run with `accelerate launch distrib.py` github.com/fastai/fastai/…
Are you ready to embark on a deep learning journey? I've just released over 6 hours of videos and the first in a series of notebooks showing the thought process of how I got to #1 in a current Kaggle comp.
The first thing you might notice if you click the link in the tweet above is that I've created `fastkaggle`, a little library of utilities to help kagglers -- especially to help running kaggle stuff on your own machine fastai.github.io/fastkaggle/
You'll also see there the links to the over 6 hours of videos walkthrus where I'll take you through every step of the process.
(This is a sneak peak at a new feature we're adding to the fast.ai course this year!)