I've written a notebook showing three additional functions for helping you navigate the @fastdotai source code and save you potentially hours of time: gist.github.com/muellerzr/3302…

1/
Why did I write this? Navigating the source code for fastai can be hard sometimes, especially trying to consolidate all the patch and typedispatch functionalities (especially because typedispatch doesn't show up in the __all__!)

So, how does this work? 2/
The first function, `trace_class`, will grab all functions in a class up to a particular library. This is helpful in scenarios where a class inherits something (like a torch tensor) but you only care about the super class level (like TensorBase):

3/
This will also grab any patched functions, making it easy to see absolutely every function available, and its source code.

Next we have `trace_func`, which is similar to trace_class, where it will just grab the source code for some function, and its stem:

4/
Now, why this particular one? It's just a handy convivence function I've found helpful and use pragmatically.

Finally, we have the `trace_dispatch`. TypeDispatch functions are a bit more of a headache, as there's multiple versions of the "same" source code. 5/
So what do we do? We grab all of them, letting you see the source code for every input as it changes, and fully see its behavior in one place:
Hope this helps some folks when it comes to debugging and examining how fastai's source code looks :)

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Zach Mueller

Zach Mueller Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @TheZachMueller

20 May
I got asked a question at the end of my stream by @lukemshepherd I didn't quite get to answer:

​Do you have any advice for someone who wants to get into contributing to the @fastdotai library who has experience with fastai but not software development?

Let's try to answer:
If you have absolutely zero, I recommend watching the first few lectures in the latest part 2 (yes of a few years ago!). Since it's building a library from scratch, Jeremy covers how he approaches software design, which can help you understand the design
From there, start small. Look at the simpler areas of the library, and try and add some documentation, etc. Then slowly build yourself towards not understanding how fastai works, but how the DESIGN process works. Apply yourself towards seeing how classes interact, etc
Read 10 tweets
6 May
With school over, I'll reveal one of my secrets. I'm making a new course!

Running with @fastdotai! Times and dates are TBD, but I'm shooting for this Fall to hold the course. This will be a continuation on Walk with fastai, building on what we learned there and applying it 1/
The plan is to split it up into three sections: Debugging, Implementations, and Guest Speakers.

The first section I want to be covering debugging in fastai, bringing raw torch code over (direct 1:1 conversions), and exploring inference heavily
The second will be walking through a few implementations of other libraries that have used fastai (and writing one or two ourselves) in a much more complex manor rather than "set your data up so the DataBlock works". Situations will arise where the DataBlock doesn't exist yet!
Read 5 tweets
5 May
It's a very big day for me today, as I'm officially releasing version 0.2.3 for the @AdaptNLP project. With it comes a slew of changes, but what exactly?

A preview:
#nbdev, @fastdotai, #fastcore, and something called a ModelHub for both #FlairNLP and @huggingface, let's dive in:
First up, #nbdev:

Thanks to the lib2nbdev package (novetta.github.io/lib2nbdev), we've completely restructured the library to become test-driven development with nbdev, with integration tests and everything else that comes with the workflow 2/9
Next @fastdotai and #fastcore, I'm restructuring the internal inference API to rely on fastai's Learner and Callback system, to decouple our code and make it more modularized. With the fastai_minima package as well, only the basic Learner and Callback classes are being used: 3/9
Read 10 tweets
14 Apr
Deploying with @fastdotai isn't always learn = load_learner(), learn.predict. There are numerous scenarios when you might only want some, part, or none of both the API and the library as a whole. In this thread we will be exploring your options, how they work, and what to do: 1/n
Ideally we have the following context:

DataBlock -> DataLoaders -> Model -> Learner -> Train

This can then stem off to a few things:

1. learn.export() -> Model and DataLoaders (which are now blank) ...
In this scenario, we need to ensure that ALL functions which were used in relation to the data are imported before loading in the learner. This can run into issues when using fastAPI and other platforms when loading in the Learner is done in a multi-process fashion 3/
Read 16 tweets
8 Apr
A list of the best (and partially biased) @fastdotai resources, in no particular order, with descriptions and how I utilize them 🙂

(A thread)
docs.fast.ai paired with their source notebooks in github.com/fastai/fastai/….

Looking from documentation to examples right away, and letting yourself get your hands dirty. The most common notebooks I go back to ...
nb50, DataBlock: docs.fast.ai/tutorial.datab…

Shows a variety of ways to create @fastdotai DataBlock and DataLoaders with the midlevel API

Callback (Core): docs.fast.ai/callback.core.…

Remembering the event trigger names, how they work, and quick links to other examples

...
Read 13 tweets
7 Apr
Quiz Time!

Can anyone tell me the difference between this @fastdotai code? How does it behave differently:
Answer: There isn't!

One is more "verbose" in my opinion (get_x and get_y), but neither lose any flexibility as you use it.

2/
But what if I have three "blocks"? Don't I need to use `getters` for that?

Not necessarily. Since we can declare an `n_inp` (as seen below), we can simply load in our `get_x` or `get_y` with multiple functions to be utilized instead for those types:
Read 5 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!

:(