lib2nbdev has been on my mind for months now, and finally it exists! What's the biggest struggle with @fastdotai's #nbdev? Using it on existing projects. This tool aims to fix that.
You are then guided through the process of setting up nbdev's `setting.ini` file and afterwards all of your existing code will be converted directly into fully functional notebooks!
But wait, don't you just throw it all into one big cell?
NO! Instead lib2nbdev will determine what should be private or public, what's an import, and what particular cell tag it should be generated with, such as below which has both private and public tags:
Each generated notebook is given a template heading for you to modify (as only YOU know what to title each notebook!) But from there it's as simple as adding in your tests, writing your documentation, and then fully working in the #nbdev pipeline!
You can also optionally include @fastdotai's fancy Github CI as well, so you won't miss out on the automated testing framework from your notebooks.
Currently there is a pip release, with conda coming soon
(so pip install lib2nbdev)
A big thank you to everyone at @Novettasol for letting me go down and explore this rabbit hole, I am very appreciative of the opportunity to do this at work.
I hope folks will find this tool extremely useful for getting teams up to speed with utilizing nbdev!
And of course to @jeremyphoward and @HamelHusain for giving us nbdev in the first place and answering all my questions as I was developing this 😁
• • •
Missing some Tweet in this thread? You can try to
force a refresh
Can anyone tell me the difference between this @fastdotai code? How does it behave differently:
Answer: There isn't!
One is more "verbose" in my opinion (get_x and get_y), but neither lose any flexibility as you use it.
2/
But what if I have three "blocks"? Don't I need to use `getters` for that?
Not necessarily. Since we can declare an `n_inp` (as seen below), we can simply load in our `get_x` or `get_y` with multiple functions to be utilized instead for those types:
Thanks everyone for joining me on this much more brief stream, in this thread I'll try and summarize all we discussed including:
- x is not implemented for y
- LossMetrics
- TensorBase and tensor metadata
And here is the stream:
1/
.@fastdotai had utilized their own tensor subclassing system in v2, however originally there weren't many issues as fastai just "let" you do things with these classes. Then @PyTorch came along and introduced them in 1.7. Suddenly, people were getting this! Why?
Pytorch became much more explicit in what subclasses can interact with oneanother. As a result @fastdotai had to make the TensorBase class which allows for any tensor-types to interact with each other. A quick function to convert and an example are below 3/
Thank you all SO much for joining me tonight, it was a ton of fun to be back in the streaming/teaching wheelhouse again! We had almost 300 people chime in! For those that missed out, the video link is here:
1/
The notebooks are also available here, and eventually I'll have them fully integrated into walkwithfastai.com, just need to figure out a good way first! 2/ github.com/walkwithfastai…
From what I gathered from the feedback, it seems this stream really did help open up everyone's eyes to the power of @fastdotai's Callback system, and how it's training loop works. It's not just a marketing phrase when they say you can modify ~everything~ in the training loop! 3/
Have you wanted to know just what goes on in @fastdotai's training loop and just what those pesky Callbacks are up to? In the latest release of Walk with @fastdotai this is now possible!
What we've done here is extended fastai's existing `show_training_loop` functionality to include some information (from a provided doc string) of what exactly is occurring during each event. This then lets you get more familiar with what fastai's training loop, ... 2/
Understand when things are called, and what happens at each step. The old version (which is still there if you pass `verbose=False`) simply just showed what Callbacks were triggered (pictured below)