Scott Condron Profile picture
Machine Learning Engineer at @weights_biases. I post about machine learning, data visualisation, software tools ❤️ @pytorch @fastdotai 🇮🇪

May 28, 2021, 5 tweets

What actually happens when you call .backwards() in @PyTorch?

Autograd goodness 🪄!

PyTorch keeps track of all of the computations you’ve done on each of your tensors and .backwards() triggers it to compute the gradients and stores them in .grad.

1/3

You can see the gradient functions by looking at .grad_fn of your tensors after each computation.

You can see the entire graph by looking at .next_functions recursively.

Or you can use github.com/szagoruyko/pyt… by @szagoruyko5

2/3

This is a good video from the @PyTorch YouTube channel that goes through the fundamentals of autograd if you’d like to learn more about it.



3/3

Another autodiff explanation for those looking for some more info about it.

This has brilliant animations and some of the considerations around the implementation.

Found here:

Share this Scrolly Tale with your friends.

A Scrolly Tale is a new way to read Twitter threads with a more visually immersive experience.
Discover more beautiful Scrolly Tales like this.

Keep scrolling