Profile picture
, 4 tweets, 3 min read Read on Twitter
New blog post: How does batch norm _really_ help optimisation?

We go on a tour of bad inits, degenerate networks and spiky Hessians - all in a Colab notebook:
colab.research.google.com/github/davidcp…

Summary 👇
1/ Early signs of trouble.

We learn that deep ReLU nets with He-init, but no batch norm, basically ignore their inputs! (Check out arxiv.org/abs/1902.04942 by Luther, @SebastianSeung for background.)

Easy to miss if you pool across channels:
2/ How to break your ResNet.

Next we go looking for trouble: we find degenerate networks near the training trajectory by (mis)using backprop. Batch norm comes to the rescue!
3/ Spikes.

Finally we connect to instability of SGD and outlying eigenvalues of the Hessian (found by @leventsagun, Bottou, @ylecun.)
The mystery of the spiky Hessian is resolved along with the secrets of batch norm!
Missing some Tweet in this thread?
You can try to force a refresh.

Like this thread? Get email updates or save it to PDF!

Subscribe to David Page
Profile picture

Get real-time email alerts when new unrolls are available from this author!

This content may be removed anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just three indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!