Excited to share our paper arxiv.org/abs/2105.12221 on neural net overparameterization to appear at #ICML2021 💃🏻We asked why can’t training find a minimum in mildly overparameterized nets. Below, a 4-4-4 net can achieve a zero-loss, but any of 5-5-5 nets trained with GD can not🤨
We investigated the training failures in mild overparameterization vs. successful training in vast overparameterization from a simple perspective of permutation symmetries!
The catch is that all critical points of small nets turn into subspaces of critical points in bigger nets. We gave precise numbers of such critical subspaces using combinatorics 😋
As a byproduct of this expansion trick, we could give a precise geometrical description of the global minima manifold in overparameterized nets: it is a union of affine subspaces that are connected like in the picture
Most surprisingly, in mildly overparameterized nets, the critical subspaces dominate the global minima! The landscape should be looking rough in this regime...
But, in the vast overparameterization regime, the global minima is much larger than the critical subspaces, therefore the minima manifold is HUGE — so it is easier to find a global minimum for GD as expected from the NTK theory!
It has been a super great pleasure to work with @HonglerClement @ArthurJacot3 François Ged Francesco Spadaro Johanni Brea Wulfram Gerstner @compneuro_epfl
...and as a personal note, this is my first first-author paper in ML😊 Very much looking forward to feedback and questions!

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with berfin

berfin Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!

:(