New @ #ICML2021: When a trained model fits clean (training) data well but randomly labeled (training) data (added in) poorly, its generalization (to the population) is guaranteed!

Paper: arxiv.org/abs/2105.00303

by ACMI PhD @saurabh_garg67, Siva B, @zicokolter, & @zacharylipton
This result makes deep connections between label noise, early learning, and generalization. Key takeaways: 1) the early learning phenomenon can be leveraged to produce post-hoc generalization certificates; 2) can be leveraged by adding unlabeled training data (randomly labeled)
The work translates the early learning into a generalization guarantee *without ever explicitly invoking the complexity of the hypothesis class* & we hope others will dig into this result and go deeper.
This work represents roughly one year of constant work. We had initial results on ERM last summer but kept pushing to articulate the idea as fully as possible. We're happy about publication, but more excited to finally share this work with our community.
Also, if you find any typos and send to Siva, @saurabh_garg67 owes him one coffee per typo, so help a statistician out...

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with ACMI Lab (CMU)

ACMI Lab (CMU) Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!

:(