Rosanne Liu Profile picture
13 Dec, 20 tweets, 10 min read
Favorite #NeurIPS2020 presentations and posters this year

PS: heavily biased by what I happened to catch and whom I happened to talk to
PPS: still catching up on talks so the list is rather incomplete and I'd hope to grow
PPPS: with contributions from @ml_collective members
[Talk] No. 1 has to go to -- keynote talk by @isbellHFh @mlittmancs et al simply brilliant 🎉🎉
slideslive.com/38935825/you-c…
[Talk] "Incentives for researchers" by Yoshua Bengio slideslive.com/38938274/incen…
appeared at the same workshop
[Talk] "The importance of deconstruction" by Kilian Weinberger slideslive.com/38938218/the-i…
appeared at @MLRetrospective
[Talk] "Through the Eyes of Birds and Frogs" by @shakir_za
slideslive.com/38938216/throu…
Appeared at @MLRetrospective
[Talk] "Pain and Machine Learning" by @shakir_za
slideslive.com/38938071/pain-…
Appeared at Biological and Artificial Reinforcement Learning workshop
[Paper/Poster with tl;dr] neurips.cc/virtual/2020/p… Is normalization indispensable for training deep neural network? Nope. If you remove BN from ResNet training fails, but if you do some smart tricks like adding a rescaling parameter to the residual connection it's back to working
[Paper/Poster with tl;dr]
neurips.cc/virtual/2020/p…
Top-k training of GANs. Simple trick! For each batch of generated images from G, pick only top k highest D score samples to backprop, zeroing out gradients from others
[Paper/Poster with tl;dr]
neurips.cc/virtual/2020/p…
Instance Selection for GANs. Similar to above, but instead of sampling top generated samples, this one preprocess data (real images) so that only high probability/density ones are used.
[Paper/Poster with tl;dr]
neurips.cc/virtual/2020/p…
Randomly dropping layers in transformer and moving layer norm out of residual connection helps
[Paper/Poster with tl;dr]
neurips.cc/virtual/2020/p…
Train-by-Reconnect. This is fun! Perhaps my favorite! Random initialize network weights, then all training does is shuffling their locations. Somehow works!
[Paper/Poster with tl;dr]
neurips.cc/virtual/2020/p…
What's being transferred in transfer learning? Turns out the downstream and upstream model live in the same local minima, and that you don't have to transfer from the last epoch
[Paper/Poster with tl;dr]
neurips.cc/virtual/2020/p… "Meta-Learning through Hebbian Plasticity in Random Networks" use Hebbian rule evolved through ES to train a network that ends up being robust to significant weight perturbations! No gradients! @risi1979 @enasmel
[Paper/Poster with tl;dr]
neurips.cc/virtual/2020/p…
Learning convolutions from scratch -- improves generalization! @bneyshabur
[Paper/Poster with tl;dr]
neurips.cc/virtual/2020/p… "Measuring Robustness to Natural Distribution Shifts in Image Classification" How robust are models to non-synthetic perturbations? TLDR: not very robust
[Paper/Poster with tl;dr]
neurips.cc/virtual/2020/p… Training is very nonlinear in the first few epochs , but after that very linear. After the first two epochs you can linearize training while maintaining acc (CIFAR)
[Paper/Poster with tl;dr]
neurips.cc/virtual/2020/p… SGD is greedy and gives non-diverse solutions. Instead, by splitting off at local minima (saddle point) and following eigenvectors with negative eigenvalues of the hessian, you can find diverse solutions
[Paper/Poster with tl;dr]
neurips.cc/virtual/2020/p… ❤️ We already know that network with random weights already contains subnetworks that perform well. But do you know that they can solve thousands of tasks continually w/o even knowing the task ID?

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Rosanne Liu

Rosanne Liu Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!