Ian Goodfellow Profile picture
Deep learner. Inventor of GANs. Lead author of https://t.co/GXx4YfpW7O
Ferdinando Fioretto Profile picture Dave Liu Profile picture 2 added to My Authors
Aug 25, 2018 6 tweets 1 min read
I don’t like notebooks either. I agree with a lot of @joelgrus ‘s reasons and have a few others of my own Notebooks generally don’t play very friendly with source control tools such as git
Aug 6, 2018 4 tweets 1 min read
One of the most anticipated GAN names now taken: paGAN. fxguide.com/featured/a-i-a… Also, someone DMed me to warn me these results could be fake: reddit.com/r/MachineLearn…
Jul 29, 2018 10 tweets 2 min read
I suspect that peer review *actually causes* rather than mitigates many of the “troubling trends” recently identified by @zacharylipton and Jacob Steinhardt: arxiv.org/abs/1807.03341 I frequently serve as an area chair and I manage a small research group, so overall I see a lot of reviews of both my group’s work and others’ work
May 16, 2018 9 tweets 2 min read
A math trick I like a lot is the approach to taking derivatives using hyperreal numbers. Thread: For this trick we introduce a new kind of number, called an infinitesimal hyperreal number. Imagine we have some number epsilon, such that epsilon > 0 but epsilon < x for all positive real numbers x. Imagine that we can use algebra to manipulate epsilon like any other variable.
May 15, 2018 10 tweets 2 min read
A quick thread on two of my favorite theory hacks for machine learning research A lot of the time, we want to analyze the optimal behavior of a neural net using algebra / calculus. Neural net models are usually too complicated for you to algebraically solve for the parameters that optimize most functions (unless it's some trivial function like weight decay)
Mar 26, 2018 12 tweets 2 min read
2nd thread on evaluating GAN papers (1st thread hit max thread length) Many DL algorithms, but especially GANs and RL, get very different results each time you run them. Papers should show at least 3 runs with the same hyperparameters to get some idea of the stochasticity.
Mar 26, 2018 25 tweets 4 min read
Thread on how to review papers about generic improvements to GANs There are a lot of papers about theoretical or empirical studies of how GANs work, papers about how to do new strange and interesting things with GANs (e.g. the first papers on unsupervised translation), new metrics, etc. This thread isn't about those.
Mar 26, 2018 11 tweets 2 min read
1/11) Thread on bidding to review conference papers 2/11) Lately I've seen a lot of people saying things like "clearly good papers get all the senior reviewers" or "remember to bid or you'll only get low quality papers"