, 5 tweets, 3 min read Read on Twitter
I've created an experimental GAN architecture I call #RecuResGAN or "Recursive-Residual GAN" and I am pretty astonished that:
- it works at all
- how well it works across a pretty wide range of scales.
- it is just 15% the size of a comparable #pix2pixHD model Portrait generated by RecuResGAN - replication of training examplePortrait generated by RecuResGAN - replication of training examplePortrait generated by RecuResGAN - replication of training examplePortrait generated by RecuResGAN - replication of training example
Of course I did not google the concept of recursive neural networks before I started this experiment and enjoyed the illusion of being very innovative here for a whole day:
en.wikipedia.org/wiki/Recursive…
The principle is pretty simple: in a classic residual architecture you chain several residual blocks behind each other (in #pix2pixHD the default is 9 blocks), what I do in #RecuResGAN is to use a single block, but loop 9 times over it, feeding its output back into its input.
The same goes for the down- and up-convolution modules, only here you cannot reduce or increase the amount of filters within a block so you have to compromise a bit how many you use in order not to run out of memory with the accumulated gradients.
So my theory why it seems to be relatively scale invariant is particular because of those recursive up- and down convolutions where a block has to handle all the features at various scales and thus becomes kind of fractal.
Missing some Tweet in this thread?
You can try to force a refresh.

Like this thread? Get email updates or save it to PDF!

Subscribe to Mario Klingemann
Profile picture

Get real-time email alerts when new unrolls (>4 tweets) are available from this author!

This content may be removed anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just three indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!