Profile picture
, 5 tweets, 3 min read Read on Twitter
Weight Agnostic Neural Networks 🦎

Inspired by precocial species in biology, we set out to search for neural net architectures that can already (sort of) perform various tasks even when they use random weight values.

Article: weightagnostic.github.io
PDF: arxiv.org/abs/1906.04358
Networks that already work with random weights are not only easily trainable, they also offer other benefits. We can give the same network an ensemble of different random weights to boost performance, without any training.

An MNIST classifier evolved to work with random weights:
Key idea is to search for architectures by de-emphasizing weights. During the search, networks are assigned a single shared weight value at each rollout, and optimized to perform well over a wide range of weight values. As a bonus, we get to bypass the costly inner training loop!
This work was led by Adam Gaier, who did a 3-month internship in the Google Brain team in Tokyo. This idea came out after a few drinks in Roppongi. He has done some fantastic work in the Neuroevolution area in the past and is active in the GECCO community. bit.ly/2Kbx0EZ
Some failure cases look interesting. A WANN evolved to walk ahead sometimes perform non-trivial actions like balancing when using bad weights.
Missing some Tweet in this thread?
You can try to force a refresh.

Like this thread? Get email updates or save it to PDF!

Subscribe to hardmaru
Profile picture

Get real-time email alerts when new unrolls are available from this author!

This content may be removed anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just three indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!