[🧠 collective intelligence 🧠] I've been intrigued by the cellular automata (CA) concept for a long time, and by the potential, mutually beneficial interaction between CAs and deep learning, so I decided to dig a bit deeper. Here are some interesting resources I found:
1/🧵
distill.pub/2020/growing-c… <- one and only Distill. A nice introduction to how neural CA works and how it may be a potentially useful model of morphogenesis and regeneration processes in developmental biology.
(think how we humans are formed from a single egg cell - when and how does this multiplication of cells stop and stays stable?)
3/
arxiv.org/abs/2111.14377 "Collective Intelligence for Deep Learning" <- a recent paper that explores how ideas from collective intelligence (an umbrella term for CA, self-organization, swarm optimization, etc.) have influenced deep learning (and vice versa).
arxiv.org/abs/2111.13545 "μNCA: Texture Generation with Ultra-Compact Neural Cellular Automata" <- a recent paper that trains a neural CA (compare that to fixed rules of Conway's Game of Life!) to generate interesting textures with as little as 60 bytes of learnable weights!
5/
Finally check out this "upgraded" version of Conway's Game of Life: continuous time-space CA called "Lenia":
1) <- some quick visualizations so that you understand what it's all about (hint: amazing artificial "lifeforms" emerge)
Oh, and yeah - neural CA is basically a GNN (as all things should be haha).
In the discrete grid world, every cell (node) has a multidimensional state (feature vector), does some local communication with immediate "pixel neighbors", and outputs a novel cell state.
8/
• • •
Missing some Tweet in this thread? You can try to
force a refresh
When successful people say they're not the best at anything, what they really mean is that they are not the best alongside any of the dimensions that we humans have a name for and that we can easily quantify (running 100m/chess/comp. programming/traveled the most/h-index...)
1/
But if we were to quantify "bestness" as the length of the resultant vector across all of the relevant dimensions - then the scoreboard changes.
They are usually, all in all, the most well-rounded professionals.
2/
That's why we don't have a global scoreboard for "the best entrepreneur", "the best leader", etc.
These roles have an immense breadth and it's impossible to quantify and rank those people.
3/
In this video, I build an MLP (multi-layer perception) and train it as a classifier on MNIST (although it's trivial to use a more complex dataset) - all this in pure JAX (no Flax/Haiku/Optax).
2/
I then add cool visualizations such as:
* Visualizing MLP's learned weights
* Visualizing embeddings of a batch of images in t-SNE
* Finally, we analyze the dead neurons
3/
[🔥 Learn ML for beginners 🥳] I recently said I'll be binge-watching fast.ai's Practical Deep Learning for Coders and I did, here are my final thoughts!
I'm mainly going to contrast it with @coursera's course as that's the course I took back in late 2018.
1/
Verdict:
If you're in high school or a student or more precisely somebody who still has difficulties creating your own learning program (no experience with self-education) I'd recommend you take @coursera's course - it's more streamlined.
2/
You'll know exactly when to read, watch, or code.
On the other hand, if you already have some experience (you had some tech internships/jobs) or you're considering switching careers (again you're experienced) or simply integrating deep learning into your own domain...
3/
Again thanks to @PetarV_93, @relja_work, Cameron Anderson, Saima Hussain for being supportive throughout this journey!
2/
In this blog you'll find:
* The details on how @DeepMind's hiring pipeline is structured.
* Many tips on how to prepare for top-tier AI labs (like DeepMind, OpenAI, etc.) in the world (for research engineering roles but I guess many tips will apply for scientists as well).
3/
I'll be binge-watching @jeremyphoward and @GuggerSylvain's @fastdotai "Practical Deep Learning for Coders" course today and tomorrow! 8 lectures, ~2h each. It's going to be fun! 😂 Why?
Well:
1/
* I want to update my blog on getting started with ML from 2019 where I only recommended @coursera (and I realized just how bad my writing was just 2.5 years ago!).
I recommend you bookmark it but don't read it just yet, it should be ready by the end of this week!
* I want to be able to give better advice to "younger folks" in general. I get a lot of questions on my Discord as well (join it if you haven't: discord.gg/peBrCpheKE).
3/