Isaak Profile picture
Working on something new. Prev comp neuro @ MIT, applied math @ Berkeley.
Apr 17 7 tweets 3 min read
I’m leaving MIT and not continuing into my PhD. AI is coming too fast for humans to keep up.

But there might be a way: I realized digital humans are more possible than most think. With capable AI researchers helping, maybe for $10B, maybe in less than 10 years, on 50k H100s. Image Running a human brain might need only ~50,000 H100 GPUs. xAI already has 200,000+ H100s or better.

To anchor the discussion, I did some very rough napkin math: Under fairly pessimistic assumptions using current high-resolution neurons (eg Hodgkin-Huxley), multi-state synapses, a human brain might be in reach of ~600 exaFLOP/s of compute, 700 GB memory storage per GPU, and 24 GB/s interconnect bandwidth. That's already in reach for today's clusters!

If much simpler neuron models (eg Leaky-Integrate-and-Fire) are enough--which needs more research to be empirically determined--then a human brain might be as cheap as ~2-3 petaFLOP/s. That's nearly a single H100 at FP16. (Memory and interconnect are likely tighter constraints.)

But what neurons to run? What parameters? What connectivity?
Oct 3, 2024 7 tweets 3 min read
Learning Chinese to intermediate fluency usually takes ~3000-4000h over years.

I did it in <1 year and <1500h, self-taught as a fun side project. If you've been wanting to learn a language, I recommend trying!

A quick thread on my strategies for effective language learning👇🧵
isaak.net/mandarin 水滴石穿 — Consistency is key.
Most people do a little duolingo and then fall off. There’s no way around consistency. Find whatever works for you.
I like doing my reviews early in the morning combined with exercise: