, 14 tweets, 2 min read Read on Twitter
Twitter-sized history of neuroscience (biased by my interests).
1760: Bayes: You should interpret what you see in the light of what you know.
1780: Galvani: Nerves have something to do with Electricity.
1850: Phineas Gauge et al: Different parts of the brain do different things.
1850: Bois-Reymond/von Helmholtz: Cells fire electrical impulses down axons.
1880: Golgi: There are beautiful things in the brain but they are all meshed together in a gloop.
1890: Ramon y Cajal: The beautiful things are all separate things called neurons.
1900: Sherington: They connect to each other with synapses.
1900: Pavlov: Brains can learn predictions.
1910: Thorndike: Predictions of reward control behaviour.
1920: Helmholtz: The Brain is a Bayesian inference machine.
1930: Lowei/Dale: Synapses are chemical.
1940: Skinner: All behaviour can be accounted for by reward prediction.
1940: Tolman: No it can’t. You need a “map”.
1940: McCulloch & Pitt: Networks of neurons can perform computations
1950: Hodgkin and Huxley: Those nerve impulses are caused by a dance of ionic currents. There are probably some ion channels.
1950: Hebb: Synapses can store memories.
1950: Lashley: Memories are represented across the whole brain equally.
1950: Penfield: Different parts of the sensorimotor cortex do different things.
1960: Milner: There are at least two forms of memory and hippocampus is only important for one of them.
1960: Sperry: Woah spooky!! The two hemispheres are actually different people.
1960: Hubel and Wiesel: Individual neurons represent real-world things in their activity.
1960: Barlow: Neurons transmit information and should do so efficiently.
1970: Marr: Cerebellar, Hippocampal and Neocortical circuits instantiate different computations.
1970-1980 Several, including Neher and Sakman: Yes there are ion channels.
1980: O’Keefe and Nadel. Hippocampal neurons form a map of space.
1980: Hinton/Senjowski/McLelland/Rumelhart:Neural networks can optimise some cool shit.
1990: Van Essen:Blimey the visual system looks complicated.
2000: Rao/Ballard/Olshausen/Fields:Those real world things that neurons represented are actually features of statistical learning
2000: Schultz, Dayan, Montague: Dopamine neurons allow reward predictions to be learnt.
2000: Kanwisher: Different parts of the higher visual cortex do different things.
2000: Haxby: No they don’t. It’s all distributed (are we still having this same argument?)
2010: Mosers: Entorhinal cells form a better map of space.
2000-2010: Deisseroth, Meisenbock, Boyden et al: Fuck me light.
2015: Hinton/LeCunn/Silver/Hassabis and many others: Neural Networks can optimise some REALLY cool shit.
2015-2020: This neural network looks just like my neurons.
2020: Jesus that’s a lot of neurons.
A couple I have since thought of:
1910: Brodmann. Cells look different in different bits of the brain.
2010: Ramirez & Tonegawa: Hebb's cell assemblies exist (Engrams)
Missing some Tweet in this thread?
You can try to force a refresh.

Like this thread? Get email updates or save it to PDF!

Subscribe to Tim Behrens
Profile picture

Get real-time email alerts when new unrolls are available from this author!

This content may be removed anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just three indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!