, 18 tweets, 5 min read Read on Twitter
It's ironic that Von Neumann described the circuits of his new invention (i.e. the computer) in terms of McCulloch-Pitts logic for neurons.
Walter Pitts was homeless at 13 and hid in the local library. In the library, he read all 3 volumes of Bertrand Russell's Principia Mathematica. nautil.us/issue/21/infor…
Godel however proved later that Russell's ambitions to frame all knowledge using mathematical logic had fundamental limits in its completeness. This sentiment was also expressed by Norbert Wiener (while as a student of Russell).
Wiener is the father of Cybernetics and an active collaborator with McCullough and Pitts. McCullough was a neuroscientist and Pitts was the mathematician who formulated the first model of an artificial neural network.
Frank Rosenblatt further enhanced the model by introducing weights and continuous variables.
Rosenblatt's perceptron was published in 1957, however, 9 years earlier, Alan Turing was working on similar neural network ideas: alanturing.net/turing_archive…
Minsky and Pappert threw cold water on the perceptron in 1969. Coincidentally, Turing's work on Neural Networks went unpublished until 1968.
This created a long winter in neural networks. The ideas of Connectionism was later revived in 1986 through the publication of the Parallel Distributed Processing (PDP) book mitpress.mit.edu/books/parallel… , that introduced back-propagation.
In PDF, Restricted Boltzmann machines were introduced and 3 years later LeCun formulated Convolutional Networks. Networks that were proven to 'learn' from data were introduced 43 years after McCulloch-Pitts formulated the first model of a neuron.
However, all this focus on a model of a neuron is completely misplaced. The critical capability belongs to mechanisms that act and learn from feedback. Coincidentally, also in 1943, "Behavior, Purpose and Teleology" was published. medium.com/intuitionmachi…
"Behavior, Purpose and Teleology" emphasized the importance of feedback loops. This idea was rediscovered in modern deep learning in the form of GANs: medium.com/intuitionmachi…
Jurgen Schmidhuber however argues (in a new paper) that what he calls 'adversarial curiosity' was formulated way back in 1990 (4 years after PDP). arxiv.org/abs/1906.04493 .
Which gets us back to Kurt Gödel and his incompleteness theorem. It turns out, formal systems, Turing computability, and cellular automata have in common self-referentiality: medium.com/intuitionmachi…
This problem of the openness of knowledge was recognized by Norbert Wiener when he wrote "Theory of Ignorance" (1906) at a tender age of 10! He writes "It is the battle for learning which is significant, and not victory."
The full text of this can be found in McCulloch's "Recollections of the many sources of cybernetics" . pdfs.semanticscholar.org/252a/d7cd2f224… .
The history of computing has always been aligned with the fascination of building machines that think. You can go way back to Aristotle to get some history: theatlantic.com/technology/arc…
It is easy for people who have never worked with a digital computer to believe that logic is all it takes to create a thinking machine. This was the mistake of almost everyone before the invention of Von Neumann's computing architecture.
However, the Gestalt psychologist Heinrich Kluver in 1946 asked "Could a “universal” computing machine compute a form—a perception—using logic alone?" GOFAI's failure provides the answer with a definite negative.
Missing some Tweet in this thread?
You can try to force a refresh.

Like this thread? Get email updates or save it to PDF!

Subscribe to IntuitionMachine
Profile picture

Get real-time email alerts when new unrolls are available from this author!

This content may be removed anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just three indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!