My Authors
Read all threads
In the pursuit of more advanced AI (or even AGI), one must be able to articulate a model of intelligence. That model doesn't have to be absolutely correct, but without one is like navigating unfamiliar territory without a map. You will miss important landmarks. #ai #deeplearning
The Achilles heel of younger AI researchers is the lack of exposure to alternative models of cognition. One has tunnel vision when optimization is one's framework for cognition. Cognition is very messy and acknowledging this should be the first step.
Behaviorism in psychology and reductionism in neuroscience are examples of pursuits that purposely ignored the messiness of the brain. Fortunately, we are moving to more holistic approaches in these areas. The same should happen in AI research.
Enactivism is the model of cognition that bests approximates discoveries in neuroscience, evolution and psychology. That is why, models inspired by this approach is more likely to serve as a model for cognition than other approaches.
Enactivism has a history that parallels the history of connectionism. It's origins come from Norbert Wiener's explorations into Cybernetics. Followed by Maturana and Varella's ideas on autopoiesis. One should not be blind to the richness of ideas in this space. #ai #autopoiesis
Ideas from the past that have not led to fruition shouldn't be ignored. Artificial Neural Networks were ignored for decades. But as technology advanced, ANNs and connectionism have been shown to be (surprisingly to many) as extremely valuable.
Many times, old ideas aren't wrong, they are just ahead of their time. But how does one explain the gap in exposure to many of the old ideas of cybernetics? These ideas were bleeding-edge ideas of their time. The problem is these ideas are absent in undergraduate curriculums.
The same can be said about research in Complexity Science. Few are exposed to the unique perspective that these approaches lend to understanding intractable domains.
We live in a society populated with people who have a mistaken perspective that everything can be understood from a mechanistic perspective. That is because our educational system focuses only on solving narrow problems.
You can actually see this cognitive dissonance in software development. Graduates from computer science are most unfamiliar with the holistic methods found in agile development to handle the intrinsic complexity of software development. Everyone has a waterfall mindset.
This waterfall mindset is very strong in the sciences and engineering. It's the same mechanistic viewpoint. The appeal of DL is that the methods appear mechanistic. But DL is a holistic approach. What can be holistic than the requirement to grow your solution?
Growing solutions is more like gardening than it is like bricklaying. It's more spiral than it is like waterfall development. Thus, to crack the AGI problem one needs to adopt a holistic viewpoint. The reductionist approach as promoted by GOFAI is a total dead end.
The historic appeal of GOFAI comes from the fascination of the power of artificial logic (i.e. computers) and the reductionist perspective of linguists. It's further fed by our human cognitive bias towards compact and tractable explanations.
Connectionism, cybernetics and enactivism all look very distasteful for the reductionist mindset. How can a method of growing a solution be a systematic approach? This is the point though, rational thinking is just a subset of systematic thinking not its entirety.
Agile processes look haphazard but they are in fact systematic. They are systematic in that they employ the right kind of discovery process to the right kind of complex problem. Different knowledge processes are required for different kinds of problems.
So when mechanistic thinkers encounter holistic thinkers it's not very different from a waterfall developer encountering agile methods for the first time. One cannot understand something if one does not have an internalized mental image that can map an outside experience.
Well, that's a kind of "ignorance". I guess someone else might have a taxonomy of ignorance. The really pernicious kind is the inability to understand something because one doesn't have the internal knowledge tree to hang an exterior idea onto.
It's like symbols without any grounding. It's like marketing folks needing to talk about technology but all they know is how the jargon is put together in a sentence! It's like HR or VCs judging applications based on the existence of buzzwords.
Connectionism, Cybernetics, Enactivism and Complexity are just ungrounded symbols to the reductionist and mechanistic thinkers that encounter them. Even worse, they associate with ideas that come from mystics or hippies that are high on something.
The saddest thing though is that our political and economic leaders are completely clueless about complexity. They treat the complex domain that they are responsible for in the same way a behaviorist would frame cognition.
The Gordon knot is that we are endowed with paleolithic brains, medieval institutions, and god-like technologies. It's time to upgrade the former two with god-like thinking.
@threadreaderapp unroll thanks!
Missing some Tweet in this thread? You can try to force a refresh.

Enjoying this thread?

Keep Current with Carlos E. Perez 🧢

Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just three indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!