[ Cybernetics, Enaction & the End of Functionalism ] 🧵🪸
To understand enactive modeling (as any scientific and rational representation is an evolutionary model experimentally correlated), one need to understand its socio-historical roots in the cybernetics functionalist style
, and how this functionalist cognitivist style has been "metabolized and digested" comprehensively into the embodied enactive style of thinking:
Another way to say that enaction is not (or only superficially and intersubjectively) anti-cognitivist but also more deeply and phenomenologically meta-cognitivist, at least for me. Enaction is "experimental" in this deeper sense that it is also experiential.
Isn't it the dual (1P-3P) sense of "empirical"?! Welcome to the "Not one, not two" logic².
For me this modern parangon of functionalism, i.e. cognitivism (computo-representational functionalism), rooted in cybernetics, is outdated (hence the actual "perturbations" at the planetary level),
signing the end of functionalism domination as the orthodoxy of technoscientific rational thinking.
Enacting collectively an extended and more comprehensive second order rationality² is needed.
(My scybernethics was always, ethically, only meant to be an inspirational example)
@threadreaderapp unroll
• • •
Missing some Tweet in this thread? You can try to
force a refresh
[ Understanding the Connectionist ("genAI") Epistemological Disruption ] 🧵 🪡
I will quickly articulate some key rationals in this thread to explain the disruptive role of "generative AI" & their unexpected success, which is also at the foundation of my "scybernethics".
In a nutshell:
connectionist modeling is a de-construction of one pillar of traditional natural & cognitive science: functionalism.
"Connectionist" style modeling (Machine learning, Artificial neural networks and associated "learning rules"), which I prefer to more rigorously call generically PDP modeling (for Parallel and Distributed Processing, Rumelhat & McClelland, 1984, about micro-cognition),
Thanks to Hui's "Machine and Sovereignty", I begin to have a more deep understanding of Hegel's dialectic and phenomenology, and see many resonances with my scybernethics journey (outside naïve projections).
Ex, in §5. "Individuation of the Spirit as historical Process"
1/3
1. Technologies as sensorimotor externalizations. 2. Computers as quasi second-order machines (machine², i.e. machines simulating machines and machining us).
2/3
3. The ambijective gesture as an embodied & phenomenological internalization (leading to an anamnesia) of the externalized, namely sciences and technologies of cognition.
Thus, seing scyberethics as the expression of a (philo-techno-scientific) *methodology* of individuation.
[ Politics of Modeling: Why I don't like PP-FEP ] 🧵
As an enactive thinker (more than "enactivist"), my problem with PP-FEP is political: the politics of modeling / style of thinking in science of cognition.
For me it is too much on the empirical side, a pure product of the actual dominant utilitarianist scientific trend. It's probabilist foundation ("Bayesian inference"), and locally global conception, while "efficient",
tend to mask thereby it's underlying emergence of coherence from (biological) micro-processes and put a screen on their intelligibility by reducing them to statistical dynamics.
[ Relating Tekhne and Episteme (Enacted Biocognition) through Implicit Processual Memory ] 🧵🪡
Understanding technicity/Tekhne imply to understand modern technologies as the production of machines *externalizing and concretizing our normative, first body then mind, gestures*.
This understanding allows us to shed light on the nature of technical artifacts, notably automatic computing machines, and their power to shape our embodied minds, but also to highlight certain crucial keys to better understanding ourselves and extending our self-knowledge.
Aristotle in his Nicomachean Ethics describe how Tekhne (skill & craft) embodies a practical approach to acquiring knowledge through practical experience and skillful practice.
When I began my spiritual journey at the age of seventeen (1982), having been confronted with the idea of death, the first books I read were "The Reign of Quantity and the Signs of the Times" by René Guénon and "Zen Mind, Beginner's Mind" by Suzuki.
Then I began to be interested in the symbolism of numbers, that is, numbers not as quantities but as *qualities*: one as unity, two as duality, three as trinity, etc. (I recently discovered that Pierce also followed this path), and also in heraldry and symbolism in general.
Through suspension, meditation and reflexion, I discovered for myself some principles of this elementary way of thinking hermeneutically, of thinking thinking formally and processually:
[ Mechanical Computation & Abstract Weaving: Old Story ] 🧵🪡
Walking in the south of France (Pézenas), I was surprised, while visiting an old private mansion with medieval architecture intended to promote the arts and crafts, to find myself in front of a Jacquard loom (1801).
The correspondence with the "computer" is obvious:
1. Use of "binary" punch cards to automate a complex task, like the first electronic computers: hole/no hole = 1/0 (Boolean binary logic).
2. "Weaving": Ada Lovelace, the first algorithmic programmer, spoke of programming as "weaving algebraic patterns".