Sometimes you just never know who reads your blogs/tweets. One of the most innovative developers I followed was James Strachan @jstrachan . I was in complete surprise when the one time we met that he said he read my blog that was about software engineering.
In my former life, I used to really enjoy writing about software development. Things like extreme programming, agile development, refactoring etc. So I actually know something about this stuff and am quite opinionated about them. Of course, it's different when stuff is new.
The commonality however with software development and artificial intelligence is that both areas deal with extreme complexity. The agile stuff that was invented two decades ago was motivated by the need to control the complexity.
There was a need to manage complexity to take into account the weak cognitive capabilities of human beings. It was previously assumed that complexity could be planned away. Agile methods are a rope-a-dope tactic to handle complexity.
So the edge effect in the interaction between complexity and limited cognitive beings was something that has always fascinated me. Unlike many analytic people, it was always clear that rigidity of thought had its limits.
Only very few of my current audience have read my old blog. It was called Manageability. I suspect most of my audience have retired by now. It was self-hosted and I got tired of trying to keep it up. You can find it archived: web.archive.org/web/2020*/mana…
If you recall back when agile methods were new. There was a ton of pushback from the waterfall analytic types. Our educational system teaches us nothing about complexity. So when complexity actually rears its face we can't recognize it and we don't have the tools to fight it.
I don't know though what is more complex than using a complex system (i.e. Deep Learning network) to solve a complex problem. Are we going to address these complexities like we did the waterfall guys in the past or are we going to adopt some 'woo' language?
I'm going with the 'woo' language and I don't give a damn if rationalists don't like it. Because when faced with complexity, we've seen how well rationalists have performed (see: current pandemic).
It's often the case that the methods of a master always looks extremely sloppy for the bookish educated type. It's as if the master breaks so many rules. In today's world, we live in a sea of bookish types. The kind who can recite verbatim what they've read.
People don't learn by reading, they learn by practice. You can't become fluent in a language without practice. You can't play an instrument well without practice. You can't be good at math without practice. You can't be good at programming without practice.
The masters are people that are so advanced in their practice that they did not have time to put it down in a book yet. In fact, a lot of what's written is in unfinished form. C.S. Peirce never finished anything because he knew they couldn't be finished.
The ultimate 'woo' language is what C.S. Peirce wrote. What made his writing so difficult was that he kept inventing new words for concepts he couldn't find an English word for. So his writing is just excruciating to read.
The madness of Peirce is that every this is evolutionary, nothing is ever static. So his writing is an evolutionary process. One interesting insight is what drives innovations. Hoffmeyer called this 'Semiotic Freedom'. researchgate.net/publication/28…
• • •
Missing some Tweet in this thread? You can try to
force a refresh
What I find very weird is that people are surprised with the notion that someone taught himself a skill. As if a teacher was absolutely necessary to learn anything. As if you can't learn anything by reading and experimenting by yourself.
"Wow, he's self-taught, he must be extremely gifted!" Are people, in general, incapable of learning anything without a teacher?
But to be perfectly fair, I'm in awe with kids who self teach themselves how to play music. I don't think I have the passion to do that! So I suspect this self-taught thing has everything to do with passion and talent.
I'm coming to the realization that the GOP isn't a conservative party but rather an incoherent group of disenfranchised parties. Any group that has trouble pushing its agenda sees an opening by joining the GOP. It's also a business model to milk the disenfranchised.
Trump perhaps saw this so he reached out to any and every fringe group for their support. Any group, no matter how abhorrent their views are welcome in the GOP. But how do they handle conflicts between parties in the group?
They actually don't need to be because incoherence is the mode of operation. Trump has consistently been logically inconsistent. It is the same for parties within the GOP. They band together not because of commonality but rather because of shared disenfranchisement.
I lived in Manhattan in the last half of the 1990s and left in 2001 before the twin towers fell. So downtown NYC seems early strange for me without seeing the towers. But where I hear the slogans of 911, "Never Forget" I am unable to understand what that even means.
Remembering history should mean that you don't repeat the mistakes of the past. But has the nation learned its mistakes? Did we go into war with false pretenses? Did we overextend our presence in Afghanistan?
What happened after 911 was a cascade of more mistakes compounded one after another. In my opinion, the swift victory against the Taliban via the CIA was one of the few things done right. But it was downhill since then.
Science is being eaten up by deep learning. A fact that nobody can ignore.
But what's unfortunate is that nobody understands deep learning well enough to set up the experiments and interpretations correctly! It's damn good at making predictions but damn terrible at explaining anything!!
The uncertainty principle of deep learning is that the more generalized one's network, the least likely it's interpretable! medium.com/intuitionmachi…
Neuroscientists don't understand cognition, rather they understand how parts of the biological brain function. These are subjects that only partially overlap.
Is there any siloed field that can claim an understanding of general intelligence? I highly doubt it. It's an interdisciplinary problem where most of its practitioners are without an academic home.
Do deep learning practitioners understand general cognition? I doubt it. They may partially understand their connectionist architectures but these are just a sliver of capabilities of what's available to a general intelligence.