The notion of differences between individuals and differences across classes of individuals isn't precisely quantifiable for brains. I think neuroscience is doing civilization a disservice by insisting on reductionist theories to quantify people.
A natural human tendency is to think like people think in the same way. Some people are surprised that other people might not vocalize their thoughts. Other people are surprised to find that some people can't visualize their thoughts.
But the brain is constructed by the accumulation of a multitude of mental habits. We grow by favoring one kind of habit over another. Many habits are not necessary, but we favor them because that is what we are used to.
Neuroscience has an implicit bias that humans are wired in the same way. There is of course an innateness in humans that distinguishes us from other primates. However, this should not imply that we are as cookie cutter as machines like computers.
The reason you can take software from one computer and have it run on another computer is that computers are standardized. This is not the case for biological brains.
So that notion that you can upload programs into brains is a seriously flawed idea. The neural language of a person's brain is unique to that brain.
The reason though that you can understand what I am writing however is that our brains have developed the interpreter to parse English text. But how we interpret differs individual per individual because our neural language is different between individuals.
We have such poor models of the notions of sameness in brains and differences in brains that we build all kinds of laws and moralities that make all kinds of false assumptions of how reality and society should be.
We discriminate against whole classes of people because we somehow believe in a fuzzy notion of innateness and a fuzzy notion of the natural order of things. It's a lot of BS that a lot of scientists are not even consciously aware of.
The root problem is due to modern language favoring nouns over verbs. We have pre-Darwinian thinking that biological things must be fixed. That all human brains must think the same. Yet we ignore the obvious reality that children grow in their thinking of this world.
Buckminster Fuller said "I know that I am not a category. I am not a thing--a noun . . . I seem to be a verb, an evolutionary process--an integral function of the universe.".
it's going to take time, a whole lot of precious time, before scientists accept process thinking. But before that, we have an entire civilization that thinks in a completely backward manner.
Wolfram explains why his theory of physics emphasizes causation (computational irreducibility) and causality (observer reference frame of computational reducibility). It's a fascinating model of reality that I also subscribe to.
Wolfram writes "Consciousness is not about the general computation that brains—or, for that matter, many other things—can do. It’s about the particular feature of our brains that causes us to have a coherent thread of experience."
Wolfram is unique in that he identifies the possibility of a different kind of consciousness that is alien from human consciousness. There is not just one kind of consciousness, but many kinds that create an empathy with reality in distinct ways.
Yes, this is an inconvenience that I (for different reasons) experience. That said, it is an inconvenience because of the rich uniqueness of our origins. Let's not demand that everyone understand our unique circumstances. That said, I can relate.
I've got a personal dislike for my name, but there's too much inertia for me to make any changes. My name is the 'John Smith' equivalent in Spanish. Completely generic without identity. What's even worse, it exposes a false identity. I don't even speak Spanish!
Anyway, minor rant. It's the inconveniences and misunderstandings that make life human and interesting. A life without inconvenience is a life devoid of meaning.
There's is a massive asymmetric information gap between knowing a theory is wrong and discovering the correct theory. Becoming aware of flaws is just the first step in a very long journey. But if you never see the flaws, you never take the journey and thus never get anywhere.
This is a double-edged sword. So we see flaws that are simply not there and take a journey, towards discovery, that is along a deceptive path. The path where one sticks to because it's the one without forks. The one that continually confirms one's own biases.
Persistence requires a level of naivety, this is what keeps us motivated. This is because if we knew how long the journey was before we began, then we might have never started it at all.
Lazy Twitter: What was the name of that hypothesis that the technologies that were not the best but were most widely distributed would become the ones that take over the world? Do you know what I'm referring to?
I seem to recall that it was also used as an argument why the Apple M1 was so fast. I don't recall though the name of the theory or who came up with it. debugger.medium.com/why-is-apples-…
It's also related to how the successful programming languages are not the most elegant or powerful ones, but the kinds that just have the best fit at the time of its adoption. I also forget what they called this observation!
Lazy Twitter: What is a good metaphor for biology?
I'm asking this because the usual bias about biology is that because it is made of wet stuff that we are biased to think of like a massively scaled chemical engineering process.
We don't think of biology like it is the dry stuff that we find in semiconductor technology. That is where the scope of control is in the movement of few elementary particles (i.e. electrons and protons).