Having looked pretty deeply at various blockchain tech stacks over the years, this thread seems dead on.
People are holding on to *the dream*, and the fact that the tech will impoverish millions, help destroy the one world we share, and fail to avoid aggregation is immaterial.
I can't stress enough just how transparently wrong the "decentralised" claims are should you care to look.
Not the dream of decentralisation (whatever that is), but the lived reality of all these systems. Bitcoin? Mostly traded through exchanges now. And they will be regulated.
"web3"? Well, you can't find anything...so you get alt-stack, slower, less capable versions of systems we already have:
Can't store NFT data on-chain because it's too expensive? Welp, there's gonna be a middle-person for that...and it will centralise and aggregate.
It's a very strange version of "being early is the same thing as being wrong"; in this case they're wrong *because* they're early
As this plays out, the forces that created aggregation and centralisation (DNS, e.g.) in the web's distributed underpinnings will take hold just as effectively here. One "solution" at a time.
Why? Because humans are the users.
If you want a "decentralised web" (whatever that is), like, hardcode IP addresses? Add new trust roots into your browser keystore? It's all there.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
I can't say it enough: "vdom" is not fast. It is slow.
The only *defensible* claim is that it isn't as slow as one might think given how much overhead it adds...but that doesn't make it competitive or good.
Computing diffs through something like React's reconciliation algorithm is a clever way to avoid needing to have knowledge of the potential changes that can occur on either the JS or DOM side.
But it's correspondingly very expensive.
Reactive systems that constrain *either* how you update DOM (e.g. Lit's template system) *or* cabin the effects of JS side-effects (Svelte's "reactive declarations", FAST's Observables) deliver superior performance because they don't need to model + compare the whole world
Browser teams (the folks who work on UI) don't think of content as "their problem". For historical reasons, they care about TLS and that has helped them make common cause with security interests.
But no such enlightenment has occurred around performance...and in particular, perf so bad that it endangers accessibility.
Platform teams, meanwhile, focus on making the runtime faster, rather than building common cause between users on high-end and low-end devices.
Stop trying to sculpt David with a JS chainsaw and get yourself an HTML/CSS chisel.
Like, it *could* be an SPA, in the same sense that one *could* use a solid rocket booster to power one's car.
How do I know it's ridiculous to apply this much JS to the problem?
Because I helped build e-commerce sites with similar features (filtering, carts, etc.) that had to work on 4.0 browsers over 33.6 modems to WebTV boxes in 1999.