Browser teams (the folks who work on UI) don't think of content as "their problem". For historical reasons, they care about TLS and that has helped them make common cause with security interests.
But no such enlightenment has occurred around performance...and in particular, perf so bad that it endangers accessibility.
Platform teams, meanwhile, focus on making the runtime faster, rather than building common cause between users on high-end and low-end devices.
At the extreme, platform managers can be convinced to invest in dealing with some of the squeakiest wheels, but not in effective ways.
Using browser UI and platform/devtools pressure to push back on behavior that will predictably suck for users is ~~controversial~~
Nevermind that this is the *EXACT* playbook that has worked so effectively in the security domain.
Interlocking platform, browser, and devtools changes have migrated a RIDICULOUS amount of traffic to be encrypted on the wire.
What browser makers put in ever-present UI is what they *actually* care about.
Browser makers, as a class, DO NOT CARE about sites that are so slow as to be functionally inaccessible by a huge majority of users.
BUT -- they will say -- look at all the features we added! Lite modes! Reader modes! Slow-link detection!
The common thread between these features is that, like the platform performance optimisation work that toils on endlessly, they must attempt to make change without ever confronting the wealthy and the enfranchised with the reality that these websites are failures.
So decision makers -- CEOs, CTOs, marketing leads, etc. -- are never perturbed with the reality that their own websites are failing everyone.
But they'll sure as hell know if there is stray mixed content.
Into this voide a disturbing, now-perennial denial has emerged.
Web developers do not *have* to fix performance for the least of these, our users, and so they do not.
Instead, they lean into the most "modern" tools.
Are they good?
No. Not in terms of results.
Let me say that again: modern web development is a failure.
This is not even the dozenth public-sector, critical infrastructure that I've seen sunk by Modern Webdev.
Are there lots of things going on here?
Yes! And that's the failure.
The systems developers are encouraged to use are so complex they *TEND* to fail.
*CAN* they be good? Of course.
But the reduction to a single example is, like all discussions of performance, an epistemic failure.
You have to look at the distributions.
...and the distribution of web pages that do OK by most users most of the time that are primarily HTML and CSS vs. those that are mostly JS is shockingly different.
RIP my notifications as everyone piles into "ok, but doesn't Core Web Vitals cause people to care?"
And the answer is: at the margin.
Do we need that margin of improvement? Yes, but others too.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
Stop trying to sculpt David with a JS chainsaw and get yourself an HTML/CSS chisel.
Like, it *could* be an SPA, in the same sense that one *could* use a solid rocket booster to power one's car.
How do I know it's ridiculous to apply this much JS to the problem?
Because I helped build e-commerce sites with similar features (filtering, carts, etc.) that had to work on 4.0 browsers over 33.6 modems to WebTV boxes in 1999.
Combined with now-rampant NIMBY-ism from the last generation to enjoy tax-funded higher ed, spiraling property costs mean the dream of owning a reasonable home and starting a family is a receding vision.
The "way up" is "supposed to be" tech -- one of the few industries often paying enough to get you a slice of California. And for the lucky few, it absolutely is.
My contention for something like a decade has been that if your tree is closed for half the year, you're "kept source", regardless of the license code eventually drops with: