Hard to overstate the technical achievement by teams across Adobe and the Chromium community. Bringing Photoshop to the web has been a massive undertaking:


Particular and specifically proud of all my Project Fugu 🐡 friends and colleagues.
As @fractorious and Nabeel allude to in the post, this has been a long, long journey. Getting the web platform into a place where folks could even *consider* projects this ambitious has been a huge lift.

Thankfully, Adjacency Theory provided a roadmap:

Photoshop on the web is huge. But beyond that, what it signifies is a Big Freaking Deal.

When the web gains features to support high-end productivity, those same capabilities can be combined to unlock whole new classes of apps that suddenly don't require heavyweight installs.
This is why we started Project Fugu 🐡: to bring the impossible (on the web) within reach.

If this milestone is anything to go by, it's working.

Congrats to everyone who has poured so much of their professional efforts into this over a few tough years.

• • •

Missing some Tweet in this thread? You can try to force a refresh

Keep Current with Alex Russell

Alex Russell Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!


Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @slightlylate

27 Oct
Look, I get that a lot of y'all want to dismiss the incredible scale of Web Components deployment these days because you're bought into framework(s) that are bad at DOM and don't play well with others.

But what if that wasn't the future?

Or even the new normal?
What if you didn't need to rely on ad hoc forks of HTML and JS, or at least got decent performance and interop for your trouble?

Dare to dream of a world that's already here.
It isn't helpful to point out how badly the failed promises of vdom and "concurrent mode" played out at scale, so consider instead what can be gained from *actually* writing components once...with miniscule runtimes...without global coordination.
Read 6 tweets
25 Oct
You can tell so many stories from the freeform responses to the State of CSS survey, but none of them support the idea that the #applebrowserban on competing engines is pro-developer.

Back in 2012 when @stshank wrote that piece, WebKit was near the front of the pack, so lack of competition was less pressing.

That was a long, long, long time ago:

Read 6 tweets
20 Oct
Having looked pretty deeply at various blockchain tech stacks over the years, this thread seems dead on.

People are holding on to *the dream*, and the fact that the tech will impoverish millions, help destroy the one world we share, and fail to avoid aggregation is immaterial.
I can't stress enough just how transparently wrong the "decentralised" claims are should you care to look.

Not the dream of decentralisation (whatever that is), but the lived reality of all these systems. Bitcoin? Mostly traded through exchanges now. And they will be regulated.
"web3"? Well, you can't find anything...so you get alt-stack, slower, less capable versions of systems we already have:

Read 6 tweets
18 Oct
I can't say it enough: "vdom" is not fast. It is slow.

The only *defensible* claim is that it isn't as slow as one might think given how much overhead it adds...but that doesn't make it competitive or good.
Why is vdom slow?

It does too much work.

Computing diffs through something like React's reconciliation algorithm is a clever way to avoid needing to have knowledge of the potential changes that can occur on either the JS or DOM side.

But it's correspondingly very expensive.
Reactive systems that constrain *either* how you update DOM (e.g. Lit's template system) *or* cabin the effects of JS side-effects (Svelte's "reactive declarations", FAST's Observables) deliver superior performance because they don't need to model + compare the whole world
Read 7 tweets
8 Oct
So @maxlynch hit me right in the feels with this one:

I have deep, deep regrets that I have not been able to convince browser makers to refuse to load 2.7MB of JS, critical path, served uncompressed.
Browser teams (the folks who work on UI) don't think of content as "their problem". For historical reasons, they care about TLS and that has helped them make common cause with security interests.
But no such enlightenment has occurred around performance...and in particular, perf so bad that it endangers accessibility.

Platform teams, meanwhile, focus on making the runtime faster, rather than building common cause between users on high-end and low-end devices.
Read 14 tweets
7 Oct

Might need to be off Twitter while @fugueish and @justinschuh *ahem* digest this press release.
Big reveal: it's Chromium!

But secure?
/me checks their website

*surely* they must be a description of how this thing improves sandboxing, allocators, control-flow hardening...something?


Read 7 tweets

Did Thread Reader help you today?

Support us! We are indie developers!

This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!