, 49 tweets, 11 min read Read on Twitter
JavaScript WeakRefs and finalizers are on the way to becoming a standard. This is terrible, terrible news and if you use JS you should know some stuff. github.com/tc39/proposal-…
PART THE FIRST: WeakRefs are toxic
WeakRefs are like cigarettes. You might really want one, you might convince yourself you need one, but it’s never good. The proposal itself says things like “they are best avoided if possible”. Not making it up.
If your code counts on WeakRefs behaving predictably, “it’s likely to be disappointed”; they may work “much later than expected, or not at all”.
What about finalizers? “Important logic should not be placed in the code path of a finalizer.” If you do, code changes that seem unrelated “could lead to data loss.” Browser auto-updates to a new version? Could cause data loss. Yeah.
“They don’t help the garbage collector do its job; rather, they are a hindrance. Furthermore, ...”
If you use WeakRefs or finalizers, you are relying on a feature that will not work predictably. Your code will be tricky to get right and impossible to test reliably. It may behave differently in different browsers and even different browser versions. Excited yet?
“For this reason, the W3C TAG Design Principles recommend against creating APIs that expose garbage collection.” That’s what I’m screaming.
The folks who’ve worked on this standard get high marks for being up front about the drawbacks. The only point of disagreement is that to me, drawbacks are a reason not to do something.
PART THE SECOND: All WeakRef examples are wrong
The classic WeakRef use case (and the only one mentioned in the rationale document) is creating a cache. It’s great to want to cache things, and you are great. Nobody loves writing resource management code by hand.
But typically we cache large objects (small ones aren’t worth caching), and because reuse is so valuable, cached objects should live as long as reasonably possible.
So who should manage that kind of resource? Obviously the GC, right? A system that is carefully tuned for *small* objects, and to recycle them *quickly*? No. That is a terrible idea.
Look, every cache has a policy that says what objects get cached and when cache entries are expired. The problem to be solved, I guess, is that people dislike implementing policies. It's boring code. They want an easy default. But GC is the *worst* default.
Instead, a cache should estimate the cost of keeping stuff cached, and the likelihood of ever using it again, and periodically drop the lowest-value stuff. The estimate doesn’t have to be good; almost anything will be better than WeakRef.
Now: you can combine estimating functions with weak refs, producing a hybrid cache that expires entries mainly based on estimated value, but can bring them back if you happen to expire an entry but then need it before GC runs.
This is the actual theoretically valuable use case for WeakRefs. Raising cache entries from the dead when you thought they weren’t worth keeping, but you were wrong. This feature is supposedly SO important that it’s worth swallowing any amount of poison to get it.
Incidentally, the first example, in the proposal rationale, of WeakRefs in action, has a deliberate bug, to illustrate a common WeakRef pitfall—it has a memory leak (again, 💯 for honesty and teaching technique)
The fixed version has to sidestep two more pitfalls. (It still isn’t as good as a native weak-valued map: it keeps keys around for an extra GC cycle after the values are GC’d. This is inherent in the design of finalizer groups. I don’t think it can be fixed.)
Are there other use cases? People sometimes ask for weak listeners. These are also error prone. The picture below shows how weak listeners can literally fall apart if you chain them. Building blocks that can’t even stack.
WeakRefs are not just hard to debug. They’re genuinely hard to reason about, to avoid creating unfixable bugs in the first place. You have to understand the heap graph, something that’s not even mentioned in the language spec.
This is also why some of the best minds working in web standards today have had a hard time writing a specification that says what a WeakRef does.
People struggle to come up with use cases for WeakRefs! The @v8js blog post announcing the feature just used the example cache code from the proposal, touched up a bit for readability. I don’t want to drag these guys too hard, they’re awesome, but lol that’s kinda lame :D :D
Most web developers have no use for WeakRefs. When people do tell me about real use cases, it usually turns out that they are thinking of something that won’t work anyway, or would let the nondeterminism shine right through abstraction boundaries (bad idea).
(...Or else it’s a job for WeakMap, which is great! WeakMap doesn’t solve most WeakRef use cases, but it’s deterministic and available today.)
(Also... this rant seems to be drawing better responses than it maybe deserves, so I reserve the right to be super wrong—there are many of you I haven’t met)
PART THE THIRD: Wasm
Easily the most compelling use case for finalizers is Wasm.

Rust and C++ use explicit memory management. There’s no GC. We want to write components in these languages for speed, and hand out nice JS objects to JS users, without forcing them to free memory explicitly.
The solution is to hand JS an object with a finalizer that calls back into Wasm to run destructors and clean up memory. Bottom line, when JS drops the object, cleanup happens.*
*Yes, obviously this is a bit at odds with the fact that finalizers do not necessarily ever run. This is I guess intended as kind of a fallback feature. I’m doubtful they’ll really be used that way, but let’s take it at face value.
This still isn’t really adequate. If you have any edges from that stuff back into JS-land, there can be garbage cycles across heaps that won’t get collected. So next I guess we’ll be implementing cycle collectors in userspace—auxiliary garbage collectors for Wasm+JS.
And that can’t be done without walking the JS heap graph. Currently there is no way to do that (you can’t get from a Function object to the values it closes over, for example) so I guess that’ll be the next proposal.
Or instead, maybe we’ll get to use a ton of WeakRefs in increasingly complicated efforts to avoid making cycles that involve Wasm objects.
Both of those ideas are obviously bonkers, right? But that is where finalizers are taking us, unless you imagine a future where complexity doesn’t increase and the JS and Wasm object graphs don’t get entangled whether we're ready for it or not. I dunno, folks.
The right answer is real integration of Wasm with GC... which is coming in a few years but isn’t ready yet. So finalizers are not just inadequate for this purpose—they’re also a stopgap.
It's not clear yet how to extend Rust to integrate these upcoming Wasm features, but whatever—it’s coming. AFAICT no one disputes that GC allocation from Wasm is coming or that it’ll be *much* better than finalizers for memory management.
~~~~

Now is a good time for a break.

Don’t forget to go outside sometimes, they have flowers and stuff out there
So, I hesitate to mention this last thing. I could be wrong here. It’s a judgment call about the future.

What the heck, you read this far
PART THE FOURTH: WeakRefs are a threat to the Open Web
There are three remaining browser engines today, and that number is under pressure. When it goes to one, Google will control the Web platform and your browser will do what Google wants it to do. The days of an open, interoperable Web will be over.
It’s amazing, but not hard to see how WeakRefs and finalizers will hasten the end. Remember how putting important code in finalizers “could lead to data loss”? (I know, it was like nine million tweets ago. But it was real. you didn’t dream that)
You shouldn’t rely on fast finalization—in fact you shouldn’t rely on finalization at all. Neither should Facebook, or Twitter, or YouTube. Now YOU, of course, actually WON’T, because as a QUALITY programmer, you’re immune to the pressures that lead to bad code...
But someone will. Their code will mostly work in the browser they care about. In other browsers, we’ll get data loss. Users will conclude those other browsers are buggy, and switch. Minority browsers will lose users.
I know, maybe this won’t happen. But it’s no joke. Back in the days of Microsoft hegemony, when Internet Explorer had 95%+ market share, Firefox had to push hard on bug-for-bug compatibility. You never know what Web content will rely on.
The features’ champions can point to good-faith efforts they’ve made to “mitigate” the nondeterminism, like making WeakRefs only null out between event loop turns, not during. Realistically I doubt the mitigations help much—we’re all just hoping.
In closing, two things—
1. Please remember that nondeterminism is bad for fixing bugs, and don’t use WeakRefs (not that you’ll necessarily know if your dependencies use them, but we can all try)
(If you feel like you have to, for some Wasm thing, remember finalization may never happen, so document an explicit eager cleanup API and make it as easy to use as you can.)
2. If you’re involved in Web standards, please talk to your W3C TAG or ECMA TC39 representative and politely ask them to reconsider their support. It’s worth a shot.
Missing some Tweet in this thread?
You can try to force a refresh.

Like this thread? Get email updates or save it to PDF!

Subscribe to every_jorendorff
Profile picture

Get real-time email alerts when new unrolls are available from this author!

This content may be removed anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just three indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!