Writing for @wired, @pomeranian99 gets a first-of-its-kind behind-the-scenes look at Youtube's algorithm development team, in order to document the company's attempt to reduce the service's role in spreading and reinforcing conspiracy theories.
Thompson traces the origin of the crisis to the company's drive for more "engagement" that led them to tune their recommendation system to identify and propose more specialized, esoteric versions of the video you'd just watched.
2/
The idea was to lead you down a rabbit hole of ever-more-specific versions of your interests, helping you discover niches you never knew existed.
3/
This dynamic in recommendation systems has gotten a lot of attention lately, and most of it is negative, but let's pause for a moment and talk about what this means for non-conspiratorial beliefs.
4/
Say you happen upon a woodworking video, maybe due to a friend's post on social media. You watch a few of them and you find yourself interested in the subject and tuning in more often.
5/
The recommendation system presents an array of possible next-views, but tilted away from general-interest woodworking videos, instead offering you a menu of specialized woodworking styles, like Japanese woodworking.
6/
You sample one of these and find it fascinating, so you start watching more of these. The recommendation system clues you in to Japanese nail-free joinery:
7/
And from there, you discover the frankly mesmerizing "Niju-mizu-kumi-tsugi" style of joinery, and you start to seek out more. You have found this narrow, weird, self-reinforcing community.
This community could not exist without the internet and its signature power to locate and connect people with shared, widely dispersed, uncommon interests.
This power isn't just used to push conspiracies and woodworking techniques, either.
9/
It's how people who know that their gender identity doesn't correspond to the gender they were assigned at birth find each other, and acquire a vocabulary for describing their views, and foment change.
10/
It's also how people who wanted to cosplay Civil War soldiers in Charlottesville, waving tiki torches, chanting "Jews will not replace us" found each other.
11/
And that is the conundrum of the recommendation engine. Helping people find others who share their views, passions and concerns is not, in and of itself, bad. It is vital. It's the thing that made the internet delightfully weird. It's also what made the world terribly weird.
12/
Thompson takes us inside Youtube's algorithm team as they try to balance three priorities:
I. Increasing their traffic and profits
II. Helping people find others with common interest
III. Stopping conspiracies from spreading
13/
And he traces how they try, with limited success, to manage these competing goals by creating extremely fine-grained rules that define what is banned on the platform.
14/
But naturally, this just gives rise to a new kind of content: stuff that is ALMOST bad enough for blocking, but not quite. The problem is that this stuff is indistinguishable (in all but the narrowest, technical way) from banned content.
15/
So then Youtube has to create a new set of moderation guidelines: "What is so close to prohibited content that it, too, is prohibited?"
Naturally, this is creating a new kind of content: "Stuff that is not close-to-bannable, but is close-to-close-to-bannable."
16/
This dynamic should be familiar to anyone who's watched the moderation policies of Big Tech platforms evolve: what is "hate speech?" "What is 'almost-hate-speech?'" "What is almost-almost-hate-speech?'"
17/
Ultimately, this ends up creating thick binders of pseudo-law that delivers advantages to the worst people: they can study the companies' policies and figure out how to skate RIGHT UP to the cliff's edge (no matter how it is defined).
18/
And at the same time, they can goad their adversaries - the people they torment - into crossing these fractally complex lines and then nark them out, so that over time, these speech policies preferentially block good speech and leave bad speech untouched.
19/
I am increasingly convinced that the problem isn't that Youtube is unsuited to moderating the video choices of a billion users - it's that no one is suited to this challenge.
20/
Remedies that put moderation choices closer to the user - breaking up monopolies, allowing interoperable recommendation systems - solve the problem of scaling up AND covering edge cases by eliminating scale altogether, and letting the edge cases make their own calls.
21/
Image:
Mark Sargent
eof/
• • •
Missing some Tweet in this thread? You can try to
force a refresh
Amazon made $35b profit last year. They're celebrating by firing 14k workers (a number they say will rise to 30k). It's the kind of thing Wall St loves. It comes after a string of pronouncements from Andy Jassy about how AI is going to let him fire *tons* of workers.
1/
If you'd like an essay-formatted version of this thread to read or share, here's a link to it on pluralistic.net, my surveillance-free, ad-free, tracker-free blog:
I am an environmentalist, but I'm not a climate activist. I used to be - I even used to ring strangers' doorbells on behalf of Greenpeace.
1/
If you'd like an essay-formatted version of this thread to read or share, here's a link to it on pluralistic.net, my surveillance-free, ad-free, tracker-free blog:
But a quarter of a century ago, I fell in with the Electronic Frontier Foundation and became a lifelong digital rights activist, and switched to cheering on environmental activists from the sidelines of their fight:
Like you, I'm sick to the back teeth of talking about AI. Like you, I keep getting dragged into AI discussions. Unlike you‡, I spent the summer writing a book on why I'm sick of AI⹋, which @fsgbooks will publish in 2026.
‡probably
⹋"The Reverse Centaur's Guide to AI"
1/
If you'd like an essay-formatted version of this thread to read or share, here's a link to it on pluralistic.net, my surveillance-free, ad-free, tracker-free blog:
A week ago, I turned that book into a speech, which I delivered as the annual Nordlander Memorial Lecture at Cornell, where I'm an AD White Professor-at-Large.
3/
Billionaires don't think we're real. How could they? How could you inflict the vast misery that generates billions while still feeling even a twinge of empathy for the sufferer in your extractive enterprise. No wonder Elon Musk calls us "NPCs":
If you'd like an essay-formatted version of this thread to read or share, here's a link to it on pluralistic.net, my surveillance-free, ad-free, tracker-free blog:
Ever notice how people get palpably stupider as they gain riches and power? Musk went from a cringe doofus to a world-class credulous dolt, and it seems like he loses five IQ points for every $10b that's added to his net worth.
3/
I'm only a few chapters into Bill McKibben's stupendous new book *Here Comes the Sun: A Last Chance for the Climate and a Fresh Chance for Civilization* and I already know it's going to change my outlook forever:
If you'd like an essay-formatted version of this thread to read or share, here's a link to it on pluralistic.net, my surveillance-free, ad-free, tracker-free blog:
McKibben is one of our preeminent climate writers and activists, noteworthy for his informed and brilliant explanations of the technical limits - and possibilities - of various climate interventions, and for his lifelong organizing work.
3/
One of the dumbest, shrewdest tricks corporate America ever pulled was teaching us all to reflexively say, "If a corporation blocks your speech, that doesn't violate the First Amendment and therefore it's not censorship":
If you'd like an essay-formatted version of this thread to read or share, here's a link to it on pluralistic.net, my surveillance-free, ad-free, tracker-free blog:
Censorship isn't limited to government action: it's the act of preventing a message from a willing speaker from reaching a willing listener. The fact that it's censorship doesn't (necessarily) mean that it's illegitimate or bad.
3/