My Authors
Read all threads
YouTube’s algorithm has been curating home movies of unwitting families into a catalog of semi-nude kids, we found.

YT often plays the videos after users watch softcore porn, building an audience of millions for what experts call child sexual exploitation
nytimes.com/2019/06/03/wor…
Each video might appear innocent on its own, a home movie of a kid in a two-piece swimsuit or a nightie. But each has three common traits:
• the girl is mostly unclothed or briefly nude
• she is no older than age 8
• her video is being heavily promoted by YouTube’s algorithm
Any user who watched one kiddie video would be directed by YouTube's algorithm to dozens more — each selected out of millions of otherwise-obscure home movies by an incredibly sophisticated piece of software that YouTube calls an artificial intelligence. The families had no idea.
We talked to one mother, in Brazil, whose daughter had posted a video of her and a friend playing in swimsuits. YouTube’s algorithm found the video and promoted it to users who watched other partly-clothed prepubescent children.

Within a few days of posting, it had 400,000 views
We talked to child psychologists, sexual trauma specialists, psychologists who work with pedophiles, academic experts on pedophilia, network analysts. They all said YouTube has built a vast audience — maybe unprecedented — for child sexual exploitation, with grave risks for kids.
YouTube, to its credit, said it has been working nonstop on this issue since a similar issue was first reported in February.

YT also removed some of the videos immediately after we alerted the company, though not others that we did not specifically flag.
YouTube’s algorithm also changed immediately after we notified the company, no longer linking the kiddie videos together.

Strangely, however, YouTube insisted that the timing was a coincidence. When I pushed, YT said the timing might have been related, but wouldn’t say it was.
I asked YouTube— why not just turn off recommendations on videos of kids? Your system can already identify videos of kids automatically.

The recommendation algorithm is driving this whole child exploitation phenomenon. Switching it off would solve the problem and keep kids safe.
Initially, YouTube gave me comment saying that they were trending in that direction. Experts were thrilled, calling it potentially a hugely positive step.

Then YouTube “clarified” their comment. Creators rely on recommendations to drive traffic, they said, so would stay on. 📈
On a personal note, I found reporting this emotionally straining, far more so than I'd anticipated. Watching the videos made me physically ill and I've been having regular nightmares. I only mention it because I cannot fathom what this is like for parents whose kids are swept up.
As I reported last year with @kbennhold, YouTube’s algorithm does something similar with politics. We found it directing large numbers of Germans news consumers toward far-right extremist videos, with real-world implications. nytimes.com/2018/09/07/wor…
For more on what happens when YouTube and other social networks route an ever-growing global share of human social relations through engagement-maximizing algorithms, read our essay on “the Algorithmification of the Human Experience”: static.nytimes.com/email-content/…
Wow: Senator Hawley has introduced legislation off of our story that would prohibit video-hosting services like YouTube from recommending videos that feature minors. hawley.senate.gov/sites/default/…
When I asked YouTube about doing exactly this last week, the company did not deny that it was technically capable, and even hinted they might do so voluntarily.

Later, YT said they would not do this because it would hurt traffic. Sen. Hawley’s bill would force them to do it.
More momentum building: Senators Blumenthal and Blackburn send an open letter to YouTube’s CEO about our story, demanding to know, among other things, why YouTube will not turn off recommendations on videos of children. blumenthal.senate.gov/imo/media/doc/…
Samantha Bee did a very good segment on YouTube’s troubles, leading with our story on algorithm-assisted pedophilia: youtube.com/channel/UC18vz…
Missing some Tweet in this thread? You can try to force a refresh.

Enjoying this thread?

Keep Current with Max Fisher

Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!