, 16 tweets, 9 min read Read on Twitter
YouTube announced they will stop recommending some conspiracy theories such as flat earth.

I worked on the AI that promoted them by the *billions*.

Here is why it’s a historic victory. Thread. 1/

bit.ly/2MMXNGn
Brian is my best friend’s in-law. After his dad died in a motorcycle accident, he became depressed. He fell down the rabbit hole of YouTube conspiracy theories, with flat earth, aliens & co. Now he does not trust anyone. He stopped working, seeing friends, and wanting kids. 2/
Brian spends most of his time watching YouTube, supported by his wife.

For his parents, family and friends, his story is heartbreaking.
But from the point of view of YouTube’s AI, he’s a jackpot.

3/
We designed YT’s AI to increase the time people spend online, because it leads to more ads. The AI considers Brian as a model that *should be reproduced*. It takes note of every single video he watches & uses that signal to recommend it to more people 4/
youtube-creators.googleblog.com/2012/08/youtub…
How many people like Brian are allured down such rabbit holes everyday day?

By design, the AI will try to get as many as possible.

5/
Brian's hyper-engagement slowly biases YouTube:

1/ People who spend their lives on YT affect recommendations more
2/ So the content they watch gets more views
3/ Then youtubers notice and create more of it
4/ And people spend even more time on that content. And back at 1

6/
This vicious circle was also observed with tay.ai, and it explains why the bot became racist in less than 24 hours.

=> Platforms that use AIs often get biased by tiny groups of hyper-active users.

7/
theverge.com/2016/3/24/1129…
Example of YT vicious circle: two years ago I found out that many conspiracies were promoted by the AI much more than truth, for instance flat earth videos were promoted ~10x more than round earth ones 🌎🤯

8/
medium.com/@guillaumechas…
I was not the only one to notice AI harms. @tristanharris talked about addiction. @zeynep talked about radicalization. @noUpside, political abuse and conspiracies. @jamesbridle, disgusting kids videos. @google's @fchollet, the danger of AI propaganda:

medium.com/@francois.chol…

9/
Since then many newspapers spoke about AI harms, as for instance: @wsj @guardian @nytimes @BuzzFeed @washingtonpost @bloomberg @huffpost @dailybeast @vox @NBCNews @VICE @cjr @techreview

Journalism matters

10/
There are 2 ways to fix vicious circles like with "flat earth"

1) make people spend more time on round earth videos
2) change the AI

YouTube’s economic incentive is for solution 1).
After 13 years, YouTube made the historic choice to go towards 2)

Will this fix work? 11/
The AI change will have a huge impact because affected channels have billions of views, overwhelmingly coming from recommendations. For instance the channel secureteam10 made *half a billion* views with deceiving claims promoted by the AI, such as:

bit.ly/2Bgkii5 12/
Note that #secureteam10 was the most liked channel of Buckey Wolfe, who came to believe his brother was a “lizard” and killed him with a sword.
To understand how he fell down the rabbit hole, see his 1312 public likes here:

youtube.com/user/Buckeywol…
clickondetroit.com/news/national/…
13/
This AI change will save thousands from falling into such rabbit holes

(If it decreases between 1B and 10B views on such content, and if we assume one person falling for it each 100,000 views, it will prevent 10,000 to 100,000 "falls") 14/
A concern remains that other rabbit holes are arising. I created algotransparency.org to identify and monitor harmful content recommended by the AI.

15/
Conclusion: YouTube's announcement is a great victory which will save thousands. It's only the beginning of a more humane technology. Technology that empowers all of us, instead of deceiving the most vulnerable.

If you see something, say something.

16/
Missing some Tweet in this thread?
You can try to force a refresh.

Like this thread? Get email updates or save it to PDF!

Subscribe to Guillaume Chaslot
Profile picture

Get real-time email alerts when new unrolls are available from this author!

This content may be removed anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just three indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!