Guillaume Chaslot Profile picture
Trustworthy AI / Mozilla Fellow / / Advisor at Center for Humane Technology / Ex-Google
🦋💙DonnaForDemocracy💙🦋 Profile picture Scott Phillips Profile picture 3 added to My Authors
4 Nov 19
Is YouTube's algo still taking people down the rabbit hole?


Right now, the algorithm is promoting to millions of teens this conspiracy that literally says:

"We ask you to suspend your disbelief and take a journey down the rabbit hole" (1:37)

It's on a channel called "After Skool", so clearly targeting kids/teens.

It was so massively recommended that it reached 1 million views in few days

One of the top comments is revealing:

"Hey YouTube? What's this doing in my recommendations? If I didn't have a basic understanding of science and the ability to recognize logical fallacies this could've sent me down the path to conspiracy paranoia"

Read 5 tweets
1 Nov 19

I discovered that a Chinese anti-American conspiracy theory was promoted by the millions by YouTube's AI

@OliviaGoldhill reported on it, and @YouTube reacted to our findings

Right after publication of the article, @YouTube added a warning on the Chinese video that it's "inappropriate for some users"

For who?

NBA officials😂?

Why hide this video? If anything, people should be able to discover that this is the official Chinese propaganda line.

Why did this video get millions and millions of free ads? Was it truthful, or just efficient at gaming the up-next algorithm? 🧐

Read 13 tweets
4 Oct 19
In the current context, which YouTube channel is particularly promoted by YouTube's Artificial Intelligence?

A Ukrainian channel that copy-pastes Fox News videos

Read that again, but slowly. 1/…
That channel did >2M views in the last few days. It was an increase of 3,080.6% month-to-month.

It's now called "BREAKING NEWS"

Full stats:…

It was called "Florans UA" before the 30th of September.

Note that the channel describes itself as Ukrainian, but we can't be certain it's Ukrainian. In any case, it's performing exceptionally well.

Read 7 tweets
14 Jul 19

My first op-ed in @WIRED: how the AI feedback loops I helped build at YouTube can amplify our worst inclinations, and what to do about it.…

Earlier this year a YouTuber showed how YouTube's recommendation algorithm was pushing thousands of users towards sexually suggestive videos of children, used by a network of pedophiles.

YouTube bans sexual videos. What happened?

At YouTube, we designed the AI for engagement. Hence, if pedophiles spend more time on YouTube than other users, the job of the AI will become to try to *increase* their numbers.

Read 24 tweets
30 Jun 19
I helped Aude @WTFake_ for this YouTube video showing that that YouTube is recommending to kids that @realDonaldTrump, the Pope, soccer, Notre Dame, @EmmanuelMacron, many brands, etc... are hidden satanists. 1/3
I showed how the recommendation engine I worked on is actively promoting them, and Aude showed the impact it had on kids and their relationships with teachers. 2/3
To be clear, I'm not calling on YouTube to ban these videos, but we need to know what YouTube shows to kids. 3/3
Read 3 tweets
22 May 19
.@coffeebreak_YT shows how the trending tab is heavily biased towards TV channels, spoiling native YouTube creators

Let's try to investigate why this happens. 1/
Having worked at YouTube, I know they don't intentionally bias algorithms, but they can pick a metric and not investigate enough to see if it creates bias.

Here's my theory about it. 2/
In order to show something in trending, they probably check that the video doesn't offend people. This seems reasonable. 3/
Read 7 tweets
21 May 19

One of my favorite YouTuber @veritasium (5M subscribers) explains 3 problems creators face:

1/ They are constantly chasing the algorithm
2/ Clickbait is the key to virality
3/ Users don't see videos of channels they subscribed to
1/ Creators having to chase the algorithm is a disaster for independent ones, because it gives bigger companies a huge advantage (e.g., TheSoul publishing company creates 1,500 videos per month)

BTW my site helps small creators hack the AI
2/ Clickbait being the key for virality creates a race to the bottom where YouTubers have to design their videos for clickbait

It doesn't have to be that way. I'll demonstrate it by creating an alternative by the end of the year that provides *provably better recommendations*
Read 5 tweets
15 May 19
Is it really "hard to argue with promotion unrelated to content"? Good question!

Let's imagine an automated food distribution system that has lever for each food. The harder you pull, the more energy the system makes. Which food would it choose to promote? 1/5
Heroin. Until everybody dies.

Promotion of unrelated to content can go wrong, really fast.

So why doesn't YouTube have the same problem? Because they ban *millions* of terrorist videos, that would beat all stats of engagement.

Read 5 tweets
14 May 19
This girl talking without fear of political consequences is right to speech.

This channel being promoted tens of millions of times by Google’s algos is “right to monetize hatred”.

Google plays an active role in the promotion of this channel.
Note that YouTube similarly encourages and monetizes hatred of Christians, but it’s often not in English so we have no idea about it.
It’s not just hatred of Muslims and Christians that YouTube monetizes: it’s all types of hatred, monetized equally.
Read 6 tweets
10 May 19
Following Aude @WTFake_´s great reporting on Lama Fâché, a YouTube channel that lies constantly, that channel deleted videos totaling *half a billion* views. #HidingEvidence
According to @SocialBlade, the deleted views could have brought Lama Faché up to $2.2 M
Read 3 tweets
6 May 19
Notre Dame burning was a historic disaster.

But for conspiracy theorists and @Google, it was an opportunity to gain market shares. @YouTube's AI understood it and recommended this dumb conspiracy hundreds of thousands of times from 520+ channels. 1/2

The AI also recommends that Notre Dame is Satanic, because it's good to make people watch ads: 2/2

4 months ago @Google promised they would recommend conspiracy theories less. I'll release a full report soon, but it's hard for me to take them seriously.

Also in France, YouTube's algorithm is massively promoting "bloody civil war" narratives millions of times. "bloody civil war" is engaging, and performs extremely well for ads revenue 💸…
Read 3 tweets
6 May 19
Breathtaking investigation by @WTFake_ on one of YouTube's French biggest channel, and how it's deceiving millions of teenagers with lies, fake prizes, copyright infringement, and fake promises.

All that, promoted by the AI I helped build.

This videos shows that a lot of rules of YouTube can be broken in practice, and are just there to give YouTube leverage. Small creators have no ideas which rules can be broken, creating an unfair competition with huge, cheating channels like Lama Faché
The video shows that inflating artificially the number of subscribers (e.g. with fake contests) doesn't get a much sanctions from YouTube.
Read 3 tweets
2 May 19
Great thread by @gadyepstein from @TheEconomist on YouTube: after meeting with top execs, he describes how they are trying to fix many problems with tweaks

➡️ Key takeway 1: "60% of young subscribers said YouTubers had changed their lives or worldview"
➡️ Key takeway 2: "When pressed on the subject, executives insist that the site is not meant for children under 13 years old without adult supervision." 🤦‍♂️
➡️ Key takeway 3: "Given the complexities, wise governments will proceed deliberately. They should seek data from platforms to help researchers identify potential harms to users" 🙏
Read 3 tweets
26 Apr 19

One week after the release of the Mueller report, which analysis of it did YouTube recommend from the most channels among the 1000+ channels that I monitor daily?

Russia Today's !!!

This video funded by the Russian government was recommended more than half a million times from more than 236 different channels.

So YouTube's algorithm massively recommends Russia's take on the investigation into Russia's interference in the 2016 election.

Read 16 tweets
13 Apr 19
Thanks @cadale from YouTube to correct Howell's missquote and give facts. They confirm my point: that conspiracy was promoted ~1 million times in 3 days🤯

And the science video you mention has 10x more impressions, but it also have 26x more subscribers to its channel🤷‍♂️
Let's look at the owner of that YouTube video, the conspiracy channel "Matrix Wisdom"

It has nearly the same monthly views as the top science channel @SmarterEveryDay, despite having ... 31 times less subscribers.

How can "Matrix Wisdom" get nearly as many views as a channel that have 31 times more subscribers?

1/ it can make more videos as it doesn't care about facts
2/ it's recommended by the AI much, much more

Read 8 tweets
10 Apr 19
While first picture of a black hole was released, @YouTube's AI massively recommended the video "They Found Something In Outer Space" claiming that humans are the results of genetic engineering from aliens coming from planet 9 to extract gold:


The AI recommended this conspiracy millions of times from more than 169 different channels, including European Space Agency and Northrop Grumman

Today was an historic high for astronomy, and an historic low for AI.

I was very enthusiast about YouTube's announcement, but let's be honest: after two months, things barely changed. (cf:

I'll do an overview of what changed with recommendations next month.
Read 3 tweets
30 Mar 19
If this thread was a YouTube video, I would have to call it:

"YouTube's CPO calls @zeynep's work *myths*, ex-YouTube engineer proves him wrong with *facts*"

Context: @zeynep, @beccalew and many other have shown how YouTube often recommends more and more extreme content, which is radicalizing people:

YouTube's CPO calls many of the complains "myths" in an interview for the NYT:

"The first [myth] is this notion that it’s somehow in our interests for the recommendations to shift people in this direction [more extreme]"

Read 13 tweets
9 Feb 19
YouTube announced they will stop recommending some conspiracy theories such as flat earth.

I worked on the AI that promoted them by the *billions*.

Here is why it’s a historic victory. Thread. 1/
Brian is my best friend’s in-law. After his dad died in a motorcycle accident, he became depressed. He fell down the rabbit hole of YouTube conspiracy theories, with flat earth, aliens & co. Now he does not trust anyone. He stopped working, seeing friends, and wanting kids. 2/
Brian spends most of his time watching YouTube, supported by his wife.

For his parents, family and friends, his story is heartbreaking.
But from the point of view of YouTube’s AI, he’s a jackpot.

Read 16 tweets
24 Jan 19

There were really good answers to the tweet suggesting that "algorithms can't be racist because they are math", so I made a top 10.

Let's jump in.

1/ My personal favorite is Dr. @mathbabedotorg's, the world's expert on the topic, and yet her answer is benevolent and humble:

2/ Google's deep learning expert @fchollet's answer generalizes the argument of @RealSaavedra into the concept of "bias laundering", which I bet will come back often in the next years:

Read 12 tweets
19 Nov 18
The YouTube algorithm that I helped build in 2011 still recommends the flat earth theory by the *hundreds of millions*. This investigation by @RawStory shows some of the real-life consequences of this badly designed AI.
This spring at SxSW, @SusanWojcicki promised "Wikipedia snippets" on debated videos. But they didn't put them on flat earth videos, and instead @YouTube is promoting merchandising such as "NASA lies - Never Trust a Snake". 2/
A few example of flat earth videos that were promoted by YouTube #today:
Read 22 tweets