Guillaume Chaslot Profile picture
Expert in AI and Datascience at https://t.co/QpIXDy0xRG / Founder at https://t.co/Q9wpElUUH9 / Advisor at Center for Humane Technology / Ex-Google Opinions are my own
4 subscribers
Jul 2, 2021 5 tweets 1 min read
WOW: Facebook is issuing a cease and desist to my favorite Chrome extension: Unfollow Everything

It unfollows all your FB friends, so you can re-follow people who matter and take back control of your newsfeed

It saved me 1 hour per day.

Here's how Facebook tries to kill it ⬇️ Unfollowing all your friends and pages is hard on Facebook. It's not a bug, it's a feature.

2/
Jan 12, 2021 8 tweets 2 min read
Update: Following my tweet on Thursday, Google fixed its autocomplete promoting that "civil war is inevitable"

Here's what Google autocomplete for "civil war is " looked like before Thursday:

THREAD Image Today, there is only one suggestion saying that civil war is coming: 2/ Image
Dec 7, 2020 10 tweets 3 min read
YouTube's deep learning AI boosted massively two news channels in the US last month:

➡️Newsmax TV went from 3M to 110M views/month (x33!)
➡️New Tang Dynasty TV (NTD): 6M to 80M views/m (x13)

They both made wild claims of voter fraud. Why does the AI like them?

Thread ⬇️ I already made a thread about Newsmax TV last week (see below), so let's focus on NTD

2/
Nov 4, 2019 5 tweets 2 min read
Is YouTube's algo still taking people down the rabbit hole?

Yes.

Right now, the algorithm is promoting to millions of teens this conspiracy that literally says:

"We ask you to suspend your disbelief and take a journey down the rabbit hole" (1:37)


1/5
It's on a channel called "After Skool", so clearly targeting kids/teens.

It was so massively recommended that it reached 1 million views in few days

2/5
Nov 1, 2019 13 tweets 5 min read
Thread

I discovered that a Chinese anti-American conspiracy theory was promoted by the millions by YouTube's AI

@OliviaGoldhill reported on it, and @YouTube reacted to our findings

1/ Right after publication of the article, @YouTube added a warning on the Chinese video that it's "inappropriate for some users"

For who?

NBA officials😂?



2/
Oct 4, 2019 7 tweets 3 min read
In the current context, which YouTube channel is particularly promoted by YouTube's Artificial Intelligence?

A Ukrainian channel that copy-pastes Fox News videos

Read that again, but slowly. 1/

algotransparency.org/?date=03-10-20… That channel did >2M views in the last few days. It was an increase of 3,080.6% month-to-month.

It's now called "BREAKING NEWS"

Full stats:
socialblade.com/youtube/channe…

2/
Jul 14, 2019 24 tweets 6 min read
Thread

My first op-ed in @WIRED: how the AI feedback loops I helped build at YouTube can amplify our worst inclinations, and what to do about it.

wired.com/story/the-toxi…

1/
Earlier this year a YouTuber showed how YouTube's recommendation algorithm was pushing thousands of users towards sexually suggestive videos of children, used by a network of pedophiles.

YouTube bans sexual videos. What happened?

2/
May 22, 2019 7 tweets 1 min read
.@coffeebreak_YT shows how the trending tab is heavily biased towards TV channels, spoiling native YouTube creators

Let's try to investigate why this happens. 1/ Having worked at YouTube, I know they don't intentionally bias algorithms, but they can pick a metric and not investigate enough to see if it creates bias.

Here's my theory about it. 2/
May 21, 2019 5 tweets 3 min read
Thread

One of my favorite YouTuber @veritasium (5M subscribers) explains 3 problems creators face:

1/ They are constantly chasing the algorithm
2/ Clickbait is the key to virality
3/ Users don't see videos of channels they subscribed to 1/ Creators having to chase the algorithm is a disaster for independent ones, because it gives bigger companies a huge advantage (e.g., TheSoul publishing company creates 1,500 videos per month)

BTW my site algotransparency.org helps small creators hack the AI
May 15, 2019 5 tweets 1 min read
Is it really "hard to argue with promotion unrelated to content"? Good question!

Let's imagine an automated food distribution system that has lever for each food. The harder you pull, the more energy the system makes. Which food would it choose to promote? 1/5 Heroin. Until everybody dies.

Promotion of unrelated to content can go wrong, really fast.

2/5
May 14, 2019 6 tweets 2 min read
This girl talking without fear of political consequences is right to speech.

This channel being promoted tens of millions of times by Google’s algos is “right to monetize hatred”.

Google plays an active role in the promotion of this channel. Note that YouTube similarly encourages and monetizes hatred of Christians, but it’s often not in English so we have no idea about it.
Apr 26, 2019 16 tweets 4 min read
THREAD

One week after the release of the Mueller report, which analysis of it did YouTube recommend from the most channels among the 1000+ channels that I monitor daily?

Russia Today's !!!

1/
This video funded by the Russian government was recommended more than half a million times from more than 236 different channels.



2/
Apr 13, 2019 8 tweets 4 min read
Thanks @cadale from YouTube to correct Howell's missquote and give facts. They confirm my point: that conspiracy was promoted ~1 million times in 3 days🤯

And the science video you mention has 10x more impressions, but it also have 26x more subscribers to its channel🤷‍♂️
1/ Let's look at the owner of that YouTube video, the conspiracy channel "Matrix Wisdom"

It has nearly the same monthly views as the top science channel @SmarterEveryDay, despite having ... 31 times less subscribers.

2/
Mar 30, 2019 13 tweets 5 min read
If this thread was a YouTube video, I would have to call it:

"YouTube's CPO calls @zeynep's work *myths*, ex-YouTube engineer proves him wrong with *facts*"

1/ Context: @zeynep, @beccalew and many other have shown how YouTube often recommends more and more extreme content, which is radicalizing people:

2/
Feb 9, 2019 16 tweets 9 min read
YouTube announced they will stop recommending some conspiracy theories such as flat earth.

I worked on the AI that promoted them by the *billions*.

Here is why it’s a historic victory. Thread. 1/

bit.ly/2MMXNGn Brian is my best friend’s in-law. After his dad died in a motorcycle accident, he became depressed. He fell down the rabbit hole of YouTube conspiracy theories, with flat earth, aliens & co. Now he does not trust anyone. He stopped working, seeing friends, and wanting kids. 2/
Jan 24, 2019 12 tweets 7 min read
Thread.

There were really good answers to the tweet suggesting that "algorithms can't be racist because they are math", so I made a top 10.

Let's jump in.

1/ My personal favorite is Dr. @mathbabedotorg's, the world's expert on the topic, and yet her answer is benevolent and humble:

Nov 19, 2018 22 tweets 9 min read
The YouTube algorithm that I helped build in 2011 still recommends the flat earth theory by the *hundreds of millions*. This investigation by @RawStory shows some of the real-life consequences of this badly designed AI. This spring at SxSW, @SusanWojcicki promised "Wikipedia snippets" on debated videos. But they didn't put them on flat earth videos, and instead @YouTube is promoting merchandising such as "NASA lies - Never Trust a Snake". 2/