Expert in AI and Datascience at https://t.co/QpIXDy0xRG / Founder at https://t.co/Q9wpElUUH9 / Advisor at Center for Humane Technology / Ex-Google
Opinions are my own
4 added to My Authors
Jul 2, 2021 • 5 tweets • 1 min read
WOW: Facebook is issuing a cease and desist to my favorite Chrome extension: Unfollow Everything
It unfollows all your FB friends, so you can re-follow people who matter and take back control of your newsfeed
It saved me 1 hour per day.
Here's how Facebook tries to kill it ⬇️
Unfollowing all your friends and pages is hard on Facebook. It's not a bug, it's a feature.
2/
Jan 12, 2021 • 8 tweets • 2 min read
Update: Following my tweet on Thursday, Google fixed its autocomplete promoting that "civil war is inevitable"
Here's what Google autocomplete for "civil war is " looked like before Thursday:
THREAD
Today, there is only one suggestion saying that civil war is coming: 2/
Dec 7, 2020 • 10 tweets • 3 min read
YouTube's deep learning AI boosted massively two news channels in the US last month:
➡️Newsmax TV went from 3M to 110M views/month (x33!)
➡️New Tang Dynasty TV (NTD): 6M to 80M views/m (x13)
They both made wild claims of voter fraud. Why does the AI like them?
Thread ⬇️
I already made a thread about Newsmax TV last week (see below), so let's focus on NTD
My first op-ed in @WIRED: how the AI feedback loops I helped build at YouTube can amplify our worst inclinations, and what to do about it.
wired.com/story/the-toxi…
1/
Earlier this year a YouTuber showed how YouTube's recommendation algorithm was pushing thousands of users towards sexually suggestive videos of children, used by a network of pedophiles.
YouTube bans sexual videos. What happened?
2/
May 22, 2019 • 7 tweets • 1 min read
.@coffeebreak_YT shows how the trending tab is heavily biased towards TV channels, spoiling native YouTube creators
Having worked at YouTube, I know they don't intentionally bias algorithms, but they can pick a metric and not investigate enough to see if it creates bias.
Here's my theory about it. 2/
May 21, 2019 • 5 tweets • 3 min read
Thread
One of my favorite YouTuber @veritasium (5M subscribers) explains 3 problems creators face:
1/ They are constantly chasing the algorithm 2/ Clickbait is the key to virality 3/ Users don't see videos of channels they subscribed to
1/ Creators having to chase the algorithm is a disaster for independent ones, because it gives bigger companies a huge advantage (e.g., TheSoul publishing company creates 1,500 videos per month)
Is it really "hard to argue with promotion unrelated to content"? Good question!
Let's imagine an automated food distribution system that has lever for each food. The harder you pull, the more energy the system makes. Which food would it choose to promote? 1/5
Note that YouTube similarly encourages and monetizes hatred of Christians, but it’s often not in English so we have no idea about it.
Apr 26, 2019 • 16 tweets • 4 min read
THREAD
One week after the release of the Mueller report, which analysis of it did YouTube recommend from the most channels among the 1000+ channels that I monitor daily?
Russia Today's !!!
1/
This video funded by the Russian government was recommended more than half a million times from more than 236 different channels.
2/
Apr 13, 2019 • 8 tweets • 4 min read
Thanks @cadale from YouTube to correct Howell's missquote and give facts. They confirm my point: that conspiracy was promoted ~1 million times in 3 days🤯
And the science video you mention has 10x more impressions, but it also have 26x more subscribers to its channel🤷♂️ 1/
Context: @zeynep, @beccalew and many other have shown how YouTube often recommends more and more extreme content, which is radicalizing people:
2/
Feb 9, 2019 • 16 tweets • 9 min read
YouTube announced they will stop recommending some conspiracy theories such as flat earth.
I worked on the AI that promoted them by the *billions*.
Here is why it’s a historic victory. Thread. 1/
bit.ly/2MMXNGn
Brian is my best friend’s in-law. After his dad died in a motorcycle accident, he became depressed. He fell down the rabbit hole of YouTube conspiracy theories, with flat earth, aliens & co. Now he does not trust anyone. He stopped working, seeing friends, and wanting kids. 2/
Jan 24, 2019 • 12 tweets • 7 min read
Thread.
There were really good answers to the tweet suggesting that "algorithms can't be racist because they are math", so I made a top 10.
The YouTube algorithm that I helped build in 2011 still recommends the flat earth theory by the *hundreds of millions*. This investigation by @RawStory shows some of the real-life consequences of this badly designed AI.
This spring at SxSW, @SusanWojcicki promised "Wikipedia snippets" on debated videos. But they didn't put them on flat earth videos, and instead @YouTube is promoting merchandising such as "NASA lies - Never Trust a Snake". 2/