Is YouTube's algo still taking people down the rabbit hole?
Right now, the algorithm is promoting to millions of teens this conspiracy that literally says:
"We ask you to suspend your disbelief and take a journey down the rabbit hole" (1:37)
It's on a channel called "After Skool", so clearly targeting kids/teens.
It was so massively recommended that it reached 1 million views in few days
One of the top comments is revealing:
"Hey YouTube? What's this doing in my recommendations? If I didn't have a basic understanding of science and the ability to recognize logical fallacies this could've sent me down the path to conspiracy paranoia"
I helped Aude @WTFake_ for this YouTube video showing that that YouTube is recommending to kids that @realDonaldTrump, the Pope, soccer, Notre Dame, @EmmanuelMacron, many brands, etc... are hidden satanists. 1/3
But for conspiracy theorists and @Google, it was an opportunity to gain market shares. @YouTube's AI understood it and recommended this dumb conspiracy hundreds of thousands of times from 520+ channels. 1/2
The AI also recommends that Notre Dame is Satanic, because it's good to make people watch ads: 2/2
4 months ago @Google promised they would recommend conspiracy theories less. I'll release a full report soon, but it's hard for me to take them seriously.
Also in France, YouTube's algorithm is massively promoting "bloody civil war" narratives millions of times. "bloody civil war" is engaging, and performs extremely well for ads revenue 💸
This videos shows that a lot of rules of YouTube can be broken in practice, and are just there to give YouTube leverage. Small creators have no ideas which rules can be broken, creating an unfair competition with huge, cheating channels like Lama Faché
The video shows that inflating artificially the number of subscribers (e.g. with fake contests) doesn't get a much sanctions from YouTube.
While first picture of a black hole was released, @YouTube's AI massively recommended the video "They Found Something In Outer Space" claiming that humans are the results of genetic engineering from aliens coming from planet 9 to extract gold:
The AI recommended this conspiracy millions of times from more than 169 different channels, including European Space Agency and Northrop Grumman
Today was an historic high for astronomy, and an historic low for AI.
I was very enthusiast about YouTube's announcement, but let's be honest: after two months, things barely changed. (cf: algotransparency.org)
I'll do an overview of what changed with recommendations next month.
Brian is my best friend’s in-law. After his dad died in a motorcycle accident, he became depressed. He fell down the rabbit hole of YouTube conspiracy theories, with flat earth, aliens & co. Now he does not trust anyone. He stopped working, seeing friends, and wanting kids. 2/
Brian spends most of his time watching YouTube, supported by his wife.
For his parents, family and friends, his story is heartbreaking.
But from the point of view of YouTube’s AI, he’s a jackpot.
The YouTube algorithm that I helped build in 2011 still recommends the flat earth theory by the *hundreds of millions*. This investigation by @RawStory shows some of the real-life consequences of this badly designed AI.
This spring at SxSW, @SusanWojcicki promised "Wikipedia snippets" on debated videos. But they didn't put them on flat earth videos, and instead @YouTube is promoting merchandising such as "NASA lies - Never Trust a Snake". 2/
A few example of flat earth videos that were promoted by YouTube #today: 3/