, 21 tweets, 7 min read Read on Twitter
Tens of thousands of words, hundreds of lines of code and countless data points later, ya girl submitted her (first) master's thesis!

In it, I investigated how YouTube's recommendation algorithm is shaping access to information and democratic discourse around the world.
"A thesis about YouTube?" you might ask. "Isn't that quite frivolous and inconsequential?"

Let me convince you otherwise.
YouTube is one of the most-used websites in the world. It draws billions of users per month - almost everyone who has internet access.

In places like India, YouTube has replaced Google as the place people turn to ask questions & learn about the world.

wsj.com/articles/india…
A billion hours of content is viewed on the site EVERY DAY, across 100s of countries.

70% of that -- 700,000,000 hours -- is driven by the recommendation algorithm. But no one really knows how it works.
The algorithm aims to show users more of what they want so they'll stay on the site and drive up ad revenue.

Which is great when what you want is, say, endless @bonappetit cooking videos.

Not so great when it delivers videos of children to pedophiles. nytimes.com/2019/06/03/wor…
What's dangerous is the type of content that keeps users most engaged tends to be controversial, extremist, or straight up untrue.

@gchaslot, a former YouTube engineer who worked on the recommendation algorithm, shared an example in this thread:
Brazil is an excellent case study.

Parents searching for info about Zika are being served anti-vax conspiracies. Young conservative provocateurs are gaining followings on YT, getting elected to federal office, and using their platform to attack rivals.
nytimes.com/2019/08/11/wor…
YouTube has long been a haven for extremist ideologies.

Reporters were warning about the rise of white supremacist content on the site AS EARLY AS 2006 -- a year after YouTube launched.
12 years later -- in January of this year -- a self-described white supremacist told viewers to "subscribe to PewDiePie" while he livestreamed himself opening fire in a New Zealand mosque.
Researchers like @beccalew have long been warning about the dangers of YouTube's unchecked algorithm.

Last year, she published a @datasociety report mapping how the "Alternative Influence Network" uses the platform to spread extremist political ideology.

datasociety.net/output/alterna…
@beccalew @datasociety Malicious actors who are intent on pushing a certain agenda understand how to produce connections between videos. YouTube is also frequently cited by members of the alt-right on Twitter and other online forums.
There's a fascinating interactive @nytimes story mapping the watch history of a user named Caleb Cain who spent several years in an alt-right YouTube "rabbit hole":

nytimes.com/interactive/20…
It's frightening that we know so little about the algorithms that curate our online experiences.

Due to their proprietary nature, we don't have access to the inputs or design. And because they're driven by machine learning, even their engineers don't fully know how they work.
YouTube has lagged behind other major social media platforms in taking down dangerous or hateful content.

Some governments are pushing for greater accountability, but regulation of a multinational private tech company is an incredibly complicated matter.
YouTube's vulnerabilities are being exploited by malicious actors, which can have serious real-world implications.

This should concern you!
In sum, YouTube's profit model and unchecked algorithm have contributed to a culture of disinformation and radicalization on and offline. While the company claims to have made changes in recent years, their lack of transparency makes it hard to test their impact and veracity.
Anyways. There's LOADS more where that came from. This issue is fascinating and troubling and important.

I hope to publish parts of my thesis soon. Let me know if you're interested in reading it.
(PS my bibliography is six pages long and contains gems like this)
And if you work in media, hit me up if you're interested in publishing 😉
From @techreview two days ago:

YouTube is experimenting with ways to make its algorithm even more addictive technologyreview.com/s/614432/youtu…
For a smoother reading experience, check out this tweet compilation: threadreaderapp.com/thread/1178409…
Missing some Tweet in this thread?
You can try to force a refresh.

Like this thread? Get email updates or save it to PDF!

Subscribe to klaudia jaźwińska 🍂
Profile picture

Get real-time email alerts when new unrolls are available from this author!

This content may be removed anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just three indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!