Tarleton Gillespie Profile picture
I'm an independent-minded academic, critical of the tech industry, working for Microsoft. Perplexing. My latest book is Custodians of the Internet (Yale, 2018)
Dr. Sophia Bakogianni Profile picture 1 subscribed
Nov 2, 2022 24 tweets 6 min read
Some free advice to @elonmusk, from someone who has studied content moderation and the public impact of social media platforms for more than a decade. Buckle in. Because you clearly do not understand how this works. Forget about content moderation for a sec, we’ll get back to it. You’re an expert in designing complex systems, so they say? Technical systems. But Twitter is a technical system and a social system, and social systems don’t work like technical systems do.
Nov 21, 2021 10 tweets 2 min read
We can criticize algorithms for being biased, and moderation policies for failing marginalized users. But this will never change until we can win the bigger societal argument: it’s only "hate speech" when its punching downward. [thread] Despite glimmers to be more progressive, social media companies will falter. They start from a normative point of view - because of who makes decisions, who they think they’re for, and which audiences they think are most valuable to their advertisers.
Sep 14, 2021 26 tweets 14 min read
This is infuriating, and totally predictable. "In its struggle to accurately moderate a torrent of content and avoid negative attention, Facebook created invisible elite tiers within the social network." wsj.com/articles/faceb… We're not the only ones to say it, but I'm proud that @robyncaplan and I noted this, that platforms create "tiered governance" systems inside their content moderation efforts, treating categories of users differently, in ways that arent clear from outside. journals.sagepub.com/doi/10.1177/20…
Jul 20, 2021 31 tweets 6 min read
Today, Facebook/Instagram announced “sensitive content control” for Instagram, giving users the ability to modulate how much “sensitive content” they’re shown in the “explore” recommendation page. Some things to notice: about.fb.com/news/2021/07/i… Though the accompanying graphic implies that this will be a user-friendly slider, a graphic farther down in the post makes clear that it requires going two pages deep into the settings and choosing one of three options: Allow, Limit, or Limit More.
Jul 20, 2021 4 tweets 3 min read
Pretty excited to share this article, written with a pile of friends and colleagues. If you're interested in a nuanced look at how metrification shapes work in the culture industries, this is for you. "Making Sense of Metrics in the Music Industries" ijoc.org/index.php/ijoc… We surveyed+interviewed music professionals, to see how metrics shaped their work. We did not find blind faith in numbers, or flat out rejection. Instead, numbers had to be made sense of - narrated into something persuasive to justify making an investment or a taking a risk.
Apr 27, 2021 25 tweets 7 min read
I’ve been quietly writing about the “borderline content” policies at YouTube and Facebook for a while, or failing to - it’s taking me more time than I want to get all the words in the right order. But let me drop a few takeaway thoughts: Both YouTube and Facebook instituted policies in 2018, where they will reduce the circulation of content they judge to be not quite bad enough to remove. The content remains on the platform, but it is recommended less, or not at all.
wired.com/story/youtube-…
Aug 5, 2019 16 tweets 3 min read
Thread, to a reporter, about #Cloudflare dropping 8chan: what effect would it likely have, and in which layers of the Net should accountability live? Short version: Decisions matter even if they don’t have a simple effect, and our ideas about responsibility are changing. 1/16 I think in the short term, both guesses are probably right: CloudFlare’s decision certainly doesn’t end 8chan, it will probably rematerialize in some form elsewhere; AND there will probably be some attrition of users, who either don’t find the new site or don’t want to. 2/16
Nov 6, 2018 4 tweets 1 min read
I think the Gab story is one of the most important new issues in content moderation + the power of intermediaries. How will growing demands for platform responsibility extend downward into other more 'infrastructural' services? wired.com/story/how-righ… We can see these infrastructural intermediaries, that have traditionally positioned themselves as impartial, struggling to justify supporting Gab: web and domain hosting, payment services, cloud services. Remember, social media platforms positioned themselves as neutral too.
Apr 9, 2018 10 tweets 3 min read
I applaud the @SSRC_org for this initiative, and Facebook for providing their data in a responsible way. Way to leverage current controversies for progressive ends! But... 1/10 ssrc.org/programs/view/… That said, and along with the lucid comments from @natematias , a few things that come to mind that, as this project develops, I hope the SSRC are thinking about: 2/10