Pretty excited to share this article, written with a pile of friends and colleagues. If you're interested in a nuanced look at how metrification shapes work in the culture industries, this is for you. "Making Sense of Metrics in the Music Industries" ijoc.org/index.php/ijoc…
We surveyed+interviewed music professionals, to see how metrics shaped their work. We did not find blind faith in numbers, or flat out rejection. Instead, numbers had to be made sense of - narrated into something persuasive to justify making an investment or a taking a risk.
Metrics are powerful, and those who have more access to data enjoy more of that power. But numbers are not by themselves enough. They are approached with skepticism, remain open to interpretation, and must be transformed into something convincing.
Today, Facebook/Instagram announced “sensitive content control” for Instagram, giving users the ability to modulate how much “sensitive content” they’re shown in the “explore” recommendation page. Some things to notice: about.fb.com/news/2021/07/i…
Though the accompanying graphic implies that this will be a user-friendly slider, a graphic farther down in the post makes clear that it requires going two pages deep into the settings and choosing one of three options: Allow, Limit, or Limit More.
Notice that “limit” is the default. So, despite this being presented as a tool to manage sensitive content, it in fact gives instagram users one additional position one either side of the current offerings: a stricter standard, and a looser one.
I’ve been quietly writing about the “borderline content” policies at YouTube and Facebook for a while, or failing to - it’s taking me more time than I want to get all the words in the right order. But let me drop a few takeaway thoughts:
Both YouTube and Facebook instituted policies in 2018, where they will reduce the circulation of content they judge to be not quite bad enough to remove. The content remains on the platform, but it is recommended less, or not at all. wired.com/story/youtube-…
They not the only ones. Tumblr blocks hashtags; Twitter reduces content to protect “conversational health”; Spotify keeps select artists off of their playlists; Reddit quarantines subreddits, keeping them off the front page. And other platforms do it without saying so.
Thread, to a reporter, about #Cloudflare dropping 8chan: what effect would it likely have, and in which layers of the Net should accountability live? Short version: Decisions matter even if they don’t have a simple effect, and our ideas about responsibility are changing. 1/16
I think in the short term, both guesses are probably right: CloudFlare’s decision certainly doesn’t end 8chan, it will probably rematerialize in some form elsewhere; AND there will probably be some attrition of users, who either don’t find the new site or don’t want to. 2/16
But I do think we can get too focused on whether a single decision will or will not have a definitive effect, and we overlook the cumulative and the symbolic value of a decision like Cloudflare’s. 3/16
I think the Gab story is one of the most important new issues in content moderation + the power of intermediaries. How will growing demands for platform responsibility extend downward into other more 'infrastructural' services? wired.com/story/how-righ…
We can see these infrastructural intermediaries, that have traditionally positioned themselves as impartial, struggling to justify supporting Gab: web and domain hosting, payment services, cloud services. Remember, social media platforms positioned themselves as neutral too.
Even those forcefully arguing to keep Gab aren't just proclaiming neutrality: they're protecting speech, defending Gab, accusing others of censorship, etc. Value judgments, cloaked as the absence of value judgments. Even here, the veneer of neutrality is flimsier than we thought.
I applaud the @SSRC_org for this initiative, and Facebook for providing their data in a responsible way. Way to leverage current controversies for progressive ends! But... 1/10 ssrc.org/programs/view/…
That said, and along with the lucid comments from @natematias , a few things that come to mind that, as this project develops, I hope the SSRC are thinking about: 2/10
Right now this only includes Facebook data. Makes sense, as a start, given their size and impact. But if the @SSRC_org initiative aims to understand "social media’s impact on society," then This must include more platforms than just Facebook. 3/10