Letter: We stand in solidarity with @LauraEdelson2 & @cyber4democracy, who were punished by Facebook for studying the company's impact on democracy

We call for changes by Facebook, tech firms, & regulators in support of accountability research. Pls sign!

edelson-solidarity.neocities.org
@LauraEdelson2 @cyber4democracy We see Facebook’s actions against NYU as part of a long-standing pattern among large technology firms, all of whom have systematically undermined accountability and independent, public-interest research. Here are some of those stories:
In 2019 Facebook shut down a similar transparency tool created by the news organization ProPublica

propublica.org/article/facebo…
Google has recently fired and let go countless staff who published research about the risks of artificial intelligence

washingtonpost.com/technology/202…

techcrunch.com/2021/02/19/goo…
Twitter shut down and delayed multiple research projects designed to reduce online hate and harassment in 2018 and 2019

wsj.com/articles/jack-…
Amazon attempted to discredit and undermine research about discrimination resulting from their facial recognition algorithms

technologyreview.com/2020/06/12/100…
Facebook has effectively dismantled the team running CrowdTangle, a tool used by journalists and researchers to examine public pages and groups

firstdraftnews.org/articles/%E2%8…
As we stand with Edelson & NYU, we also believe tech firms have shown they cannot be trusted to decide how their own businesses should be held to account.

That's why we appeal to regulators to compel companies to cooperate with accountability research

edelson-solidarity.neocities.org
Gizmodo has covered our letter here: gizmodo.com/facebook-kille…

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with J. Nathan Matias

J. Nathan Matias Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @natematias

14 Apr
So you want to be an activist/citizen/engaged scholar working on pressing societal questions:
- what does that actually look like?
- what challenges will you face?
- how do people overcome them?

Beaulieu, @BretonMylaine, Brousselle have a *great* summary: journals.plos.org/plosone/articl…
1st: what do scholars actually do? Faculty have 3 jobs:
- research
- teaching
- service (often imagined as service to the university and to their field)

Problem: a public mission isn't part of the job, as Boyer points out in this inspiring essay: eric.ed.gov/?id=EJ1097206
Boyer observes that:
- academics aren't incentivized by institutions to get out and contribute to society
- governments & other powerful orgs also prioritize conversations with lawyers, activists, and public intellectuals over academics, despite what we have to offer n Washington, we did consul...
Read 13 tweets
12 Jan
What can we learn from social/computational science about policies to govern coordinated actors in a world of overlapping platforms and media?

Yesterday, I summarized a few points on how to understand those actors. Tonight, let's take a closer look at the ecosystem.
Most content/behavior policy debates focus on individual platforms, because that's where governance happens. But we live in a *transmedia* world, where civic life spans many media forms

This term comes from @henryjenkins. You can read the history here: tandfonline.com/doi/full/10.10…
An excellent case study in transmedia is @schock's (open access) Out of the Shadows, Into the Streets, which looks at the immigrant rights movement. The book illustrates a media ecology approach to understanding media practices linked to civic action mitpress.mit.edu/books/out-shad…
Read 22 tweets
11 Jan
How can social / computational science help make sense of content moderation & platform policies? People shared ~30 questions over the last day. Over the next few days, I'll summarize scholarship & point to others doing important ongoing work

What should society do about organized, dedicated actors whose purpose is to undermine a free society?

Debates on this question can start by thinking of groups as movements with leaders, resources, and organizing structures, as McCarthy & Zald describe

en.wikipedia.org/wiki/Resource_…
If you don't recognize the ecosystem of actors with money, connections, & influence, you can get distracted by what's visible on a single online platform.

@JessieNYC described structures of white nationalist power in her *2009* book on Cyber Racism

rowman.com/ISBN/978074256…
Read 10 tweets
9 Jan
As Twitter fills with opinions on content moderation, online hate, & platform policies, what open questions do you have that social/behavioral/computational scientists can help answer?

I'll compile replies and respond this evening.
As hot takes whizz around Twitter, I'm hoping this thread can be a corner to slow down and identify the hard/important questions that come more slowly.
Questions from any political or identity standpoint are welcome. If you're unsure about asking your question publicly, send a direct message. I'll wade into my DMs this evening when I compile people's questions.
Read 7 tweets
17 Nov 20
If independently validated, an 8% decrease in sharing of false information is a big deal.

Someday companies will routinely be required/expected to share results of their experiments on us, rather than journalists leaking results. By @CraigSilverman

buzzfeednews.com/article/craigs…
Think how big an effect an 8% sharing reduction would be, if real (withholding judgment without details).

A platform data scientist is claiming they can reduce sharing of statements by a person with a huge megaphone, whose tweets are newsworthy, & who has a committed base 😮
Debates on online discourse have a baseline problem. It's impractical & undesirable for 0% of a head of state's comments to reach the public. But 100% isn't great if they're false.

That's how policy debates get stuck on arguments that a firm could "do more" & real wins get lost.
Read 5 tweets
21 Oct 20
Most large online communities have coordinated across multiple platforms for years. While quarantine/bans can disrupt recruitment, they just displace the core group elsewhere.

This piece by @alibreland shares a notable example of this. motherjones.com/politics/2020/…
A few years ago, @TarletonG and I were talking about whether we need to see conten moderation through the concept of assembly as well as speech. It's high time.

By focusing on speech, people have mistaken social/cultural problems for a content problem. And here we are.
In the 18th century, freedom of speech & assembly represented social functions that have now become un-bundled & repackaged online. To name a few:
- spreading ideas
- connecting/recruiting
- raising funds
- building relationships & group identity
- coordinating groups to act
Read 10 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!

:(