What can we learn from social/computational science about policies to govern coordinated actors in a world of overlapping platforms and media?

Yesterday, I summarized a few points on how to understand those actors. Tonight, let's take a closer look at the ecosystem.
Most content/behavior policy debates focus on individual platforms, because that's where governance happens. But we live in a *transmedia* world, where civic life spans many media forms

This term comes from @henryjenkins. You can read the history here: tandfonline.com/doi/full/10.10…
An excellent case study in transmedia is @schock's (open access) Out of the Shadows, Into the Streets, which looks at the immigrant rights movement. The book illustrates a media ecology approach to understanding media practices linked to civic action mitpress.mit.edu/books/out-shad…
A network of *people* overlaps media ecosystems in any movement, as Internet scholars often observe. Most notably, @gilgul & co's "The Revolutions were Tweeted" details how social media, TV, & online news had a *symbiotic* relationship in the Arab Spring

ijoc.org/index.php/ijoc…
If we see digital policy as an ecosystem issue, we see flows of power that are resilient to single-platform policies. @wphillips49 *2015* book "This is Why We Can't Have Nice Things," revealing the symbiotic relationship between mainstream media & hate

mitpress.mit.edu/books/why-we-c…
Here's how it works: people/groups who organize harm/violence take advantage of the democratic mission & revenue goals of media orgs. They say and do things that media feel obligated to cover, generating attention & revenue for firms, making trolls thrive

mitpress.mit.edu/books/why-we-c…
In the social sciences, you win awards for stating a problem clearly (and @wphillips49 won awards for that book). Whitney has gone a step further to publish guidelines for how media can cover extremists, antagonists, and manipulators: datasociety.net/library/oxygen…
In the story told by Gilad, Whitney, Sasha & others, transmedia enabled low-power actors to game the incentives of the media & gain outsized visibility/influence.

How do you make sense of a scenario where people who are already running things play a similar game? That's the US.
Sometimes, traditional institutions contribute to an ecosystem that causes harm. That's the debate over Radio Rwanda in the 1994 genocide, where radio shows encouraged violence. What were the effects? Strauss explains why the answer is complicated

journals.sagepub.com/doi/abs/10.117…
If you want to read a cross-country analysis of how genocide comes to happen, beyond simplistic techno-determinist ideas about radio, see Strauss's 2018 book "Making and Unmaking Nations: War, Leadership, and Genocide in Modern Africa."

cornellpress.cornell.edu/book/978080147…
Back in the 2010s US:
1) people with institutional power have encouraged & amplified networks of hateful speech/behavior
2) media continued to be gamed

@YBenkler, Faris, @cyberhalroberts offered early documentation in their 2018 book Network Propaganda

global.oup.com/academic/produ…
If you want to know the state of knowledge about media ecosystems on the left & right in the US, @dfreelon, @alicetiara, & @kreissdaniel have a great summary in the Reviews section of Science that describes what we know *and* shares unresolved Qs

science.sciencemag.org/content/369/65…
This article explains the move to Telegram, Gab, Voat (and now Parler) in connection with organizing approaches in the American right going back to the 1930s. It also explains how ecosystems on the right contrast with the left.
What does this media ecosystems stuff have to do with tech policy?

It helps us understand:
- Why some are skeptical about the impact of a single ban
- How harmful groups can keep gaming the media even without social media
- How circumventable policies might still save lives
If no single platform's policies can successfully restrain a harmful ecosystem for long, then what do we do? One option is more coordination across platforms. @evelyndouek has written about this phenomenon in "The Rise of Content Cartels"

knightcolumbia.org/content/the-ri…
While I agree with Evelyn that secret, unaccountable policymaking by more companies does not do any favors for democracy or human rights, I think we need ecosystem approaches to solving ecosystem problems.

That was my argument in The Atlantic in 2015: theatlantic.com/technology/arc…
My favorite example of a cross-context, cooperative system of coordinated content moderation is The Block Bot (now Block Together), which has transparency, due process, and the beautiful idea of "propagating forgiveness." See @staeiou's paper here:

tandfonline.com/doi/full/10.10…
My research has documented cases of coordinated content moderation, where communities form networks like the ancient Delian League. To join & get protections from shared moderation, you agree to shared policies across the league and get a say in policies

journals.sagepub.com/doi/10.1177/20…
Many people complain that tech firms waited to see what other firms did before taking action.

To a social scientist, that's not surprising. It's how people/institutions usually behave, and it's a powerful way for behaviors to spread across any network. sociology.stanford.edu/publications/t…
So far, I've answered Qs that focused on powerful actors, networks, and ecosystems who would use any tools available to organize harm.

"But you're a behavioral scientist!" you object. "Surely some policies could work on average reliably over time?" Tune in tomorrow.
Oh yes, for those of you who want to see where TV fits, @alexleavitt shared this paper by @_jenallen and co-authors that merges Nielsen & Comscore data to study how false/misleading information on TV connects relates to online media consumption

advances.sciencemag.org/content/6/14/e…
Update from @schock: Marscha Kinder is the origin of the term "Transmedia" cinema.usc.edu/directories/pr…

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with J. Nathan Matias

J. Nathan Matias Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @natematias

11 Jan
How can social / computational science help make sense of content moderation & platform policies? People shared ~30 questions over the last day. Over the next few days, I'll summarize scholarship & point to others doing important ongoing work

What should society do about organized, dedicated actors whose purpose is to undermine a free society?

Debates on this question can start by thinking of groups as movements with leaders, resources, and organizing structures, as McCarthy & Zald describe

en.wikipedia.org/wiki/Resource_…
If you don't recognize the ecosystem of actors with money, connections, & influence, you can get distracted by what's visible on a single online platform.

@JessieNYC described structures of white nationalist power in her *2009* book on Cyber Racism

rowman.com/ISBN/978074256…
Read 10 tweets
9 Jan
As Twitter fills with opinions on content moderation, online hate, & platform policies, what open questions do you have that social/behavioral/computational scientists can help answer?

I'll compile replies and respond this evening.
As hot takes whizz around Twitter, I'm hoping this thread can be a corner to slow down and identify the hard/important questions that come more slowly.
Questions from any political or identity standpoint are welcome. If you're unsure about asking your question publicly, send a direct message. I'll wade into my DMs this evening when I compile people's questions.
Read 7 tweets
17 Nov 20
If independently validated, an 8% decrease in sharing of false information is a big deal.

Someday companies will routinely be required/expected to share results of their experiments on us, rather than journalists leaking results. By @CraigSilverman

buzzfeednews.com/article/craigs…
Think how big an effect an 8% sharing reduction would be, if real (withholding judgment without details).

A platform data scientist is claiming they can reduce sharing of statements by a person with a huge megaphone, whose tweets are newsworthy, & who has a committed base 😮
Debates on online discourse have a baseline problem. It's impractical & undesirable for 0% of a head of state's comments to reach the public. But 100% isn't great if they're false.

That's how policy debates get stuck on arguments that a firm could "do more" & real wins get lost.
Read 5 tweets
21 Oct 20
Most large online communities have coordinated across multiple platforms for years. While quarantine/bans can disrupt recruitment, they just displace the core group elsewhere.

This piece by @alibreland shares a notable example of this. motherjones.com/politics/2020/…
A few years ago, @TarletonG and I were talking about whether we need to see conten moderation through the concept of assembly as well as speech. It's high time.

By focusing on speech, people have mistaken social/cultural problems for a content problem. And here we are.
In the 18th century, freedom of speech & assembly represented social functions that have now become un-bundled & repackaged online. To name a few:
- spreading ideas
- connecting/recruiting
- raising funds
- building relationships & group identity
- coordinating groups to act
Read 10 tweets
15 Jun 20
Is support for black lives short-lived? Can movements that organize around events like the death of George Floyd lead to long-term change?

Last year, @EthanZ @rahulbot @fberm @allank_o & I published research on news & social media attention to black deaths, 2013-2016. Thread:
How does an ignored, systemic issue become newsworthy? Comm scholars sometimes describe news coverage as an ocean of overlapping "news waves." Some waves, like sports, have a natural cycle. What about issues like police violence that somehow don't get much coverage?
Kepplinger & Habermeier (1995) proposed that "key events" like an earthquake or a string of deaths can "trigger waves of reporting on similar events." To test this idea, they studied German news on deaths from earthquakes, AIDS, & traffic accidents—before & after key events.
Read 11 tweets
13 Jun 20
Tidying up, I find a diagram of wisdom from @xuhulk when I was a gradstudent. At the Media Lab, the risk was always to err too far on the side of promotion. But many researchers under-promote.
I remember being told once that researchers should let the scientific process decide the value and attention our own work deserves and receives.

It's a valuable principle when deciding what to amplify. I wish the system worked reliably that way.
My priority in promotion is usually *utilization* - I hope my research will be useful to the people it matters to. That requires different effort from sharing findings with other scientists. Carol Weiss offers a great intro to the idea of utilization acawiki.org/The_Many_Meani…
Read 8 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!