, 16 tweets, 3 min read Read on Twitter
Read the Spirit AI interview on RPS with interest. It shows how hard it is to have a conversation about community moderation.

Three perspectives:
- Wronged user
- Moderator
- Social designer
1. User perspective: "I have been wronged/abused/etc. Mods should make it stop immediately. They are clearly in the wrong from a unarguable moral standpoint if they do not." Black and white, zero wiggle room.
2. Human mod perspective: "OMG, torrents of comments. Cold reality: It is very difficult to track all conversations, reports, etc. We do our best, but often even minor games would need lots of human bodies just to keep up with players 24/7. Budget is a thing too."
3. Social designer: "If you community is toxic, you need to reestablish positive social norms. This means shutting down toxicity, yes. But it means a lot of small pushes that teach people how to behave in this community."
To the user, it often looks like the mods aren't doing anything. There's always some truth to this. Because certain portions of the conversation are missed because mods don't have the tools, the reach or the resources to do anything meaningful.
And user really don't get the social designers perspective. They want immediate, clear responses to their immediate clear problems. Fuzzy talk about social norms and systemic issues is gibberish. "Who cares! When are you going to actually *do* something!"
The social designer however sees the the toxicity problem as one of toxicity *generation*. You can whack all the toxic reports you want (and we should!) but if the community trained to be toxic to one another. Or the core gameplay sparks drama. Then there's no improvement.
And with large communities, it is quite hard to banhammer your way to positive norms. While it is critical to make examples of bad behavior, you only see real progress if you also make clear examples of good behavior. You can't dictatorship your way to a healthy society.
The moderators are caught in the middle of all this. They are often overworked. And dealing with an infinite, generative fountain of toxicity. There are some wonderful techniques great community managers use to help the community gain positive standards, but *such* hard work.
Tools like Spirit AI (which I haven't used) that focus on helping community managers encourage a community to be more positive seem like a smart direction. It starts to systemically address the root of the toxicity vs just the individual toxic moments.
(^ This right here is a hard conversation to have with almost anyone. 'Fixing a concrete event' vs 'Tuning *rates* of events' breaks most discussions. Sadly, modern rhetoric was built to win shouting matches between angry apes, not fix subtle systems.)
I'm definitely on the social design side of all this. I tend to start even earlier and ask "what game mechanics breed toxicity?" There are a ton of broken social system that inevitably induce rage (ex: strangers playing high trust, collaborative activities.)
What if we fix busted social system at the *start* of the game development? Instead of asking community managers to stem the bleeding later on?
Now, in all this, I don't blame the players. The abuse is real. The toxic communities are real. The pain is real. We need to triage what we can. Ban away, ban away!
In the end, however, I guarantee the solution will not be purely based on banhammers or blunt authoritarianism. Righteous people insisting on 'more punishment' is ultimately naive.

We need to foster positive communities. Where the proper way to act is to be kind to one another.
The original interview, for folks who are interested
Missing some Tweet in this thread?
You can try to force a refresh.

Like this thread? Get email updates or save it to PDF!

Subscribe to Daniel Cook
Profile picture

Get real-time email alerts when new unrolls are available from this author!

This content may be removed anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!

This site is made by just three indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!