My Authors
Read all threads
#THREAD
@TaylorLorenz asks the $1m question: Who's "responsible" for online mob harms.

So, responsibility: there's legal responsibility and there's moral responsibility.

And not just the creator & the platform. Also, there's the responsibility of audience/users too!
1/
Let's look at legal responsibility first.

1. Audience members:
People who harass and threaten are always legally responsible for their own conduct. They can be sued for defamation or harassment. Or even arrested. BUT racism isn't illegal and is often protected speech. 2/
Plus, harassment requires a course of conduct directly to another person. So if you have 100k people saying or doing one hostile act, none of them are harassing, though the victim's experience is one of harassment.

2. The Creator
If they are inciting their audience to harass 3/
the Creator could be legally responsible for the overall harassment of the gestalt of users. Prob not criminally, but it'd be an interesting lawsuit.

3. The Platform
The platform is in the best position to be legally liable because they're in the exclusive position to 4/
help the harmed individual. BUT, in the US, the platforms are immune from legal responsibility because of the 1996 Communications Decency Law Section 230 which has been interpretted by courts to broadly say platforms aren't responsible for users' words or even conduct. 5/
Section 230, though, doesn't apply to federal crimes. So the platform CAN be held responsible for federal crimes like cyberstalking and interstate threats. The DOJ has only once held a platform criminally liable for user conduct. 6/
Now let's look at moral responsibility.

1. Audience
No question somebody who says racist and bigotted and threatening things is morally responsible for their own bad acts.

2. Creator
If Creator produces content they know is causing fans to injure people, they owe a moral 7/
duty to not cause injury. All the more so if they are monetizing the content that's causing harassment. And tenfold more if the main purpose of the content is to embarrass and harass folks.

3. Platform
The biggest moral responsibility falls on the entity 8/
best positioned to stop it. IMO, that's the platform. They are minting money off their creators and audience members, mining their data, advertising at them. If their business model involves injuring people, it's a dangerous product and should be held liable as all consumer 9/
products are. The platforms have the resources and have or should have the staff to prevent and resolve user harms. So if they don't, it's morally reprehensible. They may claim they don't want to overcensor. In which case, they're making the conscious decision to 10/
choose hateful speech and harassment over user safety. (mind you, the users they are minting money off of). It's even more morally wretched if they are choosing not to moderate their content simply because Section 230 says they legally aren't responsible. 11/
Until we start holding the platforms (and their founders, investors, VCs, hareholders, employees) morally responsible, for the harms to their users and the public, we will never reach a point where we can change the laws so they'll be held legally responsible. 12/12
Also, some folks might say that if TikTok creators are the ones being trolled, perhaps they're assuming that risk. To a certain extent, it's true they should expect to be the subject of disparagement and criticism as public figures. BUT celebs and influences don't sign up 13/12
for threats and harassment and racism. It's not part of the job. And we need to stop thinking of abuse and privacy violations as the price of fame. 14/12
Missing some Tweet in this thread? You can try to force a refresh.

Keep Current with Carrie A. Goldberg

Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!