Profile picture
random facts girl. @soychicka
, 11 tweets, 3 min read Read on Twitter
Point blank: this is fucked up.

Clearly what FB needs to do is not to sit someone down in front of a screen evaluating if something is child pornography for hours - they need to allow users to be trained and certified as "citizen moderators."
businessinsider.com/sarah-katz-wha…
And in the instance described above, there is ZERO reason that once an image is posted once, it isn't detected automatically on future reports that it has already been marked as a violation...
And once that image is validated by a certified moderator, all images from the platform are periodically scanned, last scan date recorded to avoid duplication, starting with those posted by untrusted/suspect/new accounts, to identify other instances of the image in violation.
By promoting some users to certified moderator status, tracking the content viewed by those users that is NOT reported can provide a positive standard against which bad actors and content can be compared.
If 1 of every 10k, even 1 in 100k users is allowed to trigger expedited content removal, the content in violation of rules, especially that which is going viral, will be suspended much earlier, and reduce the exposure of all other users to harmful material.
and I'm sure you can calculate probabilities that could reveal - from given subnetworks of 1-10M users - clusters of inauthentic accounts based on the frequency of known moderators who follow/interact with those accounts.
Using perceived celebrity as the sole basis for defining who can request rapid response is unacceptable, and is biased towards handling only some types of bad content.
Social media companies need to be held accountable in a way that each minute accrues additional penalty following the initial report that content is in violation;
every account that is exposed to and subsequently reports that content following the initial report until the content is removed should be compensated (or allow their compensation to be contributed to a victims' fund)...
And every account exposed to said content that did NOT report content deemed inappropriate by a citizen or contract/employee moderator should be penalized in some way - certainly a reduction in trust, i.e. decreasing exp in search results, limiting exposure to non-followers
We know @facebook + @twitter collect the data that can make this work; if you really want to improve the adherence to your platform's guidelines, allowing, encouraging, and demanding community enforcement and participation in its moderation is one way to start. @yoyoel @delbius
Missing some Tweet in this thread?
You can try to force a refresh.

Like this thread? Get email updates or save it to PDF!

Subscribe to random facts girl.
Profile picture

Get real-time email alerts when new unrolls are available from this author!

This content may be removed anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just three indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member and get exclusive features!

Premium member ($3.00/month or $30.00/year)

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!