The day after Mastercard unveils extensive consent verification for all content, @NCOSE says that verification doesn't actually matter *after all* — because it could be coerced.
Their answer: Criminal prosecution of those who sell adult content.
After all, according to NCOSE it could be ... Artificial Intelligence, and AI can't consent!
Don't know how many times we have to say this, but @NCOSE and @ExodusCry are not serious about #Traffickinghub. It's a means to an end to get adult content off the internet, and they'll move the goal posts every time. You can not negotiate with the American Taliban.
How fun! Just got a letter from NCOSE asking me to join them in getting Congress to prosecute pornographers.
Because "monetized commercial sex" is human trafficking.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
Here we go. @NickKristof calls for Paypal to stop working with adult companies — with no mention of the real ways CSAM is shared online. Because private social media channels and the Dark Web aren't of interest to the evangelicals he works with. nytimes.com/2021/04/16/opi…
His source again is Laila Micklewait, who — again — is not with Exodus Cry, an Evangelical organization that wants to stop sex work, but with "The Justice Defense Fund"
And just for good I-don't-care-about-actual-data measure, he throws in the terrible study from the British Journal of Criminology and *drumroll* entirely misrepresents it. Now it shows *actual* sexual violence.
If you're a journalist who listen solely to evangelicals, this sort of patrolling of adult content seems reasonable and easy. If you listen to sex workers —the ones most likely to be harmed — you would understand why it's not.
There's nothing in these new rules that limits this to porn sites. Unless Mastercard clarifies, we should assume this applies to every site that home to adult content on the internet — from Wikipedia to Twitter.
Pornhub, which evangelicals used @NickKristof to take down, complies with all the regulations. (They're still going after it.) The Mastercard rules will take care of the rest — the ones that are responsible, that don't take user-generated content, the small performer platforms.
Mastercard has announced it will require documentation of "clear, unambiguous, and documented consent" on all adult images on any site that processes using its credit cards. It's hard to see this as anything as a disaster for sex workers. 1/
While everyone — especially sex workers — want networks content is legal and consensual, regulations such as these will either mean social media sites like Twitter and Reddit will now have to store performer IDs and model releases or... 2/
...more likely, just stop dealing with adult content entirely. If not, they risk not being able to process payments on their network. It's hard to explain how short-sighted this is, and how devastating it will be to independent performers 3/
Remember that @NCOSE doesn't believe in the right to sex worker speech. They want it deplatformed and prosecuted. When they say #shutitdown, they're not just talking about Pornhub — they want sex and sex workers off the internet entirely.
The next stated target is @OnlyFans. They say want to save the creators from the "psychological, emotional and physical harm" they are putting on themselves.
They are also targeting Twitch, Twitter, TikTok and Discord and Reddit for a pressure campaign to remove sex and sex worker related content.
The #NCOSE "Congressional" hearing is now suggesting that the entire porn industry is illegal user-uploaded content and needs to be shutdown.
The #NCOSE lawyer is suggesting that anyone who works with or partners with sex work platforms — including Visa/Mastercard — should have legal criminal liability for anything on that platform. It's like Section 230 in reverse.
It's hard to express how bad their information is. Laila is now using comments from users who say "she looks like she's underage" on a video as evidence of actual CSAM.
The problem with bad studies is they get replicated as headlines with no context. Yesterday, the British Journal of Criminology (@CrimeandJustice) published a study that purports to detail the frequency of sexual violence in video descriptions. 1/ academic.oup.com/bjc/advance-ar…
According to the study, 12% of the videos on the homepage of the major tube sites show some form of "sexual violence" — a headline that's since been replicated globally. 2/