Here we go. @NickKristof calls for Paypal to stop working with adult companies — with no mention of the real ways CSAM is shared online. Because private social media channels and the Dark Web aren't of interest to the evangelicals he works with. nytimes.com/2021/04/16/opi…
His source again is Laila Micklewait, who — again — is not with Exodus Cry, an Evangelical organization that wants to stop sex work, but with "The Justice Defense Fund"
And just for good I-don't-care-about-actual-data measure, he throws in the terrible study from the British Journal of Criminology and *drumroll* entirely misrepresents it. Now it shows *actual* sexual violence.
But perhaps my favorite @NickKristof-ism is the phrase "rapists, real or fake*
This is a man who has a history of misrepresenting sex trafficking and sex crimes. He's a true believer in what evangelicals are selling (at least when it comes to sex), and he leaves tremendous damage in his wake. washingtonpost.com/blogs/erik-wem…
This is the man who, in recklessly going after Backpage for similar claims as he's now making about adult sites, helped pushed sex workers offline and onto the streets. theguardian.com/commentisfree/…
It is NOT that we shouldn't go after CSAM and revenge porn. They are a scourge. But there are ways to do it that don't strip sex workers of things like banking and online ads that actually protect them!
But that would take someone — either Kristof of the evangelicals who feed him information — believing that #sexworkisrealwork. It would take looking at the actual data. This column is gasoline on a fire. It's irresponsible and dangerous. Shame on @nytimes for platforming this.
I can't stress enough how unserious @NickKristof is. I just googled "rape unconscious girl" which he has called on Google to remove. It's CLEARLY fictitious. I don't know who made it — most payment processors ban it — but he's demanding Google police fantasy.
This is no longer about CSAM or non-consensual content (if it ever was). If it was, he and Exodus Cry wouldn't constantly go down these wormholes where they pretend not to know the difference between fake and real. They want it all down.

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Mike Stabile

Mike Stabile Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @mikestabile

15 Apr
The day after Mastercard unveils extensive consent verification for all content, @NCOSE says that verification doesn't actually matter *after all* — because it could be coerced.
Their answer: Criminal prosecution of those who sell adult content.

newsweek.com/porn-industry-…
After all, according to NCOSE it could be ... Artificial Intelligence, and AI can't consent!
Don't know how many times we have to say this, but @NCOSE and @ExodusCry are not serious about #Traffickinghub. It's a means to an end to get adult content off the internet, and they'll move the goal posts every time. You can not negotiate with the American Taliban.
Read 5 tweets
14 Apr
If you're a journalist who listen solely to evangelicals, this sort of patrolling of adult content seems reasonable and easy. If you listen to sex workers —the ones most likely to be harmed — you would understand why it's not.
There's nothing in these new rules that limits this to porn sites. Unless Mastercard clarifies, we should assume this applies to every site that home to adult content on the internet — from Wikipedia to Twitter.
Pornhub, which evangelicals used @NickKristof to take down, complies with all the regulations. (They're still going after it.) The Mastercard rules will take care of the rest — the ones that are responsible, that don't take user-generated content, the small performer platforms.
Read 4 tweets
14 Apr
Mastercard has announced it will require documentation of "clear, unambiguous, and documented consent" on all adult images on any site that processes using its credit cards. It's hard to see this as anything as a disaster for sex workers. 1/

mastercard.com/news/perspecti…
While everyone — especially sex workers — want networks content is legal and consensual, regulations such as these will either mean social media sites like Twitter and Reddit will now have to store performer IDs and model releases or... 2/
...more likely, just stop dealing with adult content entirely. If not, they risk not being able to process payments on their network. It's hard to explain how short-sighted this is, and how devastating it will be to independent performers 3/
Read 13 tweets
9 Apr
Remember that @NCOSE doesn't believe in the right to sex worker speech. They want it deplatformed and prosecuted. When they say #shutitdown, they're not just talking about Pornhub — they want sex and sex workers off the internet entirely.
The next stated target is @OnlyFans. They say want to save the creators from the "psychological, emotional and physical harm" they are putting on themselves.
They are also targeting Twitch, Twitter, TikTok and Discord and Reddit for a pressure campaign to remove sex and sex worker related content.
Read 8 tweets
8 Apr
The #NCOSE "Congressional" hearing is now suggesting that the entire porn industry is illegal user-uploaded content and needs to be shutdown.
The #NCOSE lawyer is suggesting that anyone who works with or partners with sex work platforms — including Visa/Mastercard — should have legal criminal liability for anything on that platform. It's like Section 230 in reverse.
It's hard to express how bad their information is. Laila is now using comments from users who say "she looks like she's underage" on a video as evidence of actual CSAM.
Read 32 tweets
5 Apr
The problem with bad studies is they get replicated as headlines with no context. Yesterday, the British Journal of Criminology (@CrimeandJustice) published a study that purports to detail the frequency of sexual violence in video descriptions. 1/ academic.oup.com/bjc/advance-ar…
According to the study, 12% of the videos on the homepage of the major tube sites show some form of "sexual violence" — a headline that's since been replicated globally. 2/
For example, this one in @thetimes 3/ thetimes.co.uk/article/porn-s…
Read 18 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!