Mastercard has announced it will require documentation of "clear, unambiguous, and documented consent" on all adult images on any site that processes using its credit cards. It's hard to see this as anything as a disaster for sex workers. 1/

mastercard.com/news/perspecti…
While everyone — especially sex workers — want networks content is legal and consensual, regulations such as these will either mean social media sites like Twitter and Reddit will now have to store performer IDs and model releases or... 2/
...more likely, just stop dealing with adult content entirely. If not, they risk not being able to process payments on their network. It's hard to explain how short-sighted this is, and how devastating it will be to independent performers 3/
Mastercard's plan *sounds* so good in theory — just document consent — but they have no idea what that takes, or what the real effect will be. 4/
They are also asking for "Content Review Prior to Publication" — this is something adult sites do, but social media does not. You want to upload something to Twitter? You just do it and it appears. Now, someone will have to review it, and have multiple avenues to block it. 5/
Mastercard is also asking that anyone who is depicted in such a video be able to petition remove it through an appeals process. Does this mean that someone who was of legal age and signed a model release can now get a video taken down? Unclear! 6/
They're also asking for additional certification from banks that provide financial to adult sites. "Banks that connect merchants to our network will need to certify that the seller of adult content has effective controls in place ... 7/
"... to monitor, block and, where necessary, take down all illegal content." My read on this is that it will become even harder for sex workers and adult businesses to access banking. 8/
And that we'll see additional restrictions from Dropbox, Google Drive, Paypal ... just about everyone. 9/
In fact, adult sites will have this covered with little problem. We already know how to handle model releases and IDs and have systems in place. Most large companies don't ... and likely won't bother. 10/
Visa/Mastercard are BY FAR the largest censors of legal adult content online, and have been for years. The government actually needs proof, credit card processors just need to whisper *liability,* and a whole legal industry is crushed. 11/
Hopefully I'm wrong. But the situation seems dire. Crypto couldn't have come at a better time. #MasterCensors #VisaVictims
And as @itslittlepbigd just pointed out, how to do you pre-approve live content like cams?

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Mike Stabile

Mike Stabile Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @mikestabile

16 Apr
Here we go. @NickKristof calls for Paypal to stop working with adult companies — with no mention of the real ways CSAM is shared online. Because private social media channels and the Dark Web aren't of interest to the evangelicals he works with. nytimes.com/2021/04/16/opi…
His source again is Laila Micklewait, who — again — is not with Exodus Cry, an Evangelical organization that wants to stop sex work, but with "The Justice Defense Fund" Image
And just for good I-don't-care-about-actual-data measure, he throws in the terrible study from the British Journal of Criminology and *drumroll* entirely misrepresents it. Now it shows *actual* sexual violence. Image
Read 11 tweets
15 Apr
The day after Mastercard unveils extensive consent verification for all content, @ncose says that verification doesn't actually matter *after all* — because it could be coerced.
Their answer: Criminal prosecution of those who sell adult content.

newsweek.com/porn-industry-…
After all, according to NCOSE it could be ... Artificial Intelligence, and AI can't consent! Image
Don't know how many times we have to say this, but @ncose and @ExodusCry are not serious about #Traffickinghub. It's a means to an end to get adult content off the internet, and they'll move the goal posts every time. You can not negotiate with the American Taliban. Image
Read 5 tweets
14 Apr
If you're a journalist who listen solely to evangelicals, this sort of patrolling of adult content seems reasonable and easy. If you listen to sex workers —the ones most likely to be harmed — you would understand why it's not.
There's nothing in these new rules that limits this to porn sites. Unless Mastercard clarifies, we should assume this applies to every site that home to adult content on the internet — from Wikipedia to Twitter.
Pornhub, which evangelicals used @NickKristof to take down, complies with all the regulations. (They're still going after it.) The Mastercard rules will take care of the rest — the ones that are responsible, that don't take user-generated content, the small performer platforms.
Read 4 tweets
9 Apr
Remember that @NCOSE doesn't believe in the right to sex worker speech. They want it deplatformed and prosecuted. When they say #shutitdown, they're not just talking about Pornhub — they want sex and sex workers off the internet entirely.
The next stated target is @OnlyFans. They say want to save the creators from the "psychological, emotional and physical harm" they are putting on themselves.
They are also targeting Twitch, Twitter, TikTok and Discord and Reddit for a pressure campaign to remove sex and sex worker related content.
Read 8 tweets
8 Apr
The #NCOSE "Congressional" hearing is now suggesting that the entire porn industry is illegal user-uploaded content and needs to be shutdown.
The #NCOSE lawyer is suggesting that anyone who works with or partners with sex work platforms — including Visa/Mastercard — should have legal criminal liability for anything on that platform. It's like Section 230 in reverse.
It's hard to express how bad their information is. Laila is now using comments from users who say "she looks like she's underage" on a video as evidence of actual CSAM.
Read 32 tweets
5 Apr
The problem with bad studies is they get replicated as headlines with no context. Yesterday, the British Journal of Criminology (@CrimeandJustice) published a study that purports to detail the frequency of sexual violence in video descriptions. 1/ academic.oup.com/bjc/advance-ar…
According to the study, 12% of the videos on the homepage of the major tube sites show some form of "sexual violence" — a headline that's since been replicated globally. 2/
For example, this one in @thetimes 3/ thetimes.co.uk/article/porn-s…
Read 18 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!