Remember that @NCOSE doesn't believe in the right to sex worker speech. They want it deplatformed and prosecuted. When they say #shutitdown, they're not just talking about Pornhub — they want sex and sex workers off the internet entirely.
The next stated target is @OnlyFans. They say want to save the creators from the "psychological, emotional and physical harm" they are putting on themselves.
They are also targeting Twitch, Twitter, TikTok and Discord and Reddit for a pressure campaign to remove sex and sex worker related content.
They're also going after @Netflix, for "gratuitous" nudity. 👉"Researchers at NCOSE recently looked into 10 of the top original Netflix titles and found that 9 out of 10 featured graphic on-screen sex scenes." endsexualexploitation.org/netflix/
They want the company to remove books from photographer Sally Mann, who, you know, has only been awarded a Guggenheim and three grants from the National Endowment for the Arts. But to NCOSE, she's just a child pornographer.
The thing is, none of this is particularly well-hidden. So they can say "oh, we're not anti-porn, we're just anti-exploitation" but to them it's the same thing. It's like saying "I'm not against gay people, I'm against homosexual acts."
They always are, and have been "Morality in Media" — the group that's gone after sex toys and sex education and librarians. Even if they don't want you to know it. #shutitalldownen.wikipedia.org/wiki/National_…
• • •
Missing some Tweet in this thread? You can try to
force a refresh
The #NCOSE "Congressional" hearing is now suggesting that the entire porn industry is illegal user-uploaded content and needs to be shutdown.
The #NCOSE lawyer is suggesting that anyone who works with or partners with sex work platforms — including Visa/Mastercard — should have legal criminal liability for anything on that platform. It's like Section 230 in reverse.
It's hard to express how bad their information is. Laila is now using comments from users who say "she looks like she's underage" on a video as evidence of actual CSAM.
The problem with bad studies is they get replicated as headlines with no context. Yesterday, the British Journal of Criminology (@CrimeandJustice) published a study that purports to detail the frequency of sexual violence in video descriptions. 1/ academic.oup.com/bjc/advance-ar…
According to the study, 12% of the videos on the homepage of the major tube sites show some form of "sexual violence" — a headline that's since been replicated globally. 2/