Anti-adult work policies, often driven from American morality (for OF: the morality of their payment processors & the people lobbying them) put marginalized people - often women, people of color, and LGBTQ folk - at serious risk.
But, if you're seeing this, here's a short summary:
These changes in rules can mean a huge & unexpected loss of income.
As far as we can tell from our still-in-progress research, many OF creators are new to S*x W*rk. While experienced S*x W*rkers were anticipating the realities of deplatforming, less experienced folks may not be.
While you may think of OF as a side gig, for many people, it's not.
Sometimes as a security researcher I'm shocked that the assumptions we make: that every system can be broken, that no system is truly private are rarely held by the broader CS community.
A short thread of recent examples.
In preparing COVID19 ppr, I'm reminded of our efforts to submit the initial work on people's willingness to adopt based on privacy vs. accuracy @NeurIPSConf.
Reviewers claimed it was unnecessary to consider privacy in a decentralized system because it was perfectly private.
@airbnb uses AI to detect whether a user is a sex worker, mentally ill, or otherwise "un-desirable".
Based on this algorithm, #airbnb bans these folks, even if they were *not* looking to use the property for e.g., #sexwork.
Multiple sex workers we interviewed in Europe (Germany/Switzerland) reported this problem extensively, even though sex work is not illegal behavior.
This is a classic case, as many workers discussed, of Americanized-tech damagingly applying our "ethics" to the rest of the world
Our interview data further confirms the prevalence of this practice internationally, which denies legal sex workers, among other groups determined undesirable by AI, from participating in a large and fast growing part of the gig economy, when they are gig workers themselves!