, 15 tweets, 4 min read Read on Twitter
I feel conflicted about this. Given the vast amount of posts per second on social media platforms the answer will be algorithmic in the vast amount of situations. I want to see #mentalhealth organisation make the case for why social media as a public space is vital for folks
At present there are not 'unlimited fines' for media companies that publish news and comment or dramatic presentation in contravention of guidelines on reporting suicide.
At present I don't trust the combination of legislation and social media companies not to create a solution that doesn't drive people experiencing #mentalhealth difficulties further into the shadows by making individuals discussion of certain experiences taboo.
There's difficult line of nuance to walked with the issue of social media and self harm, given social media may be the only place where individuals can openly discuss their experiences and given many folks carry on their body 'evidence' of their experiences. Algorhythm nightmare
The greatest test case I can think of is media coverage of Richey Edwards from Manic Street Preachers in early 90s who famously self-harmed in presence of music journalists. Would anyone sharing any reference to that story now be at risk of 'unlimited fines'?
I also worry that the current discussion about self harm 'content' and social media treats it as just that 'content'; rather than making a distinction between malicious and unintentional harm. Because those 4channy troll types know very well how to use self material maliciously
*self harm material
I think there might be a number of categories of intention in social media posts about self harm. First might be triggering but unintentional (discussion of own experience). Second might be socially endorsing (own experience plus endorsement). Third might be malicious triggering
Some self harm material is passive (it exists in a social media space and can be found). Some self harm material is targeted (sent as a form of harassment or online violence sending passive material purposely to a vulnerable individual). Some is discursive or shared conversation
The horrible perverse possibility is that people with lived experience of self harm trying to help others minimise damage and risk will likely be seen as sharing the most risky self harm content as this is easy to see as promoting or normalising. This worries me greatly
I think difference is that for people with long term experience of self harm, self harm is already a reality in their lives. I rarely see people talk about this with anything less than responsibly. But those conversations would be hotspots for triggering algorithmic intervention
But challenge is informed, respectful conversations about self harm that take place in social media spaces still generate material that could be seen as dangerous and will create content that has a concentration of self harm related material even if intent is benign and positive
Like I said at the top of thread, I'm conflicted. The constituency I belong to and work within (people with lived experience of mental health difficulty) requires us to be able to talk freely and openly with each other. Social media has made that possible centreformentalhealth.org.uk/blog/writer-re…
I suppose what I'm worried about is concerns of people who don't live with #mentalhealth difficulty around self harm content leads to social media companies just deciding best approach is to algorithmically nuke all of our discussions from orbit pushing us out of public space
@threadreaderapp unroll please
Missing some Tweet in this thread?
You can try to force a refresh.

Like this thread? Get email updates or save it to PDF!

Subscribe to Mark Brown
Profile picture

Get real-time email alerts when new unrolls are available from this author!

This content may be removed anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just three indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!