, 10 tweets, 3 min read Read on Twitter
Today the Science & Technology Select Committee published its report about regulating social media. Here it is: parliament.uk/business/commi… Couple of thoughts. #Thread
It’s not bad. It says that platforms should have a ‘duty of care’, for kids using their services - such as default high privacy & filtering certain harmful content. Also wants Ofcom to be regulate that duty, so it can check if they’re up to scratch & fine if not. Reasonable, imho
Other positives: mandatory PHSE classes, more data for researchers, tech companies should invest in AI responses to ‘deep fakes’ (first time a Select Committee has brought this up perhaps?). I can get behind all this. (Check this deep fake: )
However, it also repeats the canard that social media companies aren’t doing much on harmful images of children. This is unfair, as me & @akrasodomski explain here: demos.co.uk/project/tech-e… UK are world leaders in this.
Also - and this is a red flag - it suggests that platforms should verify their users’ age. How? Presumably through some identify verification system they control. Do we want these companies having still more information about us? Beware of building this sort of infrastructure.
Plus, we need to be careful about what we are regulating exactly. All companies are becoming tech platforms. Gaming sites - barely mentioned - are also social media in a sense. I’m against a single regulator for that reason. Will be outdated very quickly.
As I explain in this thread, algorithms can’t fix the problem of bad content. Facebook et al should take far more seriously the job of content moderators - must be well paid, trained professional with subject matter expertise.
Politicians increasing just say ‘algorithms can surely fix it!’ in response to any problem, without understanding what they are or how they work. They can help filter a lot but cannot fix the problem of harmful content. We can’t outsource our problems to a machine.
I don’t go for the ‘publishers’ or ‘platform’ debate. Think platforms are in-between & need regulation for expeditious content removal & regulation to check on their behaviour. This report seems to be roughly in that ballpark.
No point complaining all the time and then dismissing all & any policy suggestions as useless. There’s a few things I think are wrong, but some good things in there too. Next stop: the forthcoming white paper on internet harms & we’ll see what they’re taken on. /end
Missing some Tweet in this thread?
You can try to force a refresh.

Like this thread? Get email updates or save it to PDF!

Subscribe to Jamie Bartlett
Profile picture

Get real-time email alerts when new unrolls are available from this author!

This content may be removed anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just three indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!