⏰BREAKING: HMG quietly rolled out a scheme to seize - & sell access to - the health data of every man, woman, and child in England. Patients weren’t asked.
Legal issues? We think so. So we helped @JustTreatment send a legal letter. @madhumita29 in @FT: ft.com/content/9fee81…
In many ways, this fight is about the future of the NHS.
The NHS sits atop the most valuable trove of health data in the world. Why? For years, your GP record has been stored not in doctor's scrawled notes but using GP codes, which a computer can 'read'. (h/t @marcus_baw)
This makes NHS health data of *massive* interest to researchers. So far, so good - we all want the NHS to come out of the pandemic stronger.
But there are issues: who gets access? On what terms? Who can patients trust? And who benefits – us, the NHS, or private companies?
We’ve been here before.
Groups like @medConfidential – who sounded the alarm here – beat back the last flawed effort to centralise GP data.
Look up ‘care.data’. HMG didn’t assure patients data wouldn’t be accessed for profit - so millions opted out. It collapsed.
We’re worried the government’s taken the wrong lesson from care.data – not to go carefully, but to rush this through and hope people don’t notice. That's undemocratic - and it betrays the trust people put in the NHS.
We think hiding from this debate is silly and may be unlawful.
Patients are open to the use of their health data – to benefit the NHS, in ways they can trust (ie without toxic firms like Palantir). They're less keen on Big Tech and Big Pharma profiting off the NHS.
We truly hope it won’t be necessary to litigate. The gov't could see sense and talk to folks. But it’s not OK to slip this by without a full debate about what we want done with our health data – and what we don’t.
Special mention to our friends at @TBIJ who obtained key emails at the start of this case, showing Palantir wooing NHS execs and UK officials over Davos chats and watermelon cocktails.
2/ We love @signalapp. (We used to use @WhatsApp lots until Facebook took our data!)
Disappearing messages are great for us, the citizens. They're not appropriate for officials.
Why, you ask?
3/ Simple. The Public Records Act 1958 requires officials to review every message about the formulation of government policy to perform a legal check – in case it needs to be archived for public release.
If the message explodes, the check can never happen.
Last year, the government signed the largest data deal in history between the NHS and giant tech firms – like Palantir and Faculty, a British AI start-up involved with the Vote Leave campaign.
We took legal action and, with your support, forced them to publish the contracts.
Breaking: More than 200 @Facebook content moderators (and supporting employees) from across the US and Europe have published a demand for better protections - and full employment rights - during the pandemic.
This is the biggest joint international effort of Facebook content moderators yet. We are proud to have worked with them to do it. Many more moderators in other sites wanted to sign, but were too intimidated by Facebook - these people are risking their livelihood to speak out.
They've taken that risk because Facebook has risked their lives by forcing content moderators back to the offices. Full FB staff are allowed to work from home - many mods have to come in, even with live Covid cases on the floor, as @samfbiddle reported: theintercept.com/2020/10/20/fac…
NEW: last night, @Ofqual pulled from its website its guidance about the algorithmic grading appeals process. We’re posting it and a letter we've sent this AM because we think it opens up a new ground of challenge. (tl;dr – it basically works like this) foxglove.org.uk/news/grading-a…
First, the position was that the algorithm was sacrosanct and more reliable than mock exams or teachers’ assessed grades.
Ofqual then changed tack, saying mock exams could be more reliable than the algorithm.
THREAD: hundreds of you emailed in anguish about the unfair grading algorithm. This case affects you all, and we are asking for crowdfunding support (gofundme.com/f/fairgrades20…), so we wanted to post our letter as soon as possible. It’s at foxglove.org.uk/news/grading-a…. Key points:
Ofqual just hasn’t got the power to mark students based on an algorithm which actively ignores individual student performance for cohorts over 15. In law-speak, this is called ‘ultra vires’.
The algorithm is irrational – it treats teachers’ rankings as sacrosanct, yet teachers’ assessed grades are treated as either everything (under 5) or nothing (over 15).