📢NEW📢: we've won the Palantir case!

At last, officials admit YOU have a right to a say in long-term deals between the NHS and US tech firms like Palantir.

It only took us a year, thousands of you speaking up, and two court cases with @openDemocracy to get there!
Here’s our write-up by @martyftz and @cori_crider.

They’ve hit pause – they won’t extend Palantir’s NHS role beyond Covid without stopping to consider our rights, and ask.

This is a major step. NHS now says they’ll consult the public.

opendemocracy.net/en/ournhs/weve…
Special mention to our friends at @TBIJ who obtained key emails at the start of this case, showing Palantir wooing NHS execs and UK officials over Davos chats and watermelon cocktails.

thebureauinvestigates.com/stories/2021-0…
And hats off to our amazing barristers, who worked so, so hard on this - Victoria Wakefield QC of @brickcourt, Julianne Kerr Morrison of @moncktonlaw, and Jennifer MacLeod, also at Brick. 🥰
The future of the NHS matters. The NHS is sitting atop a massive £10bn/a year trove of health data. US tech firms like Palantir hope to access or manage it for the long term.

Hancock’s plans for a revamp could build a future-fit NHS, or clear the path for US tech cos to profit.
We need an NHS we can trust. That means no access to data people don’t know about and agree to (the datasets fed into the datastore are STILL redacted).

It means no shady business partners (Palantir’s main clients are the CIA, cops, US border forces, and now the Home Office).
And it means no deals like this without getting our permission.

Palantir snuck in via emergency COVID-related contracts - now they’re trying to weasel their way into the future of our nation’s healthcare system.
Palantir fails the trust test. Their founder and chair is far-right Trump donor Peter Thiel, whose stated agenda is to extend his company’s reach across government.

nymag.com/intelligencer/…
We still have work to do. By standing together, we’ve slowed Palantir’s roll – but they’ll be back for more.

There are good, trustworthy ways to unlock health data for the NHS. Or we could let Google & Palantir in. Which will it be? Have your say.

foxglove.org.uk/get-involved

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Foxglove

Foxglove Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @Foxglovelegal

29 Mar
🔥NEW: You can’t run a government by disappearing message.

1/ When @BorisJohnson texts @MattHancock or @pritipatel over WhatsApp or Signal, he may be breaking UK law.

We've teamed up with @allthecitizens for a legal action.@jimwaterson has the story:

theguardian.com/law/2021/mar/2…
2/ We love @signalapp. (We used to use @WhatsApp lots until Facebook took our data!)

Disappearing messages are great for us, the citizens. They're not appropriate for officials.

Why, you ask?
3/ Simple. The Public Records Act 1958 requires officials to review every message about the formulation of government policy to perform a legal check – in case it needs to be archived for public release.

If the message explodes, the check can never happen.
Read 8 tweets
24 Feb
🚨 NEW: We’ve filed a lawsuit with @openDemocracy defending YOUR right to a say about data deals between the NHS and Big Tech.

@Rowlsmanthorpe has the story, for @SkyNews.

Saddle up, people! 🏇 [THREAD] -

news.sky.com/story/nhs-faci…
Throughout the pandemic, we’ve seen private firms creep into our public institutions at a shocking rate – through chummy, opaque backdoor deals.

One such deal was with @PalantirTech - a shady data firm that mainly works with the CIA, US cops, and ICE.

opendemocracy.net/en/ournhs/cont…
Last year, the government signed the largest data deal in history between the NHS and giant tech firms – like Palantir and Faculty, a British AI start-up involved with the Vote Leave campaign.

We took legal action and, with your support, forced them to publish the contracts.
Read 8 tweets
18 Nov 20
Breaking: More than 200 @Facebook content moderators (and supporting employees) from across the US and Europe have published a demand for better protections - and full employment rights - during the pandemic.

foxglove.org.uk/news/open-lett…
This is the biggest joint international effort of Facebook content moderators yet. We are proud to have worked with them to do it. Many more moderators in other sites wanted to sign, but were too intimidated by Facebook - these people are risking their livelihood to speak out.
They've taken that risk because Facebook has risked their lives by forcing content moderators back to the offices. Full FB staff are allowed to work from home - many mods have to come in, even with live Covid cases on the floor, as @samfbiddle reported: theintercept.com/2020/10/20/fac…
Read 9 tweets
16 Aug 20
NEW: last night, @Ofqual pulled from its website its guidance about the algorithmic grading appeals process. We’re posting it and a letter we've sent this AM because we think it opens up a new ground of challenge. (tl;dr – it basically works like this) foxglove.org.uk/news/grading-a…
First, the position was that the algorithm was sacrosanct and more reliable than mock exams or teachers’ assessed grades.
Ofqual then changed tack, saying mock exams could be more reliable than the algorithm.
Read 5 tweets
14 Aug 20
THREAD: hundreds of you emailed in anguish about the unfair grading algorithm. This case affects you all, and we are asking for crowdfunding support (gofundme.com/f/fairgrades20…), so we wanted to post our letter as soon as possible. It’s at foxglove.org.uk/news/grading-a…. Key points:
Ofqual just hasn’t got the power to mark students based on an algorithm which actively ignores individual student performance for cohorts over 15. In law-speak, this is called ‘ultra vires’.
The algorithm is irrational – it treats teachers’ rankings as sacrosanct, yet teachers’ assessed grades are treated as either everything (under 5) or nothing (over 15).
Read 8 tweets
12 Aug 20
📢Potential new case Klaxon!📢

Thousands of A-level students get their “results” tomorrow. Covid-19 meant exams were cancelled. So, instead, their grades will in many cases have been generated by a hastily-built government algorithm.

We’ve got concerns about this algorithm.
Firstly, as is so often the case, there’s a lack of transparency about what this algorithm does and how it works.

This is unacceptable. Students have a right to understand how decisions which affect them are made. And algorithmic decision-making needs to be open to scrutiny.
Secondly, it appears the algorithm grades schools rather than students: “Where a subject has more than 15 entries in a school, teachers’ predicted grades will not be used”.

So an individual student’s life chances hang on an estimate based on their school’s historic performance.
Read 10 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!