My Authors
Read all threads
Some thoughts on the NHSX Datastore DPIA that was published, disappeared, and now reappeared at england.nhs.uk/publication/da… (Follow @owenboswarva to keep track of where it was last seen.)
In earlier thread I talked about three data streams:
- app,
- track & trace,
- dashboard;
two purposes:
- tracking ill & suspected & contacts
- planning overall response.

This is dashboard for planning.

App data NOT mentioned. Odd: serves planning!
Data that goes into the dashboard: listed in DPIA "but not exhaustive", also link to england.nhs.uk/contact-us/pri….

Includes indidividual level health data, some but not all pseudonymised, not app data, various non-personal data.
Cover page england.nhs.uk/contact-us/pri… lists more companies besides Palantir, MS, Google, Faculty, whose contracts have been published but who forced these promises "more soon" so maybe they were already expecting McKinsey, Deloittes (sic), ANS Group contracts
Screening questions:

Is this new tech e.g. AI? "No but it is expected that new algorithms will be created to support targeted analysis." Hmmm. algorithm creation = ???

Difficulties in ensuring data rights? "No". But informing data subjects impossible due to haste.
But anyway, large scale processing of special category data so DPIA compulsory and continue.

Can it be done without personal data? "No" - I agree, but the answer is tautological. We want to do this, it involves personal data, so we can't do without.
Is NHSE under a legal obligation to carry out the processing? "Yes". So brings in 6(1)(c) just in case 6(1)(e) wasn't enough.
This explains why the COPI notice gov.uk/government/pub… doesn't just allow but *requires* processing.
Will the processing involve data about racial origin? "No".

That's the safe answer in terms of data protection but it also seems to preclude planning to address the disproportionate effect of COVID on BAME people, as per the PHE report that was finally published partially.
This bit gets fishy.

"Would it be helpful to seek advice from
independent experts (clinicians, security
experts, ethicists etc.) where their
specialist knowledge would be useful in
understanding and managing privacy
risks?"

(Answer always "yes", but ...)
Answer: "Subject matter experts are involved in
ensuring that the processing meets safe, efficient and effective standards."

Missing the word "independent". Crucially bad.

NHSX have an Ethics Advisory Board. Not involved.
"Will any other stakeholder(s) (whether internal or external) need to be consulted about the proposed processing (e.g. NHSE Central team, Public Health England, NHS Digital, the
Office for National Statistics)?"

Answer: NHSX will monitor everything carefully.

That's at 3x bad.
(at *least*)
1. Not answering the question.
2. Not taking consultation seriously, once again.
3. Outrageous for NHSX to do a huge data project like this without involving existing expertise within NHS including guardians of the input datasets.
It's "innovate and break things".
See a link below to the security risk analysis. Works as well in Twitter as in the pdf.
As a security bod, I'd say there's an argument for eliding the specific controls, but much less so for the risks. Image
App data isn't listed as data source, but there's some odd text, under the Access heading.

Pivotal (app developer) will get temporary read access and "Following this, an SQL account will be created which the application will use to read/write the database for certain task(s)."
Odd text because "the application" is something that is intended to run in millions of instances on phones. Are they all between them going to share an SQL account to write directly into the Palantir NHSX dashboard? Not very sensible or likely.
Misdescribed information flow.
Summarising thoughts then.

1. This is a DPIA for the data store, not for the full dashboard. Faculty only gets a casual mention.
This limits the data processing to pseudonymisation of some input streams and puttting them all together.
Palantir is the only processor mentioned.
The "algorithms created" is the main thing that escapes this artificial limited scope.

But overall it is deceptive. We don't have a DPIA for what is going to be done with the data. Will it follow? Is it being hidden, like this one from "April 2020"?
I wondered whether the "No" to the screening question on "making decisions or taking actions against individuals in ways which can have a significant impact on them?" was right. But it is: databases don't take decisions.
2. Consultation has been broadly ignored. Not with independent external experts, not with data subjects, not even with organisations within NHS that have been processing large medical data sets for a lot longer than this year's new baby NHSX.
Leaving it at that for now.
Correction on what I said on "data rights" above: haste precluded consultation, not notification. I'm sure @RDBinns @mikarv @Jausl00s could tell you access rights to pseudonymised datasets aren't easy anyway ...
See for more on the odd text re: app developers, "admin account" with read-only access, who the pentesters might be, and "app" data.

Also being told that the list of input data bases as no longer visible was much longer than in england.nhs.uk/contact-us/pri…
Revisiting thread after talking to @journoandrea and others, and finally reading the Palantir contract (see my short thread on that in ). More summative comments.
3. Differences between DPIA and contract with Palantir (only data processor in DPIA) matter...
The contract is for much wider processing than just setting up a data store after some more pseudonymisation. So it validates my view that the DPIA is artificially narrow in scope. "Data analytics", "Support tracking, surveillance, and reporting" not in DPIA but in contract.
The "contradiction" between contract enabling data export outside EEA, and processing other special category data besides health, may be apparent only. The DPIA should be more recent and may reflect a decision to limit some of the riskier aspects of contract.

But it may be real.
4(th summative remark on DPIA). Coming back to the picture below. My comment above is far too generous. Not just security risks elided behind unclickable picture, it's ALL of them. Image
As a consequence: this DPIA as published is a cheat.

The heart of a DPIA,
"What could possibly go wrong?",
including attacks, failing design assumptions, and function creep, is not being shared.

I dedicate this tweet to @tim2040 who first taught me & my students on (D)PIA.
See this by @journoandrea digitalhealth.net/2020/06/contac… on the data feed from app to dashboard.

Don't let the news about app in the long grass detract from ongoing plans with the data dashboard. With dubious partners.
Update: the risk analysis spreadsheet for COVID Data Store DPIA is now available (last site update 15-6). england.nhs.uk/wp-content/upl…
7 risk areas, worst G/A: data outside EEA, misuse by users, processor agreements, security, encryption, testing, re-identification.
So, does this answer "What could possibly go wrong?" adequately? Not really - 1st order effects like misuse and disclosure, but not consequences of that for rights and freedoms; not function creep; not the risks to individuals as listed in the template xls.
Missing some Tweet in this thread? You can try to force a refresh.

Keep Current with Eerke Boiten

Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!