Big Brother Watch now out. Looking at the dissents, it does not look good for anti-surveillance campaigners: 'with the present judgment the Strasbourg Court has just opened the gates for an electronic “Big Brother” in Europe' hudoc.echr.coe.int/eng?i=001-2100…
and we go live to Strasbourg
Going to post some interesting pieces (not a judgment summary!) here. Firstly, that Contracting States can transfer Convention-compliant bulk intercept material to non-Contracting states that only have minimal protections (e.g. on keeping it secure/confidential). AKA the USA.
Court also flummoxed by the clunkiness of Microsoft Word's cross-reference function.
The Court in general throughout this judgment adopts a mode of 'if it's technically difficult, we have to adapt our tests'. They do this on notification and on bearer selection/external communication/foreseeability, for example.
The GC does not waver from majority view of the Chamber in not requiring pre-authorisation/oversight to be by judges in their roles as such, instead focussing on possibility of sufficiently independent bodies that have some legal power to stop a process (current sit. w/ IPCO).
The Court makes a bigger specific deal than the Chamber does of ensuring the 'categories' of selectors that can be used are part of the initial authorising warrant. But they are not specific on how granular these 'categories' are or should be (names? names of types of people?).
In effect, w a result v similar to the Chamber judgment, the Court looks at bulk surveillance in 'stages', and does not believe that the obtaining of mass communication & metadata is intrinsically a problem. That was ultimately what they had the chance to revisit; they did not.
We see 'stages' pop up all over in data law. In EU data DP too, a 'stages' approach is appearing regarding tracking, ignoring the effect of the entire system. Data might look like it flows, but information's journey in practice is more like building architecture than a river.
We see a failure in information law across the board of dealing with practices which construct systems. They get looked at atomistically, in isolation, and courts end up missing the forest for the trees.
Indeed, when systems have emergent properties harming human rights, but individ. pieces seem defensible, Courts in a tricky situation. Strike down the whole system? What alternative comes in its place? Courts aren't policymakers; can't design new architectures. So they're stuck.
This happens all over, not just in data law. For example, @vmantouvalou has been writing about structural injustice forcing and trapping workers in exploitation. Courts have trouble providing oversight over complex sociotechnical systems. papers.ssrn.com/sol3/papers.cf…
This confusion can work both ways. In the BBW case, Js Lemmens, Vehabović and Bošnjak point out that the Court is taking a 'global assessment' when it is convenient, which also undermines individual safeguards too. Unhappy medium.
They further hit back saying that the Court is allowing independent oversight of only vague criteria ('grounds', circumstances') for selection but then leaves the rest up to internal oversight. Again, falling between the cracks in an unhappy global-local medium.
Judge Pinto de Albuquerque in his own dissent has further harsh words for the Court, stating that it relied on 'educated guesses' about the way the UK regime operated; did not insist on detailed evidence as the IPT got; and that the judgment was likely not 'factually sound'.
He really does not let up in his dissent. The Court's attitude to ignoring evidence is 'incomprehensible', particularly in ignorance of how bulk powers aimed at individuals within the territorial jurisdiction of the State, and how it is used for much broader purposes than crime.
What I am personally concerned about is that all this came before the very widespread use of machine learning in these systems (although it was used). Could a warrant specify the 'results of a risk scoring algorithm based on metadata' as its subjects for comms examination?
Twitter gets a mention too in the dissents, as an eg of the 'absurd' situation the majority vote leaves a Twitter message in — GCHQ simply grab a copy of a UK resident's DMs via the NSA and are exempt from the need for independent programme authorisation allowing them to do so.
(Ofc you can also say all is irrelevant bc if the ECtHR ruled against the UK’s intelligence system, UK would leave ECHR rather than stop bulk powers. But ECtHR go even less far; they say it’s NEEDED in the 21st C despite only a tiny handful of Euro states operating such regimes)
Also important to state. This whole legal process w the many applicants (Privacy International, Liberty, Open Rights Group, BBW, etc) did lead to change and oversight, but of the legitimising type. Snowden leaks did not blow open the cross-border transfer enough. They survive.
I am also regularly reminded of this sentiment about one of the main actors in tirelessly bringing and supporting these cases, @benjaffey, who (despite this disappointing judgment) has achieved a huge amount in so many areas of UK surveillance law.
thank you for all the nice comments about the @BBCNewsnight interview! I tried to communicate infrastructure's importance. if new to you, here is a 🧵of some (not all!) academic work by others which highlights the power of technical infrastructure (rather than eg data).
The Luca QR code Covid app, (for-profit system flogged to 🇩🇪 Länder) has been compromised (in a way that the official CoronaWarnApp’s QR system can’t be), through a website that lets you check in any phone number to wherever you want—even regional prime ministers! 🧵 on the saga:
While hard to believe, Luca was adopted by Länder after huge lobbying from hospitality who convinced them that a hasty app w a 6 mo free trial for venues & big cost for health authorities would i) allow reopening, ii) help Länder win upcoming 🗳 by making national gov look slow
Luca’s slick PR campaign, where they became mostly known to health authorities by aggressive marketing w celebrities, meant that no-one discussed or scrutinised the technical details. Politicians have even admitted this, and DPAs accepted statements of ‘encryption’ as secure.
Lots of selected thoughts on the draft leaked EU AI regulation follow. Not a summary but hopefully useful. 🧵
Blacklisted art 4 AI (except general scoring) exempts include state use for public security, including by contractors. Tech designed to ‘manipulate’ ppl ‘to their detriment’, to ‘target their vulnerabilities’ or profile comms metadata in indiscriminate way v possible for states.
This is clearly designed in part not to eg further upset France in the La Quadrature du Net case, where black boxes algorithmic systems inside telcos were limited. Same language as CJEU used in Art 4(c). Clear exemptions for orgs ‘on behalf’ of state to avoid CJEU scope creep.
Updates announced to England and Wales presence tracing (QR checkin) app functionality. (short thread translating what these things mean)
1st point isn't actually a change to the app or regulations. The regulations have always required everyone individually to scan in if they used the app, but allowed a 'lead member' to represent a group of up to six. This would abolish the latter. More: michae.lv/law-of-qr/
Venue history upload is controversial. The DPIA has not yet been updated to show the privacy-preserving method that the press release claims to use. May also make people wary to upload venue data. Cannot analyse without further information.
We outlines current approaches to accessing enclosed data, and argue that GDPR transparency, access, portability rights can be a powerful bottom-up, adversarial data access tool, if used well.
We outline the nature of those transparency provisions for those unfamiliar, and show how they can be used, elaborating on legal, ethical and methodological challenges — a bit like a mini-manual. A lot more could be said — but we hope this helps researchers make a good start.