The Act (new trendy EU name for a Regulation) is structured by risk: from prohibitions to 'high risk' systems to 'transparency risks'. So far so good. Let's look at the prohibitions first.
The Act prohibits some types of manipulative systems. The EC itself admits these have to be pretty extreme — a magic AI Black Mirror sound that makes workers work far beyond the Working Time Directive, and an artificially intelligent Chucky doll. Would it affect anything real?
Concerned with platforms' power to map & reconfigure the world w/ ambient sensing? I'm *hiring* a 2-year Research Fellow (postdoc) @UCLLaws. Think regulating Apple AirTags (UWB); Amazon Sidewalk (LoRa), and—yes—Bluetooth contact tracing. (please RT!) 1/ atsv7.wcn.co.uk/search_engine/…
Just as platforms wanted to be the only ones who could sell access to populations based on how they use devices, they want to determine and extract value from how physical space is used and configured. There is huge public value from this knowledge, and huge public risk. 3/
Hey Microsoft Research people who think that constant facial emotion analysis might not be a great thing (among others), what do you think of this proposed Teams feature published at CHI to spotlight videos of audience members with high affective ‘scores’? microsoft.com/en-us/research…
Requires constantly pouring all face data on Teams through Azure APIs. Especially identifies head gestures and confusion to pull audience members out to the front, just in case you weren’t policing your face enough during meetings already.
Also note that Microsoft announced on Tuesday that it is opening up its Teams APIs to try to become a much wider platform to eat all remote work, so even if Teams didn’t decide to implement this directly, employers could through third party integration! protocol.com/newsletters/so…
Big UK GDPR case: Court of Appeal rules in favour of the @OpenRightsGroup@the3million: Immigration Exemption to SARs is incompatible with Art 23 GDPR. This is a new exemption from 2018 the Home Office uses to withhold data rights info in 59% of cases. bailii.org/ew/cases/EWCA/…
Warby LJ is sympathetic to CJEU jurisprudence that 'the legal basis which permits the interference with those rights must itself define the scope of the limitation', noting that the Immigration E is highly discretionary, and the DPA18 does not contain limits on its scope.
However, Warby LJ judges the case more narrowly on a reading of Article 23(2), which permits Member States to restrict a GDPR right for public interest only if a 'legislative measure' contains limiting provisions.
Big Brother Watch now out. Looking at the dissents, it does not look good for anti-surveillance campaigners: 'with the present judgment the Strasbourg Court has just opened the gates for an electronic “Big Brother” in Europe' hudoc.echr.coe.int/eng?i=001-2100…
and we go live to Strasbourg
Going to post some interesting pieces (not a judgment summary!) here. Firstly, that Contracting States can transfer Convention-compliant bulk intercept material to non-Contracting states that only have minimal protections (e.g. on keeping it secure/confidential). AKA the USA.
thank you for all the nice comments about the @BBCNewsnight interview! I tried to communicate infrastructure's importance. if new to you, here is a 🧵of some (not all!) academic work by others which highlights the power of technical infrastructure (rather than eg data).
The Luca QR code Covid app, (for-profit system flogged to 🇩🇪 Länder) has been compromised (in a way that the official CoronaWarnApp’s QR system can’t be), through a website that lets you check in any phone number to wherever you want—even regional prime ministers! 🧵 on the saga:
While hard to believe, Luca was adopted by Länder after huge lobbying from hospitality who convinced them that a hasty app w a 6 mo free trial for venues & big cost for health authorities would i) allow reopening, ii) help Länder win upcoming 🗳 by making national gov look slow
Luca’s slick PR campaign, where they became mostly known to health authorities by aggressive marketing w celebrities, meant that no-one discussed or scrutinised the technical details. Politicians have even admitted this, and DPAs accepted statements of ‘encryption’ as secure.
Lots of selected thoughts on the draft leaked EU AI regulation follow. Not a summary but hopefully useful. 🧵
Blacklisted art 4 AI (except general scoring) exempts include state use for public security, including by contractors. Tech designed to ‘manipulate’ ppl ‘to their detriment’, to ‘target their vulnerabilities’ or profile comms metadata in indiscriminate way v possible for states.
This is clearly designed in part not to eg further upset France in the La Quadrature du Net case, where black boxes algorithmic systems inside telcos were limited. Same language as CJEU used in Art 4(c). Clear exemptions for orgs ‘on behalf’ of state to avoid CJEU scope creep.
Updates announced to England and Wales presence tracing (QR checkin) app functionality. (short thread translating what these things mean)
1st point isn't actually a change to the app or regulations. The regulations have always required everyone individually to scan in if they used the app, but allowed a 'lead member' to represent a group of up to six. This would abolish the latter. More: michae.lv/law-of-qr/
Venue history upload is controversial. The DPIA has not yet been updated to show the privacy-preserving method that the press release claims to use. May also make people wary to upload venue data. Cannot analyse without further information.
We outlines current approaches to accessing enclosed data, and argue that GDPR transparency, access, portability rights can be a powerful bottom-up, adversarial data access tool, if used well.
We outline the nature of those transparency provisions for those unfamiliar, and show how they can be used, elaborating on legal, ethical and methodological challenges — a bit like a mini-manual. A lot more could be said — but we hope this helps researchers make a good start.
Core to the DMA is the idea of "core platform services" and providers thereof, listed here and defined either within the reg or in previous regs. Big and powerful providers of these are in scope, basically.
The juicy parts of the DMA are Articles 5 and 6. These contain obligations for gatekeepers in relation to core services. Art 6 obligations can be further specified by the EC through implementing acts.
Today's Online Harms consultation response is perhaps the first major UK divergence from a big principle of EU law not tied to Brexit directly: it explicitly proposes a measure ignoring the prohibition on requiring intermediaries like platforms to generally monitor content.
the e-Commerce Directive art 15 prohibits member states from requiring internet intermediaries to actively look for illegal content; this is because the awareness would make them liable.
The Online Harms White Paper roughly kept with this, indicating that automatic detection systems were an approach platforms could use, but they would not be required to. Consultation responses (unsurprisingly) agreed.
The original was a triple whammy of hubris: wouldn’t work abroad, wouldn’t work technologically on platforms, centralisation open for abuse and function creep.
This version has much better foundations.
I understand mistrust that may linger — but please do try this new one.
We’ve also learned plenty about platforms. If governments want the citizens to be able to run arbitrary code on mobile devices, making use of all sensors, they’ll need the law to crack open walled gardens. theguardian.com/commentisfree/…
I suspect students in England will make a very large number of subject access requests under the GDPR to schools from tomorrow for their teacher-estimated grade as well as rank-order in the class — information which will likely have determined their university entrance. 1/
There is a relevant exemption/delay provision in the Data Protection Act 2018 sch 2 para 25 for exam scripts, but this only pushes the deadline to a minimum of 22 September 2020. The ICO has confirmed this. ico.org.uk/global/data-pr…
The only time I can see a plausible ground for this grade to be refused is where the rank order reveals data about others, such as in classes of 2 or 3 (wow). Even then, no presumption against disclosure (see DB v General Medical Council  EWCA Civ 1497).
Looks like the Court agrees with @maxschrems - it is for DPAs to strike down SCCs with certain countries, rather than throwing the mechanism itself out, and the Court decides to answer the Privacy Shield questions (the AG said they did not need to), and strikes it down.
SCCs now haunted by the question of how an underfunded DPA examines all of a third country’s laws and assessed whether SCCs remain valid, when they can’t even take complaints effectively in their own legal system.
national parliamentary committee can be public authority & data controller says CJEU.
clearly some v strange bg to this case though, as DE admin court referred a 2nd q doubting its own ability to refer under TFEU 267 due to general lack of independence curia.europa.eu/juris/document…
in headache inducing logic typical of art 267, CJEU says they are independent so can refer the DP question, but that the independence question is technically inadmissible because isn’t necessary to answer the issue in the main proceedings, so it says it actually never answered it
anyway besides the strange act of self-doubt which appears to be about the appointment of temporary judges & the ministry of justice’s IT support of the ct computers, the case is generally unremarkable other than to say the definition of public authority is wide & eu law applies
the tracetogether token, and the new v of the app, is the *opposite of anonymous*. users input their identity card numbers before they can use it. every ping resolves to this. this risks becoming an infrastructure for quarantine control. bbc.co.uk/news/business-…
it doesn’t matter that they’re not internet connected. centralised systems rely on others, including those potentially operating hidden beacons in supermarkets, public transit, uploading and using data you have emitted. your data is in other devices, not (just) your own.
if someone tries to sell you a corona badge or dongle that’s doesn’t connect to the internet at all, the chances are it is basically a system pinging out a decryptable version of your passport number. function creep follows.
Turns out app doesn't work on iPhones! Korski says you would have only found this out after extensive testing. Does this stand up? No ...
... not only was it widely known and discussed (I flagged it by email on 11 April), but *before the Isle of Wight study*, @TheRegister ran an article detailing the problem theregister.com/2020/05/05/uk_…. And not only that...
Online/semi-online coronatimes legal educators! Angela Daly @nidhalaigh & colleagues put up a *really useful* pre-print on legal teaching activities using wikis based on their experience at CUHK. Making students produce together. Inspiring & practical. papers.ssrn.com/sol3/papers.cf…