Concerned with platforms' power to map & reconfigure the world w/ ambient sensing? I'm *hiring* a 2-year Research Fellow (postdoc) @UCLLaws. Think regulating Apple AirTags (UWB); Amazon Sidewalk (LoRa), and—yes—Bluetooth contact tracing. (please RT!) 1/ atsv7.wcn.co.uk/search_engine/…
Just as platforms wanted to be the only ones who could sell access to populations based on how they use devices, they want to determine and extract value from how physical space is used and configured. There is huge public value from this knowledge, and huge public risk. 3/
This isn't about collecting all the data. This is about controlling crucial infrastructures. Thanks to vertical integration, firms have the power to coerce the world by witholding or extending other crucial infrastructures. Using cryptography, they can do this confidentially. 4/
How do we regulate this? Try bluntly, and fiefdoms like Apple scream in pain that you'll unleash a world of insecure malware, and NGOs that you are killing e2e encryption. Try it softly, and power outmaneouvers all best efforts. In all ways, legitimate decision-making is hard. 5/
Yet in this pandemic (and not just now) there has been huge demand to manage or learn about the physical and social world in digital ways. That demand will not go away. This postdoc is about how we live with and manage this tension through law, policy and technology. 6/
Deadline: 29 June. It's not just for lawyers, for tech policy people more broadly. Not even just for ppl w doctorates (if you have sig. writing & research experience elsewhere), although PhDs are generally expected. PDF of job descrip and person spec filesv7.wcn.co.uk/admin/fairs/ap…
.@UCLLaws is an absolutely fantastic place full of really great, smart, friendly people. It's a *suspiciously* good part of academia and I still look around warily in case it's a trap (no evidence yet). The project is funded by @FondationBotnar. 8/
Please get in touch if you have any questions. And please do share this job broadly! There is a complementary PhD position that is closing in a few weeks too 9/
Hey Microsoft Research people who think that constant facial emotion analysis might not be a great thing (among others), what do you think of this proposed Teams feature published at CHI to spotlight videos of audience members with high affective ‘scores’? microsoft.com/en-us/research…
Requires constantly pouring all face data on Teams through Azure APIs. Especially identifies head gestures and confusion to pull audience members out to the front, just in case you weren’t policing your face enough during meetings already.
Also note that Microsoft announced on Tuesday that it is opening up its Teams APIs to try to become a much wider platform to eat all remote work, so even if Teams didn’t decide to implement this directly, employers could through third party integration! protocol.com/newsletters/so…
Big UK GDPR case: Court of Appeal rules in favour of the @OpenRightsGroup@the3million: Immigration Exemption to SARs is incompatible with Art 23 GDPR. This is a new exemption from 2018 the Home Office uses to withhold data rights info in 59% of cases. bailii.org/ew/cases/EWCA/…
Warby LJ is sympathetic to CJEU jurisprudence that 'the legal basis which permits the interference with those rights must itself define the scope of the limitation', noting that the Immigration E is highly discretionary, and the DPA18 does not contain limits on its scope.
However, Warby LJ judges the case more narrowly on a reading of Article 23(2), which permits Member States to restrict a GDPR right for public interest only if a 'legislative measure' contains limiting provisions.
Big Brother Watch now out. Looking at the dissents, it does not look good for anti-surveillance campaigners: 'with the present judgment the Strasbourg Court has just opened the gates for an electronic “Big Brother” in Europe' hudoc.echr.coe.int/eng?i=001-2100…
and we go live to Strasbourg
Going to post some interesting pieces (not a judgment summary!) here. Firstly, that Contracting States can transfer Convention-compliant bulk intercept material to non-Contracting states that only have minimal protections (e.g. on keeping it secure/confidential). AKA the USA.
thank you for all the nice comments about the @BBCNewsnight interview! I tried to communicate infrastructure's importance. if new to you, here is a 🧵of some (not all!) academic work by others which highlights the power of technical infrastructure (rather than eg data).
The Luca QR code Covid app, (for-profit system flogged to 🇩🇪 Länder) has been compromised (in a way that the official CoronaWarnApp’s QR system can’t be), through a website that lets you check in any phone number to wherever you want—even regional prime ministers! 🧵 on the saga:
While hard to believe, Luca was adopted by Länder after huge lobbying from hospitality who convinced them that a hasty app w a 6 mo free trial for venues & big cost for health authorities would i) allow reopening, ii) help Länder win upcoming 🗳 by making national gov look slow
Luca’s slick PR campaign, where they became mostly known to health authorities by aggressive marketing w celebrities, meant that no-one discussed or scrutinised the technical details. Politicians have even admitted this, and DPAs accepted statements of ‘encryption’ as secure.
Lots of selected thoughts on the draft leaked EU AI regulation follow. Not a summary but hopefully useful. 🧵
Blacklisted art 4 AI (except general scoring) exempts include state use for public security, including by contractors. Tech designed to ‘manipulate’ ppl ‘to their detriment’, to ‘target their vulnerabilities’ or profile comms metadata in indiscriminate way v possible for states.
This is clearly designed in part not to eg further upset France in the La Quadrature du Net case, where black boxes algorithmic systems inside telcos were limited. Same language as CJEU used in Art 4(c). Clear exemptions for orgs ‘on behalf’ of state to avoid CJEU scope creep.