Michael Veale Profile picture
Feb 4 13 tweets 5 min read
Significant news for the AI Act from the Commission as it proposes its new Standardisation Strategy, involving amending the 2012 Regulation. Remember: private bodies making standards (CEN/CENELEC/ETSI) are the key entities in the AI Act that determine the final rules. 🧵
Firstly, the Commission acknowledges that standards are increasingly touching not on technical issues but on European fundamental rights (although doesn’t highlight the AI Act here). This has long been an elephant in the room: accused private delegation of rule making by the EC.
They point to CJEU case law James Elliot in that respect (see 🖼), where the Court has brought the interpretation of harmonised standards (created by private bodies!) within the scope of preliminary references. Could have also talked about Fra.Bo and Comm v DE.
The EC note governance in these European Standardisation Bodies is outrageous. They point out that in ETSI, which deals with telecoms and more, industry capture is built in society votes are “barely countable” and Member States have “circa 2%” of the vote: the rest is industry.
The Commission has not officially confirmed who will be mandated to make the standards for the AI Act (I’ve heard them lean more towards CEN/CENELEC in slides), but ETSI certainly want to be in on the technology.
Next there’s a big geopolitical change. The Commission propose an amendment in a short revision to the reg that has the effect of excluding non EU/EEA standards bodies from voting on any standards relating to the AI Act. Sorry, @BSI_UK — no more formal influence post Brexit.
This also has the effect of pushing out the limited formal influence of European Civil Society orgs, even the three that are mandated by the Commission and paid to be involved in standardisation, from consumer, social/trade union, and env fields, @anectweet, @ETUI_org, and ECOS.
In order to have a say on the AI Act, the Commission thus now relies on *Member State standardisation bodies* to be sufficiently representative of societal interests that in turn are sufficiently attuned to highly technological and European policy processes. This seems a stretch.
The EC does threaten a Sword of Damocles on european standards bodies: sort your house out and start to fix your wonky and broken governance or we’ll regulate you more directly, but I doubt this will have a significant effect, and this isn’t the first time they’ve done this.
In New Legislative Framework regulations like the AI Act (where standards can be used to demonstrate compliance), the EC can usually substitute standards for delegated acts called Common Specifications, but rarely does.
While EC executive action wouldn’t mean democratic, and a serious coregulatory process would be needed, the common specification process seems better suited to deal with fundamental rights charged areas of the AI Act than delegating to industry captured standards bodies.
The EC proposes to develop a framework for when it uses common specifications or not. This seems an opportunity for the Parliament to push for a inclusive process in building them in the case of standards touching on fundamental rights and freedoms, rather than delegating to ESOs

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Michael Veale

Michael Veale Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @mikarv

Feb 1
Very detailed and wide-ranging decision of the Belgian DPA regarding cookie tracking in relation to (from inference, it's badly anonymised...) the @EDAATweets, the service that runes Your Online Choices (ht @PrivacyMatters) autoriteprotectiondonnees.be/publications/d…
Admittedly, the Chamber at the end says it wasn't really trying to anonymise. Image
So, the EDAA runs a site called "Your Online Choices", an incredibly little used, awkward & archiaic self regulatory initiative of the ad industry to try and claim that people have online choices in the absence of them. This website is linked to by ads, and itself places cookies. ImageImageImage
Read 11 tweets
Nov 30, 2021
B3. The proposal does little to stop the huge pre-emption of any national rules on use of AI, besides the reduction in scope of the AI definition which reduces the pre-empted scope slightly because not absolutely everything can be claimed to be ‘use of software’.
B4. A huge removal of a high risk system is to remove systems modelling and searching through giant crime databases. Likely because unlike many Annex III technologies, these are commonly used in MSs… In theory EC could propose its return one day but wouldn’t hold breath.
B5. The presidency thinks it is solving a great value chain problem by addressing general purpose systems, like APIs sold by Google, Microsoft, OpenAI etc. But it fails hugely here, and these companies will shriek with joy.
Read 16 tweets
Nov 30, 2021
The Council presidency compromise text on the draft EU AI Act has some improvements, some big steps back, ignores some huge residual problems and gives a *giant* handout to Google, Amazon, IBM, Microsoft and similar. Thread follows. 🧵
The Good:
G1. The manipulation provisions are slightly strengthened by a weakening of intent and a consideration of reasonable likelihood. The recital also has several changes which actually seem like they have read our AIA paper, on sociotechnical systems and accumulated harms…
Read 9 tweets
Jul 6, 2021
New 📰: There's more to the EU AI regulation than meets the eye: big loopholes, private rulemaking, powerful deregulatory effects. Analysis needs connection to broad—sometimes pretty arcane—EU law

@fborgesius & I have done it so you don't have to: long 🧵
osf.io/preprints/soca… Demystifying the Draft EU Artificial Intelligence Act In Apr
The Act (new trendy EU name for a Regulation) is structured by risk: from prohibitions to 'high risk' systems to 'transparency risks'. So far so good. Let's look at the prohibitions first.
The Act prohibits some types of manipulative systems. The EC itself admits these have to be pretty extreme — a magic AI Black Mirror sound that makes workers work far beyond the Working Time Directive, and an artificially intelligent Chucky doll. Would it affect anything real?
Read 44 tweets
Jun 1, 2021
Concerned with platforms' power to map & reconfigure the world w/ ambient sensing? I'm *hiring* a 2-year Research Fellow (postdoc) @UCLLaws. Think regulating Apple AirTags (UWB); Amazon Sidewalk (LoRa), and—yes—Bluetooth contact tracing. (please RT!) 1/ atsv7.wcn.co.uk/search_engine/… ImageImage
You'll join a deeply interdisciplinary team of critical privacy engineers (@carmelatroncoso @sedyst); sensor experts (@SrdjanCapkun); epidemiologists and medical devices experts (@marcelsalathe @klausscho); and systems and security whizzes (@gannimo @JamesLarus @ebugnion) 2/
Just as platforms wanted to be the only ones who could sell access to populations based on how they use devices, they want to determine and extract value from how physical space is used and configured. There is huge public value from this knowledge, and huge public risk. 3/
Read 10 tweets
May 27, 2021
Hey Microsoft Research people who think that constant facial emotion analysis might not be a great thing (among others), what do you think of this proposed Teams feature published at CHI to spotlight videos of audience members with high affective ‘scores’? microsoft.com/en-us/research…
Requires constantly pouring all face data on Teams through Azure APIs. Especially identifies head gestures and confusion to pull audience members out to the front, just in case you weren’t policing your face enough during meetings already.
Also note that Microsoft announced on Tuesday that it is opening up its Teams APIs to try to become a much wider platform to eat all remote work, so even if Teams didn’t decide to implement this directly, employers could through third party integration! protocol.com/newsletters/so…
Read 8 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

:(