Significant news for the AI Act from the Commission as it proposes its new Standardisation Strategy, involving amending the 2012 Regulation. Remember: private bodies making standards (CEN/CENELEC/ETSI) are the key entities in the AI Act that determine the final rules. 🧵
Firstly, the Commission acknowledges that standards are increasingly touching not on technical issues but on European fundamental rights (although doesn’t highlight the AI Act here). This has long been an elephant in the room: accused private delegation of rule making by the EC.
They point to CJEU case law James Elliot in that respect (see 🖼), where the Court has brought the interpretation of harmonised standards (created by private bodies!) within the scope of preliminary references. Could have also talked about Fra.Bo and Comm v DE.
The EC note governance in these European Standardisation Bodies is outrageous. They point out that in ETSI, which deals with telecoms and more, industry capture is built in society votes are “barely countable” and Member States have “circa 2%” of the vote: the rest is industry.
The Commission has not officially confirmed who will be mandated to make the standards for the AI Act (I’ve heard them lean more towards CEN/CENELEC in slides), but ETSI certainly want to be in on the technology.
Next there’s a big geopolitical change. The Commission propose an amendment in a short revision to the reg that has the effect of excluding non EU/EEA standards bodies from voting on any standards relating to the AI Act. Sorry, @BSI_UK — no more formal influence post Brexit.
This also has the effect of pushing out the limited formal influence of European Civil Society orgs, even the three that are mandated by the Commission and paid to be involved in standardisation, from consumer, social/trade union, and env fields, @anectweet, @ETUI_org, and ECOS.
In order to have a say on the AI Act, the Commission thus now relies on *Member State standardisation bodies* to be sufficiently representative of societal interests that in turn are sufficiently attuned to highly technological and European policy processes. This seems a stretch.
The EC does threaten a Sword of Damocles on european standards bodies: sort your house out and start to fix your wonky and broken governance or we’ll regulate you more directly, but I doubt this will have a significant effect, and this isn’t the first time they’ve done this.
In New Legislative Framework regulations like the AI Act (where standards can be used to demonstrate compliance), the EC can usually substitute standards for delegated acts called Common Specifications, but rarely does.
While EC executive action wouldn’t mean democratic, and a serious coregulatory process would be needed, the common specification process seems better suited to deal with fundamental rights charged areas of the AI Act than delegating to industry captured standards bodies.
The EC proposes to develop a framework for when it uses common specifications or not. This seems an opportunity for the Parliament to push for a inclusive process in building them in the case of standards touching on fundamental rights and freedoms, rather than delegating to ESOs

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Michael Veale @mikarv@someone.elses.computer

Michael Veale @mikarv@someone.elses.computer Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @mikarv

Nov 20, 2023
How do and should model marketplaces hosting user-uploaded AI systems like @HuggingFace @GitHub & @HelloCivitai moderate models & answer takedown requests? In a new paper, @rgorwa & I provide case studies of tricky AI platform drama & chart a way forward. osf.io/preprints/soca…
Moderating Model Marketplaces: Platform Governance Puzzles for AI Intermediaries. The AI development community is increasingly making use of hosting intermediaries such as Hugging Face that provide easy access to user-uploaded models and training data. These model marketplaces lower technical deployment barriers for hundreds of thousands of users, yet can be used in numerous potentially harmful and illegal ways. In this article, we argue that AI models, which can both `contain' content and be open-ended tools, present one of the trickiest platform governance challenges seen to date. We prov...
@huggingface @github @HelloCivitai @rgorwa There are a growing number of model marketplaces (Table). They can be hosting models that can create clear legal liability (e.g. models that can output terrorist manuals or CSAM). They are also hosting AI that may be used harmfully, and some are already trying to moderate this. Image
@huggingface @github @HelloCivitai @rgorwa Models can memorise content and reproduce it. They can also piece together new illegal content that has never been seen before. To this end, they can be (and some regimes would) equate them with that illegal content. But how would marketplaces assess such a takedown request?
Read 23 tweets
Aug 19, 2022
Int’l students are indeed used to subsidise teaching. High quality undergraduate degrees cost more than £9250 to run (always have in real terms), but were been subsidised by both govs (now rarely) & academic pay cuts. If int’l students capped, what fills the gap @halfon4harlowMP?
Tuition fees are a political topic because they’re visible to students, but the real question is ‘how is a degree funded’? The burden continues to shift from taxation into individual student debt, precarious reliance on int’l students, and lecturer pay.
Universities like Oxford distort the narrative too. College life is largely, often subsidised by the college endowment and assets, by the past. The fact so much of the political class went to a university with a non replicable funding model compounds issues hugely.
Read 4 tweets
Aug 19, 2022
Users of the Instagram app should today send a subject access request email to Meta requesting a copy of all this telemetry ‘tap’ data. It is not provided in the ‘Download Your Information’ tool. Users of other apps in the thread that do this (eg TikTok) can do the same.
Form: m.facebook.com/help/contact/5…
Say you are using Art 15 GDPR to access a copy of data from in-app browsers, including all telemetry and click data for all time. Say it is not in ‘Download your Information’. Link to Krause’s post for clarity. Mention your Instagram handle.
If you have trouble getting it (you will) you can return and ask for tips here, or read our thoughts & regulators’ views on flawed common refusals:
@Jausl00s + me, ‘Researching with Data Rights’: techreg.org/article/view/1…
@EU_EDPB (consultation doc) edpb.europa.eu/system/files/2…
Read 4 tweets
Jul 18, 2022
The Data Protection and Digital Information Bill contains a lot of changes. Some were previewed in the June consultation response. Others weren't. Some observations: 🧵
Overshadowing everything is an ability for the Secretary of State to amend anything they feel like about the text of the UK GDPR through regulations, circumventing Parliamentary debate. This should not happen in a parliamentary democracy, is an abuse of powers, and must not pass.
Article 22, around automated decision-making, is gone, replaced by three articles which in effect say that normal significant, automated decisions are never forbidden but get some already-present safeguards; decisions based on ethnicity, sexuality, etc require a legal basis.
Read 15 tweets
Jul 18, 2022
No legislation envisaged, just v general "cross-sectoral principles on a non-statutory footing". UK gov continues its trend of shuffling responsibility for developing a regulatory approach onto the regulators themselves, while EU shuffles it onto private standards bodies.
Meanwhile, regulators are warned not to actually do anything, and care about unspecified, directionless innovation most of all, as will be clearer this afternoon as the UK's proposed data protection reforms are perhaps published in a Bill.
Read 5 tweets
May 13, 2022
By my calculations, @officestudents' "unexpected" first class degrees model they calc grade inflation with uncritically expects a white, non-mature/disabled female law student to have a 40.5% chance of a First; the same Black student to have 15.4% chance. theguardian.com/education/2022…
The data is hidden in Table 6 of the annex to the report here officeforstudents.org.uk/publications/a… (requires you to add up the model estimates, do inverse log odds) Image
(I also used the 2020-21 academic year but you can choose your own)
Read 6 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us!

:(