Jennifer Cobbe Profile picture
Apr 28 18 tweets 4 min read Twitter logo Read on Twitter
FAccT paper with @mikarv and @jatinternet - on accountability in algorithmic supply chains

We explore the dynamics of AI supply chains, their implications for accountability, and why we need to understand their legal and political economic context

Link: papers.ssrn.com/sol3/papers.cf…
A lot of algorithmic accountability work is based (implicitly or otherwise) in trying to understand the actions of a single organisation which makes and deploys an AI system

These tools are useful but this focus is a mistake
Digital technologies are now often built around cloud-based services, using a client-server model where compute is centralised but deployment of tech is distributed, producing complex supply chains driven by data flows between multiple actors
These supply chains are dynamic, transient, and unstable - they may come into existence when executed and disappear when they complete, the actors involve might differ each time a chain is instantiated, and they change over longer periods as services and applications develop
Data-driven supply chains are now key infrastructures in political economy of informational capitalism - producing and shaping data flows, enabling functionality, and centralising power with core players who can extend it over others - and increasingly underpin AI technologies
Technological and economic barriers to entry for advanced AI tech mean many general purpose and generative systems will in future be produced or controlled by a few key players - primarily Amazon, Microsoft, and Google - and distributed as services running on their infrastructure
Even where those companies don't build systems themselves, other providers of AI services like Open AI use their cloud infrastructure for production and distribution (e.g. Open AI uses MS Azure for its cloud infrastructure and for distribution of its API-based services)
The big players are now highly integrated both horizontally (across markets and sectors) and vertically (across production and distribution of AI systems), with significant technical expertise and resources, and benefitting from economies of scale and data from many customers
So interdependencies and balances of power in algorithmic supply chains are often highly asymmetric - allowing big providers to leverage APIs and standard form terms of service to structure their relations with customers and extend control over deployment and use of their tech
AI providers also extend their own supply chains across borders for regulatory arbitrage - outsourcing things like dataset production to companies in jurisdictions with weaker data protection and privacy laws, lower wages, and fewer workers' rights and employment protections
Compute is centralising around big providers, but algorithmic supply chains exhibit *distributed responsibility* - a split between production activities (by providers and their suppliers), deployment (by customers), and potentially use (by end-users) of AI systems as services
Existing and proposed laws like GDPR and even the AI Act don't properly reflect this distribution of practical responsibilities and will struggle to contend with what's happening here. This is not good! We need to fix these laws to rein in power and protect people and societies
Algorithmic supply chains also have an *accountability horizon* - it's very difficult for any one actor to 'see' more than one or two steps upstream or downstream from their position in the chain, and all but impossible for others to understand what's happening within chains
This makes it hard for providers to properly frame problems when developing systems - they're less able to account for the many contexts and use-cases of customers' applications

It's also difficult for customers to know which systems and services are appropriate to their needs
It also makes risk management reqs in laws like GDPR and the AI Act hard for anyone to meet - providers will struggle to foresee risks in the many and varied applications and contexts where customers deploy their systems - even one step down the chain, let alone further away
These problems are exacerbated by the scale services operate at - even if it was easy to see across chains, providers have so many customers and general purpose AI systems have so many potential applications and contexts of use that it's practically impossible to manage
All of these things challenge existing governance and accountability mechanisms and make reliably allocating accountability to the appropriate actor difficult

Yet we have few legal, technical, or other ways of understanding algorithmic supply chains and addressing these problems
If algorithmic accountability is to be a mechanism for challenging power - as it should be - then we need much more work to understand the dynamics of supply chains, to build ways of investigating and contesting them, and to reliably allocate accountability across them

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Jennifer Cobbe

Jennifer Cobbe Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @jennifercobbe

Apr 28
Kenneth Baker warning of the dangers of the digital welfare state when introducing his Data Surveillance Bill in *1969* Image
I *think* Baker's Bill was the first legislative proposal for something like a data protection law anywhere in the world. Happy to be corrected (perhaps Hesse's 1970 law was proposed before May 1969?)
Read 5 tweets
Apr 28
Looking for the earliest use of the term "personal data" to mean roughly what it means today (information about a person). The earliest I've found is 1911 (in a US government report on Japanese immigrant communities). Any advance on that?
A brief history of "personal data" Image
(in English)
Read 4 tweets
Jul 18, 2022
Skimming this new data protection Bill, one thing for sure is it's going to make the Data Protection Act 2018 and the UK GDPR even more of a mess to follow
What is the point of abolishing the Information Commissioner (and his office) and replacing him with an Information Commission with a chair and the same role and responsibilities? Other than to waste money on new headed paper
Biometrics Commissioner to be abolished and (some of) their functions transferred to the Investigatory Powers Commissioner. Other functions - e.g. reviewing and reporting to Home Secretary on police use of DNA and fingerprints - seemingly just dropped?
Read 30 tweets
Dec 21, 2021
The reason why GDPR hasn't made a difference here is that data protection law - as essentially constituted since at least the 1990s - is itself part of the problem

For GDPR to have changed anything it needed a much more radical departure from the core framework of the Directive
Law is part of what produces various social, political economic, and technological processes and practices, including tracking

Tracking emerged in the context of law that permitted it, through its framework, its concepts, its approach, and its effects
The promise of GDPR was to reform the law to address things like tracking.

Though the text changed, it is still grounded in the same concepts and principles as the Directive. It still takes fundamentally the same approach. So it still has essentially the same effects.
Read 13 tweets
Dec 21, 2021
Women & Equalities Committee calls on the government to move towards a self-ID model for trans people
They also recommend that the requirement to 'live in the acquired gender' for two years - which perpetuates outdated stereotypes - be removed *immediately*
They held six oral evidence sessions - giving people like Stock who complain about being silenced the opportunity to put across their views - and received 848 pieces of written evidence, including from transphobic organisations like Women's Place UK and Transgender Trend
Read 5 tweets
Oct 6, 2021
Seven more thoughts on recommender systems

ONE. Law needs to identify specific platform activities to regulate, rather than kinds of technology

We identify activity of 'recommending' - constructing content feeds algorithmically based on determination of relevance, interest, etc
But 'recommending' is itself too broad a description of what platforms are doing. So we divide 'recommending' into three types: open, curated, and closed

The distinction between these activities turns on the platform's sourcing of content for the recommender system
'Open' recommending is where platforms recommend content from a pool of user content without any kind of specific selection or editorial control by the platform. This is the Facebook News Feed, or Twitter Timeline.
Read 20 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us on Twitter!

:(