Cybermatron has left the building 🕷 Profile picture
This account will now be inactive. Thanks for all the fun new things. Please find me on The Good Place now https://t.co/Ba2gxAp5ge

Sep 12, 2020, 12 tweets

In DP terms, I think loss of control is most closely linked to violations of the purpose limitation principle.

Like @mireillemoret said, this is then also connected to a lack of transparency and, I would argue, fairness (in the Art. 5 sense). But as far as algorithmic decision-making is concerned, purpose limitation is clearly where its at.

Having said that, I’m starting to get very suspicious of the concept of *control* (nevermind *property*) as our loadstar, given its current link solely to the individual data subject, who is mostly not equipped to exercise that control responsibly.

I agree with the BVerfG’s communitarian rationale for the (individual’) right to informational self-determination, namely that it facilitates individuals’ resilience in the context of “wehrhafte Demokratie”. It’s a nice idea.

But I can’t help thinking that, these days, individuals need a bit more help in that from the law. We are faced with a very different power balance compared to 1984, where ...

... individuals can easily be seduced to exercise their control in a way that brings them short-term benefit (but often long-term detriment) while causing both short-term and long-term detriment to societal interests.

And AI is capable of tipping the scales even further to one side by seemingly automating (perceived) individual control.

So if we want to prevent a *loss of control* not just in the individualist sense but in the sense described by Simitis as “always also leading to a loss of democratic resilience”,...

... then we need to start talking not just about how we can protect or strengthen individual control, but how certain data uses should be removed from being subject to that control and outlawed regardless of what the individual has to say about this.

If we can have a system where data uses are authorised by legal grounds for reasons of protecting other (public and commercial) interests even if the individual does not consent (which is arguably what the GDPR is all about),...

...then we should be able to have a system where certain data uses are prohibited on the basis of legal rules for reasons of protecting other (mostly public) interests *even if* the individual consents.

We have to start talking - using our standard democratic processes - about taking things off the table when they are neither in the individual’s nor society’s interest and/or are likely to cause detriment to either or both. Control be damned!

Share this Scrolly Tale with your friends.

A Scrolly Tale is a new way to read Twitter threads with a more visually immersive experience.
Discover more beautiful Scrolly Tales like this.

Keep scrolling