"We think it is necessary and overdue to rethink the way technology gets designed and implemented, because contact tracing apps, if implemented, will be scripting the way we will live our lives and not just for a short period."
"The history of critical debate & radical interventions in technological hegemony offers an archive of how technological solutionism w/out resistance, inevitably will be complacent with racialised capitalism which amounts to unequal intersections of race, class, ability & gender"
"Epidemiology and public health have served racial, class, sexual, ableist divides and colonial practices."
"We are currently being presented only two options, a paradigm of either implementing a contact tracing app or continuing to social distance at "home". However this appears to be a false dichotomy. Is tracing contacts the preferred response? What are other options? "
"Will communities be given the decision making power? We need to ask for what and for whom is this exposure tracing app, and how does it fulfill or not the ways in which communities want to protect themselves and manage exposure."
"The urgent challenge is how to facilitate a public discussion so that communities can design, test and shape the methods for protection, care and managing their own exposure."
"How to facilitate a public discussion so that communities can design, test and shape methods for protection not from viruses, but from extractive modes of existence."
This and much more from "The Institute for Technology in the Public Interest" with Miriyam Aouragh, @Helen_Pritchard and Femke Snelting
• • •
Missing some Tweet in this thread? You can try to
force a refresh
One year into the pandemic, in Germany we are encountering debate around apps (see #LucaApp or #ImmunityPassports) that promise to solve the Corona crisis. The current debate forgets important questions for public interest 1/x
Currently the debate is all about "data privacy". But, is that all that is at stake?
Are there other questions that journalists, politicians and civil society could be asking that could ensure these apps serve the public interest? 2/x
Serving the public interest does not begin and end with good data security and privacy but needs to start with whether the apps serve the purpose they are designed for and do so effectively. 3/x
Ein Jahr nach Beginn der Pandemie erleben wir in Deutschland eine extrem verflachte Debatte über Apps (z.B. #LucaApp oder #ImmunityPassports), die bei der Bekämpfung der Corona-Krise helfen sollen. Problem: Diese Form der Debatte ist nicht hilfreich.
Aktuell konzentiert sich die Debatte auf "Datensicherheit".
Aber ist das die einzige relevante Frage?
Was sind die Fragen, die Journalist*innen, Politiker*Innen und die Zivilgesellschaft stellen sollten, wenn es darum geht, das Gemeinwohl nicht aus dem Blick zu verlieren?
Ob etwas dem Gemeinwohl zuträglich ist, hängt nicht nur von guter Datensicherheit ab, sondern muss sich zuerst mit der Frage beschäftigen, ob diese Apps tatsächlich dem Zweck dienen, zu dem sie eingesetzt werden.
"part of what we in an American university have to consider now, what it is for us to have been made custodians of those principles [of free speech] even as we are made to watch when they are dissolved in an infernal public private partnership." Fred Moten
"it is that this ought not to drive us to defend an abstract principle of free speech, which is only ever concretized, and usually at the same moment dishonestly and disgustingly sacralized, in exclusion."
"instead and in refusal of that, lets claim and eruptively assert our fugitive speech, which is fueled by the realization of the conditions we live in"
For those interested in the political economy of AI this report has a lot of teasers. Many of them are aligned with some recent papers that talked about the concentration of research in the hands of a few (corporations and their research collaborators).
Report claims OpenAI and Deepmind, but also other big players in the industry are important players in research but do not/cannot publish their code (I hope all our colleagues who now do ethics at these companies consider these structural issues!)
Tools are an important but of expanding infrastructural power of these companies into research institutions, and the report claims Facebook is outpacing Google.
When Alex Irpan of Google writes about compute as the way forward for AI, you wonder how much of this is AI pulling on compute vs. compute (and the investment into chips) pulling on AI.
The whole article is as much about economics as about AI, in fact it conflates the two 1/x
It starts with artificial general intelligence being equated with "economically valuable work":
"artificial general intelligence (AGI) [is] an AI system that matches or exceeds humans at almost all (95%+) economically valuable work
2/x
According to Irpan, economics determines also how AI will spread:
"We also don’t have to build AI systems that learn like humans do. If they’re capable of doing most human-level tasks, economics is going to do the rest, whether or not those systems are made in our own image."
3/x
This WP article has highlighted important problems with Gapple's contact tracing efforts & quotes some of my mentors in tech policy. But, from my vantage point, it creates (unintentionally) a false dichotomy between national sovereignty and Gapple (1/x) washingtonpost.com/technology/202…
My gut response to the first Gapple announcement was not only lack of sovereignty but also democratic process. Sovereignty is not sufficient give the complexity of the relationship between governments & Gapple (2/x)
To this day most app initiatives are techno-centric and top down and have side-stepped health authorities as well as civil society. Governments across Europe got pushback for this, and some changed course, e.g., Germany, Belgium and the Netherlands. (3/x)