For those interested in the political economy of AI this report has a lot of teasers. Many of them are aligned with some recent papers that talked about the concentration of research in the hands of a few (corporations and their research collaborators).
Report claims OpenAI and Deepmind, but also other big players in the industry are important players in research but do not/cannot publish their code (I hope all our colleagues who now do ethics at these companies consider these structural issues!)
Tools are an important but of expanding infrastructural power of these companies into research institutions, and the report claims Facebook is outpacing Google.
The brain drain from the universities is happening at the same time as, for example, European public education institutions are pumping money into universities for AI research. This has implications for what happens with public funds and quality of education.
The report suggests the endowment model does not make up for the loss. Notice that aside from the tiny sums, especially when companies finance social sciences, humanities (ethics) and law, it also does not solve the lack of public interest technology research. #fundingmatters
The brain drain also can be seen from a global perspective: a process further concentrating research power in the hands of tech dominant countries while draining students from the global south, maybe better call this brain extractivism?
This impacts immigration policy, as companies (and universities) fight to keep the "talent" flowing in. This can be contrasted with the silence of tech companies vis a vis the continuous depression of number of refugees accepted into the countries that benefit from these inflows
And, not that NeurIPS as a conference can be seen as a social indicator, but still, this graph raises questions with respect to who will benefit from the current political economy of AI, if "AI" comes to be so successful as promised. Or, who will have wasted the most money?
This slide is especially for @mikarv and @carmelatroncoso and points to how the path paved using differential privacy to unblock the privacy obstacle to current computational infrastructures is further mainstreamed in federated ML solutions.
I haven't finished reading the report (well, really a slide deck), but I would be very curious to hear if people catch other interesting threads that could lead to interesting research questions, but also questions about education and research policy.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
One year into the pandemic, in Germany we are encountering debate around apps (see #LucaApp or #ImmunityPassports) that promise to solve the Corona crisis. The current debate forgets important questions for public interest 1/x
Currently the debate is all about "data privacy". But, is that all that is at stake?
Are there other questions that journalists, politicians and civil society could be asking that could ensure these apps serve the public interest? 2/x
Serving the public interest does not begin and end with good data security and privacy but needs to start with whether the apps serve the purpose they are designed for and do so effectively. 3/x
Ein Jahr nach Beginn der Pandemie erleben wir in Deutschland eine extrem verflachte Debatte über Apps (z.B. #LucaApp oder #ImmunityPassports), die bei der Bekämpfung der Corona-Krise helfen sollen. Problem: Diese Form der Debatte ist nicht hilfreich.
Aktuell konzentiert sich die Debatte auf "Datensicherheit".
Aber ist das die einzige relevante Frage?
Was sind die Fragen, die Journalist*innen, Politiker*Innen und die Zivilgesellschaft stellen sollten, wenn es darum geht, das Gemeinwohl nicht aus dem Blick zu verlieren?
Ob etwas dem Gemeinwohl zuträglich ist, hängt nicht nur von guter Datensicherheit ab, sondern muss sich zuerst mit der Frage beschäftigen, ob diese Apps tatsächlich dem Zweck dienen, zu dem sie eingesetzt werden.
"part of what we in an American university have to consider now, what it is for us to have been made custodians of those principles [of free speech] even as we are made to watch when they are dissolved in an infernal public private partnership." Fred Moten
"it is that this ought not to drive us to defend an abstract principle of free speech, which is only ever concretized, and usually at the same moment dishonestly and disgustingly sacralized, in exclusion."
"instead and in refusal of that, lets claim and eruptively assert our fugitive speech, which is fueled by the realization of the conditions we live in"
When Alex Irpan of Google writes about compute as the way forward for AI, you wonder how much of this is AI pulling on compute vs. compute (and the investment into chips) pulling on AI.
The whole article is as much about economics as about AI, in fact it conflates the two 1/x
It starts with artificial general intelligence being equated with "economically valuable work":
"artificial general intelligence (AGI) [is] an AI system that matches or exceeds humans at almost all (95%+) economically valuable work
2/x
According to Irpan, economics determines also how AI will spread:
"We also don’t have to build AI systems that learn like humans do. If they’re capable of doing most human-level tasks, economics is going to do the rest, whether or not those systems are made in our own image."
3/x
This WP article has highlighted important problems with Gapple's contact tracing efforts & quotes some of my mentors in tech policy. But, from my vantage point, it creates (unintentionally) a false dichotomy between national sovereignty and Gapple (1/x) washingtonpost.com/technology/202…
My gut response to the first Gapple announcement was not only lack of sovereignty but also democratic process. Sovereignty is not sufficient give the complexity of the relationship between governments & Gapple (2/x)
To this day most app initiatives are techno-centric and top down and have side-stepped health authorities as well as civil society. Governments across Europe got pushback for this, and some changed course, e.g., Germany, Belgium and the Netherlands. (3/x)
"Watched and still dying"
"I never imagined I would experience more loved ones dying in similar isolation and uncertainty [as AIDS] in my lifetime."
From the wise @Combsthepoet odbproject.org/2020/04/26/wat…
"I have spoken with several social justice organizers and survivors who confirmed the stigma that accompanied not only the patients, but family members, friends and anyone who came into contact with someone suspected of having HIV or AIDS."
"We have since discovered much more about HIV and AIDS, but not without failed experiments, constant tracking of people living with the virus, and lots of government mistakes, most prominently from the White House."