Before this goes too far - *extorting. The crime I meant to refer to was extortion.
This was always ethically questionable (buying it to the exclusion of others), but it there was a nationalist argument before it became leverage for the data grab.
This was always where we were headed: emergency research ethics. This is such an incredibly delicate thing to do well - and it should apply to more of our emergency interventions than just vaccine testing.
"Without functioning infrastructure, institutions, or systems to coordinate communication, technology fails... often without the tools to measure, monitor, or correct the failures that result... endured by populations already under tremendous hardship."
"The practice of human subjects research... are essential for protecting populations from mistreatment by experimenters who undervalue their well-being. But they come from the medical industry, which relies on a lot of established infrastructure that less-defined industries lack"
Really enjoying @EdFelten's webinar on technology and COVID19.
A few quick thoughts!
We agree - immunity passports have huge, dangerous issues.
@EdFelten My primary thoughts remain wanting to see/understand more science on the digital footprint of "transmission events," - because I haven't seen proximity tracking rise to the level I would want to alerting people to elevated risk in a system w/limited treatment capacity.
@EdFelten The obvious and 100% correct caveat offered is - "in contexts with lots of human contact tracers and testing" - but what places have done this successfully?
Even Singapore, which is arguably the best environment in the world to do this, containment failed.
For as long as I’ve been involved in digital rights advocacy, I’ve struggled with bridging operational pragmatism and principled, inclusive governance.
Polarization over legal theories (OSS, data as X, etc.) has made that harder. Data governance is next. [thread]
Governance in transition is, by design, deliberative. In many societies, it aspires to be representative. Digitization is a scale change in power balances, which has effects on most of the way we design all kinds of rights protection systems, from human to property to political
Digital and data rights policy decisions are made primarily in commercial, regulatory, and procurement processes, instead of legislative processes.
We’re deciding some of the most complex issues of our time without the participatory tools we use to govern
I've read this new piece from @lawgeek and @katecrawford - and it's good! Legal arguments are best thought of as “yes, and!” debates. I agree that the problem is super important, but disagree with the proposed solution.
The paper is well-reasoned + legally inventive - but it starts from the problem (weak liability for public use of unreasonably or unknowably risky ‘artificial intelligence’ technologies in public services) without acknowledging the process that creates those problems. (2/13)
It goes on to indirectly ask whether the provision of state-like services by artificial intelligence vendors, by virtue of including decision-making, should import the kinds of liability extension that courts have, in applying state action doctrine. (3/13);