Looking forward to reading this (recommendation of @gileslane), by the late Mike Cooley, engineer, academic, shop steward, activist behind the Lucas plan en.m.wikipedia.org/wiki/Mike_Cool…
NB: this book (from 1980) actually coined '*human-centred* systems', as an explicitly socialist and socio-technical political movement centering the needs of the people who make and use technology. A far cry from the kind of human-centred design critiqued by Don Norman (2005)
Some highlights:
Some like to think computers should do the calculating, while people do the creativity and value judgements. But the two can't just be combined "like chemical compounds". It doesn't scale.
Oh look, an online duty of care! Modeled even more closely on health and safety regs
It's the robot rights debate (cc @Abebab@theblub) but with an industrial relations twist
Quantification = fascism (a brilliant French mathematician says so)
Why are we building image recognition models when the best image recognisers are unemployed humans? (Or will be, once they've finished labeling all the training data on mTurk)
• • •
Missing some Tweet in this thread? You can try to
force a refresh
Thread on possible implications of #SchremsII for end-to-end crypto approaches to protecting personal data. Background: last week the (CJEU) issued its judgment in Case C-311/18, “Schrems II”. Amongst other things, it invalidates Privacy Shield, one of the mechanisms
enabling transfers from EU-US. This was in part because US law lacks sufficient limitations on law enforcement access to data, so the protection of data in US not 'essentially equivalent' to that in the EU. Similar arguments could apply elsewhere (e.g. UK).
The main alternative mechanism enabling transfers outside the EEA is the use of 'standard contractual clauses' (SCCs) under Article 46(2)(c) GDPR. But the Court affirmed that SCCs also need to ensure 'essentially equivalent' protection.