Today's Online Harms consultation response is perhaps the first major UK divergence from a big principle of EU law not tied to Brexit directly: it explicitly proposes a measure ignoring the prohibition on requiring intermediaries like platforms to generally monitor content.
the e-Commerce Directive art 15 prohibits member states from requiring internet intermediaries to actively look for illegal content; this is because the awareness would make them liable.
The Online Harms White Paper roughly kept with this, indicating that automatic detection systems were an approach platforms could use, but they would not be required to. Consultation responses (unsurprisingly) agreed.
The consultation response states that in child protection and terrorist content, the regulator will be given a power to require the use of automated detection tools, if it can show necessity, proportionality, accuracy.
In contrast, the EU's Digital Services Act maintains the general monitoring prohibition (and indeed moves it into a Regulation).
The general monitoring prohibition is a weird beast in retained law — it is in a directive: essentially, MS would be in breach of EU law for introducing an obligation in legislation/court/etc, but never needed to be transposed into UK law directly legislation.gov.uk/uksi/2002/2013….
While UK courts have referred to it in case-law, this puts it in a strange situation — Parliament can always make a law ignoring a prohibition even if it was transposed. Just could not in practice if it was a regulation or Directive.
The prob here is it's magical thinking to think that automated tools can accurately deal with this problem. They can certainly find a subset of already-seen material that is pretty objectively CSAM or terrorist content. They can't deal with the grey areas journals.sagepub.com/doi/full/10.11…
Notably missing is any requirement for sharing of these techs. Only largest firms will be able to obtain, maintain, train good automated systems. Huge barrier to entry for new entrants in social media or other communications technologies. OH hasn't thought of competition issues.
Core to the DMA is the idea of "core platform services" and providers thereof, listed here and defined either within the reg or in previous regs. Big and powerful providers of these are in scope, basically.
The juicy parts of the DMA are Articles 5 and 6. These contain obligations for gatekeepers in relation to core services. Art 6 obligations can be further specified by the EC through implementing acts.
After a long, unnecessary saga, England/Wales launches a decentralised contact tracing app based on the DP-3T work led by @carmelatroncoso, following other regions of the UK.
The original was a triple whammy of hubris: wouldn’t work abroad, wouldn’t work technologically on platforms, centralisation open for abuse and function creep.
This version has much better foundations.
I understand mistrust that may linger — but please do try this new one.
We’ve also learned plenty about platforms. If governments want the citizens to be able to run arbitrary code on mobile devices, making use of all sensors, they’ll need the law to crack open walled gardens. theguardian.com/commentisfree/…
I suspect students in England will make a very large number of subject access requests under the GDPR to schools from tomorrow for their teacher-estimated grade as well as rank-order in the class — information which will likely have determined their university entrance. 1/
There is a relevant exemption/delay provision in the Data Protection Act 2018 sch 2 para 25 for exam scripts, but this only pushes the deadline to a minimum of 22 September 2020. The ICO has confirmed this. ico.org.uk/global/data-pr…
The only time I can see a plausible ground for this grade to be refused is where the rank order reveals data about others, such as in classes of 2 or 3 (wow). Even then, no presumption against disclosure (see DB v General Medical Council [2018] EWCA Civ 1497).
Looks like the Court agrees with @maxschrems - it is for DPAs to strike down SCCs with certain countries, rather than throwing the mechanism itself out, and the Court decides to answer the Privacy Shield questions (the AG said they did not need to), and strikes it down.
SCCs now haunted by the question of how an underfunded DPA examines all of a third country’s laws and assessed whether SCCs remain valid, when they can’t even take complaints effectively in their own legal system.
national parliamentary committee can be public authority & data controller says CJEU.
clearly some v strange bg to this case though, as DE admin court referred a 2nd q doubting its own ability to refer under TFEU 267 due to general lack of independence curia.europa.eu/juris/document…
in headache inducing logic typical of art 267, CJEU says they are independent so can refer the DP question, but that the independence question is technically inadmissible because isn’t necessary to answer the issue in the main proceedings, so it says it actually never answered it
anyway besides the strange act of self-doubt which appears to be about the appointment of temporary judges & the ministry of justice’s IT support of the ct computers, the case is generally unremarkable other than to say the definition of public authority is wide & eu law applies