Our latest paper on technology standardisation is out in @PolicyR! Thanks @teirdes for the fabulous co-op. Some people doubt that values-inspired technology design is possible. We show that not only it is possible, but values already influence technology. policyreview.info/pdf/policyrevi…
"Values" not only guide the building of technologies in aspects such as privacy or cybersecurity, accessibility, freedom of expression, or censorship. There are past examples of political-technology clashes/interventions, too. On-demand decryption or OS changes are examples.
"Human, moral, and European values are clearly linked to technology ... We stress that the presence of politics in the technology sphere is already a reality.". True story! With examples from the U.S. and France.
There are also examples of failures. For example, the Do Not Track/Tracking Preferences Expression pollinated tech policy debates, but ultimately did not get a backing and it is facing a crisis. But there are also many success stories. For example Privacy by Design.
But European Union is struggling with technology standards. Its approach is maladapted to today's world. GDPR and 2G are the well known examples, but what else is there? There is a need for a change. The need for smart tech policy people to engage.
Europe can do better, though. We solve this puzzle. European Union must simplify the current policy, needs a modern strategy of involvement in technology standards, must realise how to practically structure its influence over technology standards
I must also thank again to probably my favourite former MEP Amelia @teirdes. We both know a lot about technology standards. And about technology policy. This work is a direct outcome of our mix of fabulous experience and knowledge.
Apple AI announcement is interesting. In this thread, I analyze the security and privacy of Private Cloud Compute (PCC). Preliminary summary: trust is necessary. It is not pure on-device processing. Transparency is unclear now.
They put loads of effort into this design. Technically, it probably can’t be made much better. Cloud servers perform computations in a way that does not retain any user data after processing, enhancing security.
Data must never be available to anyone other than the user. PCC does not intend to save any user data. It is used to make the request and output the answer.
For example, entirely hypothetically, Russia could stage a false-flag cyber operation "as Ukraine". On itself. An operation that would evidently cross the war threshold, which is very simple to do with cyber. Then "legally" respond.
It may then be evidently and completely "legal" to self-defend, including by sending a tank division for an invasion, possibly with airborne support, why not.
It should also then be "understandable" to many Western States, because they already said that a significant cyberattack may necessitate a classical military-kinetic response. Russia could even argue that they are simply doing what the West has previously devised...
It turns out that wireless charging leaks private data. It leaks information about websites visited by the user. " allows accurate website fingerprinting on a charging smartphone". Information leaked depends on the battery level. Cool work! #GDPR#ePrivacyarxiv.org/pdf/2105.12266…
"Below approximately 80% state of charge, both wired and wireless charging side-channels observed in this experiment do not leak information. ... consistently classify traces with a battery state 90%". Privacy-preserving advice: have less than 80% battery charge? :-)
Google doubling-down on their new (hopefully, claimed) privacy-improved proposals for ads systems, Turtledove. What is it? This thing lets to choose the ad to display on the user's device - with no data supposed to leave the user's browser. So no tracking?groups.google.com/a/chromium.org…
The testing environment ('Fledge') have a bit relaxed privacy properties. So let's hope the final solution is more tight with respect to privacy protection. It'q quite a complex proposal. github.com/WICG/turtledov…chromestatus.com/feature/573358…
Solution apparently based (at least it seems so) and builds on the 10+ years of academic privacy research in privacy-preseving ads systems. Initially neglected, today it is fascinating to imagine this niche field suddenly emerge to be multi-billion one. blog.lukaszolejnik.com/are-we-reachin…
Ticketmaster fined £1.25million for security compromise (they were hacked by Magecart group, their website code was altered to steal data during payments), #GDPR breach. ~9.4m customers affected. Payment data stolen, too. ico.org.uk/media/action-w…
Third-party (chatbot provider) was breached. This spilled to Ticketmaster. Had this functionality not included on the payment site, this breach would not happen (this way, at least). Fun fact: ICO decided to enforce PCI-DSS requirements. #GDPR#ePrivacy
Ticketmaster says they were unable to use the standard subresource integrity (blog.lukaszolejnik.com/making-third-p…) to protect their site because the software changed too often (but they did not know how often). "ICO views this measure as an appropriate measure to implement" #GDPR
The Netherlands government published its position on rules applying to security in cyberspace (cyberattacks/cyberwarfare. My short take (the dokument is v. good) government.nl/binaries/gover…
Sovereignty as a matter of rule applies to cyberspace. But it's extent is not clear. Some investigations may (or may not) be breaching sovereignty of other countries.
Apparently links 'election interference on soc media' with 'intervention'. Is a bunch of trolls an intervention into country affairs? No because it does not effect in behavior change in 'targeted state' (who?)? But you can imagine a State leader issuing threats on social media?