Matthew Green Profile picture
Jan 25, 2022 13 tweets 5 min read Read on X
I read the new location tracking complaint against Google filed by three state AGs and DC. It shouldn’t be surprising to anyone who is familiar with Google, but it’s pretty detailed. Thread. 1/
The basic allegation is that Google (mainly via Android) made it extremely difficult to turn off location data collection, and when people *did* try to turn this off, Google still collected and used location data for advertising.
As described in the complaint, there are basically three ways Google can get your location. (1) via GPS, (2) by monitoring nearby WiFi networks, (3) through IP address. Even if you turn GPS off, Google uses some of these. 2/
Once Google has your location information, the question is whether the user can stop them from recording it. As of 2018, Google seemed to make this possible through a Location History account setting. 3/
The Location History setting was described as “let[ting] Google save your location.” Presumably to ordinary non-technical users this language was about as clear as things get. According to the complaint, however, Google saved your location regardless of the setting. 4/
Specifically, Google has another “Web & App Activity” setting that also lets Google save your location. Because why have one setting when you can have many confusing ones? 5/
A brief interlude here to see what Google employees thought of these options. “[F]eels like it is designed to make things possible, but difficult enough that people won’t figure it out” is a solid quote. 6/
The complaint has a long section on “dark patterns” and this reads like a syllabus in a course on Silicon Valley privacy invasion. 7/
All the typical stuff: (1) presenting users with complicated opt-ins once at setup; (2) repeatedly “nudging” people who opt-out; (3) rewording dialog boxes to be less specific and maximize engagement; (4) hinting that apps “need” location history to work. It goes on. 8/
The one area where I felt j needed more detail was around the scanning of Wi-Fi networks. Even if you turn off GPS, companies like Google can determine your location by seeing nearby Wi-Fi. The complaint hints that Google does these even when you disable location. 9/
In fact, from context it feels like a lot of the redacted text in this document is about Wi-Fi geolocation. I hope future amended complaints get into the details. 10/
Final note: how did Google management feel about all of this? Was it all a big misunderstanding caused by good people trying hard not to be evil? Judge for yourself. 11/11 fin.
Here is the complaint so you can read for yourself. It’s only about 20 pages long. cdn.vox-cdn.com/uploads/chorus…

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Matthew Green

Matthew Green Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @matthew_d_green

Jul 3
I really do think context is important here. Some of these age verification laws are based on good-faith concerns. But a lot of them are really designed to censor big chunks of the Internet, making them less accessible to both kids and adults.
If you’re thinking that some kind of privacy-preserving age verification system is the answer, that’s great! But you need to make sure your goals (easy access for adults, real privacy, no risk of credentials being stolen) actually overlap with the legislators’ goals.
These systems have loads of sharp edges, and even if you do a perfect job you’re already going to chill access to sites that require age verification. But of course *nobody* comes close to getting it right. For example: 404media.co/id-verificatio…
Read 12 tweets
Jun 21
I want to agree with the idea that mass scanning “breaks encryption” but I think the entire question is a category error. Any law that installs surveillance software directly on your phone isn’t “breaking” or “not breaking” encryption, it’s doing exactly what it promises to do.
For decades we (in the west) had no mass surveillance of any communications. Starting in the 2010s some folks came up with the idea of scanning for illicit content like CSAM uploaded in plaintext on servers. (With apparently relatively little effect on the overall problem.) Image
I don’t think many people realize how new and unproven this scanning tech is: they just assume it’s always been there and it works. It really hasn’t: it’s only a few years old, and it doesn’t seem to have any noticeable impact on sharing of CSAM material.
Read 7 tweets
Jun 10
So Apple has introduced a new system called “Private Cloud Compute” that allows your phone to offload complex (typically AI) tasks to specialized secure devices in the cloud. I’m still trying to work out what I think about this. So here’s a thread. 1/
Apple, unlike most other mobile providers, has traditionally done a lot of processing on-device. For example, all of the machine learning and OCR text recognition on Photos is done right on your device. 2/
Image
Image
The problem is that while modern phone “neural” hardware is improving, it’s not improving fast enough to take advantage of all the crazy features Silicon Valley wants from modern AI, including generative AI and its ilk. This fundamentally requires servers. 3/
Read 22 tweets
May 28
Some folks are discussing what it means to be a “secure encrypted messaging app.” I think a lot of this discussion is shallow and in bad faith, but let’s talk about it a bit. Here’s a thread. 1/
First: the most critical element that (good) secure messengers protect is the content of your conversations in flight. This is usually done with end-to-end encryption. Messengers like Signal, WhatsApp, Matrix etc. encrypt this data using keys that only the end-devices know. 2/
Encrypting the content of your conversations, preferably by default, is “table stakes.” It isn’t perfect, but it’s required for a messenger even to flirt with the word “secure.” But security and privacy are hard, deep problems. Solving encrypted messaging is just the start. 3/
Read 15 tweets
May 23
Several people have suggested that the EU’s mandatory chat scanning proposal was dead. In fact it seems that Belgium has resurrected it in a “compromise” and many EU member states are positive. There’s a real chance this becomes law. dropbox.com/scl/fi/9w611f2…


Image
Image
Image
The basic idea of this proposal is to scan private (and encrypted) messages for child sexual abuse material. This now means just images and videos. Previous versions also included text and audio, but the new proposal has for the moment set that aside, because it was too creepy. Image
Previous versions of this idea ran into opposition from some EU member states. Apparently these modest changes have been enough to bring France and Poland around. Because “compromise”. Image
Read 11 tweets
May 12
Telegram has launched a pretty intense campaign to malign Signal as insecure, with assistance from Elon Musk. The goal seems to be to get activists to switch away from encrypted Signal to mostly-unencrypted Telegram. I want to talk about this a bit. 1/
First things first, Signal Protocol, the cryptography behind Signal (also used in WhatsApp and several other messengers) is open source and has been intensively reviewed by cryptographers. When it comes to cryptography, this is pretty much the gold standard. 2/
Telegram by contrast does not end-to-end encrypt conversations by default. Unless you manually start an encrypted “Secret Chat”, all of your data is visible on the Telegram server. Given who uses Telegram, this server is probably a magnet for intelligence services. 3/
Read 13 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us!

:(