Matthew Green Profile picture
Dec 23, 2020 26 tweets 6 min read Read on X
My students @maxzks and Tushar Jois spent most of the summer going through every piece of public documentation, forensics report, and legal document we could find to figure out how police were “breaking phone encryption”. 1/
This was prompted by a claim from someone knowledgeable, who claimed that forensics companies no longer had the ability to break the Apple Secure Enclave Processor, which would make it very hard to crack the password of a locked, recent iPhone. 2/
We wrote an enormous report about what we found, which we’ll release after the holidays. The TL;DR is kind of depressing:

Authorities don’t need to break phone encryption in most cases, because modern phone encryption sort of sucks. 3/
I’ll focus on Apple here but Android is very similar. The top-level is that, to break encryption on an Apple phone you need to get the encryption keys. Since these are derived from the user’s passcode, you either need to guess that — or you need the user to have entered it. 4/
Guessing the password is hard on recent iPhones because there’s (at most) a 10-guess limit enforced by the Secure Enclave Processor (SEP). There’s good evidence that at one point in 2018 a company called GrayKey had a SEP exploit that did this for the X. See photo. 5/ Image
There is really no solid evidence that this exploit still works on recent-model iPhones, after 2018. If anything, the evidence is against it.

So if they can’t crack the passcode, how is law enforcement still breaking into iPhones (because they definitely are)? 6/
The boring answer very likely is that police *aren’t* guessing suspects’ passcodes. They’re relying on the fact that the owner probably typed it in. Not *after* the phone is seized, in most cases. Beforehand. 7/
You see, iPhones can be in one of two states, which are respectively known as “Before First Unlock” (BFU) and “After First Unlock” (AFU). This is pretty self-explanatory.

When you turn your phone on and enter the passcode in the morning, you switch your phone from BFU->AFU. 8/
When you first unlock your iPhone after power-on, it uses your passcode to derive several sets of cryptographic keys. These stay in memory inside your phone, and are used to encrypt the file system. 9/
When you lock your iPhone (or press the button on the side, or leave it alone until the screen goes blank), exactly *one* set of keys gets “evicted”, ie erased from memory. Those keys are gone until you enter your passcode or use FaceID.

All of the other keys stay in memory. 10/
The key that gets evicted on lock is used to decrypt a subset of the files on the filesystem, namely the ones that have a specific protection class (NSComplete). The keys that don’t get evicted can be used to decrypt all the other files.

(This is all well-known so far BTW.) 11/
So the upshot of this is that, if police can capture your phone in the AFU state (yours is almost certainly in that state for 99% of its existence) *and* they have a software exploit that allows them to bypass the OS security measures, they can get most of the files. 12/
The real question is: what exactly does “most of the files” mean, and the corollary is “why not protect *more* than just a few of the files with that special key (the one that gets evicted)”. That’s where things get depressing. 13/
Apple *sort of* vaguely offers a list of the apps whose files get this special protection even in the AFU state. But notice how vague this language is. I have to actually decode it. 14/ Image
Notice how this text simply reports that some app data is “protected through encryption” (this is vague and meaningless, since it doesn’t say whether it’s AFU or BFU) and other app data is explicitly only protected in the BFU state (before you first unlock.) Why so vague? 15/
Here is a version of the same text from back in 2012. Notice how it explicitly states that “Mail, App Launch images, and Location Data” are protected using the strongest type of encryption.

So it seems that Apple is actually protecting *less* data now than in 2012. Yikes. 16/ Image
(Our most likely guess: Apple has weakened the protections on location data in order to enable fancy features like “location based reminders”. So they had to weaken the language in the security guide. This isn’t great.) 17/
But whether you look at the 2012 or 2020 data, the situation sucks. The built-in apps that definitely use strong AFU protection are:

Mail (which probably already exists on a server that police can subpoena, so who cares.)

App launch data (🤷‍♂️)

That’s not great. 18/
3rd party apps can opt-in to protect data using the strongest type of encryption, so this isn’t necessarily the whole story. But let’s list some data that *doesn’t* get AFU protection:

Photos
Texts
Notes
Possibly some location data

Most of what cops want. 19/
So this answers the great mystery of “how are police breaking Apple’s encryption in 2020”. The answer is they probably aren’t. They’re seizing unlocked phones and using jailbreaks to dump the filesystem, most of which can be accessed easily since keys are in memory. 20/
Oh my god my thumbs. 21/
Anyway, this leaves basically only one remaining question:

Why is so little of this data encrypted when your phone is AFU and locked? And the answer to that is probably obvious to anyone who develops software, but it still sucks. 22/
Most apps like to do things in the background, while your phone is locked. They read from files and generally do boring software things.

When you protect files using the strongest protection class and the phone locks, the app can’t do this stuff. It gets an error. 23/
Apple provides some tools to make this less painful: for example, they have a “write only” protection class.

But for the most part it’s annoying for software devs, so they lower protections. And if Apple *isn’t* using strong protection for its in-house apps, who will? 24/
If I could tell Apple to do one thing, I would tell them to figure this problem out. Because without protection for the AFU state, phone encryption is basically a no-op against motivated attackers.

Maybe Apple’s lawyers prefer it this way, but it’s courting disaster. 25/
For those who would prefer to read this thread in the form of a 65-page PDF that also discusses cloud backup systems and Android, here is our current paper draft: securephones.io/main.pdf

This will be on a pretty website soon. Thanks for not blocking me after this thread. // fin

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Matthew Green

Matthew Green Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @matthew_d_green

Jul 5
I’ve been watching some early 90s movies recently, and being reminded of the privacy expectations we all used to take for granted in a world that basically worked fine. People paying in cash; using telephones; having important conversations in person.
The idea that a corporation might track you routinely (even if just to push you ads) was barely on the radar. The idea that we needed to add that feature to keep us safe, that was laughable. The past is like a foreign country.
Someone asked me to explain why the early cypherpunks were such a weird alliance of pro-privacy hippies and more right wing gun nuts. Well, that’s easy. The cypherpunk folks were an alliance of weirdos. It was a time where most of these ideas didn’t have mainstream support because the mainstream took most privacy for granted and didn’t see the need to think about weird ideas like “digital money that worked like cash” because we all used cash.
Read 5 tweets
Jul 3
I really do think context is important here. Some of these age verification laws are based on good-faith concerns. But a lot of them are really designed to censor big chunks of the Internet, making them less accessible to both kids and adults.
If you’re thinking that some kind of privacy-preserving age verification system is the answer, that’s great! But you need to make sure your goals (easy access for adults, real privacy, no risk of credentials being stolen) actually overlap with the legislators’ goals.
These systems have loads of sharp edges, and even if you do a perfect job you’re already going to chill access to sites that require age verification. But of course *nobody* comes close to getting it right. For example: 404media.co/id-verificatio…
Read 12 tweets
Jun 21
I want to agree with the idea that mass scanning “breaks encryption” but I think the entire question is a category error. Any law that installs surveillance software directly on your phone isn’t “breaking” or “not breaking” encryption, it’s doing exactly what it promises to do.
For decades we (in the west) had no mass surveillance of any communications. Starting in the 2010s some folks came up with the idea of scanning for illicit content like CSAM uploaded in plaintext on servers. (With apparently relatively little effect on the overall problem.) Image
I don’t think many people realize how new and unproven this scanning tech is: they just assume it’s always been there and it works. It really hasn’t: it’s only a few years old, and it doesn’t seem to have any noticeable impact on sharing of CSAM material.
Read 7 tweets
Jun 10
So Apple has introduced a new system called “Private Cloud Compute” that allows your phone to offload complex (typically AI) tasks to specialized secure devices in the cloud. I’m still trying to work out what I think about this. So here’s a thread. 1/
Apple, unlike most other mobile providers, has traditionally done a lot of processing on-device. For example, all of the machine learning and OCR text recognition on Photos is done right on your device. 2/
Image
Image
The problem is that while modern phone “neural” hardware is improving, it’s not improving fast enough to take advantage of all the crazy features Silicon Valley wants from modern AI, including generative AI and its ilk. This fundamentally requires servers. 3/
Read 22 tweets
May 28
Some folks are discussing what it means to be a “secure encrypted messaging app.” I think a lot of this discussion is shallow and in bad faith, but let’s talk about it a bit. Here’s a thread. 1/
First: the most critical element that (good) secure messengers protect is the content of your conversations in flight. This is usually done with end-to-end encryption. Messengers like Signal, WhatsApp, Matrix etc. encrypt this data using keys that only the end-devices know. 2/
Encrypting the content of your conversations, preferably by default, is “table stakes.” It isn’t perfect, but it’s required for a messenger even to flirt with the word “secure.” But security and privacy are hard, deep problems. Solving encrypted messaging is just the start. 3/
Read 15 tweets
May 23
Several people have suggested that the EU’s mandatory chat scanning proposal was dead. In fact it seems that Belgium has resurrected it in a “compromise” and many EU member states are positive. There’s a real chance this becomes law. dropbox.com/scl/fi/9w611f2…


Image
Image
Image
The basic idea of this proposal is to scan private (and encrypted) messages for child sexual abuse material. This now means just images and videos. Previous versions also included text and audio, but the new proposal has for the moment set that aside, because it was too creepy. Image
Previous versions of this idea ran into opposition from some EU member states. Apparently these modest changes have been enough to bring France and Poland around. Because “compromise”. Image
Read 11 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us!

:(