Matthew Green Profile picture
Dec 23, 2020 26 tweets 6 min read Read on X
My students @maxzks and Tushar Jois spent most of the summer going through every piece of public documentation, forensics report, and legal document we could find to figure out how police were “breaking phone encryption”. 1/
This was prompted by a claim from someone knowledgeable, who claimed that forensics companies no longer had the ability to break the Apple Secure Enclave Processor, which would make it very hard to crack the password of a locked, recent iPhone. 2/
We wrote an enormous report about what we found, which we’ll release after the holidays. The TL;DR is kind of depressing:

Authorities don’t need to break phone encryption in most cases, because modern phone encryption sort of sucks. 3/
I’ll focus on Apple here but Android is very similar. The top-level is that, to break encryption on an Apple phone you need to get the encryption keys. Since these are derived from the user’s passcode, you either need to guess that — or you need the user to have entered it. 4/
Guessing the password is hard on recent iPhones because there’s (at most) a 10-guess limit enforced by the Secure Enclave Processor (SEP). There’s good evidence that at one point in 2018 a company called GrayKey had a SEP exploit that did this for the X. See photo. 5/ Image
There is really no solid evidence that this exploit still works on recent-model iPhones, after 2018. If anything, the evidence is against it.

So if they can’t crack the passcode, how is law enforcement still breaking into iPhones (because they definitely are)? 6/
The boring answer very likely is that police *aren’t* guessing suspects’ passcodes. They’re relying on the fact that the owner probably typed it in. Not *after* the phone is seized, in most cases. Beforehand. 7/
You see, iPhones can be in one of two states, which are respectively known as “Before First Unlock” (BFU) and “After First Unlock” (AFU). This is pretty self-explanatory.

When you turn your phone on and enter the passcode in the morning, you switch your phone from BFU->AFU. 8/
When you first unlock your iPhone after power-on, it uses your passcode to derive several sets of cryptographic keys. These stay in memory inside your phone, and are used to encrypt the file system. 9/
When you lock your iPhone (or press the button on the side, or leave it alone until the screen goes blank), exactly *one* set of keys gets “evicted”, ie erased from memory. Those keys are gone until you enter your passcode or use FaceID.

All of the other keys stay in memory. 10/
The key that gets evicted on lock is used to decrypt a subset of the files on the filesystem, namely the ones that have a specific protection class (NSComplete). The keys that don’t get evicted can be used to decrypt all the other files.

(This is all well-known so far BTW.) 11/
So the upshot of this is that, if police can capture your phone in the AFU state (yours is almost certainly in that state for 99% of its existence) *and* they have a software exploit that allows them to bypass the OS security measures, they can get most of the files. 12/
The real question is: what exactly does “most of the files” mean, and the corollary is “why not protect *more* than just a few of the files with that special key (the one that gets evicted)”. That’s where things get depressing. 13/
Apple *sort of* vaguely offers a list of the apps whose files get this special protection even in the AFU state. But notice how vague this language is. I have to actually decode it. 14/ Image
Notice how this text simply reports that some app data is “protected through encryption” (this is vague and meaningless, since it doesn’t say whether it’s AFU or BFU) and other app data is explicitly only protected in the BFU state (before you first unlock.) Why so vague? 15/
Here is a version of the same text from back in 2012. Notice how it explicitly states that “Mail, App Launch images, and Location Data” are protected using the strongest type of encryption.

So it seems that Apple is actually protecting *less* data now than in 2012. Yikes. 16/ Image
(Our most likely guess: Apple has weakened the protections on location data in order to enable fancy features like “location based reminders”. So they had to weaken the language in the security guide. This isn’t great.) 17/
But whether you look at the 2012 or 2020 data, the situation sucks. The built-in apps that definitely use strong AFU protection are:

Mail (which probably already exists on a server that police can subpoena, so who cares.)

App launch data (🤷‍♂️)

That’s not great. 18/
3rd party apps can opt-in to protect data using the strongest type of encryption, so this isn’t necessarily the whole story. But let’s list some data that *doesn’t* get AFU protection:

Photos
Texts
Notes
Possibly some location data

Most of what cops want. 19/
So this answers the great mystery of “how are police breaking Apple’s encryption in 2020”. The answer is they probably aren’t. They’re seizing unlocked phones and using jailbreaks to dump the filesystem, most of which can be accessed easily since keys are in memory. 20/
Oh my god my thumbs. 21/
Anyway, this leaves basically only one remaining question:

Why is so little of this data encrypted when your phone is AFU and locked? And the answer to that is probably obvious to anyone who develops software, but it still sucks. 22/
Most apps like to do things in the background, while your phone is locked. They read from files and generally do boring software things.

When you protect files using the strongest protection class and the phone locks, the app can’t do this stuff. It gets an error. 23/
Apple provides some tools to make this less painful: for example, they have a “write only” protection class.

But for the most part it’s annoying for software devs, so they lower protections. And if Apple *isn’t* using strong protection for its in-house apps, who will? 24/
If I could tell Apple to do one thing, I would tell them to figure this problem out. Because without protection for the AFU state, phone encryption is basically a no-op against motivated attackers.

Maybe Apple’s lawyers prefer it this way, but it’s courting disaster. 25/
For those who would prefer to read this thread in the form of a 65-page PDF that also discusses cloud backup systems and Android, here is our current paper draft: securephones.io/main.pdf

This will be on a pretty website soon. Thanks for not blocking me after this thread. // fin

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Matthew Green

Matthew Green Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @matthew_d_green

Mar 31
This thing Facebook did — running an MITM on Snapchat and other competitors’ TLS connections via their Onavo VPN — is so deeply messed up and evil that it completely changes my perspective on what that company is willing to do to its users.
I don’t come from a place of deep trust in big tech corporations. But this stuff seems like it crosses a pretty clear red line, maybe even a criminal one.
I would say: I’d like to see some very performative firings before I trust Meta again, but let’s be honest. This almost certainly went right to the top. Nobody is going to do something this unethical unless they know management has their back 100%.
Read 6 tweets
Mar 12
Google has a blog up discussing their threat modeling when deploying “post-quantum” (PQC) cryptographic algorithms. It’s an interesting read. bughunters.google.com/blog/510874798…
To elaborate a bit on what’s in the blog post, we know that quantum algorithms exist, in principle, that can break many of the cryptographic algorithms we routinely use. All we’re waiting for now is a capable enough quantum computer to run them. (And this seems hard.) 1/
But technology development isn’t linear. Sometimes problems seem impossible until a big breakthrough changes everything. Think about the development of classical computers before and after semiconductors. The same could happen with QC. 2/
Read 12 tweets
Mar 5
A thing I worry about in the (academic) privacy field is that our work isn’t really improving privacy globally. If anything it would be more accurate to say we’re finding ways to encourage the collection and synthesis of more data, by applying a thin veneer of local “privacy.”
I’m referring to the rise of “private” federated machine learning and model-building work, where the end result is to give corporations new ways to build models from confidential user data. This data was previously inaccessible (by law or customer revulsion) but now is fair game.
A typical pitch here is that, by applying techniques like Differential Privacy, we can keep any individual user’s data “out of the model.” The claim: the use of your private data is harmless, since the model “based on your data” will be statistically close to one without it.
Read 11 tweets
Feb 21
So Apple has gone and updated the iMessage protocol to incorporate both forward security (very good!) and post-quantum cryptography. security.apple.com/blog/imessage-…
This is a big deal because iMessage (which gets no real attention from anyone) is one of the most widely-adopted secure communications protocols in the world. At least 1 billion people use it, all over the world. It’s the only widely-available encrypted messaging app in China.
The original iMessage protocol was launched in 2011 and was really amazing for the time, since it instantly provided e2e messaging to huge numbers of people. But cryptographically, it wasn’t very good. My students broke it in 2015: washingtonpost.com/world/national…
Read 18 tweets
Dec 27, 2023
Article on some new research that finds ways to balance privacy and stalker detection for AirTags and other location trackers. This is a collaboration with my students @gabrie_beck, Harry Eldridge and colleagues Abhishek Jain and Nadia Heninger. wired.com/story/apple-ai…
TL;DR thread. When Apple launched their “Find My” system for lost devices in 2019, they designed a clever solution to keep bad actors (including Apple) from tracking users. This works by making devices change their broadcast identifier every 15 minutes. blog.cryptographyengineering.com/2019/06/05/how…
Two years later, Apple introduced the AirTag. At this point they noticed a problem: people were using location trackers to stalk victims, by placing them on victims’ possessions or cars. This led to several murders. arstechnica.com/tech-policy/20…
Read 18 tweets
Nov 12, 2023
I’m a sucker for crypto papers that do insane things like build ciphertexts out of garbled circuits, and then use the garbled circuit to do stuff that only shows up in the security reduction. Eg: eprint.iacr.org/2023/1058
So what’s fun about this paper is that it’s trying to do something weirdly hard: build cryptosystems that allow you to encrypt (functions of) secret keys. This can be encrypting your own secret key, or eg I can encrypt your secret key and you can encrypt mine to form a “cycle”.
The reason this is hard is that our standard definitions of security (eg semantic security) say that encryption must be safe for any possible messages an adversary can come up with. But adversaries don’t know my secret key, so the definition says nothing about that.
Read 12 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us!

:(