Matthew Green Profile picture
Mar 20, 2018 22 tweets 4 min read Read on X
Good morning Twitter. This post about Ledger cryptocurrency hardware wallet vulnerabilities is extremely cool, and not just for cryptocurrency people. Let me talk a bit about it. 1/ saleemrashid.com/2018/03/20/bre…
There is a common architectural theme in certain embedded devices: they incorporate a secure processor (or processor component) to protect critical secrets or ensure correct behavior. I’ve seen this in all kinds of devices, not just cryptocurrency wallets. 2/
(For an obvious example, every recent iPhone has a Secure Enclave processor that stores your fingerprint data and cryptographic keys. But these devices are used elsewhere as well. theiphonewiki.com/wiki/Secure_En…) 3/
Secure co-processors typically incorporate some kind of tamper-resistant physical casing as well as a limited interface to protect secret data. They often have some crypto functions on board, and can “attest” (prove to remote parties) that they’re running the right software. 4/
None of these processors can withstand all attacks. But let’s ignore that part and assume they can, for the moment. This still leaves a huge gaping hole in many devices. 5/
You see, the typical “secure element” isn’t powerful enough to drive your entire device (including the pretty GUI and peripherals and network communication if that’s available). So most devices have a second “insecure” processor to do all that stuff. 6/
(A very few devices make exceptions to this. For example, the iPhone SEP has a direct connection to the fingerprint reader, because the application processor isn’t trusted with that data. Weirdly FaceID departs from this but I digress.) 7/
Anyway, the upshot of this design is that even if the secure processor works correctly, it’s entirely dependent on the (relatively) insecure general processor for nearly all functions, including driving a user interface. Basically it’s a hostage. 8/
In some cases this is ok. For example, a good SEP can use crypto to establish secure communications with a trusted outside device (like a remote server). If this is done right, even a compromised app processor can only block communications, not tamper with them. 9/
In others it’s super bad news. If the security of the device relies on the idea that the user can trust what they see on the display. But they can’t if the app processor controls that, and it becomes compromised. 10/
Solving this problem is incredibly hard. Systems like TPMs try to do it by giving the secure chip access to the same RAM as the app processor, which allows it to check which code the app processor is running. 11/
But most secure processors don’t have even this capability. They have no way of knowing whether the app processor is running good or compromised code. 12/
Which (finally!) brings us to the brave, ambitious, scary thing Ledger did. In Ledger wallets, the secure processor *asks* the app processor (nicely) to send it a copy of the firmware that it’s running. 13/
(When I mentioned this to my grad student Gabe, he got a look on his face like I had just handed him Durian candy. Then he started muttering “no, no, that can’t possibly work”) 14/
The reason to be concerned about this approach is because *if* the app processor is compromised, then why would you trust it to send over the actual (malicious) code it’s running? It could just lie and send the legit code. 15/
Ledger tries to square this circle, in a novel way. Their idea is that the device has a fixed amount of NVRAM storage. If you install compromised firmware on it, you’d need room to store that. But you’d also need to store the original firmware to satisfy the checks. 16/
If you make it hard for the attacker to find the room to do this, you win!

This time around, Ledger did not win. Saleem writes about why that is in his post, which I linked to 9,000 tweets up this thread. 17/
But, as Garth says to young Indiana Jones in “The Last Crusade”: You lost today, kid, but that doesn't mean you have to like it. 18/
And since Ledger can’t update the hardware on their devices presumably they’re going to have to try to harden their approach even further. I’m really interested to see whether they win this! 19/
Because if someone can make this approach work, it would have huge implications for a large class of devices beyond wallets. I’m deeply skeptical. But I’m always skeptical. Excited to see how it goes. 20/20 fin
And by the way, nothing in the post or thread above means you should freak out about these vulns, or that you should assume other wallets are better. Just be safe.
Also, if you don’t like massive 20-tweet rants, here’s the above Ledger thread in one page. threadreaderapp.com/thread/9760774…

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Matthew Green

Matthew Green Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @matthew_d_green

Mar 12
Google has a blog up discussing their threat modeling when deploying “post-quantum” (PQC) cryptographic algorithms. It’s an interesting read. bughunters.google.com/blog/510874798…
To elaborate a bit on what’s in the blog post, we know that quantum algorithms exist, in principle, that can break many of the cryptographic algorithms we routinely use. All we’re waiting for now is a capable enough quantum computer to run them. (And this seems hard.) 1/
But technology development isn’t linear. Sometimes problems seem impossible until a big breakthrough changes everything. Think about the development of classical computers before and after semiconductors. The same could happen with QC. 2/
Read 12 tweets
Mar 5
A thing I worry about in the (academic) privacy field is that our work isn’t really improving privacy globally. If anything it would be more accurate to say we’re finding ways to encourage the collection and synthesis of more data, by applying a thin veneer of local “privacy.”
I’m referring to the rise of “private” federated machine learning and model-building work, where the end result is to give corporations new ways to build models from confidential user data. This data was previously inaccessible (by law or customer revulsion) but now is fair game.
A typical pitch here is that, by applying techniques like Differential Privacy, we can keep any individual user’s data “out of the model.” The claim: the use of your private data is harmless, since the model “based on your data” will be statistically close to one without it.
Read 11 tweets
Feb 21
So Apple has gone and updated the iMessage protocol to incorporate both forward security (very good!) and post-quantum cryptography. security.apple.com/blog/imessage-…
This is a big deal because iMessage (which gets no real attention from anyone) is one of the most widely-adopted secure communications protocols in the world. At least 1 billion people use it, all over the world. It’s the only widely-available encrypted messaging app in China.
The original iMessage protocol was launched in 2011 and was really amazing for the time, since it instantly provided e2e messaging to huge numbers of people. But cryptographically, it wasn’t very good. My students broke it in 2015: washingtonpost.com/world/national…
Read 18 tweets
Dec 27, 2023
Article on some new research that finds ways to balance privacy and stalker detection for AirTags and other location trackers. This is a collaboration with my students @gabrie_beck, Harry Eldridge and colleagues Abhishek Jain and Nadia Heninger. wired.com/story/apple-ai…
TL;DR thread. When Apple launched their “Find My” system for lost devices in 2019, they designed a clever solution to keep bad actors (including Apple) from tracking users. This works by making devices change their broadcast identifier every 15 minutes. blog.cryptographyengineering.com/2019/06/05/how…
Two years later, Apple introduced the AirTag. At this point they noticed a problem: people were using location trackers to stalk victims, by placing them on victims’ possessions or cars. This led to several murders. arstechnica.com/tech-policy/20…
Read 18 tweets
Nov 12, 2023
I’m a sucker for crypto papers that do insane things like build ciphertexts out of garbled circuits, and then use the garbled circuit to do stuff that only shows up in the security reduction. Eg: eprint.iacr.org/2023/1058
So what’s fun about this paper is that it’s trying to do something weirdly hard: build cryptosystems that allow you to encrypt (functions of) secret keys. This can be encrypting your own secret key, or eg I can encrypt your secret key and you can encrypt mine to form a “cycle”.
The reason this is hard is that our standard definitions of security (eg semantic security) say that encryption must be safe for any possible messages an adversary can come up with. But adversaries don’t know my secret key, so the definition says nothing about that.
Read 12 tweets
Oct 29, 2023
So Apple deployed an entire key transparency thing for iMessage and it literally seems to be documented in a blog post. What the heck is the point of key transparency if you don’t document things, and (critically) provide open source ID verification tools?
Key transparency is about deterring attacks. But it doesn’t deter them if you keep it all secret, Apple! Image
Here’s the blog post. TLDR every device shares (?) an ECDSA signing key synced by iCloud key vault, all public keys go into CONIKS, encryption keys are authenticated by signing keys. So many little details unknown. security.apple.com/blog/imessage-…
Read 4 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us!

:(