The story here, for those who may have forgotten 2015 (it was a long time ago!) is that the NSA inserted a backdoor into a major encryption standard and then leaned on manufacturers to install it. Thread. 1/
The backdoor was in a pseudorandom number generator called Dual EC. It wasn’t terribly subtle but it was *deniable*. You could say to yourself “well, that could be horribly exploitable but nobody would do that.” Lots of serious people said that, in fact. But they did. 2/
Not only did the NSA insert this backdoor into encryption standards, but they allegedly paid and pressured firms to implement it in their products. This includes major US security firms like RSA Security and Juniper. (That we know of!) 3/
In 2013, compelling evidence confirming the existence of this backdoor leaked out in the Snowden documents. We didn’t know quite how widely it had been implemented yet, but even then it was shocking. 4/
It would be such a terribly embarrassing story if it ended there. But it gets even worse. 5/
One of the products that the US Intel agencies allegedly convinced to use the backdoor was Juniper, whose NetScreen line of firewalls are widely deployed globally and in the US government. We didn’t know about this because the company hid it in their certification documents. 6/
Even if we’d known about this, I’m sure “serious” folks would have vociferously argued that it’s no big deal because only the NSA could possibly exploit this vulnerability (it used a special secret only they could know), so (from a very US-centric PoV) why be a big downer? 7/
But the field is called computer security; not computer optimism. We think about worst case outcomes because if we don’t do that, our opponents absolutely will. 8/
In fact, they already had. What nobody had considered was that *even if the backdoor required a special secret key* only the NSA knows, a system with such a backdoor could be easily “rekeyed.” 9/
In practice this would simply mean hacking into a major firewall manufacturer’s poorly-secured source code repository, changing 32 bytes of data, and then waiting for the windfall when a huge number of VPN connections suddenly became easy to decrypt. And that’s what happened. 10/
The company was Juniper, the hack was in 2012. It is alleged (in this new reporting) to have been a Chinese group called APT 5. Untold numbers of corporate firewalls received the new backdoor, making both US and overseas systems vulnerable. 11/
The new, rekeyed backdoor remained in the NetScreen code for over *three years*, which is a shockingly long time. Eventually it was revealed around Christmas 2015. 12/
Fortunately we learned a lot from this. Everyone involved was fired and no longer works in the field of consumer-facing cryptography.

I’m kidding! Nobody was fired, it was hushed up, and everyone involved got a big promotion or lateral transfer to lucrative jobs in industry. 13/
The outcome of the Juniper hack remains hushed-up today. We don’t know who the target is. (My pet theory based on timelines is that it was OPM, but I’m just throwing darts.) Presumably the FBI has an idea, and it’s bad enough that they’re keeping it quiet. 14/
The lesson to current events is simple: bad things happen. Don’t put backdoors in your system no matter how cryptographically clever they look, and how smart you think you are. They are vulnerabilities waiting for exploitation, and if the NSA wasn’t ready for it, you aren’t. 15/
The second lesson is that “serious” people are always inclined away from worst-case predictions. In bridge building and politics you can listen to those people. But computer security is adversarial: the conscious goal of attackers is to bring about worst-case outcomes. 16/
It is very hard for people to learn this lesson, by the way. We humans aren’t equipped for it. 17/
I want to say only two more slightly “inside baseball” things about Juniper and this reporting.

First, the inclusion of Dual EC into Juniper-NetScreen wasn’t as simple as the NSA calling the company up and asking them to implement “a NIST standard.” Image
Juniper’s public certification documents don’t mention Dual EC was even used in NetScreen products. It lists another algorithm. The NetScreen Dual EC implementation is included *in addition* to the certified one, and without documentation. That stinks like cheese. 19/
And of course there is a very coincidental “oops” software vulnerability in the NetScreen code that allows the raw output of Dual EC to ooze out onto the wire, bypassing their official, documented algorithm. For more see: dl.acm.org/doi/pdf/10.114… 20/
I’ve told this story eight million times and it never ceases to amaze me that all this really happened, and all we’ve done about it is try to build more encryption backdoors. It makes me very, very tired. 21/21 fin
Addendum: the White House Press Secretary was asked about this story, and their answer is “please stop asking about this story.” h/t @jonathanmayer

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Matthew Green

Matthew Green Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @matthew_d_green

14 Sep
A lot of pro-CSAM scanning arguments take the following form: “phones already have lots of opportunities (real and potential) for privacy abuse, so you’re stupid for minding when we try to introduce a new and highly-scalable one.”
And to some extent this argument is correct! It’s hard to believe this, but the iPhone (and mass-market smartphones) only launched in 2007. In 14 short years these devices have done more to erode user privacy than 114 years of all other electronic technologies combined.
From my perspective the only reasonable reaction for any technologist observing this incredible privacy collapse is: to wake up in the morning trying to fix it, and collapse into bed at night having spent the entire day trying to undo the damage.
Read 8 tweets
13 Sep
The thesis of this article is that Britain “tamed big tech,” but the actual legislation seems to require a few privacy switches for kids — switches that should be on for everyone under a reasonable regulatory regime. wired.co.uk/article/age-ap…
“Strange women lying in ponds is no basis for a system of government.” Image
The major observation here is that tech firms will do all sorts of things to “protect children” as long as they’re (1) relatively inexpensive, (2) don’t substantially harm their own financial interests. Which generally means doing ineffective things.
Read 4 tweets
26 Aug
My 11 y/o is making friendship bracelets to support K9 Partners for Life. Also happy to take donations direct or at the Venmo below, since it would make her happy and we’re getting zero foot traffic on this 95 degree Baltimore day :) k94life.org ImageImage
You can also send Zcash to zs1ztg7nnjqr99k4xn0g8fjw24at3nm95w864hlfk2ujq9mpumrwal2mtqe54985774whk9vvv9js8 but I can’t promise it will be tax deductible :)
Wow. $464.33 total raised for Canine Partners For Life. Thanks to everyone who donated, including the Zcash folks :)
Read 4 tweets
13 Aug
I’m glad that Apple is feeling the heat and changing their policy. But this illustrates something important: in building this system, the *only limiting principle* is how much heat Apple can tolerate before it changes its policies. reuters.com/technology/aft…
This headline pretty neatly summarizes Apple’s progress this week. gizmodo.com/apple-will-kee…
I’m grateful that Apple has been so open and positive to the technical community. I wish they’d done this before they launched their unpopular service, not after. Some of us have been talking about these issues for two years.
Read 13 tweets
10 Aug
Everyone keeps writing these doomed takes about how “the US government is going to force tech companies to comply with surveillance, so they might as well just give in preemptively.” Like it’s inevitable and we should just hope for what scraps of privacy we can.
Even I was pessimistic last week. What I’ve seen in the past week has renewed my faith in my fellow countrymen — or at least made me realize how tired and fed up of invasive tech surveillance they really are.
People are really mad. They know that they used to be able to have private family photo albums and letters, and they could use computers without thinking about who else had their information. And they’re looking for someone to blame for the fact that this has changed.
Read 11 tweets
10 Aug
I would like to see a thread on the problem of CSAM that doesn’t use automated CSAM reporting tools as the metric to show that there’s a problem.
I’m not denying that there’s a CSAM problem in the sense that there is a certain small population of users who promote this terrible stuff, and that there is awful abuse that drives it. But when we say there’s a “problem”, we’re implying it’s getting rapidly worse.
The actually truth here is that we have no idea how bad the underlying problem is. What we have are increasingly powerful automated tools that detect the stuff. As those tools get better, they generate overwhelming numbers of reports.
Read 12 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!

:(