[HN Gopher] NSA, NIST, and post-quantum crypto: my second lawsui...
       ___________________________________________________________________
        
       NSA, NIST, and post-quantum crypto: my second lawsuit against the
       US government
        
       Author : trulyrandom
       Score  : 905 points
       Date   : 2022-08-05 19:07 UTC (1 days ago)
        
 (HTM) web link (blog.cr.yp.to)
 (TXT) w3m dump (blog.cr.yp.to)
        
       | er4hn wrote:
       | > The same people tend to have trouble grasping that most of the
       | vulnerabilities exploited and encouraged by NSA are also
       | exploitable by the Chinese government. These people start with
       | the assumption that Americans are the best at everything; ergo,
       | we're also the best at espionage. If the Chinese government stole
       | millions of personnel records from the U.S. government, records
       | easily usable as a springboard for further attacks, this can't
       | possibly be because the U.S. government made a policy decision to
       | keep our computer systems "weak enough to still permit an attack
       | of some nature using very sophisticated (and expensive)
       | techniques".
       | 
       | I'm not sure if I understand this part. I was under the
       | impression that the OPM hack was a result of poor authn and authz
       | controls, unrelated to cryptography. Was there a cryptography
       | component sourced somewhere?
        
         | danielheath wrote:
         | If, rather than hoarding offensive tools & spying, the NSA had
         | interpreted its mission as being to harden the security of
         | government infrastructure (surely even more firmly within the
         | remit of national security) and spent its considerable budget
         | in that direction, would authn and authz controls have been
         | used at the OPM?
        
         | woodruffw wrote:
         | This is my understanding as well. I asked this very same
         | question less than a week ago[1], and now it's the first Google
         | result when you search "OPM Dual_EC_DRBG."
         | 
         | The response to my comment covers some circumstantial evidence.
         | But I'm not personally convinced; human factors are a _much_
         | more parsimonious explanation.
         | 
         | [1]: https://news.ycombinator.com/item?id=32286528
        
       | [deleted]
        
       | throwaway654329 wrote:
       | The history in this blog post is excellently researched on the
       | topic of NSA and NIST cryptographic sabotage. It presents some
       | hard won truths that many are uncomfortable to discuss, let alone
       | to actively resist.
       | 
       | The author of the blog post is also well known for designing and
       | releasing many cryptographic systems as free software. There is a
       | good chance that your TLS connections are secured by some of
       | these designs.
       | 
       | One of his previous lawsuits was critical to practically
       | protecting free speech during the First Crypto War:
       | https://en.m.wikipedia.org/wiki/Bernstein_v._United_States
       | 
       | I hope he wins.
        
         | nimbius wrote:
         | the author was also part of the Linux kernel SPECK cipher talks
         | that broke down in 2013 due to the nsa's stonewalling and hand
         | waving for technical data and explanations.
         | 
         | nsa speck was never adopted.
         | 
         | https://en.m.wikipedia.org/wiki/Speck_(cipher)
        
           | ddingus wrote:
           | Interesting read!
        
         | aliqot wrote:
         | Given his track record, and the actual meat of this suit, I
         | think he has a good chance.
         | 
         | - He is an expert in the domain
         | 
         | - He made a lawful request
         | 
         | - He believes he's experiencing an obstruction of his rights
         | 
         | I don't see anything egregious here. Being critical of your
         | government is a protected right for USA. Everyone gets a moment
         | to state their case if they'd like to make an accusation.
         | 
         | Suing sounds offensive, but that is the official process for
         | submitting an issue that a government can understand and
         | address. I'm seeing some comments here that seem aghast at the
         | audacity to accuse the government at your own peril, and it
         | shows an ignorance of history.
        
           | trasz wrote:
           | >Being critical of your government is a protected right for
           | USA. Everyone gets a moment to state their case if they'd
           | like to make an accusation.
           | 
           | Unless a kangaroo "FISA court" says you can't - in which case
           | you're screwed, and can't even tell anyone about the
           | "sentence" if it included a gag order. Still better than
           | getting droned I suppose.
        
           | newsclues wrote:
           | Trump Card: National Security
        
             | CaliforniaKarl wrote:
             | That's a valid reason (specifically, 1.4(g) listed at
             | https://www.archives.gov/declassification/iscap/redaction-
             | co...). And while the NIST returning such a response is
             | possible, it goes against the commitment to transparency.
             | 
             | But still, that requires a response, and there hasn't been
             | one.
        
               | [deleted]
        
             | Kubuxu wrote:
             | "National Security" response implies cooperation with NSA
             | and destroys NIST's credibility.
        
           | maerF0x0 wrote:
           | I'd add
           | 
           | * and it's been 20 yrs since the 9/11 attacks which
           | predicated a lot of the more recent dragnets
        
             | kevin_thibedeau wrote:
             | The dragnets existed before 9/11. That just gave
             | justification for even more funding.
        
               | throwaway654329 wrote:
               | Which programs do you mean specifically?
               | 
               | We know the nature of the mass surveillance changed and
               | expanded immensely after 9/11 in a major way, especially
               | domestically.
        
               | KennyBlanken wrote:
               | Every piece of mail that passes through a high-speed
               | sorting machine is scanned, front and back, OCR'd, and
               | stored - as far as we know, indefinitely. That's how they
               | deliver the "what's coming in your mailbox" images you
               | can sign up to receive via email.
               | 
               | Those images very often show the contents of the envelope
               | clearly enough to recognize and even read the contents,
               | which I'm quite positive isn't an accident.
               | 
               | The USPS is literally reading and storing at least part
               | of nearly every letter mailed in the United States.
               | 
               | The USPS inspectors have a long history of being used as
               | a morality enforcement agency, so yes, this should be of
               | concern.
        
               | greyface- wrote:
               | Some more details: https://en.wikipedia.org/wiki/Mail_Iso
               | lation_Control_and_Tra...
        
               | throwaway654329 wrote:
               | Agreed. It's even worse: they also have the capability
               | with the "mail covers" program to divert and tamper with
               | mail. This happens to Americans on U.S. soil and I'm not
               | just talking about suspects of terrorism.
        
               | UpstandingUser wrote:
               | I've heard rumors that this was going on for a long time
               | before it's been publicly acknowledged to have -- before
               | OCR should have been able to handle that sort of variety
               | of handwriting (reliably), let alone at scale. Like a
               | snail-mail version of the NSA metadata collection
               | program.
        
               | nuclearnice1 wrote:
               | Wikipedia gives the impression the modern incarnation of
               | photographing the US mail at scale began in 2001:
               | "created in the aftermath of the 2001 anthrax attacks
               | that killed five people, including two postal workers"
               | [0]
               | 
               | However research on photographs of mail was already
               | taking place as far back as 1986 [1]
               | 
               | [0] https://en.m.wikipedia.org/wiki/Mail_Isolation_Contro
               | l_and_T...
               | 
               | [1] https://cedar.buffalo.edu/papers/articles/Framework_O
               | bject_1...
        
               | nuclearnice1 wrote:
               | Apparently not a pre 9/11 program, if Wikipedia is
               | correct.
               | 
               | https://en.m.wikipedia.org/wiki/Mail_Isolation_Control_an
               | d_T...
        
               | fanf2 wrote:
               | TFA says: _<<The European Parliament already issued a
               | 194-page "Report on the existence of a global system for
               | the interception of private and commercial communications
               | (ECHELON interception system)" in 2001>>_ (July 2001,
               | that is)
        
               | throwaway654329 wrote:
               | Yes, Duncan Campbell's report is legendary ( https://www.
               | duncancampbell.org/menu/surveillance/echelon/IC2... ).
               | This is the same guy who revealed the existence of GCHQ,
               | and was arrested for this gift to the public.
               | 
               | To clarify, I was asking them for their specific favorite
               | programs as they didn't indicate they only meant the ones
               | in the blog post.
        
               | michaelt wrote:
               | There was the Clipper Chip [2] and the super-weak 40-bit
               | 'export strength' cryptography [3] and the investigation
               | of PGP author Phil Zimmerman for 'munitions export
               | without a license' [4].
               | 
               | So there was a substantial effort to weaken cryptography,
               | decades before 9/11.
               | 
               | On the dragnet surveillance front, there have long been
               | rumours of things like ECHELON [1] being used for mass
               | surveillance and industrial espionage. And the simple
               | fact US spies were interested in weakening export SSL
               | rather implied, to a lot of people, they had easy access
               | to the ciphertext.
               | 
               | Of course, this was before so much stuff had moved
               | online, so it was a different world.
               | 
               | [1] https://en.wikipedia.org/wiki/ECHELON [2]
               | https://en.wikipedia.org/wiki/Clipper_chip [3] https://en
               | .wikipedia.org/wiki/Export_of_cryptography_from_th... [4]
               | https://en.wikipedia.org/wiki/Pretty_Good_Privacy#Crimina
               | l_i...
        
               | throwaway654329 wrote:
               | > there was a substantial effort to weaken cryptography,
               | decades before 9/11.
               | 
               | Yes, agreed. Everyone should dig up their favorite
               | programs and discuss them openly.
        
             | feet wrote:
             | I'll also add
             | 
             | Which have not prevented anything and instead are used in
             | parallel construction to go after Americans
        
               | gene91 wrote:
               | I don't like the collateral damages of many policies. But
               | it's not fair to say that the policies "have not
               | prevented anything" because we simply don't know. The
               | policies could have stopped in-progress evil acts (but
               | they were never revealed to the public for intel reasons)
               | or prevented attempts of an evil acts (well, nothing
               | happened, nothing to report).
        
               | Quekid5 wrote:
               | One cannot prove a negative, but given how much public
               | recording of _everything_ there is these days (and in the
               | last decade+), I 'd say it's safe to err on the side of
               | them not having prevented much of consequence. ("Absence
               | of evidence..." doesn't really apply when evidence
               | _should_ be ample for the phenomenon to be explained.)
        
               | colonwqbang wrote:
               | The bar for public policy should be set quite a bit
               | higher than "it could have done some good at some point,
               | maybe".
               | 
               | In comic books, we read fanciful stories about the good
               | guys saving the world in secret. But the real world
               | doesn't really work like that.
               | 
               | When the police seize some illegal drugs, what is the
               | first thing they do? They snap a picture and publish it
               | for society to see:
               | 
               | https://www.google.com/search?q=police+seize+drugs&tbm=is
               | ch
               | 
               | because citizens want to see that their tax money is
               | being used successfully. The same would likely be done by
               | the surveillance authorities if they saw significant
               | success in their mission.
        
               | feet wrote:
               | I find it rather funny that we know about the parallel
               | construction which they attempt to keep hidden, yet don't
               | know about any successful preventions. I would assume
               | they would at least want people to know if a program was
               | a success. To me, the lack of information speaks volumes
               | 
               | This is on top of all the entrapment that we also know
               | about, performed by the FBI and associated informants on
               | Islamic/Muslim communities
               | 
               | The purpose of a system is what it does
        
               | sweetbitter wrote:
               | Considering that they do not obey the law, if they had
               | actually stopped any terrorists we would be hearing all
               | about it from "anonymous leakers" by now.
        
               | [deleted]
        
               | maerF0x0 wrote:
               | It also could have stopped the Gods from smiting us all,
               | but there's no evidence that it has.
               | 
               | This article[1] is a good start at realizing the costs
               | outweigh the benefits. There's little or no evidence of
               | good caused, but plenty of evidence of harms caused.
               | 
               | [1]: https://www.eff.org/deeplinks/2014/06/top-5-claims-
               | defenders...
        
               | daniel-cussen wrote:
               | There is evidence of that, in fact. There were many
               | serious terrorist attacks in Europe, like in Spain's
               | subway (300 dead) and Frankfurt, in the aftermath of 9/11
               | and other...uh howmy gonna say this...other stuff, the
               | Spanish terrorist attacks were done by Basque
               | nationalists or such, not Muslims.
               | 
               | So there's your control group, Europe.
        
         | fossuser wrote:
         | I remember reading about this in Steven Levy's crypto and
         | elsewhere, there was a lot of internal arguing about lots of
         | this stuff at the time and people had different opinions. I
         | remember that some of the suggested changes from NSA shared
         | with IBM were actually stronger against a cryptanalysis attack
         | on DES that was not yet publicly known (though at the the time
         | people suspected they were suggesting this because it was
         | weaker, the attack only became publicly known later). I tried
         | to find the specific info about this, but can't remember the
         | details well enough. _Edit: I think it was this:_
         | https://en.wikipedia.org/wiki/Differential_cryptanalysis
         | 
         | They also did intentionally weaken a standard separately from
         | that and all the arguing about 'munitions export' intentionally
         | requiring weak keys etc. - all the 90s cryptowar stuff that
         | mostly ended after the clipper chip failure. They also worked
         | with IBM on DES, but some people internally at NSA were upset
         | that they shared this after the fact. The history is a lot more
         | mixed with a lot of people arguing about what the right thing
         | to do is and no general consensus on a lot of this stuff.
        
           | api wrote:
           | > I remember that some of the suggested changes from NSA
           | shared with IBM were actually stronger against a
           | cryptanalysis attack on DES that was not yet publicly known
           | 
           | So we have that and other examples of NSA apparently
           | strengthening crypto, then we have the dual-EC debacle and
           | some of the info in the Snowden leaks showing that they've
           | tried to weaken it.
           | 
           | I feel like any talk about NSA influence on NIST PQ or other
           | current algorithm development is just speculation unless
           | someone can turn up actual evidence one way or another. I can
           | think of reasons the NSA would try to strengthen it and
           | reasons they might try to weaken it, and they've done both in
           | the past. You can drive yourself nuts constructing infinitely
           | recursive what-if theories.
        
             | kmeisthax wrote:
             | The NSA wants "NOBUS" (NObody-But-US) backdoors. It is in
             | their interest to make a good show of fixing easily-
             | detected vulnerabilities while keeping their own
             | intentional ones a secret. The fantasy they are trying to
             | sell to politicians is that people can keep secrets from
             | other people but not from the government; that they can
             | make uncrackable safes that still open when presented with
             | a court warrant.
             | 
             | This isn't speculation either; Dual_EC_DRBG and its role as
             | a NOBUS backdoor was part of the Snowden document dump.
        
               | api wrote:
               | Here's the counter-argument that I've seen in
               | cryptography circles:
               | 
               | Dual EC, a PRNG built on an asymmetric crypto template,
               | was kind of a ham fisted and obvious NOBUS back door. The
               | math behind it made such a backdoor entirely plausible.
               | 
               | That's less obvious in other cases.
               | 
               | Take the NIST ECC curves. If they're backdoored it means
               | the NSA knows something about ECC we don't know and
               | haven't discovered in the 20+ years since those curves
               | were developed. It also means the NSA was able to search
               | all ECC curves to find vulnerable curves using 1990s
               | technology. Multiple cryptographers have argued that if
               | this is true we should really consider leaving ECC
               | altogether. It means a significant proportion of ECC
               | curves may be problematic. It means for all we know
               | Curve25519 is a vulnerable curve given the fact that this
               | hypothetical vulnerability is based on math we don't
               | understand.
               | 
               | The same argument could apply to Speck:
               | 
               | https://en.wikipedia.org/wiki/Speck_(cipher)
               | 
               | Speck is incredibly simple with very few places a
               | "mystery constant" or other back door could be hidden. If
               | Speck is backdoored it means the NSA knows something
               | about ARX constructions that we don't know, and we have
               | no idea whether this mystery math also applies to ChaCha
               | or Blake or any of the other popular ARX construction
               | gaining so much usage right now. That means if we
               | (hypothetically) knew for a fact that Speck was
               | backdoored _but not how it 's backdoored_ it might make
               | sense to move away from ARX ciphers entirely. It might
               | mean many or all of them are not as secure as we think.
        
               | dhx wrote:
               | SM2 (Chinese), GOST (Russian) and NIST P (American)
               | parameters are "you'll just have to straight up assume
               | these are something up our sleeve numbers".
               | 
               | ECGDSA/brainpool (German) and ECKCDSA (Korean) standards
               | make an attempt to explain how they chose recommended
               | parameters but at least for brainpool parameters, the
               | justifications fall short.
               | 
               | The DiSSECT[1] project recently published this year is an
               | excellent approach to estimating whether parameters
               | selected (often without justification) are suspicious.
               | GOST parameters were found to be particularly suspicious.
               | 
               | I wonder if a similar project could be viable for
               | assessing parameters of other types of cryptographic
               | algorithms e.g. Rijndael S-box vs. SM4 S-box selection?
               | 
               | [1] https://dissect.crocs.fi.muni.cz/
        
               | throwaway654329 wrote:
               | Regarding Simon and Speck: one simple answer is that the
               | complicated attacks may exist and simple attacks
               | certainly exist for smaller block and smaller key sizes.
               | 
               | However, it's really not necessary to have a backdoor in
               | ARX designs directly when they're using key sizes such as
               | 64, 72, 96, 128, 144, 192 or 256 bits with block sizes of
               | 32, 48, 64, 96 or 128 bits. Especially so if quantum
               | computers arrive while these ciphers are still deployed.
               | Their largest block sizes are the smallest available for
               | other block ciphers. The three smallest block sizes
               | listed are laughable.
               | 
               | They have larger key sizes specified on the upper end.
               | Consider that if the smaller keys are "good enough for
               | NSA" - it will be used and exploited _in practice_. Not
               | all bits are equal either. Simon's or Spec's 128 bits are
               | doubtfully as strong as AES's 128 bits, certainly with
               | half the bits for the block size. It also doesn't inspire
               | confidence that AES had rounds removed and that the AES
               | 256 block size is... 128 bits. Suite A cryptography
               | probably doesn't include a lot of 32 bit block sizes.
               | Indeed BATON supposedly bottoms out at 96 bits. One block
               | size for me, another for thee?
               | 
               | In a conversation with an author of Speck at FSE 2015, he
               | stated that for some systems only a few minutes of
               | confidentiality was really required. This was said
               | openly!
               | 
               | This is consistent in my view with NSA again
               | intentionally pushing crypto that can be broken in
               | certain conditions to their benefit. This can probably be
               | practically exploited though brute force with their
               | computational resources.
               | 
               | Many symmetric cryptographers literally laugh at the NSA
               | designs and at their attempts at papers justifying their
               | designs.
               | 
               | Regarding NIST curves, the safe curves project shows that
               | implementing them safely is difficult. That doesn't seem
               | like an accident to me, but perhaps I am too cynical?
               | Side channels are probably enough for targeted breaks.
               | NIST standardization of ECC designs don't need to be
               | exploited in ways that cryptographers respect - it just
               | needs to work for NSA's needs.
        
               | throwaway654329 wrote:
               | NSA doesn't want NOBUS, they're not a person.
               | 
               | NSA leadership has policies to propose and promote the
               | NOBUS dream. Even with Dual_EC_DRBG, the claims of NOBUS
               | were incredibly arrogant. Just ask Juniper and OPM how
               | that NOBUS business worked out. The NSA leadership wants
               | privileged access and data at nearly any cost. The
               | leadership additionally want you to believe that they
               | want NOBUS for special, even exceptional cases. In
               | reality they want bulk data, and they want it even if the
               | NOBUS promises can fail open.
               | 
               | Don't believe the hype, security is hard enough, NOBUS
               | relies on so many assumptions that it's a comedy. We know
               | about Snowden because he went public, does anyone think
               | we, the public, would learn if important keys were
               | compromised to their backdoors? It seems extremely
               | doubtful that even the IG would learn, even if NSA
               | themselves could discover it in all cases.
        
             | fossuser wrote:
             | I think it's just both. It's a giant organization of people
             | arguing in favor of different things at different times
             | over its history, I'd guess there's disagreement
             | internally. Some arguing it's critical to secure encryption
             | (I agree with this camp), others wanting to be able to
             | break it for offense reasons despite the problems that
             | causes.
             | 
             | Since we only see the occasional stuff that's unclassified
             | we don't really know the details and those who do can't
             | share them.
        
               | throwaway654329 wrote:
               | There are plenty of leaked classified documents from NSA
               | (and others) that have been verified as legitimate. Many
               | people working in public know stuff that hasn't been
               | published in full.
               | 
               | Here is one example with documents:
               | https://www.spiegel.de/international/world/the-nsa-uses-
               | powe...
               | 
               | Here is another:
               | https://www.spiegel.de/international/germany/inside-the-
               | nsa-...
               | 
               | Please read each and every classified document published
               | alongside those two stories. I think you may revise your
               | comments afterwards.
        
           | throwaway654329 wrote:
           | You are not accurately reflecting the history that is
           | presented in the very blog post we are discussing.
           | 
           | NSA made DES weaker for _everyone_ by reducing the key size.
           | IBM happily went along. The history of IBM is dark. NSA
           | credited tweaks to DES can be understood as ensuring that _a
           | weakened DES stayed deployed longer_ which was to their
           | advantage. They clearly explain this in the history quoted by
           | the author:
           | 
           | "Narrowing the encryption problem to a single, influential
           | algorithm might drive out competitors, and that would reduce
           | the field that NSA had to be concerned about. Could a public
           | encryption standard be made secure enough to protect against
           | everything but a massive brute force attack, but weak enough
           | to still permit an attack of some nature using very
           | sophisticated (and expensive) techniques?"
           | 
           | They're not internally conflicted. They're strategic
           | saboteurs.
        
             | fossuser wrote:
             | > "NSA credited tweaks to DES can be understood as ensuring
             | that a weakened DES stayed deployed longer which was to
             | their advantage. They clearly explain this in the history
             | quoted by the author"
             | 
             | I'm not sure I buy that this follows, wouldn't the weakened
             | key size also make people not want to deploy it given that
             | known weakness? To me it reads more that some people wanted
             | a weak key so NSA could still break it, but other people
             | wanted it to be stronger against differential cryptanalysis
             | attacks and that they're not really related. It also came
             | across that way in Levy's book where they were arguing
             | about whether they should or should not engage with IBM at
             | all.
        
               | throwaway654329 wrote:
               | It follows: entire industries were required to deploy DES
               | and the goal was to create one thing that was "strong
               | enough" to narrow the field.
               | 
               | Read the blog post carefully about the role of NBS, IBM,
               | and NSA in the development of DES.
               | 
               | It's hard to accept because the implications are
               | upsetting and profound. The evidence is clear and
               | convincing. Lots of people try to muddy the waters, don't
               | help them please.
        
               | fossuser wrote:
               | They had a privately known way to weaken DES that
               | effectively shortens the key length. They could have
               | pretended to allow a longer key length while secretly
               | retaining their privately known attack that lets them
               | shorten it (without also acting to strengthen DES against
               | it). They knew this in the 70s _20 years_ before it would
               | become publicly known. They actively strengthened DES
               | against this while not revealing the exploit. Doing this
               | secretly doesn 't narrow the field (doing it publicly
               | might have), it's also inconsistent with their argument
               | for short keys.
               | 
               | I read the blog post and I've read a lot about the
               | history of this - what you're saying isn't really
               | convincing. Often people I mostly agree with, maybe 90%
               | just take it to the extreme where everything must fit
               | their world view 100%. Rarely imo is that the case, often
               | reality is more mixed.
               | 
               | If they're related maybe they wanted DES to be strong so
               | they could use it, but wanted the public to only have
               | access to short keys so they could also break the
               | public's use of it. Still, it's interesting they didn't
               | leave in a weakness they could exploit secretly despite a
               | longer key size.
               | 
               |  _edited for clarity_
        
               | throwaway654329 wrote:
               | You're making a lot of assumptions and guesses to imply
               | they helped overall when we know they weakened DES by
               | reducing the key size such that it was practically
               | breakable as a hobby project. At the time of DES
               | creation, Hellman remarked that this was a bad enough
               | problem to fix it by raising the key size. NSA and IBM
               | and others ignored the cryptographers who were not
               | compromised. Any benefit against DC attacks seems clearly
               | like a hedge against DES being replaced sooner and
               | against known adversary capabilities. When did the
               | Russians learn that technique? Probably before the public
               | did, I would wager.
               | 
               | The longer DES stays, the longer NSA retain their
               | capabilities. Any design changes made by NSA are for
               | their benefit first. That's the primary lesson from my
               | perspective.
        
               | fossuser wrote:
               | I don't think they helped overall, I'd agree on net they
               | acted to make things less secure by arguing for the small
               | key sizes. We mostly agree. I just think strengthening
               | public DES based on a security issue that was not public
               | at the time is an interesting example of a time they did
               | the opposite of inserting a backdoor, people were afraid
               | their suggestions were weakening DES, but they were
               | strengthening it. That paired with the history suggested
               | some internal arguing about priorities.
        
             | bragr wrote:
             | >IBM happily went along. The history of IBM is dark.
             | 
             | Then, as of now, I'm confused why people expect these kinds
             | of problems to be solved by corporations "doing the right
             | thing" rather than demanding some kind of real legislative
             | reform.
        
               | mensetmanusman wrote:
               | Is there a third option beyond government and
               | corporations?
        
               | revscat wrote:
               | Libertarian and capitalist propaganda. The answer is
               | always a variation of "if you don't like it, don't buy
               | it/let the market decide." Even if the "market" heads
               | towards apocalypse.
        
               | throwaway654329 wrote:
               | Agreed. It can be both but historically companies
               | generally do the sabotage upon request, if not
               | preemptively. This hasn't changed much at all in favor of
               | protecting regular users, except maybe with the expansion
               | of HTTPS, and a few other exceptions.
        
         | matthewmcg wrote:
         | Right came here to make the same point. The first lawsuit
         | alluded to in the blog post title resulted in an important
         | holding that source code can be protected free expression.
        
         | [deleted]
        
           | [deleted]
        
       | sigil wrote:
       | Near the end of the post - after 50 years of axe grinding - djb
       | does eventually get to the point wrt pqcrypto. I find the below
       | excerpt particularly damning. Why not wrap nascent pqcrypto in
       | classical crypto? Suspect!
       | 
       | --
       | 
       | The general view today is that of course post-quantum
       | cryptography should be an extra layer on top of well-established
       | pre-quantum cryptography. As the French government cybersecurity
       | agency (Agence nationale de la securite des systemes
       | d'information, ANSSI) put it at the end of 2021:
       | 
       |  _Acknowledging the immaturity of PQC is important: ANSSI will
       | not endorse any direct drop-in replacement of currently used
       | algorithms in the short /medium term. However, this immaturity
       | should not serve as an argument for postponing the first
       | deployments. ANSSI encourages all industries to progress towards
       | an initiation of a gradual overlap transition in order to
       | progressively increase trust on the post-quantum algorithms and
       | their implementations while ensuring no security regression as
       | far as classical (pre-quantum) security is concerned. ..._
       | 
       |  _Given that most post-quantum algorithms involve message sizes
       | much larger than the current pre-quantum schemes, the extra
       | performance cost of an hybrid scheme remains low in comparison
       | with the cost of the underlying post-quantum scheme. ANSSI
       | believes that this is a reasonable price to pay for guaranteeing
       | an additional pre-quantum security at least equivalent to the one
       | provided by current pre-quantum standardized algorithms._
       | 
       | But NSA has a different position: it says that it "does not
       | expect to approve" hybrids. Publicly, NSA justifies this by
       | 
       | - pointing to a fringe case where a careless effort to add an
       | extra security layer damaged security, and
       | 
       | - expressing "confidence in the NIST PQC process".
       | 
       | Does that mean the original NISTPQC process, or the current
       | NISTPQC process in which NIST, evidently surprised by attacks,
       | announced plans to call for new submissions?
       | 
       | Of course, if NSA/IDA have secretly developed an attack that
       | works for a particular type of post-quantum cryptosystem, then it
       | makes sense that they'd want people to start using that type of
       | cryptosystem and turn off the existing pre-quantum cryptosystem.
        
         | tptacek wrote:
         | This is the least compelling argument Bernstein makes in the
         | whole post, because it's simply not the job of the NIST PQC
         | program to design or recommend hybrid classical/PQC schemes. Is
         | it fucky and weird if NSA later decides to recommend against
         | people using hybrid key establishment? Yes. Nobody should
         | listen to NSA about that, or anything else. But NIST ran a PQC
         | KEM and signature contest, not a secure transport
         | standardization. Sir, this is a Wendy's.
        
           | sigil wrote:
           | It's compelling in context. If the NSA influenced NIST
           | standards 3x in the past -- DES, DSA, Dual EC -- then
           | shouldn't we be on high alert this 4th time around?
           | 
           | That NSA is _already_ recommending against hybrid, instead of
           | waiting for the contest results, _might_ signal they've once
           | again managed to game the standardization process itself.
           | 
           | At the very least -- given the exhaustive history in this
           | post -- you'd like to know what interactions NSA and NIST
           | have had this time around. Thus, djb's FOIA. And thus the
           | lawsuit when the FOIA went unanswered. It all seems very
           | reasonable to me.
           | 
           | What's that old saying, "fool me thrice..."?
        
             | tptacek wrote:
             | Everybody is on high alert. Being on high alert doesn't
             | make Bernstein right.
             | 
             | I don't even support the premise of NIST crypto
             | standardization, let alone trust them to do it.
        
       | thorwayham wrote:
       | dig @1.1.1.1 blog.cr.yp.to is failing for me, but 8.8.8.8 works.
       | Annoying!
        
       | ris wrote:
       | There are ways of writing that make one look _less_ like a
       | paranoid conspiracy theorist.
        
       | jcranmer wrote:
       | If anyone is curious, the courtlistener link for the lawsuit is
       | here: https://www.courtlistener.com/docket/64872195/bernstein-v-
       | na...
       | 
       | (And somebody has already kindly uploaded the documents to RECAP,
       | so it costs you nothing to access.)
       | 
       | Aside: I really wish people would link to court documents
       | whenever they talk about an ongoing lawsuit.
        
         | Natsu wrote:
         | > Aside: I really wish people would link to court documents
         | whenever they talk about an ongoing lawsuit.
         | 
         | I just want to second that and thank you for the link. Most
         | reporting is just horribly bad at covering legal stuff because
         | all the stuff that makes headlines that people click on is
         | mostly nonsense.
        
           | AndyMcConachie wrote:
           | And a big thank you to the wonderful people at the Free Law
           | Project for giving us the ability to find and link to this
           | stuff. They're a non-profit and they accept donations. (hint
           | hint)
        
         | tptacek wrote:
         | It's just a vanilla FOIA lawsuit, of the kind hundreds of
         | people file every month when public bodies fuck up FOIA.
         | 
         | If NIST puts up any kind of fight (I don't know why they
         | would), it'll be fun to watch Matt and Wayne, you know, win a
         | FOIA case. There's a lot of nerd utility in knowing more about
         | how FOIA works!
         | 
         | But you're not going to get the secrets of the Kennedy
         | assassination by reading this thing.
        
           | chasil wrote:
           | I will draw to your attention two interesting facts.
           | 
           | First, OpenSSH has disregarded the winning (crystals)
           | variants, and implemented hybrid NTRU-Prime. The Bernstein
           | blog post discusses hybrid designs.
           | 
           | "Use the hybrid Streamlined NTRU Prime + x25519 key exchange
           | method by default ("sntrup761x25519-sha512@openssh.com"). The
           | NTRU algorithm is believed to resist attacks enabled by
           | future quantum computers and is paired with the X25519 ECDH
           | key exchange (the previous default) as a backstop against any
           | weaknesses in NTRU Prime that may be discovered in the
           | future. The combination ensures that the hybrid exchange
           | offers at least as good security as the status quo."
           | 
           | https://www.openssh.com/releasenotes.html
           | 
           | Second, Daniel Bernstein has filed a public complaint against
           | the NIST process, and the FOIA stonewalling adds more concern
           | and doubt that the current results are fair.
           | 
           | https://www.google.com/url?q=https://groups.google.com/a/lis.
           | ..
           | 
           | What are the aims of the lawsuit? Can the NIST decision on
           | crystals be overturned by the court, and is that the goal?
        
             | Thorrez wrote:
             | >What are the aims of the lawsuit? Can the NIST decision on
             | crystals be overturned by the court, and is that the goal?
             | 
             | It sounds to me like the goal is to find out if there's any
             | evidence of the NSA adding weaknesses into any of the
             | algorithms. That information would allow people to avoid
             | using those algorithms.
        
               | thalassophobia wrote:
               | I think the fact that NIST refuses yo give any
               | information on that is enough evidence in of itself.
        
               | tptacek wrote:
               | The town I live in just outside of Chicago refused to
               | disclose their police General Orders to me; I had to
               | engage the same attorneys Bernstein did to get them. What
               | can I infer from their refusal? That the General Orders
               | include their instructions from the Lizard People
               | overlords?
        
             | djmdjm wrote:
             | We (OpenSSH) haven't "disregarded" the winning variants, we
             | added NTRU before the standardisation process was finished
             | and we'll almost certainly add the NIST finalists fairly
             | soon.
        
               | chasil wrote:
               | I will eagerly await the new kex and keytypes, and will
               | be sure to sysupgrade.
               | 
               | I will be _very_ curious if the default kex shifts away
               | from NTRU-Prime.
               | 
               | I might also point out that crystals-kyber was coequal to
               | NTRU-Prime at the time that you set your new default kex.
               | 
               | I trust that the changelog will have a detailed
               | explanation of all the changes that you will make, and
               | why.
               | 
               | I will "ssh-rotate" whatever you decide.
               | 
               | https://www.linuxjournal.com/content/ssh-key-rotation-
               | posix-...
        
             | LinuxBender wrote:
             | It's not the first time either and it won't be the last.
             | NIST chose Rijndael over Serpent for the AES standard even
             | though Serpent won. I vaguely recall they gave some smarmy
             | answer. I don't think anyone submitted a FOIA not that it
             | would matter. I've been through that bloated semi-pseudo
             | process and saw how easy it was to stall people not answer
             | a simple question.
        
               | bannable wrote:
               | Rijndael was selected over Serpent for performance
               | reasons.
        
               | chasil wrote:
               | This is what I know; wish I knew more.
               | 
               | AES won due to software performance.
               | 
               | https://www.moserware.com/2009/09/stick-figure-guide-to-
               | adva...
        
             | tptacek wrote:
             | What are the aims of the lawsuit? NIST fucked up a FOIA
             | response. The thing you do when a public body gives you an
             | unsatisfactory FOIA response is that you sue them. I've
             | been involved in similar suits. I'd be surprised if NIST
             | doesn't just cough up the documents to make this go away.
             | 
             | "Can NIST's decisions on crystals be overturned by the
             | court?" Let me help you out with that: no, you can't use a
             | FOIA suit to "overturn" a NIST contest.
             | 
             | OpenSSH implemneted NTRU-Prime? What's your point? That we
             | should just do whatever the OpenSSH team decides to do? I
             | almost agree! But then, if that's the case, none of this
             | matters.
        
               | adrian_b wrote:
               | I assume that the point was that NSA is against using
               | hybrid algorithms like the one used by OpenSSH, which
               | combine a traditional algorithm with a post-quantum
               | algorithm, arguing that using both algorithms is an
               | unnecessary complication.
               | 
               | The position of D. J. Bernstein and also of the OpenSSH
               | team is that the prudent approach is to use only hybrid
               | algorithms until enough experience is gained with the
               | post-quantum algorithms, to be reasonably certain that
               | they are secure against the possible attacks.
               | 
               | If they obtain the documents requested through FOIA, it
               | is expected that they will support the opinion that the
               | NSA recommendations should be ignored, because they have
               | a very long history in making attempts to convince the
               | public that certain cryptographic algorithms are secure
               | enough, even when they were aware of weaknesses in those
               | algorithms that they could exploit, so it was in their
               | interest that everybody else should use them, to
               | facilitate the NSA's tasks.
               | 
               | As explained at the linked Web page, in the past NSA has
               | forced the standardization of algorithms that had too
               | short keys, i.e. DES and DSA, and has made partially-
               | successful attempts to standardize back-doored algorithms
               | like Clipper and their infamous random bit generator.
               | 
               | Similarly now, they want to enforce the use of only the
               | post-quantum winning algorithm, without the additional
               | protection of combining it with a traditional algorithm.
        
               | tptacek wrote:
               | Fucking _everybody 's_ position is to combine classical
               | key exchanges with PQC KEMs. It wasn't NIST's job to
               | standardize a classical+PQC construction. The point of
               | the contest is to figure out which PQC constructions to
               | use. NIST also didn't recommend that everyone implement
               | their cryptographic handshakes in a memory-safe language.
               | But not doing that is going to get a bunch of people
               | owned by NSA too. Do you see how silly this argument is?
        
               | adrian_b wrote:
               | Of course it was not NIST's job to standardize a hybrid
               | algorithm and nobody claims such a thing.
               | 
               | However the silly position is that of the NSA, as shown
               | in
               | 
               | https://web.archive.org/web/20220529202244im_/https://pbs
               | .tw...
               | 
               | which attempts to strongly discourage the use of any
               | "crypto redundancy" and says that they will not approve
               | such algorithms.
        
               | tptacek wrote:
               | Obviously people do claim that the NIST contest is
               | suspect because it doesn't approve hybrid schemes; there
               | are people who claim it on this thread.
        
               | chasil wrote:
               | Ostensibly, nistpqc is about finding safe crypto, first
               | for TLS, second for ssh. You will argue differently, but
               | we all expect the same end product.
               | 
               | NIST has specifically asked for guidance on hybrid crypto
               | (as well you know), as I documented elsewhere on this
               | page.
               | 
               | You assert that NIST only accepts pure post-quantum
               | crypto. They ask for hybrid.
               | 
               | Color me jaded.
               | 
               | EDIT: Just for you, my fine fellow!
               | 
               | 'For example, in email to pqc-forum dated 30 Oct 2019
               | 15:38:10 +0000 (2019), NIST posted technical comments
               | regarding hybrid encryption modes and asked for feedback
               | "either here on the pqc-forum or by contacting us at pqc-
               | comments@nist.gov" (emphasis added).'
               | 
               | https://www.google.com/url?q=https://groups.google.com/a/
               | lis...
        
               | tptacek wrote:
               | Thanks, it's nice not to have to link somewhere deep into
               | the thread to support the point I just made.
        
       | kvetching wrote:
        
         | dang wrote:
         | We ban accounts that post like this, so please don't.
         | 
         | https://news.ycombinator.com/newsguidelines.html
        
         | [deleted]
        
       | xenophonf wrote:
       | Good god, this guy is a bad communicator. Bottom line up front:
       | 
       | > _NIST has produced zero records in response to this [March
       | 2022] FOIA request [to determine whether /how NSA may have
       | influenced NIST's Post-Quantum Cryptography Standardization
       | Project]. Civil-rights firm Loevy & Loevy has now filed suit on
       | my behalf in federal court, the United States District Court for
       | the District of Columbia, to force NIST to comply with the law._
       | 
       | Edit: Yes, I know who DJB is.
        
         | jcranmer wrote:
         | That is truly burying the lede...
         | 
         | I spent most of the post asking myself "okay, I'm guessing this
         | is something about post-quantum crypto, but _what_ are you
         | actually suing about? "
        
         | [deleted]
        
         | kube-system wrote:
         | Well, he is an expert in cryptic communication
        
       | lizardactivist wrote:
       | An expert, prominent, and someone who the whole cryptography
       | community listens to, and he calls out the lies, crimes, and
       | blatant hypocrisy of his own government.
       | 
       | I genuinely fear that he will be suicided one of these days.
        
         | ok_dad wrote:
         | I think the United States is more about charging people with
         | crimes and ruining their lives that way rather than
         | disappearing people. Russia might kill you with Polonium and
         | make sure everyone knows it, but America will straight up
         | "legally" torture you in prison via several means and then
         | argue successfully that those methods were legal and convince
         | the world you weren't tortured. Anyone who's a target for that
         | treatment, though, knows that's a lie.
        
           | dmix wrote:
           | The FBI will just interview you over whatever and then charge
           | you for lying to a federal agent or dig up some other
           | unrelated dirt. While the original investigation gets
           | mysteriously dropped a year later.
        
           | danuker wrote:
           | McAfee and Epstein pop to mind. Maybe also Aaron Swartz.
        
             | oittaa wrote:
             | It seems silly to me how so many people immediately dismiss
             | anyone even suggesting that something fishy was going on
             | with those cases, when we already know about MKUltra,
             | Tuskegee expirement, etc.
        
             | discordance wrote:
             | Assange too.
        
               | danuker wrote:
               | Not yet. Maybe he will survive to come out the other end,
               | like Chelsea Manning.
        
       | dmix wrote:
       | This is one hell of a well written argument.
        
       | gred wrote:
       | This guy is the best kind of curmudgeon. I love it.
        
       | bumper_crop wrote:
       | This definitely has the sting of bitterness in it, I doubt djb
       | would have filed this suit if NTRU Prime would have won the PQC
       | NIST contest. It's hard to evaluate this objectively when there
       | are strong emotions involved.
        
         | [deleted]
        
         | cosmiccatnap wrote:
         | It's funny how often the bitterness of a post is used as an
         | excuse to dismiss the long and well documented case being made.
        
           | bumper_crop wrote:
           | If NTRU Prime had been declared the winner, would this suit
           | have been filed? It's the same contest, same people, same
           | suspicious behavior from NIST. I don't think this suit would
           | have come up. djb is filing this suit because of alleged bad
           | behavior, but I have doubts that it's the real reason.
        
             | throwaway654329 wrote:
             | Yes, I think so. His former PhD students were among the
             | winners in round three and he has other work that has also
             | made it to round four. I believe he would have sued if he
             | won every single area in every round. This is the Bernstein
             | way.
             | 
             | The behavior in question by NIST isn't just alleged - look
             | at the FOIA ( https://www.muckrock.com/foi/united-states-
             | of-america-10/nsa... ). They're not responding in a
             | reasonable or timely manner.
             | 
             | Does that seem like reasonable behavior by NIST to you?
             | 
             | To my eyes, it is completely unacceptable behavior by NIST,
             | especially given the timely nature of the standardization
             | process. They don't even understand the fee structure
             | correctly, it's a comedy of incompetence with NIST.
             | 
             | His FOIA predates the round three announcement. His lawsuit
             | was filed in a timely manner, and it appears that he filed
             | it fairly quickly. Many requesters wait much longer before
             | filing suit.
        
         | pixl97 wrote:
         | When it comes to the number of times DJB is right versus the
         | number of times that DBJ is wrong, I'll fully back DJB. Simply
         | put the NSA/NIST cannot and should not be trusted in this case.
        
           | bumper_crop wrote:
           | You misread. I'm saying his reasons for filing are in
           | question. NIST probably was being dishonest. That's not the
           | reason there is a lawsuit though.
        
             | throwaway654329 wrote:
             | They're not in question for many people carefully tracking
             | this process. He filed his FOIA before the round three
             | results were announced.
             | 
             | The lawsuit is because they refused to answer his
             | reasonable and important FOIA in a timely manner. This is
             | not unlike how they also delayed the round three
             | announcement.
        
       | lawrenceyan wrote:
       | Here's an interesting question. Even if post-quantum cryptography
       | is securely implemented, doesn't the advent of neurotechnology
       | (BCIs, etc.) make that method of security obsolete?
       | 
       | With read and write capability to the brain, assuming this comes
       | to fruition at some point, encryption as we know it won't work
       | anymore. But I don't know, maybe this isn't something we have to
       | worry about just quite yet.
        
         | Banana699 wrote:
         | The thing you're missing is that BCIs and friends are,
         | themselves, computers, and thus securable with post-quantum
         | cryptography, or any cryptography for that matter, or any means
         | of securing a computer. And thus, for somebody to read-write to
         | your computers, they need to read-write to your brain(s), but
         | to read-write to your brain(s), they need to read-write to the
         | computers implanted in your brain(s). It's a security cycle
         | whose overall power is determined by the least-secure element
         | in the chain.
         | 
         | Any sane person will also not touch BCIs and similar technology
         | with a 100 lightyear pole unless the designing company reveals
         | every single fucking silicon atom in the hardware design and
         | every single fucking bit in the software stack at every level
         | of abstraction, and ships the device with several redundant
         | watchdogs and deadmen timers around it that can safely kill or
         | faraday-cage the implant on user-defined events or manually.
         | 
         | Alas, humans are very rarely sane, and I come to the era of bio
         | hacking (in all senses of the word) with low expectations.
        
         | yjftsjthsd-h wrote:
         | The encryption is fine, that's just a way to avoid it. Much
         | like how tire-iron attacks don't _break_ passwords so much as
         | bypass them.
        
           | lawrenceyan wrote:
           | Ok that's actually a great point. To make the comparison:
           | 
           | Tire-irons require physical proximity. And torture generally
           | doesn't work, at least in the case of getting a private key.
           | 
           | Reading/writing to the brain, on the other hand, requires no
           | physical proximity if wireless. And the person(s) won't even
           | know it's happening.
           | 
           | These seem like totally different paradigms to me.
        
             | ziddoap wrote:
             | I think we are a _long_ way away from being able to
             | wirelessly read a few specific bytes of data from the brain
             | of an unknowing person. Far enough away that I 'm not sure
             | it's productive to begin thinking of how to design
             | encryption systems around it.
        
               | lawrenceyan wrote:
               | Memory and experience aren't encoded in the brain like
               | traditional computers. There's no concept of a "byte"
               | when thinking about the human computational model.
        
               | ziddoap wrote:
               | There is the concept of "byte" when talking about a
               | string of characters which make up a password, though,
               | which is why I said bytes. But yes, I am aware, and your
               | statement just further supports my point.
        
               | pferde wrote:
               | Not necessarily. A person could remember a password that
               | contains name of their loved one differently in their
               | brain than some arbitrary string of letters and numbers.
               | Those letters and numbers can each be "encoded"
               | differently in their brain - e.g. maybe the letter 'S' is
               | linked in their brain to snakes because it kind of looks
               | like one. Or any kind of weird connections of certain
               | parts of the password to a smell they smelled twenty
               | years ago. This would all deeply affect how the actual
               | string of character is actually "stored" in the brain.
               | 
               | Yes, after you'd extract the password from their brain,
               | you would then convert it to a string of bytes and store
               | it on your digital storage device, but you were talking
               | about accessing data in a human brain.
               | 
               | The point is, human brain is weird when looked at from
               | point of view of data storage. :)
        
               | ziddoap wrote:
               | > _[...] but you were talking about accessing data in a
               | human brain._
               | 
               | No, I wasn't. I used bytes as a _unit of measurement_ of
               | data. I guess if I said  "characters" instead of "bytes"
               | people would stop trying to explain this to me. Although
               | I sort of doubt that, because I said "yes, I know" and
               | then get another paragraph explaining the same thing to
               | me.
        
             | aaaaaaaaata wrote:
             | > And torture generally doesn't work, at least in the case
             | of getting a private key.
             | 
             | This seems incorrect.
        
             | PaulDavisThe1st wrote:
             | > torture generally doesn't work, at least in the case of
             | getting a private key.
             | 
             | Why not?
        
               | fudgefactorfive wrote:
               | You can beat me to a pulp, doesn't make me suddenly
               | remember a specific N byte string any faster.
               | 
               | Passwords are to be remembered, private keys are to be
               | stored. I suppose I'll tell you where it's stored, but
               | often even that doesn't help. (E.g. It's on a USB key I
               | didn't label and lost, or this is totally the admin pin
               | to my smartcard, ok you got me these 3 are the real pins,
               | uh oh it's physically wiped itself? Sad face for you)
        
               | [deleted]
        
         | [deleted]
        
         | lysergia wrote:
         | Yeah I've even had very personal dreams where my Linux root
         | password was spoken in the dream. I'm glad I don't talk in my
         | sleep. There's also truth serums that can be weaponized in war
         | scenarios to extract secrets from the enemy without resorting
         | to torture.
        
         | xenophonf wrote:
         | Cryptographic secrets stored in human brains are already
         | vulnerable to an attack mechanism that requires $5 worth of
         | interface hardware that can be procured and operated with very
         | little training. Physical security controls do a decent job of
         | preventing malicious actors from connecting said hardware to
         | vulnerable brains. I assume the same would be true with the
         | invention of BCIs more sophisticated than a crescent wrench.
        
       | rnhmjoj wrote:
       | If there's the suspiscion that NIST interests aren't aligned with
       | the public ones (at least wrt cryprography, I hope they're at
       | least honest with the physical constants), why do we still allow
       | them do dictate the standards?
       | 
       | I mean, there's plenty of standards bodies and experts in the
       | cryptography community around the world that could probably do a
       | better job. At this point NIST should be treated as a compromised
       | certificate authority: just ignore them and move along.
        
       | politelemon wrote:
       | So, question then, isn't one of the differences between this
       | time's selection, compared to previous selections, that some of
       | the algorithms are open source with their code available.
       | 
       | For example, Kyber, one of the finalists, is here:
       | https://github.com/pq-crystals/kyber
       | 
       | And where it's not open source, I believe in the first round
       | submissions, everyone included reference implementations.
       | 
       | Does the code being available make it easy to verify whether
       | there are some shady/shenanigans going on, even without NIST's
       | cooperation?
        
         | aaaaaaaaaaab wrote:
         | What? :D
         | 
         | Who cares about a particular piece of source code?
         | Cryptanalysis is about the _mathematical_ structure of the
         | ciphers. When we say the NSA backdoored an algorithm, we don 't
         | mean that they included hidden printf statements in "the source
         | code". It means that mathematicians at the NSA have knowledge
         | of weaknesses in the construction, that are not known publicly.
        
           | politelemon wrote:
           | Well, that was why I asked the question. I didn't think
           | asking a question deserved downvotes and ridicule.
        
         | [deleted]
        
         | gnabgib wrote:
         | Worth noting DJB (the article author) was on two competing
         | (losing) teams to Kyber[0] in Round 3. And has an open
         | submission in round 4 (still in progress). That's going to
         | slightly complicate any FOIA until after the fact, or it
         | should. Not that there's no merit in the request.
         | 
         | [0]: https://csrc.nist.gov/Projects/post-quantum-
         | cryptography/pos...
        
           | greyface- wrote:
           | > the Supreme Court has observed that a FOIA requester's
           | identity generally "has no bearing on the merits of his or
           | her FOIA request."
           | 
           | https://www.justice.gov/archives/oip/foia-
           | guide-2004-edition...
        
           | throwaway654329 wrote:
           | It is wrong to imply he is unreasonable here. NIST has been
           | dismissive and unprofessional towards him and others in this
           | process. They look terrible because they're not doing their
           | jobs.
           | 
           | Several of his student's proposals won the most recent round.
           | He still has work in the next round. NIST should have
           | answered in a timely manner.
           | 
           | On what basis do you think any of these matters can or may
           | complicate the FOIA process?
        
         | lostcolony wrote:
         | Not really. For the same reason that "here's your github login"
         | doesn't equate to you suddenly being able to be effective in a
         | new company. You might be able to look things up in the code
         | and understand how things are being done, but you don't know
         | -why- things are being done that way.
         | 
         | A lot of the instances in the post even show the NSA giving a
         | why. It's not a particular convincing why, but it was enough to
         | sow doubt. The reason to make all discussions public is so that
         | there isn't an after the fact "wait, why is that obviously odd
         | choice being done?" but instead a before the fact "I think we
         | should make a change". The burden of evidence is different for
         | that. A "I think we should reduce the key length for
         | performance" is a much harder sell when the spec already
         | prescribes a longer key length, than an after the fact "the
         | spec's key length seems too short" "Nah, it's good enough, and
         | we need it that way for performance". The status quo always has
         | inertia.
        
           | politelemon wrote:
           | Thanks for the response, that's making sense. I've also tried
           | following the PQC Google Groups but a lot of the language is
           | beyond my grasp.
           | 
           | Also... I don't understand why I've been downvoted for asking
           | a question, I'm trying to learn but HN can certainly be
           | unwelcoming to the 'curious' (which is why I thought we are
           | here)
        
       | ehzy wrote:
       | Ironically, when I visit the site Chrome says my connection is
       | not secured by TLS.
        
         | encryptluks2 wrote:
         | Are you logging into the site?
        
         | kzrdude wrote:
         | I was hoping for chacha20+Poly1305
        
           | ziddoap wrote:
           | You can see for yourself if you visit the HTTPS version.
           | 
           | > _Connection Encrypted
           | (TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256, 256 bit keys,
           | TLS 1.2)_
        
       | benreesman wrote:
       | djb has got to be the single biggest pain in the ass for the NSA
       | and I love it.
        
       | tomgs wrote:
       | My background is in normal, enterprise-saas-style software
       | development projects, and the whole notion of post-quantum crypto
       | kind of baffles me.
       | 
       | Funnily enough, this post coincides with the release of a
       | newsletter issue[0] by a friend of mine - unzip.dev - about
       | lattice-based cryptography.
       | 
       | A bit of a shameless plug, but it really is a great bit of intro
       | for noobs in the area like myself.
       | 
       | [0] https://unzip.dev/0x00a-lattice-based-cryptography/
        
         | aaaaaaaaaaab wrote:
        
       | bsaul wrote:
       | side question :
       | 
       | I've only recently started to digg a bit deeper into crypto
       | algorithms ( looking into various types of curves etc), and it
       | gave me the uneasing feeling that the whole industry is relying
       | on the expertise of only a handful of guys to actually ensure
       | that crypto schemes used today are really working.
       | 
       | Am i wrong ? are there actually thousands and thousands of people
       | with the expertise to actually proove that the algorithms used
       | today are really safe ?
        
         | chasil wrote:
         | This "monoculture" post raised this point several years ago.
         | 
         | https://lwn.net/Articles/681616/
        
         | aumerle wrote:
         | Proof! the entire field of cryptography can prove absolutely
         | nothing other than that a single use of One time pad is secure.
         | the rest is all hand waving, that boils down to no-one I know
         | knows how to do this, and I cant do it myself, so I believe
         | it's secure.
         | 
         | So the best we have in cryptography is trusting "human
         | instincts/judgements" about various algorithms. Which then
         | further reduces to trusting humans.
        
         | NavinF wrote:
         | Most programmers don't need to prove crypto algorithms. There
         | are many situations where you can just use TLS 1.3 and let it
         | choose the ciphers. If you really need to build a custom
         | protocol or file format, you can still use libsodium's
         | secretbox, crypto_box, and crypto_kx functions which use the
         | right algorithms.
        
           | ziddoap wrote:
           | This is completely unrelated to the question being asked by
           | the parent. They aren't asking about the average programmer.
           | They are asking how many people in the world can truly
           | 'prove' (to some reasonable degree) that the cryptography in
           | use and the algorithms that are implementing that
           | cryptography are 'secure' (to some reasonable degree).
           | 
           | Put another way, they are asking how many people in the world
           | could verify that the algorithms used by libsodium,
           | crypto_box, etc. are secure.
        
           | l33t2328 wrote:
           | The grandparent post is asking about the people who need to
           | know enough to program TLS to
           | 
           | > let it choose
        
         | kibibyte wrote:
         | I don't know if that's easily quantifiable, but I had a
         | cryptography professor (fairly well-known nowadays) several
         | years ago tell us that she only trusted 7 people (or some other
         | absurdly low number), one of them being djb, to be able to
         | evaluate the security of cryptographic schemes.
         | 
         | Perhaps thousands of people in the world can show you proofs of
         | security, but very few of them may be able to take into account
         | all practical considerations like side channels and the like.
        
         | [deleted]
        
         | benlivengood wrote:
         | There may be thousands of people in the entire world who
         | understand cryptanalysis well enough to accurately judge the
         | security of modern ciphers. Most aren't living or working in
         | the U.S.
         | 
         | It's very difficult to do better. The mathematics is complex
         | and computer science hasn't achieved proofs of the hypotheses
         | underlying cryptography. The best we can achieve is heuristic
         | judgements about what the best possible attacks are, and P?=NP
         | is an open question.
        
           | Tainnor wrote:
           | > The mathematics is complex and computer science hasn't
           | achieved proofs of the hypotheses underlying cryptography.
           | 
           | No unconditional proofs (except for the OTP ofc), but there
           | are quite a few conditional proofs. For example, it's
           | possible to show that CBC is secure if the underlying block
           | cipher is.
        
       | jacooper wrote:
       | Flippo valrosida and Matthey green aren't too happy.
       | 
       | https://twitter.com/matthew_d_green/status/15556838562625208...
        
         | ghoward wrote:
         | Thanks for letting me know. I think I'll consider both of them
         | compromised.
        
         | throwaway654329 wrote:
         | Dismissing this lawsuit as a conspiracy theory is embarrassing
         | for both of them.
         | 
         | There is ample evidence to document malfeasance by the involved
         | parties, and it's reasonable to ask NIST to follow public law.
        
           | detaro wrote:
           | > _Dismissing this lawsuit as a conspiracy theory is
           | embarrassing for both of them._
           | 
           | They are not dismissing the lawsuit.
        
             | throwaway654329 wrote:
             | One says he's doing it wrong. The other says he hopes that
             | he wins, of course!
             | 
             | Meanwhile they go on to attack Bernstein, mischaracterize
             | his writing, completely dismiss his historical analysis,
             | mock him with memes as a conspiracy theorist, and to top it
             | off they question his internal motivations (which they
             | somehow know) as some kind of a sore loser which is
             | demonstrably false.
             | 
             | The plot twist for the last point: he is still in the
             | running for round four and his former PhD students _did
             | win_ major parts of round three.
        
               | tptacek wrote:
               | Two things can easily be true: that NIST mishandled a
               | FOIA request, and that there isn't especially good reason
               | to accept on faith Bernstein's concerns about the PQC
               | process, which is unrelated to how they handle FOIA.
               | 
               | Meanwhile: you haven't actually added any light to this
               | subthread: the tweets we're talking about _do not_
               | dismiss the suit. Cryptographic researchers that aren 't
               | stans of Daniel Bernstein (there are a lot of those) are
               | also unhappy about NIST clowning up FOIA.
               | 
               | You are in a deeply weird and broken place if you think
               | you can divide the world into "people who take what
               | Daniel Bernstein says on faith" and "people who trust
               | NIST". I don't know if you're in that place! But some
               | people on this thread clearly are.
        
               | onetimeusename wrote:
               | You wrote a large number of comments on this so I am
               | asking this here since it's fresh.
               | 
               | Can you comment on why you think djb thinks it is worth
               | investigating if the NSA is attempting to destroy
               | cryptography with weak pqc standards? I read through some
               | of the entries NIST just announced and there are indeed
               | attacks, grave attacks, that exist against Kyber and
               | Falcon. I have no reason to believe the authors of those
               | specs work with the NSA. Wouldn't a more reasonable
               | conclusion be that we need to do more work on pqc? Maybe
               | I have it wrong and he is just trying to rule out that
               | possibility but his long rant which was 80% about NIST
               | and their history with the dual EC backdoor really points
               | at djb concluding the NSA is deliberately trying to
               | weaken crypto by colluding with a bunch of people who
               | probably don't care about money or the NSA's goals that
               | much.
        
               | tptacek wrote:
               | You'd have to ask Bernstein. I think it's helpful to take
               | a bit of time (I know this is a big ask) to go see how
               | Bernstein has comported himself in other standards
               | groups; the CFRG curve standardization discussion is a
               | good example. The reason I said there's a lot of eye-
               | rolling about this post among cryptographers is that I
               | think this is pretty normal behavior for Bernstein.
               | 
               | I used to find it inspiring; he got himself crosswise
               | against the IETF DNS working group, which actively
               | ostracized him, and I thought the stance he took there
               | was almost heroic (also, I hate DNSSEC, and so does he).
               | But when you see that same person get in weird random
               | fights with other people, over and over again, well:
               | there's a common thread there.
               | 
               | Is it worth investigating whether NSA is trying weaken
               | PQC? Sure. Nobody should trust NSA. Nobody should trust
               | NIST! There's value in NIST catalyzing all the academic
               | asymmetric cryptography researchers into competing
               | against each other, so the PQC event probably did
               | everybody a service. But no part of that value comes from
               | NIST blessing the result.
               | 
               | It's probably helpful for you to know that I think PQC
               | writ large is just a little bit silly. Quantum computers
               | of unusual size? I don't believe they exist. I think an
               | under-appreciated reason government QC spending happens
               | is because government spending is a goal in and of
               | itself; one of NSA's top 3 missions is to secure more
               | budget for NSA --- it might even be the #1 goal.
               | Meanwhile, PQC is a full-employment program for academic
               | cryptographers working on "Fun" asymmetric schemes that
               | would otherwise be totally ignored in an industry that
               | has more or less standardized on the P-curves and
               | Curve25519.
               | 
               | Be that as it may: whether or not NSA is working to
               | "weaken" CRYSTALS-Kyber is besides the point. NSA didn't
               | invent CRYSTALS. A team of cryptographers, including some
               | huge names in modern public key crypto research, did.
               | Does NSA have secret attacks against popular academic
               | crypto schemes? Probably. You almost hope so, because we
               | pay them a fuckload of a lot of money to develop those
               | attacks. But you can say that about _literally every
               | academic cryptosystem_.
               | 
               | You probably also don't need me to tell you again how
               | much I think formal cryptographic standards are a force
               | for evil in the industry.
        
               | onetimeusename wrote:
               | ok, thanks. I didn't know that about djb's history as far
               | as picking fights with standards groups. I don't know
               | much about him outside of the primitives he designed.
               | That makes some sense in context now because the
               | implication just seemed like a stretch. Cryptosystems
               | break and have flaws in them, that's nothing new. It's
               | just strange to leap to "The NSA did it", but again, I
               | didn't know he just tends to accuse people of that.
               | 
               | I agree about the PQC stuff and committees. Anyways,
               | thanks for clarifying this.
        
               | tptacek wrote:
               | Just bear in mind that this is just opinions and hearsay
               | on my part. Like, I think there's value in relaying what
               | I think I know and what I've heard, but I'm not a
               | cryptographer, I paid almost no attention to the PQC
               | stuff (in fact, I pretty much only ever swapped PQC into
               | my resident set when Bernstein managed to start drama
               | with other cryptographers whose names I knew), and there
               | are possibly other sides to these stories. I've seen
               | Bernstein drama where it's pretty clear he's deeply in
               | the wrong, and I've seen Bernstein drama where it's
               | pretty clear he wasn't.
               | 
               | The suit is good. NIST isn't allowed to clown up FOIA;
               | they have to do it right.
        
               | throwaway654329 wrote:
               | I am definitely not in that place. We clearly disagree on
               | a few points.
               | 
               | The issues raised in the blog post aren't just about NIST
               | mishandling the FOIA. By reducing it to the lawsuit, this
               | is already a bad faith engagement.
               | 
               | The blog post is primarily about the history of NSA
               | sabotage as well as contemporary efforts, including
               | (NIST's) failures to stop this sabotage. Finally it
               | finishes the recent history by raising that there are
               | mishandling issues in the pq-crypto competition. The
               | lawsuit is at the end of a long chronological text with
               | the goal of finding more information to extend the facts
               | that we know. This is a noble goal, and it's hard to
               | accept any argument that the past in this area hasn't
               | been troubled.
               | 
               | Weirdly there is an assumption made immediately by
               | Filippo, made without basis in fact: he supposes
               | Bernstein somehow lost the contest and that this is _his
               | motivation_ for action. Bernstein hasn't lost, though
               | some structured lattices have won. He still has submitted
               | material in the running as far as I understand things.
               | None the less we see that Filippo tells us the deepest
               | internal motivations of Bernstein, though we don't learn
               | how he learned these personal secrets. This is simply not
               | reasonable. Maybe it could be phrased as a question but
               | then the rhetorical tool of denying questions as a valid
               | form of engagement would start to fade away.
               | 
               | Back to the core of the tweets: One of the two says he
               | hopes he wins the suit, the other says he's doing it
               | wrong. We could read that as they're both hoping he wins,
               | and yet... it's hard to believe when the rhetoric centers
               | around Bernstein's supposedly harmful rhetoric _in the
               | blog post and lawsuit_ as being harmful to the community
               | at large.
               | 
               | Bernstein isn't attacking a singular person as Filippo is
               | attacking Bernstein. Filippo even includes a meme to
               | drive home the personal nature of the attacks.
               | 
               | For me personally, I used to find this meme funny until I
               | learned the history of the meme. This strikes me as blind
               | spot, my very own once. The context and history of that
               | meme and that scene is dark.
               | 
               | So then, here is some light for you: This meme is a
               | parody from a comedy. In turn it is a parody of a famous
               | scene from a film portraying John Nash. It's about a very
               | famous mentally ill mathematician. Nash in this scene is
               | the iconic, quintessential conspiracy theorist insane
               | person once considered a genius. Nash is drawing
               | connections that aren't there and that aren't reasonable.
               | He was deeply mentally ill at that point in his life.
               | That is a brutal thing to say in itself about anyone,
               | but... it gets worse.
               | 
               | Nash was also famously a virulent antisemitic in some of
               | his psychological breaks and outbursts. I don't hold him
               | responsible for his ravings as he was a paranoid
               | schizophrenic, but wow I would not throw up that specific
               | meme at a (Jewish) mathematician while implying he's a
               | crazy conspiracy theorist. It's some really gross mental
               | health hate mixed with ambiguity about the rest. It could
               | be funny in some contexts, I suppose, but not this one.
               | 
               | So in summary: that is a gross meme to post in a series
               | of ad-hominem tweet attacks calling (obviously Jewish
               | family name) Bernstein a conspiracy theorist, saying he
               | is making obviously crazy, baseless connections. The root
               | of his concern is _not insane_ and ignoring the history
               | of sabotage in this area by NSA is unreasonable.
               | 
               | I assume this meme subtext is a mistake and it wasn't
               | intended as antisemitic. Still after processing the
               | mental health punching down part of the meme, I had
               | trouble assuming good faith about any of it. Talk about
               | harmful rhetoric in the community.
               | 
               | I also note that they attack him in a number of other bad
               | faith ways which make me lose my assumption of good faith
               | generally about their well wishing on his lawsuit being
               | successfully.
               | 
               | Meanwhile, I don't take Bernstein on faith. I find his
               | arguments and points in the blog post convincing. I find
               | his history of work in the public interest convincing. I
               | don't care about popularity contests or personal
               | competition. Meanwhile you say you're not following the
               | contest.
               | 
               | Corruption of NIST and other related parties isn't just
               | possible, we know it has happened. We should be extra
               | vigilant that it doesn't repeat. FOIA is a weak mechanism
               | but it's something. Has any corruption or sabotage
               | happened here? We don't know yet, and more important NIST
               | have promised transparency that they haven't delivered. A
               | promise is a good start but it's not sufficient.
               | 
               | NIST have slipped their own deadlines, they have been
               | silent in concerning ways, and they're still failing to
               | provide critical details about the last round of NSA
               | sabotage _that directly involved NIST standardization_.
        
               | tptacek wrote:
               | I just want to jump back here for a second, because when
               | I responded to this comment last night, I hadn't really
               | read it (for I think obvious reasons, but also I was
               | watching Sandman). So I wrote some replies last night
               | that I'm not super proud of --- not because I said
               | anything wrong, but because I didn't acknowledge the
               | majesty of the argument I had been confronted with.
               | 
               | This right here is a comment that makes the following
               | argument, which I will helpfully outline:
               | 
               | * Filippo Valsorda wrote a tweet that included a meme
               | from "It's Always Sunny In Philadelphia"
               | 
               | * That meme is a parody of "A Beautiful Mind"
               | 
               | * "A Beautiful Mind" is about John Nash --- hold on to
               | that fact, because the argument is about to bifurcate
               | 
               | * John Nash was mentally ill
               | 
               | * John Nash was virulently anti-semitic (hold on to your
               | butts...)
               | 
               | * Ergo, Filippo Valsorda is both bigoted against the
               | mentally ill, and also an anti-semite.
               | 
               | Can we do other memes like this? I'd like your exegesis
               | of the "Homer Simpson dissolves backwards into the
               | hedges" meme next!
        
               | throwaway654329 wrote:
               | > ... I didn't acknowledge the majesty of the argument I
               | had been confronted with.
               | 
               | Gee, thanks, I think. Sorry to say we don't agree on your
               | summary of my comment.
               | 
               | > Filippo Valsorda wrote a tweet that included a meme
               | from "It's Always Sunny In Philadelphia"
               | 
               | From this, we already have serious disagreements. It's
               | part of a _series of tweets_ amplified by others. It
               | isn't a single tweet in isolation even when we only look
               | at the direct author. We do agree on the source of the
               | clip, though I think you weren't familiar with the
               | background _of the subject parodied in the clip_ as I
               | raised it. Perhaps you do not believe it or perhaps you
               | think that the parody somehow erases what was parodied
               | originally. Reasonable people can read it many ways.
               | 
               | > That meme is a parody of "A Beautiful Mind"
               | 
               | There are many memes with text included, though this was
               | a video, and it seems clearly a parody of Nash from that
               | film. Here is one example meme that I did not create:
               | https://me.me/i/mathematician-john-nash-during-a-
               | schizophren...
               | 
               | > John Nash was mentally ill
               | 
               | Yep. The implication of using such a meme to punch down
               | is mirrored in the words of the related tweets calling
               | him a conspiracy theorist. This wasn't as you tried to
               | say, a single tweet, it's presented in a context that is
               | harsh, and condemning.
               | 
               | > John Nash was virulently anti-semitic
               | 
               | Maybe, it's unclear if it was a byproduct of his mental
               | illness or a sincerely held belief. It's a third rail,
               | regardless. I won't hold a mentally ill person
               | accountable for stuff they say during an episode, and I
               | also won't use it as a joke.
               | 
               | > Ergo, Filippo Valsorda is both bigoted against the
               | mentally ill, and also an anti-semite.
               | 
               | This isn't my claim. My claim is that it's completely
               | inappropriate on many levels to post not only that meme
               | but to use it in tandem with direct personal attacks on
               | Bernstein. This seems especially relevant in a thread
               | supposedly about damaging behavior _of other people_ in
               | the community.
               | 
               | I would prefer you don't cover for mental health
               | stigmatization or antisemitic dog whistling even a tiny
               | bit, _especially if it was not intended_. Painting me as
               | crazy for my analysis is shitty. You asked me to bring
               | some light and then attack me for sharing my actual
               | thoughts. You didn't acknowledge my insight about Jewish
               | names, either. Was that news to you? Dismissively
               | omitting anything about that insight is weird.
               | 
               | Please leave no room for ambiguity here, it is a very
               | dangerous time in the world, and in America, especially
               | after the Tree of Life murders. There are many many other
               | examples of terrible stuff like that - and anything that
               | even remotely smells like that must be immediately
               | challenged in my view. No doubt this personal context
               | makes myself and others extra sensitive. That is
               | _exactly_ why I explained my understanding of the
               | meaning.
               | 
               | I am happy to provide an analysis of Homer Simpson memes
               | in context if it can help us break the ice and not end
               | this thread on hard or hateful terms.
        
               | tptacek wrote:
               | "Crazy" is not the word I would use about what you've
               | written here. It is unlikely you and I are going to have
               | any productive dialog after this, which is totally fine;
               | I'm happy to disengage here.
        
               | tptacek wrote:
               | There's nothing "bad faith" about it. The tweet is
               | supportive of the lawsuit, and not supportive of
               | Bernstein's weird, heavily-telegraphed, long-predicted
               | claims that a NIST contest he opted to participate in was
               | corrupted by dint of not prioritizing his own designs.
               | 
               | Your bit about the "obviously Jewish family name" thing
               | is itself risible, and you should be embarrassed for
               | trying to make it a thing.
        
               | throwaway654329 wrote:
               | Your augment that the selection doesn't pick his designs
               | doesn't square with SPHINCS+ winning, and with others
               | remaining in the running. His former PhD student won with
               | Kyber. Bernstein did very well here and you're misleading
               | people by suggesting he had his ass handed to him.
               | 
               | He has published (and it is linked from the blog) his
               | views on how to run cryptographic contests before their
               | recent selection finished (late). His comments are not
               | simply the result of the round three announcement.
               | 
               | As to the offensive meme, I note that you don't even
               | dispute the punching down about mental health. Gross.
               | 
               | Bernstein is a German-Jewish name. These names were given
               | and in some cases forced on people in history to give a
               | signal to others, usually negative. This is a hint, not a
               | fact of his beliefs. My understanding is that he does
               | come from a Jewish family. I won't presume to speak for
               | Bernstein's beliefs, just that I see something obviously
               | tense and probably wrong.
               | 
               | It's your choice to not care to comment about the
               | antisemitic connotations that I raised. My point was that
               | for some people this is impossible to not see. It is
               | highly offensive given the context. Now I understand that
               | you refuse to do so when shown. Also extremely gross.
        
               | tptacek wrote:
               | I didn't even notice a "punching down about mental
               | health" thing. You wrote a long comment, I skimmed it.
               | Your allegation that Filippo and Matt Green are
               | antisemitic is ludicrous.
               | 
               | I didn't say Bernstein had his ass handed to him. I said
               | that he wrote thousands and thousands of words about his
               | reasons to mistrust NIST (not just here but elsewhere,
               | and often), but still participated in the PQC contest,
               | raising these concerns only at its conclusion.
        
               | throwaway654329 wrote:
               | > I didn't even notice a "punching down about mental
               | health" thing. You wrote a long comment, I skimmed it.
               | 
               | That tracks, okay. It's the weekend and I'm a nobody on
               | the internet. Thank you for talk the time to continue to
               | engage with me.
               | 
               | > Your allegation that Filippo and Matt Green are
               | antisemitic is ludicrous.
               | 
               | That isn't an allegation that I am making, you are
               | misunderstanding and misrepresenting my statements. My
               | comment even disclaimed that this probably isn't
               | intentional, merely that it is one read of that meme. My
               | core point is this: posting that meme is unhelpful in a
               | thread about Bernstein's supposedly harmful behavior.
               | Maybe you think it's a funny joke, I don't.
               | 
               | Either way - funny joke or not - it certainly isn't a
               | healthy discourse for "the community" to call someone
               | names and to dismiss them as some kind of unhinged
               | conspiracy theorist.
               | 
               | > I didn't say Bernstein had his ass handed to him.
               | 
               | Indeed, I did not claim to quote you there. I am
               | characterizing your words into what I understand as your
               | point. Let's call this " _the sore loser discourse_ " -
               | it is repeated in this thread by others. It seems to be
               | implied by my read when you say: "...he opted to
               | participate in was corrupted by dint of not prioritizing
               | his own designs." I preemptively acknowledge that I may
               | have misunderstood you.
               | 
               | What do you mean to convey by "dint of not" roughy? Don't
               | SPHINCS+ (Standardized in round three) and Classic
               | McEliece (still in the running) count as prioritizing his
               | designs? Also, what is wrong with participating in this
               | standardization process? He seems to be unhappy with NIST
               | before, and during the process, and with ample cause. By
               | participating, it's clear he has learned more and by
               | winning parts of the competition, he's not a sore loser.
               | 
               | If he wasn't a part of this competition, people would
               | probably dismiss his criticism as simply being outside.
               | It's harder to dismiss him if he is part of it, and even
               | harder when his submissions win. It isn't a clean sweep,
               | but it's lifetime achievement levels for some people to
               | have a hand in just one such algorithm, selected in such
               | a process. He has a hand in several remaining submissions
               | as far as I understand the process and the submissions.
               | 
               | > I said that he wrote thousands and thousands of words
               | about his reasons to mistrust NIST (not just here but
               | elsewhere, and often),
               | 
               | So you note he has been saying these things for a long
               | time. On that we agree.
               | 
               | > but still participated in the PQC contest,
               | 
               | You go on to note that he then participated in the
               | process. He is documented in his attempts to use the
               | process tools to raise specific issues and to try to have
               | them settled by NIST as promised, with transparency. NIST
               | has failed to bring that transparency.
               | 
               | Confusingly (to me anyway) your next statement continues
               | with a contradiction:
               | 
               | > raising these concerns only at its conclusion
               | 
               | Which is it? Was he constantly raising these issues or
               | only raising them at the end (of round three)?
               | 
               | Alternatively I could read this as "at its (the blog
               | post) conclusion" which would be extremely confusing. I
               | presume this isn't what you meant but if so, okay, I am
               | really missing the point.
        
               | mr_woozy wrote:
        
               | mr_woozy wrote:
        
         | jeffparsons wrote:
         | I think this is a sloppy take. If you read the full back-and-
         | forth on the FOI request between D.J. Bernstein and NIST, it
         | becomes readily apparent that there is _something_ rotten in
         | the state of NIST.
         | 
         | Now of course that doesn't necessarily mean that NIST's work is
         | completely compromised by the NSA (even though it has been in
         | the past), but there are other problems that are similarly
         | serious. For example, if NIST is unable to explain how certain
         | key decisions were made along the way to standardisation, and
         | those decisions appear to go against what would be considered
         | by prominent experts in the field as "good practice", then NIST
         | has a serious process problem. This is important work. It
         | affects everyone in the world. And certain key parts of NIST's
         | decision making process seem to be explained with not much more
         | than a shrug. That's a problem.
        
           | tptacek wrote:
           | All you're saying here is that NIST failed to comply with
           | FOIA. That's not unusual. No public body does a reliably good
           | job of complying with FOIA, and many public bodies seem to
           | have a bad habit of pre-judging the "merits" of FOIA
           | requests, when no merit threshold exists for their open
           | records requirements.
           | 
           | NIST failing to comply with FOIA makes them an intransigent
           | public body, like all the rest of them, from your local water
           | reclamation board to the Department of Energy.
           | 
           | It emphatically does _not_ lend support to any of this
           | litigants concerns about the PQC process. I don 't know
           | enough (really, anything) about the PQC "contest" to judge
           | claims about its validity, but I do know enough --- like, the
           | small amount of background information needed --- to say that
           | it's risible to suggest that any of the participating teams
           | were compromised by intelligence agencies; that claim having
           | been made in this post saps its credibility.
           | 
           | So, two things I think a reasonable person would want to
           | establish here: first, that NIST's behavior _with respect to
           | the FOIA request_ is hardly any kind of smoking gun, and
           | second that the narrative being presented in this post about
           | the PQC contest seems somewhere between  "hand-wavy" and
           | "embarrassing".
        
             | jeffparsons wrote:
             | > It emphatically does not lend support to any of this
             | litigants concerns about the PQC process.
             | 
             | I agree with most of what you're saying except for this. In
             | my view, unlike some of the other organisations you
             | mentioned, the _only value_ of NIST is in the quality and
             | transparency of its processes. My reading of the DJB/NIST
             | FOI dialogue is that there is reason to believe NIST has
             | serious process problems that go far beyond simply handling
             | an FOI well. From their own responses, it reads as if they
             | aren't able to articulate themselves why they would choose
             | one contestant's algorithm over another's. That kind of
             | undermines the entire point of having an open contest.
        
               | tptacek wrote:
               | The peer review NIST is refereeing happened in the open.
               | Thus far, Bernstein is the only person making these
               | claims. For all the words he burns on NIST's sordid
               | history, he chose to participate in this NIST-run
               | process, and imploded publicly only after the results
               | were announced. There are dozens of cryptographers with
               | reputations in the field comparable to Bernstein's who
               | also participated. Bernstein is the only one suggesting
               | that NSA bribed the contest winners.
               | 
               | From what I can tell, nobody who actually works in this
               | field is taking any of this seriously; what I see is a
               | whole lot of eye rolling and "there he goes again". But
               | you don't get any of that on HN, because HN isn't a forum
               | for cryptography researchers. All you get is Bernstein's
               | cheering section.
               | 
               | I was part of Bernstein's cheering section! I understand
               | the feeling. And, like, I'm still using ChaPoly and 25519
               | in preference to any of the alternatives! He's done
               | hugely important work. But he has, not to put too fine a
               | point on it, _a fucked up reputation_ among his peers in
               | cryptography research, and he 's counting on you not to
               | know that, and to confuse a routine, workaday FOIA
               | lawsuit with some monumental new bit of litigation.
               | 
               | It's a deeply cynical thing for him to be doing.
               | 
               | He could have just announced, in his lovably
               | Bernsteinian+ way, that NIST had failed in its FOIA
               | obligations, and he was holding them to account. I'd be
               | cheering too. But he wrote a screed that culminated in an
               | allegation that NSA had bribed members of PQC teams to
               | weaken their submissions. Simply risible; it's
               | embarrassing to be part of a community that dignifies
               | that argument, even if I absolutely get why it's
               | happening. I have contempt for him for exploiting all of
               | you.
               | 
               | None of this is to take anything away from his FOIA suit.
               | I stan his FOIA attorneys. The suit, boring as it is, is
               | a good thing. He should win, and he almost certainly
               | will; L&L wouldn't have taken the case if he wasn't going
               | to. Just keep in mind, people sue and win over FOIA
               | mistakes all the time. In Illinois, you even get fee
               | recovery when you win. This isn't Bernstein v United
               | States!
               | 
               | + _I 'm not being snarky; I was a multiple-decades-long
               | admirer of that style._
        
               | chasil wrote:
               | The main concern that I have is the NIST refusal to
               | consider a hybrid design as described in the blog,
               | coupled with the fact that OpenSSH has disregarded NIST
               | and standardized on hybrid NTRU-Prime.
               | 
               | There had to be substance to accomplish this, and it
               | moves all of UNIX plus Microsoft away from crystals. It
               | would seem hugely damaging to crystals as the winner of
               | the latest round.
        
               | tptacek wrote:
               | I don't think you understand what's going on here. The
               | point of the PQC "contest" is to figure out which PQC
               | constructions to use. It's not to design hybrid
               | classical/PQC schemes: everybody already knows how to do
               | that. The idea that NIST should have recommended
               | CRYSTALS-Kyber+Curve25519 is a little like suggesting
               | that they should have recommended Rijndael+DES-EDE.
               | 
               | It's simply not NIST's job to tell tls-wg how to fit PQC
               | into HTTPS, or the OpenSSH team how to fit it into SSH.
               | 
               | If you trust the OpenSSH team more than NIST, that's
               | fine. I think that's a reasonable thing to do. Just do
               | whatever OpenSSH does, and you don't have to worry about
               | how corrupt NIST's process is. I don't even think NIST is
               | corrupt, and I still think you'd be better off just
               | paying attention to whatever OpenSSH does.
        
               | chasil wrote:
               | That would make it seem that the lengthy hybrid
               | discussion in the blog is a misdirection.
               | 
               | I will grant you that this does support your argument.
               | 
               | EDIT: Actually, what you have said does not seem at all
               | correct.
               | 
               | In DJB's Apon complaint, we find this text:
               | 
               | 'For example, in email to pqc-forum dated 30 Oct 2019
               | 15:38:10 +0000 (2019), NIST posted technical comments
               | regarding hybrid encryption modes and asked for feedback
               | "either here on the pqc-forum or by contacting us at pqc-
               | comments@nist.gov" (emphasis added).'
               | 
               | If hybrid encryption is entirely beyond the purview of
               | the NIST PQC competition, then why did this discussion
               | and feedback request ever take place?
        
               | tptacek wrote:
               | Look, I'm just not going to dignify the argument that
               | there is somehow some controversy over the NIST PQC
               | contest not recommending higher-level constructions to
               | plug PQC KEMs into Curve25519 key exchanges. I get that
               | this seems like a super interesting controversy to you,
               | because Bernstein's blog post is misleading you, but this
               | simply isn't a real controversy.
        
               | chasil wrote:
               | Hopefully, the judge will help, as before.
        
               | tptacek wrote:
               | It's somewhat unlikely that there will even be a judge
               | here.
        
               | djmdjm wrote:
               | Repeating this here. We (OpenSSH) have not disregarded
               | NIST, we just added a PQ algorithm before NIST finished
               | their competition and we'll almost certainly add support
               | for the finalist fairly soon.
        
               | chasil wrote:
               | I will eagerly await their arrival, and be sure to
               | sysupgrade.
        
         | silisili wrote:
         | What's with the infighting here? Nothing about the post comes
         | across as conspiracy theory level or reputation ruining. It
         | makes me question the motives of those implying he's crazy, to
         | be honest.
        
           | tptacek wrote:
           | Post-quantum cryptography is essentially a full-employment
           | program for elite academic public key cryptographers, which
           | is largely what the "winning" PQC teams consist of. So, yeah,
           | suggesting that one of those teams was compromised by an
           | intelligence agency is "conspiracy theory level".
           | 
           | Nobody is denying the legitimacy of the suit itself. NIST is
           | obligated to follow public records law, and public records
           | law is important. Filippo's message, which we're all
           | commenting on here, says that directly.
        
             | 7373737373 wrote:
             | Has the general notion of "conspiracy theory" ever carried
             | any positive value? It only seems to exist to discredit
             | "doubters against the majority consensus" without
             | substance. But I guess words like "crank" wouldn't even
             | exist if there weren't many people like it, so it carries
             | some "definitional" value.
             | 
             | Because they show total disregard for someones opinion (in
             | a more formal way: "unlike you/them, i completely agree
             | with the (apparent) majority consensus (which it also
             | implies), these words probably don't belong into a serious
             | discussion.
        
               | woodruffw wrote:
               | Our notion of "crank"/conspiracy theory is a logical
               | consequence of "extraordinary claims require
               | extraordinary evidence." When that evidence isn't
               | provided all that remains is an exceptionally convoluted
               | explanation, generally involving more parties than
               | necessary, hence "conspiracy."
        
               | 7373737373 wrote:
               | These words probably also tend to describe people who
               | rather hold a belief in these theories, instead of
               | treating them as statements without evidence (which both
               | sides probably should, because there is no evidence
               | against it).
               | 
               | On the other hand, discussing statements without evidence
               | (even if they are not presented as beliefs), has some
               | (opportunity?) cost, which the "theorists" are willing to
               | pay
        
         | jacooper wrote:
         | Man, mobile typos suck.
        
         | svnpenn wrote:
         | Filippo Valsorda seems to be happy to ignore the fact that NIST
         | already let an NSA backdoor in, as recently as 2014:
         | 
         | https://wikipedia.org/wiki/Dual_EC_DRBG
         | 
         | is he really just going to ignore something from 8 years ago?
        
           | oittaa wrote:
        
             | tptacek wrote:
             | First, last I checked, Filippo does not in fact work at
             | Google.
             | 
             | Second: the guidelines on this site forbid you to write
             | comments like this; in fact, this pattern of comments is
             | literally the most frequent source of moderator admonitions
             | on HN.
             | 
             | Filippo hardly needs me to defend his reputation, but, as a
             | service to HN and to you in particular, I'd want to raise
             | your awareness of the risk of beclowning yourself by
             | suggesting that he, of all people, is somehow compromised.
        
               | oittaa wrote:
        
               | tptacek wrote:
               | It's funny how you think it's a mic drop that you didn't
               | bother to check basic facts before attempting to defame
               | someone by name, anonymous commenter.
               | 
               | I just told you this was a dumb hill to die on, but I
               | can't stop you from commenting like this.
        
             | throwaway654329 wrote:
        
           | throwaway654329 wrote:
           | Yes, he appears to be unreasonably dismissive of the blindly
           | obvious history and the current situation.
           | 
           | As an aside, this tracks with his choice of employers - at
           | least one of which was a known and documented NSA
           | collaborator (as well as a victim, irony of irony) _before he
           | took the job with them_.
           | 
           | As Upton Sinclair remarked: "It is difficult to get a man to
           | understand something when his salary depends upon his not
           | understanding it."
           | 
           | Joining Google after Snowden revealed PRISM and BULLRUN, as
           | well as MUSCULAR, is almost too rich to believe, Meanwhile he
           | asserts and dismisses Bernstein as a conspiracy theorist.
           | It's a classic bad faith ad-hominem coincidence theory.
        
           | stefantalpalaru wrote:
        
       | sylware wrote:
       | I have the feelings the govs around the world get more and more
       | sued related to serious digital matters. Here, once the heat wave
       | is finally over, I will see again my lawyer about the
       | interoperability of gov related sites with noscript/basic (x)html
       | browsers.
        
       | dt3ft wrote:
       | Perhaps the old advice ("never roll your own crypto") should be
       | reevaluated? If you're creative enough, you could combine and
       | apply existing algorithms in such ways that it would be very
       | difficult to decrypt? Think 500 programmatic combinations (steps)
       | of encryption applying different algorithms. Content encrypted in
       | this way would require knowledge of the encryption sequence in
       | order to execute the required steps in reverse. No amount of
       | brute force could help here...
        
         | TobTobXX wrote:
         | > Would require knowledge of the encryption sequence...
         | 
         | This is security by obscurity. Reputable encryptions work under
         | the assumption that you have full knowledge about the
         | encryption/decryption process.
         | 
         | You could however argue that the sequence then becomes part of
         | the key. However, this key [ie. the sequence of encryptions]
         | would then be at most as strong as the strongest encryption in
         | this sequence, which kindof defeats the purpose.
        
         | Tainnor wrote:
         | No, an important property of a secure cryptographic cipher is
         | that it should be as close to a random permutation of the input
         | as possible.
         | 
         | A "randomly assembled" cipher that just chains together
         | different primitives without much thought is very unlikely to
         | have that, which will mean that it will probably have
         | "interesting" statistical properties that can be observed given
         | enough plaintext/ciphertext pairs, and those can then be
         | exploited in order to break it.
        
         | anfilt wrote:
         | No not at all, that advice is still good. Even more important
         | if your are talking about modifying algorithms. Your gonna want
         | proofs of resistance or immunity to certain classes of attacks.
         | A subtle change can easily make a strong primitive useless.
        
       | thrway3344444 wrote:
       | Why is the link in the URL http: not https: ? Irony?
        
         | cosmiccatnap wrote:
         | If you spend all day making bagels do you go home and make
         | bagels for dinner?
         | 
         | It's a static text blog, not a bank
        
           | theandrewbailey wrote:
           | The NSA has recorded your receipt of this message.
        
           | pessimizer wrote:
           | > It's a static text blog, not a bank
           | 
           | I want those delivered by https most, because http leaks the
           | exact page I've visited, rather than just the domain.
        
             | effie wrote:
             | If you care about preventing those kinds of leaks, do not
             | use mainstream browsers (they are likely to leak even your
             | https URLs to the browser company), and do not access those
             | pages directly using your home connection (there may be
             | mitms between you and the page).
        
           | creatonez wrote:
           | See: "Here's Why Your Static Website Needs HTTPS" by Troy
           | Hunt
           | 
           | https://www.troyhunt.com/heres-why-your-static-website-
           | needs...
        
         | sam0x17 wrote:
         | Well https uses the NIST standards so.... ;)
        
         | creatonez wrote:
         | This is just due to the way that the OP posted it, not how it
         | was originally published. This website forces HTTPS using
         | ChaCha20-Poly1305 standard.
        
       | theknocker wrote:
        
       | efitz wrote:
       | Why don't we invert FOIA?
       | 
       | Why don't we _require_ that all internal communications and
       | records be public, available within 24 hours on the web, and
       | provide a very painful mechanism involving significant personal
       | effort of high level employees for every single communication or
       | document that is to be redacted in some way? The key is requiring
       | manual, personal (non-delegatable) effort on the part of senior
       | bureaucrats, and to allow a private cause of action for citizens
       | and waiver of immunity for bureaucrats.
       | 
       | We could carve out (or maybe not) specific things like allowing
       | automatic redaction of employee PII and PII of citizens receiving
       | government benefits.
       | 
       | After many decades, it's clear that the current approach to FOIA
       | and sunshine laws just isn't working.
       | 
       | [ed] fixed autocorrect error
        
         | voz_ wrote:
        
           | efitz wrote:
           | We should rethink the concept of a "secret". If it's _really_
           | a secret, it will still be worth the effort to protect.
        
             | acover wrote:
             | They are erroring on the side of caution because people
             | have determined secret information from public information
             | - like the energy in a nuclear bomb (censored) by the blast
             | radius (public).
             | 
             | Another example is they want to protect their means and
             | methods. But those means and methods are how they know most
             | information. Often times it's easy to work backwards from
             | they know x therefore y is compromised.
             | 
             | It's a hard problem similar to how to release anonymized
             | data. See K-anonymity attacks and caveats.
             | 
             | https://en.wikipedia.org/wiki/K-anonymity
        
           | chmod775 wrote:
           | Qualifiers such as evil aren't really useful when there
           | hasn't been a country acting honorably on that stage for a
           | long time, if ever.
           | 
           | Here's a phrasing that might be more appropriate:
           | 
           | "Since we're backstabbers and scoundrels, we should exercise
           | caution around each other."
        
           | vasco wrote:
           | Do you think it's tough for those regimes to pay someone to
           | do FOIA requests for them? Or to get jobs at government
           | agencies?
        
           | nix23 wrote:
           | Not sure if the US with it's torture-base aka Guantanamo and
           | torture-safe-houses around the world really has the right to
           | call someone else "evil", i don't mean that as
           | "whataboutissm" but that human lives are not more "worth" in
           | the US as in Mainland China
        
           | denton-scratch wrote:
           | Surely "keeping things a little more hidden" depends on
           | reliable cryptography.
        
         | afrcnc wrote:
        
         | gorgoiler wrote:
         | The old Abe rhetoric was powerful but it always felt like it
         | was only hitting home on two of the three points. Obviously
         | government, by definition really, is of the people. The much
         | better parts were for the people and _by the people_.
        
         | chaps wrote:
         | The carve-out you mention is a decent idea on paper, but in
         | practice is a _difficult_ process. There 's really no way to do
         | it in any significant degree without basically putting all gov
         | to a complete halt. Consider that government is not staffed
         | with technical people, nor necessarily critically minded people
         | to implement these systems.
         | 
         | There are ways to push for FOIA improvements that don't require
         | this sort of drastic approach. Problem is, it takes a lot of
         | effort on the parts of FOIA requesters, through litigation and
         | change in the laws. Things get surprisingly nuanced when you
         | really get down into what a "record" is, specifically for
         | digital information. I definitely wouldn't want to have "data"
         | open by default in this manner, because it would lead to
         | privacy hell.
         | 
         | Another component of this all is to consider contractors and
         | subcontractors. Would they fall under this? If so, to what
         | degree? If not, how do we prevent laundering of information
         | through contractors/subcontractors?
         | 
         | To a large degree, a lot of "positive" transparency movements
         | like the one you suggest can ironically lead to reduced
         | transparency in some of the more critical sides of
         | transparency. A good example of that is "open data", which
         | gives an appearance of providing complete data, but without the
         | legal requirements to enforce it. Makes gov look good but it
         | de-incentivizes transparency pushback and there's little way to
         | identify whether all relevant information is truly exposed. I
         | would imagine similar would happen here.
        
         | [deleted]
        
       | eointierney wrote:
       | Yippee! DJB for the win for the rest of us!
        
       | elif wrote:
       | Perhaps the best way to build trust in a cryptographic algorithm
       | is to have it devised by certifiably neutral general purpose
       | mathematic neural net.
       | 
       | It could even generate an algorithm so complicated it would be
       | close to impossible for a human mind to comprehend the depth of
       | it.
        
         | creatonez wrote:
         | > It could even generate an algorithm so complicated it would
         | be close to impossible for a human mind to comprehend the depth
         | of it.
         | 
         | Okay... then some nefarious actor's above-human-intelligence
         | neural network instantly decodes the algorithm deemed too
         | complicated for human understanding?
         | 
         | I don't see how opaque neural nets are suddenly going to make
         | security-through-obscurity work.
        
         | tooltower wrote:
         | "Certifiably neutral"
         | 
         | So, by a process that hasn't been designed yet. Especially when
         | one considers how opaque most neutral nets are to human
         | scrutiny.
        
           | elif wrote:
           | I mean, if the source, training data, and query interface are
           | public, it would be insanely difficult to hide a backdoor
           | 
           | There i "designed" your impossible criterion in just a few
           | obvious steps you could have inferred
        
             | tooltower wrote:
             | There are many, many papers that show how you can make
             | innocuous changes to inputs to make neutral nets produce
             | the wrong result. You might be overestimating the
             | difficulty of this process.
        
       | mramadany wrote:
        
         | dang wrote:
         | We detached this subthread from
         | https://news.ycombinator.com/item?id=32363982.
        
         | tptacek wrote:
         | You could not have less of an idea of what you're talking about
         | here.
        
         | pvg wrote:
         | This isn't the sort of shit you can start here, take a look at
         | 
         | https://news.ycombinator.com/newsguidelines.html
        
           | mramadany wrote:
           | If you think I went around looking to dig up dirt, I didn't.
           | I just searched djb's name on Twitter to find more
           | discussions about the subject, as post-quantum cryptography
           | is an area I'm curious about.
           | 
           | Regarding asking for a disclosure, I thought that was widely
           | accepted around here. If the CEO of some company criticised a
           | competitor's product, we would generally expect them to
           | disclose that fact upfront. I thought that was appropriate
           | here given the dismissive tone of GP.
        
             | pvg wrote:
        
               | mramadany wrote:
        
               | pvg wrote:
        
             | andrewflnr wrote:
             | > asking for a disclosure
             | 
             | Bullshit, before you even finish the sentence. You didn't
             | ask, you accused. Did you read the context of the tweets
             | you linked?
        
         | lapinot wrote:
         | Not sure about the disclosure, having a grudge with djb is not
         | particularly a minority thing.
        
           | tptacek wrote:
           | Whatever "grudge" I have with Bernstein is, to say the least,
           | grudging.
        
       | sgt101 wrote:
       | yeah, but where do all these big primes come from?
        
       | [deleted]
        
       | frogperson wrote:
       | Seems odd to me a crypto blog isn't using https these days.
        
       | pyuser583 wrote:
       | Please include links with https://
        
         | oittaa wrote:
         | NSA employees downvoted this?
        
           | pyuser583 wrote:
           | Seriously! Tons of people ranting about crypto visiting a
           | non-TLs website!
        
       | tptacek wrote:
       | I may believe almost all of this is overblown and silly, as like
       | a matter of cryptographic research, but I'll say that Matt Topic
       | and Merrick Wayne are the real deal, legit the lawyers you want
       | working on something like this, and if they're involved,
       | presumably some good will come out of the whole thing.
       | 
       | Matt Topic is probably best known as the FOIA attorney who got
       | the Laquan McDonald videos released in Chicago; I've been
       | peripherally involved in some work he and Merrick Wayne did for a
       | friend, in a pretty technical case that got fierce resistance
       | from CPD, and those two were on point. Whatever else you'd say
       | about Bernstein here, he knows how to pick a FOIA lawyer.
       | 
       | A maybe more useful way to say the same thing is: if Matt Topic
       | and Merrick Wayne are filing this complaint, you should probably
       | put your money on them having NIST dead-to-rights with the FOIA
       | process stuff.
        
         | daneel_w wrote:
         | _> "I may believe almost all of this is overblown and silly, as
         | like a matter of cryptographic research ..."_
         | 
         | Am I misunderstanding you, or are you saying that you believe
         | almost all of DJB's statements claiming that NIST/NSA is
         | doctoring cryptography is overblown and silly? If that's the
         | case, would you mind elaborating?
        
           | tptacek wrote:
           | I believe the implication that NIST or NSA somehow bribed one
           | of the PQC researchers to weaken a submission is risible.
           | 
           | I believe that NIST is obligated to be responsive to FOIA
           | requests, even if the motivation behind those requests is
           | risible.
        
             | ckastner wrote:
             | > _I believe the implication that NIST or NSA somehow
             | bribed one of the PQC researchers to weaken a submission is
             | risible._
             | 
             | Could you elaborate on this? I didn't get this from the
             | article at all. There's no researcher(s) being implicated
             | as far as I can tell.
             | 
             | What I read is the accusation of NIST's decision-making
             | process possibly being influenced by the NSA, something
             | that we know has happened before.
             | 
             | Say N teams of stellar researchers submit proposals, and
             | they review their peers. For the sake of argument, let's
             | say that no flaw is found in any proposal; every single one
             | is considered perfect.
             | 
             | NIST then picks algorithm X.
             | 
             | It is _critical_ to understand the decision making process
             | behind the picking of X, _crucially_ so when the decision-
             | making body has a history of collusion.
             | 
             | Because even if all N proposals are considered perfect by
             | all possible researchers, if the NSA did influence NIST in
             | the process, history would suggest that X would be the
             | _least_ trustable of all proposals.
             | 
             | And that's the main argument I got from the article.
             | 
             | Yes, stone-walling a FOIA request may be common, but in the
             | case of NIST, there is ample precedent for malfeasance.
        
               | tptacek wrote:
               | Nobody should trust NIST.
               | 
               | I don't even support NIST's mission; even if you
               | assembled a trustworthy NIST, I would oppose it.
               | 
               | The logical problem with the argument Bernstein makes
               | about NSA picking the least trustworthy scheme is that it
               | applies to literally any scheme NIST picks. It's
               | unfalsifiable. If he believes it, his FOIA effort is a
               | waste of time (he cannot FOIA NSA's secret PQC attack
               | knowledge).
               | 
               | The funny thing here is, I actually do accept his logic,
               | perhaps even more than he does. I don't think there's any
               | reason to place more trust in NIST's PQC selections than
               | other well-reviewed competing proposals. I trust the peer
               | review of the competitors, but not NIST's process at all.
        
               | ckastner wrote:
               | > _The logical problem with the argument Bernstein makes
               | about NSA picking the least trustworthy scheme is that it
               | applies to literally any scheme NIST picks. It 's
               | unfalsifiable._
               | 
               | That may be true in the strict sense, but in practice, I
               | think there would be a material distinction between a
               | NIST process of _" we defer our decision to the majority
               | opinion of a set of three researchers with unimpeachable
               | reputations"_ (a characterization from another comment)
               | and a process of _" NSA said we should pick X."_
               | 
               | In the strict sense, I can't trust either process, but in
               | practice [edit: as an absolute layperson who has to trust
               | _someone_ ], I'd trust the first process infinitely more
               | (as I would absolutely distrust the second process).
               | 
               | > _The funny thing here is, I actually do accept his
               | logic, perhaps even more than he does._
               | 
               | That's actually what I got from your other comments to
               | this story. But that confused me, because it was also
               | what I got from the article. The first two thirds of the
               | article are spent entirely on presenting NIST as an
               | untrustworthy body based on decades of history. Apart
               | from the title, PQC isn't even mentioned until the last
               | third, and that part, to me, was basically "NIST's claims
               | of reform are invalidated if it turns out that NSA
               | influenced the decision-making process again".
               | 
               | My vibe was that both of your positions are more or less
               | in agreement, though I have to say I didn't pick up on
               | any accusations of corruption of a PQC researcher in the
               | article (I attribute that to me being a layperson in the
               | matter).
        
             | [deleted]
        
             | Semaphor wrote:
             | > risible
             | 
             | just in case someone else never heard this word before:
             | 
             | > arousing or provoking laughter
        
             | jmprspret wrote:
             | I believe you have a very naive and trusting view of these
             | US governmental bodies. I don't intend that to be an
             | insult, but by now I think the jury is out that these
             | agencies cannot be trusted (the NSA less so, than NIST).
        
               | daneel_w wrote:
               | I think it's naive and trusting only on the surface, but
               | with some clear intent and goal underneath. In the past
               | he has held a different stance, but it suddenly changed
               | some time after Matasano.
        
               | tptacek wrote:
               | Can I ask that, if you're going to accuse me of shilling
               | in an HN thread, you at least come up with something that
               | I'm shilling? I don't care what it is; you can say that
               | I'm shilling for Infowars Life ProstaGuard Prostate
               | Health Supplement with Saw Palmetto and Anti-Oxidant, for
               | all I care, just identify _something_.
        
               | daneel_w wrote:
        
               | [deleted]
        
               | lmeyerov wrote:
               | I'm not sure about corrupting NIST nor corrupting
               | individual officials of NIST, but I can easily imagine
               | NIST committees not understanding something, being
               | tricked, not looking closely, protecting big orgs by
               | default (without maliciousness), and overall being
               | sloppy.
               | 
               | Running standards without full transparency, in my
               | experiences of web security standards + web GPU standards
               | is almost always due to hiding weaknesses, incompetence,
               | security gaps of big players, & internal politics of
               | these powerful incumbents. Think some hardware vendor not
               | playing ball without guarantee of privacy, some
               | Google/Apple committee member dragging their feet because
               | of internal politics & monopoly plays. Seperately,
               | mistakes may come from standards committee member
               | glossing over stuff in emails because they're busy:
               | senior folks are the most technically qualified yet also
               | most busy. Generally not because some NSA/CIA employee is
               | telling them to do something sneaky or lying. Still FOIA-
               | worthy (and why I rather public lists for standards), but
               | for much lamer reasons.
        
               | jmprspret wrote:
               | > ...but I can easily imagine NIST committees not
               | understanding something, being tricked, not looking
               | closely, protecting big orgs by default (without
               | maliciousness), and overall being sloppy.
               | 
               | I agree with this. And I think that this is more likely
               | to be the case. But I really think with all that we now
               | know about US governmental organisations the possibility
               | of backdoors or coercion should not be ruled out.
        
               | tptacek wrote:
               | Even when you're trying to be charitable, you're wildly
               | missing the point. I don't give a fuck about NIST or NSA.
               | I don't trust either of them and I don't even buy into
               | the premise of what NIST is supposed to be doing: I think
               | formal cryptographic standards are a force for evil. The
               | point isn't that NIST is trustworthy. The point is that
               | the PQC finalist teams are comprised of academic
               | cryptographers from around the world with unimpeachable
               | reputations, and it's ludicrous to suggest that NSA could
               | have compromised them.
               | 
               | The whole _point_ of the competition structure is that
               | you don 't simply have to trust NIST; the competitors
               | (and cryptographers who aren't even entrants in the
               | contest) are peer reviewing each other, and NIST is
               | refereeing.
               | 
               | What Bernstein is counting on here is that his cheering
               | section doesn't know the names of any cryptographers
               | besides "djb", Bruce Schneier, and maybe, just maybe,
               | Joan Daemen. If they knew anything about who the PQC team
               | members were, they'd shoot milk out their nose at the
               | suggestion that NSA had suborned backdoors from them.
               | What's upsetting is that he knows this, and he knows you
               | don't know this, and he's exploiting that.
        
               | thayne wrote:
               | My reading wasn't that he thinks they built backdoors
               | into them, but that the NSA might be aware of weaknesses
               | in some of them, and be trying to promote the algorithms
               | they know how to break.
        
               | [deleted]
        
               | jmprspret wrote:
               | Thank you for actually explaining your POV. I don't
               | understand how you expected me or the other commenters to
               | gather this from your original comment.
               | 
               | If it's worth anything, you have changed my opinion on
               | this. You raise very good points.
        
               | tptacek wrote:
               | You're probably right about my original comment, and I
               | apologize. These threads are full of very impassioned,
               | very poorly-informed comments --- I'm not saying I'm
               | well-informed about NIST PQC, because I'm not, but, I
               | mean, just, wow --- and in circumstances like that I tend
               | to play my cards very close to my chest; it's just a
               | deeply ingrained message board habit of mine. I can see
               | how it'd be annoying.
               | 
               | I spent almost 2 decades as a Daniel Bernstein ultra-fan
               | --- he's a hometown hero, and also someone whose work was
               | extremely important to me professionally in the 1990s,
               | and, to me at least, he has always been kind and
               | cheerful; he even tried to give us some ideas for ECC
               | challenges for Cryptopals. I know what it's like to be in
               | the situation of (a) deeply admiring Bernstein and (b)
               | only really paying attention to one cryptographer in the
               | world (Bernstein).
               | 
               | But talk to a bunch of other cryptographers --- and,
               | also, learn about the work a lot of other cryptographers
               | are doing --- and you're going to hear stories. I'm not
               | going to say Bernstein has a bad reputation; for one
               | thing, I'm not qualified to say that, and for another I
               | don't think "bad" is the right word. So I'll put it this
               | way: Bernstein has a fucked up reputation in his field. I
               | am not at all happy to say that, but it's true.
        
               | gautamcgoel wrote:
               | Can you elaborate on his reputation?
        
               | tptacek wrote:
               | Based only on random conversations and no serious
               | interrogation of what happened, so take it for the very
               | little this pure statement of opinion is worth, I'd say
               | he has, chiefly, and in _my own words_ , a reputation for
               | being a prickly drama queen.
               | 
               | He has never been that to me; I've had just a few
               | personal interactions with him, and they've been
               | uniformly positive. My feeling is that he was generous
               | with his time and expertise when I had questions, and
               | pleasant and welcoming in person.
               | 
               | He has, in the intervening years, done several things
               | that grossed me the fuck out, though. There are certainly
               | people who revel in hating the guy. I'm not one of them.
        
               | nicd wrote:
               | "I think formal cryptographic standards are a force for
               | evil."
               | 
               | May I ask what you view as the alternative? (No formal
               | cryptographic standard, or something else?)
        
               | tptacek wrote:
               | Peer review and "informal standards". Good examples of
               | things that were, until long after their widespread
               | adoption, informal standards include Curve25519, Salsa20
               | and ChaCha20, and Poly1305. A great example of an
               | informal standard that remains an informal standard
               | despite near-universal adoption is WireGuard. More things
               | like WireGuard. Less things like X.509.
        
               | lmeyerov wrote:
               | Both formal and informal peer review are why I like the
               | FOIA, and standards / competition discussion to be open
               | in general. I actually dislike closed peer review, or at
               | least without some sort of time-gated release.
               | 
               | Likely scenarios, and that closed review hides:
               | 
               | - Peer review happened... But was lame. Surprisingly
               | common, and often the typical case.
               | 
               | - If some discussion did come up on a likely attack...
               | What? Was the rebuttal and final discussion satisfactory?
               | 
               | It's interesting if some gov team found additional
               | things... But I'm less worried about that, they're
               | effectively just an 'extra' review committee. Though as
               | djb fears, a no-no if they ask to weaken something... And
               | hence another reason it's good for the history of the alg
               | to be public.
               | 
               | Edit: Now that storage and video are cheap, I can easily
               | imagine a shift to requiring all emails + meetings to be
               | fully published.
               | 
               | Edit: I can't reply some reason, but having been an
               | academic reviewer, including for security, and won awards
               | for best of year/decade academic papers, I can say
               | academic peer review may not be doing what most people
               | think, eg, it is often more about novelty and trends and
               | increments from a 1 hour skim. Or catching only super
               | obvious things outsiders and fresh researchers mess up
               | on. Very diff from say a yearlong $1M dedicated pentest.
               | Which I doubt happened. It's easy to tell which kind of
               | review happened when reading a report... Hence me liking
               | a call for openness here.
        
               | tptacek wrote:
               | You get that the most important "peer review" in the PQC
               | contest took the form of published academic research,
               | right? NIST doesn't even have the technical capability to
               | do the work we're talking about. My understanding is that
               | they refereed; they weren't the peer reviewers.
               | 
               |  _Replying to your edit_ I 've been an academic peer
               | reviewer too. For all of its weaknesses, that kind of
               | peer review is the premise of the PQC contest --- indeed,
               | it's the premise of pretty much all of modern
               | cryptography.
        
               | Tepix wrote:
               | > _If they knew anything about who the PQC team members
               | were, they 'd shoot milk out their nose at the suggestion
               | that NSA had suborned backdoors from them._
               | 
               | Please point to this suggestion.
        
               | tptacek wrote:
               | Reload the page, scroll to the top, and click the title,
               | which will take you to the blog post we're commenting on,
               | which makes the suggestion.
        
               | tptacek wrote:
               | I think you need to re-read my comment, because you have
               | not comprehended what I just wrote.
        
               | amluto wrote:
               | You said:
               | 
               | > the motivation behind those requests is risible.
               | 
               | It is quite hilarious that NIST suckered the industry
               | into actually using Dual-EC, despite being worse than the
               | other possible choices in nearly every respect. And this
               | ignores the fact that the backdoor was publicly known for
               | _years_. This actually happened; it's not a joke.
               | 
               | The motivation behind the FOIA requests is to attempt to
               | see whether any funny business is going on with PQ
               | crypto.
               | 
               | If the NSA actually suckers any major commercial player
               | into using a broken PQ scheme without a well-established
               | classical scheme as a backup, that will be risible too.
        
               | woodruffw wrote:
               | Dual_EC keeps getting brought up, but I have to ask: does
               | anybody have any real evidence that it was widely
               | deployed? My recollection is that it basically didn't
               | appear anywhere outside of a handful of not-widely-used
               | FIPS-certified libraries, and wasn't even the default in
               | any of them except RSA's BSAFE.
               | 
               | The closest thing we have to evidence that Dual_EC was
               | exploited in the wild seems to be a bunch of
               | circumstantial evidence around its role in the OPM hack
               | which, if true, is much more of a "self own" than
               | anything else.
        
               | tptacek wrote:
               | It was widely deployed. NSA got it into BSAFE, which I
               | would have said "nobody uses BSAFE, it's not 1996
               | anymore", but it turned out a bunch of closed-source old-
               | school hardware products were using BSAFE. The most
               | notable BSAFE victims were Juniper/Netscreen.
               | 
               | Everybody who claimed Dual EC was a backdoor was right,
               | and that backdoor was materially relevant to our
               | industry. I couldn't believe something as dumb as Dual EC
               | was a real backdoor; it seemed like such idiotic
               | tradecraft. But the belief that Dual EC was so bad as
               | tradecraft that it couldn't be real was, apparently, part
               | of the tradecraft! Bernstein is right about that (even if
               | he came to the conclusion at basically the same time as
               | everyone else --- like, the instant you find out
               | Juniper/Netscreen is using Dual EC, the jig is up).
        
               | woodruffw wrote:
               | Thanks. I suppose I live a charmed life for thinking that
               | nobody was using BSAFE :-)
        
               | tptacek wrote:
               | I promise you, I know the feeling.
        
               | hovav wrote:
               | I don't think Juniper used BSAFE in ScreenOS -- they seem
               | to have put together their own Dual EC implementation on
               | top of OpenSSL, sometime around 2008. (This doesn't
               | change your point, of course.)
        
               | tptacek wrote:
               | Yeah, I think you're right; the Juniper revelation also
               | happened months after the BULLRUN stuff --- I remember
               | being upset about how Greenwald and his crew had hidden
               | all the Snowden docs in a SCIF to "carefully review
               | them", with the net result that we went many months
               | without knowing that one of the most popular VPN
               | appliances was backdoored.
        
               | petre wrote:
               | Not Dual EC, but ECDSA is used (by law) in EU smart
               | tachograph systems for signing data.
        
               | tptacek wrote:
               | ECDSA is almost universally used. It's deeply suboptimal
               | in a variety of ways. But that's because it was designed
               | in the 1990s, not because it's backdoored. This isn't a
               | new line of argumentation for Bernstein; he has also
               | implied that AES is Rijndael specifically because it was
               | so commonly implemented with secret-dependent lookups
               | (S-boxes, in the parlance); he's counting on a lay
               | audience not knowing the distinction between an
               | engineering principle mostly unknown at the time
               | something was designed, and a literal backdoor.
               | 
               | What's annoying is that he's usually right, and sometimes
               | even right in important new ways. But he runs the ball
               | way past the end zone. Almost everybody in the field
               | agrees with the core things he's saying, but almost
               | nobody wants to get on board with his wild-eyed theories
               | of how the suboptimal status quo is actually a product of
               | the Lizard People.
        
               | mrderp wrote:
               | Is he claiming that it is a literal backdoor though?
               | Couldn't Bernstein have a point that the NIST picked
               | Rijndael as the winner of the AES competition because the
               | way it was usually implemented was susceptible to timing
               | attacks? Even if it the engineering principle was mostly
               | unknown at the time, one might guess that e.g. NSA was
               | aware of it and may have provided some helpful feedback.
        
               | oefrha wrote:
               | > he's counting on a lay audience not knowing the
               | distinction between an engineering principle mostly
               | unknown at the time something was designed, and a literal
               | backdoor.
               | 
               | When you discount his theories with that argument, your
               | own reductio ad Lizardum (?) doesn't help. There's a
               | world of distinction between NSA inserting backdoors, for
               | which there's good evidence but maybe not every time, and
               | whatever you're trying to paint his theory as by invoking
               | the Lizard People.
        
               | tptacek wrote:
               | You haven't explained how my argument discounts his
               | theories. You're just unhappy that I used the term
               | "Lizard People". Ok: I retract "Lizard People". Where
               | does that leave your argument?
        
               | [deleted]
        
               | petre wrote:
               | I don't care about his theories. What matters that US
               | export controls on encryption were reduced due to his
               | previous lawsuit and he has offered alternative
               | encryption in the public domain.
        
               | smegsicle wrote:
               | > I believe the implication that NIST or NSA somehow
               | bribed one of the PQC researchers to weaken a submission
               | is risible.
               | 
               | maybe you don't know what risible means, but it reads
               | like you're saying that the NSA "somehow" coercing
               | someone is unlikely, which i'm sure you can agree is a
               | "very naive and trusting view"
        
               | escape_goat wrote:
               | Maybe he does know what risible means and is in fact
               | extremely well informed, much better informed than you
               | are, to the point where offering sarcasm on the apparent
               | basis of absolutely nothing but what you've learnt from
               | the internet is actually not a valuable contribution to
               | the conversation but instead embarrassing. Have you
               | considered this possibility as well?
        
               | tptacek wrote:
               | No part of what I said had anything to do with what NSA
               | would or wouldn't attempt to do.
               | 
               | If you don't understand what I wrote, ask questions. What
               | you did instead was leap to stupid conclusions.
        
               | jona-f wrote:
               | You used obscure language to make yourself look smart and
               | deal with the resulting confusion by calling people
               | stupid instead of clarifying what was said. Please get
               | your ego in order.
        
               | dcow wrote:
               | He said bribed, which quite explicitly means _payment
               | made to a person in a position of trust to corrupt his
               | judgment_. Coerced is not bribed. Period.
        
               | smegsicle wrote:
               | a risible distinction- a cursory reading of the article
               | will reveal that bribery was only brought forth as an
               | _example_ of coercion
        
               | tptacek wrote:
               | It's a fun word, right? "Risible"? I chose it carefully,
               | though.
        
               | carapace wrote:
               | Michael Palin in Monty Python's Life of Brian, "Do you
               | find it... risible?"
               | 
               | https://youtu.be/kx_G2a2hL6U?t=177
               | 
               | (I don't have anything constructive to add to the
               | conversation. -\\_(tsu)_/- )
        
               | dcow wrote:
               | Where?
               | 
               | If you RTFA you'd know it pertains to bribery, not
               | coercion.
        
               | throwaway654329 wrote:
               | To quote the article:
               | 
               |  _At the risk of belaboring the obvious: An attacker won
               | 't have to say "Oops, researcher X is working in public
               | and has just found an attack; can we suppress this
               | somehow?" if the attacker had the common sense to hire X
               | years earlier, meaning that X isn't working in public.
               | People arguing that there can't be sabotage because
               | submission teams can't be bribed are completely missing
               | the point._
               | 
               | He goes on to say: _I coined the phrase "post-quantum
               | cryptography" in 2003. It's not hard to imagine that the
               | NSA/IDA post-quantum attack team was already hard at work
               | before that, that they're years ahead of the public in
               | finding attacks, and that NSA has been pushing NISTPQC to
               | select algorithms that NSA secretly knows how to break._
               | 
               | Does this seem unreasonable, and if so, why?
               | 
               | He also remarks: _Could such a weakness also be exploited
               | by other large-scale attackers? Best bet is that the
               | answer is yes. Would this possibility stop NSA from
               | pushing for the weakness? Of course not._
               | 
               | Doesn't sound to me like he only has concerns about
               | bribery. Corruption of the standards to NSA's benefit is
               | one overarching issue. It's not the only one, he has
               | concerns about non-American capabilities as well.
               | 
               | The are many methods for the NSA to achieve a win.
               | 
               | Ridiculing people for worrying about this is totally lame
               | and is harmful to the community.
               | 
               | To suggest a few dozen humans are beyond reproach from
               | attack by the most powerful adversaries to ever exist is
               | extremely naive at best. However that literally isn't
               | even a core point as Bernstein notes clearly.
        
               | dcow wrote:
               | FFS nobody is saying that the general idea of being
               | skeptical is unreasonable. And nobody is being ridiculed
               | for doing such. This subthread is about the contents of
               | tptacek's comment, which _doesn 't do_ what you are
               | saying. Saying DJB's claims are inconceivable _is the_
               | mischaracterization. People are very eager to paint a
               | picture nobody intended so they can say something and be
               | right.
               | 
               | I use djb's crypto. Everybody knows his speculation.
               | Everybody knows why he's pursuing more information.
               | Nobody disagrees more information would be a public good.
               | Some people are more skeptical than others that he'll
               | find anything substantial.
        
               | throwaway654329 wrote:
               | You said this up thread and I find it incorrect:
               | 
               | > If you RTFA you'd know it pertains to bribery, not
               | coercion
               | 
               | By _quoting the article_ it seems the text directly
               | contradicts your summary as being too narrow. General
               | coercion is also be included as part of the concerns
               | raised by _TFA_. He isn't just talking about NSA giving a
               | person a sack of money.
               | 
               | Meanwhile in this thread and on Twitter, many people are
               | indeed doing the things you say that _nobody is doing_.
               | 
               | We almost all use Bernstein's crypto -- some as mere
               | users, others as developers, etc. I'm not sure what that
               | brings to the discussion.
               | 
               | I'm glad we agree that his work to gather more
               | information is a public good.
        
               | alariccole wrote:
               | The person is saying one thing then denying saying that
               | thing and being a jerk about it. Either a bot or someone
               | with a broken thesaurus. Glad you pointed it out because
               | it's ridiculous/risible.
        
               | kbenson wrote:
               | That person is very well known in this community, and in
               | other communities as well.
               | 
               | They are also known for making very specific arguments
               | that people misinterpret and fight over, but the actual
               | intent and literal meaning of the statements is most
               | often correct (IMO).
               | 
               | Whether this is a byproduct of trying to be exacting in
               | the language used that tends to cause people interpretive
               | problems or a specific tactic to expose those that are a
               | combination of careless with their reading and willing to
               | make assumptions rather than ask questions is unknown to
               | me, but that doesn't change how it tends to play out,
               | from my perspective.
               | 
               | In this case, I'll throw you a bone and restate his
               | position as I understand it.
               | 
               | NIST ran the competition in question in a way such that
               | all the judges referred each other, and all are very well
               | known in the cryptographic field, and the suggestion by
               | someone with more common game that they could be bribes
               | in this manner (note not that the NSA would not attempt
               | it, but the implication they would _succeed_ with the
               | people in question) is extremely unlikely, and that DJB
               | would suggest as much knowing his fame may matter to
               | people more than the facts of who these people are, is
               | problematic.
        
               | MauranKilom wrote:
               | If that is the case, then what is the explanation for
               | NIST (according to DJB) 1. not communicating their
               | decision process to anywhere near the degree that they
               | vowed to, and 2. stone-walling a FOIA request on the
               | matter?
               | 
               | > Whether this is a byproduct of trying to be exacting in
               | the language used that tends to cause people interpretive
               | problems or a specific tactic to expose those that are a
               | combination of careless with their reading and willing to
               | make assumptions rather than ask questions is unknown to
               | me
               | 
               | Communicating badly and then acting smug when
               | misunderstood is not cleverness (https://xkcd.com/169/).
               | 
               | If many people do not understand the argument being made,
               | it doesn't matter how "exacting" the language is - the
               | writer failed at communicating. I don't have a stake in
               | this, but from afar this thread looks like tptacek making
               | statements so terse as to be vague, and then going
               | "Gotcha! That's not the right interpretation!" when
               | somebody attempts to find some meaning in them.
               | 
               | In short: If standard advice is "you should ask questions
               | to understand my point", you're doing it wrong. This
               | isn't "HN gathers to tease wisdom out of tptacek" - it's
               | on him to be understood by the readers (almost all of
               | which are lurkers!). Unless he doesn't care about that,
               | but only about shouting (what he thinks are) logically
               | consistent statements into the void.
        
               | kbenson wrote:
               | > If that is the case, then what is the explanation for
               | NIST (according to DJB) 1. not communicating their
               | decision process to anywhere near the degree that they
               | vowed to, and 2. stone-walling a FOIA request on the
               | matter?
               | 
               | Why are you asking me, when I was clear I was just
               | stating my interpretation of his position, and he had
               | already replied to me with even more clarification to his
               | position?
               | 
               | > Communicating badly and then acting smug when
               | misunderstood is not cleverness
               | 
               | I don't disagree. My observations should not be taken as
               | endorsement for a specific type of behavior, if that's
               | indeed what is being done.
               | 
               | That said, while I may dislike how the conversation plays
               | out, I can't ignore that very often he has an intricate
               | and we'll thought out position that is expressed
               | succinctly, and in the few cases where someone treats the
               | conversation with respect and asks clarifying questions
               | rather than makes assumptions the conversation is clear
               | and understanding is quickly reached between most
               | parties.
               | 
               | I'm hesitant to lay the blame all on one side when the
               | other side is the one jumping to conclusions and then
               | refusing to accept their mistake when it's pointed out.
        
               | tptacek wrote:
               | The explanation for the FOIA process is that public
               | bodies routinely get intransigent about FOIA requests and
               | violate the statutes. Read upthread: I have worked with
               | Bernstein's FOIA attorneys before. Like everyone else, I
               | support the suit, even as I think it's deeply silly for
               | Bernstein to equate it to _Bernstein v US_.
               | 
               | If you made me guess about why NIST denied his FOIA
               | requests, I'd say that Bernstein probably royally pissed
               | everyone at NIST off before he made those requests, and
               | they denied them because they decided the requests were
               | being made in bad faith.
               | 
               | But they don't get to do that, so they're going to be
               | forced to give up the documents. I'm sure when that
               | happens Bernstein will paint it as an enormous legal
               | victory, but the fact is that these outcomes are
               | absolutely routine.
               | 
               | When we were FOIA'ing the Police General Orders for all
               | the suburbs of Chicago, my own municipality declined to
               | release theirs. I'd already been working with Topic on a
               | (much more important) FOIA case from a friend of mine, so
               | I reached out asking for him to write a nastygram for me.
               | The nastygram cost me money --- but he told me having him
               | sue would not! It was literally cheaper for me to have
               | him sue my town than to have him write a letter, because
               | FOIA suits have fee recovery terms.
               | 
               | I really can't emphasize enough how much suing a public
               | body to force compliance with FOIA is just a normal part
               | of the process. It sucks! But it's utterly routine.
        
               | tptacek wrote:
               | I'm not sure I'd use the same words, but yeah, the
               | argument I'm refusing to dignify is that NSA _could have
               | been successful_ at bribing a member of one of the PQC
               | teams. Like, what is that bribed person going to do? Look
               | at the teams; they 're ridiculously big. It doesn't even
               | make sense. Again: part of my dismissiveness comes from
               | how clear it is that Bernstein is counting on his
               | cheering section not knowing any of this, even though
               | it's a couple of Google searches away.
        
               | throwaway654329 wrote:
               | One trivial example implied by the blog post: Such
               | corruption could be involved in the non-transparent
               | decision making process at NIST.
               | 
               | Regarding Dual_EC: we still lack a lot of information
               | about _how_ this decision was made internally at NIST.
               | That's a core point: transparency was promised in the
               | wake of discovered sabotage and it hasn't arrived.
        
               | tptacek wrote:
               | What do you mean, "how" the decision about Dual EC was
               | made? It's an NSA-designed backdoor. NIST standardized it
               | because NSA told them to. I'm sure NSA told NIST a story
               | about why it was important to standardize it. The
               | Kremlinology isn't interesting: it is NSA's chartered job
               | to break cryptography, and nobody should ever trust them;
               | the only thing NSA can do to improve cryptography is to
               | literally publish secret attacks, and they're not going
               | to do that.
        
               | throwaway654329 wrote:
               | What do I mean? Iran-Contra, Watergate, or a 9/11 report
               | style report, like levels of investigation. Given how
               | widely read the BULLRUN stories were, it's not credible
               | to suggest the details aren't important.
               | 
               | The American people deserve to know who picked up the
               | phone or held a meeting to make this happen. Who was
               | present, who at NIST knew what, and so on. Who internally
               | had objections and indeed who set the policy in the first
               | place. What whistleblower protections were in place and
               | why didn't the IG have involvement in public? Why did we
               | have to learn about this from Snowden?
               | 
               | NSA has a dual mandate, on that I hope we can agree. It's
               | my understanding that part of their job is to secure
               | things and that part of their job is to break stuff.
               | 
               | NIST has no such dual mandate, heads should roll at NIST.
               | We probably agree that NSA probably won't be accountable
               | in any meaningful sense, but NIST _must_ be - we are
               | stuck with them. Not trusting them isn't an option for
               | anyone who files their taxes or banks or does any number
               | of other regulated activities that require using NIST
               | standards.
        
               | dcow wrote:
               | Nowhere does the comment say _that the NSA "somehow"
               | coercing someone is unlikely_. Hence, it's fair question
               | whether the comment had been comprehended, because it
               | seems it hasn't in this thread. If comprehension begets
               | intelligence than conclusions born from misunderstanding
               | exude stupidity.
               | 
               | And, dropping the pedantry, it's quite frustrating to be
               | deliberately or casually or in whatever way
               | misrepresented by drive-by commenters in an otherwise apt
               | discussion thread. Your comment and the one tptacek
               | responded to are patronizing and dismissive and really
               | don't contribute to any interesting discourse on the
               | topic. I think it's fair to dismiss _stupid drive-by low-
               | effort quips_ , personally.
        
             | jeffparsons wrote:
             | > I believe the implication that NIST or NSA somehow bribed
             | one of the PQC researchers to weaken a submission is
             | risible.
             | 
             | Is that even a claim here? I'm on mobile right now so it's
             | a bit hard for me to trawl through the DJB/NIST dialogue,
             | but I thought his main complaint is that NIST didn't appear
             | to have a proper and clear process for choosing the
             | algorithms they did, when arguably better algorithms were
             | available.
             | 
             | So the suggestion wouldn't necessarily be that one of the
             | respected contestants was bribed or otherwise compromised,
             | but rather that NIST may have been tapped on the shoulder
             | by NSA (again) with the suggestion that they should pick a
             | specific algorithm, and that NSA would make the suggestion
             | they have because their own cryptographers ("true
             | believers" on NSA payroll) have discovered flaws in those
             | suggested algorithms that they believe NSA can exploit but
             | hopefully not adversaries can exploit.
             | 
             | There's no need for any novel conspiracies or corruption;
             | merely an exact repeat of previous NSA/NIST behaviour
             | consistent with NSA policy positions.
             | 
             | It's simultaneously about as banal as it gets, and deeply
             | troubling because of that.
        
               | tptacek wrote:
               | It is indeed a claim here; in fact, it's probably the
               | principle claim.
        
               | _notreallyme_ wrote:
               | The actual claim is that NSA may have already spent a lot
               | of time and effort to analyse PQC algorithm underlying
               | problems without making their findings public.
               | 
               | DJB seems to suspect that they may influence NIST to
               | select algorithms and parameters within the range of what
               | they already know how to break.
        
               | tptacek wrote:
               | Huh? _Of course_ NSA spent a lot of time and effort
               | analyzing algorithms without making their findings
               | public. That is their literal job. The peer review NIST
               | is refereeing happened in the open. When people broke
               | SIDH, they didn 't whisper it anyone's ear: they
               | published a paper. That's how this stuff works. Bernstein
               | doesn't have a paper to show you; all he has is innuendo.
               | How you know his argument is as limp as a cooked
               | spaghetti noodle is that he actually stoops to suggesting
               | that NSA might have bribed one of the members of the PQC
               | teams.
               | 
               | If he had something real to say, he wouldn't have
               | embarrassed himself like that. How I think I know that
               | is, I think any reasonable person would go way out of
               | their way to avoid such an embarrassing claim, absent
               | extraordinary evidence, of which he's presented none.
        
               | armitron wrote:
               | Your animosity towards DJB is meritless given that he is
               | a subject matter expert, and your record in this domain
               | compared to his, is abysmal if not entirely irrelevant.
               | 
               | Moreover, given your history of ridiculous arguments
               | -defending corrupt government mechanisms- that have
               | collapsed with the passage of time (e.g. Assange
               | extradition) I wonder why anyone would place any value in
               | anything you say.
        
               | tptacek wrote:
               | I'm pretty comfortable with the people who do and don't
               | take me seriously.
        
               | daneel_w wrote:
        
               | cycomanic wrote:
               | It is very hard to not take this comment as being made in
               | bad faith. You either are willfully ignorant or have
               | ulterior motives.
               | 
               | It is a matter of public record (as also detailed again
               | in the article) that the NSA colluded with NIST to get
               | weakened crypto standardised. This happened not only once
               | but multiple times and they when weaknesses become known
               | have repeatedly (and against better knowledge) downplayed
               | the impact. This is undisputed. After the Dual EC skandal
               | they promised that they would be more transparent in the
               | future. DJB alleges that there is important information
               | missing on the decision making processes in the most
               | recent PQC discussion (I am willing to trust him on that,
               | but if you are an expert in the field I'm sure you can
               | elaborate here why it is incorrect). That's why did an
               | FOI request which has not been answered and he is now
               | filing a lawsuit over.
               | 
               | I would argue based on past behaviour we should trust DJB
               | much more than either the NSA or NIST, but it seems you
               | are more occupied with unsubstantially attacking his
               | person than getting to the truth.
        
               | vasco wrote:
               | > he actually stoops to suggesting that NSA might have
               | bribed one of the members of the PQC teams
               | 
               | I don't know anyone in the teams to judge their moral
               | fiber, but I'm 100% sure the NSA is not above what is
               | suggested and your weird outrage at the suggestion seems
               | surprising knowing what is public knowledge about how the
               | NSA operates.
               | 
               | There are arguments here about NSA pressure on NIST. You
               | miss the point because apparently you're offended that
               | someone suggested your friends can be bribed. I mean,
               | maybe they can't, but this is about the NSA being
               | corrupt, not the researchers.
        
               | throwaway654329 wrote:
               | It _can_ be everybody involved. It should include NIST
               | based on the history alone.
               | 
               | Some of the commentary on this topic is by people who
               | also denied DUAL_EC until (correctly) conceding that it
               | was actually a backdoor, actually deployed, and that it
               | is embarrassing for both NSA and NIST.
               | 
               | This sometimes looks like reactionary denialism. It's a
               | safe position that forces others to do a lot of work, it
               | seems good faith with some people and not so much with
               | others.
        
               | tptacek wrote:
               | I'm people who denied that Dual EC was a backdoor (my
               | position wasn't an unusual one; it was that Dual EC was
               | too stupid to actually use, which made it an unlikely
               | backdoor). Dan Bernstein didn't educate me about that;
               | like anybody else who held that position, the moment I
               | learned that real products in the industry were built
               | with libraries that defaulted to Dual EC, the jig was up.
               | 
               | I'm honest about what I'm saying and what I've said. You
               | are not meeting the same bar. For instance, here you're
               | insinuating that my problem on this thread is that I
               | think NIST is good, or trustworthy, or that NSA would
               | never have the audacity to try to bribe anybody. Of
               | course, none of that is true.
               | 
               | I don't know how seriously you expect anybody to take
               | you. You wrote 13-paragraph comment on this thread based
               | on Filippo's use of an "It's Always Sunny In Philadelphia
               | Meme", saying that it was a parody of "A Beautiful Mind",
               | which is about John Nash, who was mentally ill, and also
               | an anti-semite, ergo Filippo Valsorda is an anti-semite
               | who punches down at the mentally ill. It's right there
               | for everybody to read.
        
               | monocasa wrote:
               | I guess I'm not reading it that way. In fact, a FOIA
               | request is going after official records, which I wouldn't
               | expect would contain outright bribery.
               | 
               | Yes, DJB brings up their known bribing of RSA wrt to the
               | whole Dual-EC thing. But my read of that bit of info was
               | the more general 'here's evidence that the NSA actively
               | commits funding towards infecting standards' rather than
               | 'the NSA's playbook just contains outright bribery and
               | that's what we expect to find in the FOIA requests given
               | to NIST'.
        
               | tptacek wrote:
               | The FOIA issue is 100% legitimate. NIST is required to
               | comply with FOIA.
        
               | what-imright wrote:
               | You don't get it clearly. They're playing dirty. At best
               | the FOIA will receive a document made on the fly with
               | nothing of value. The rules don't apply to the NSA. You
               | can do exactly nothing. But NIST, you can do something
               | about - reject any standard they approve. It's your
               | choice what algorithm you use, and we know NIST will
               | select a broken algorithm for the NSA, so just ignore
               | their 'standard'. The best solution is using layers of
               | crypto, trusting no single algorithm.
        
               | tptacek wrote:
               | You should tell Bernstein that! Your logic implies he's
               | wasting his time with the suit.
        
               | simsla wrote:
               | "You shouldn't fight because the baddies are strong!" is
               | a horrible argument in my book. Discouraging and
               | disparaging other people's attempts is even worse.
        
         | api wrote:
         | I don't think it's a bad thing to push back and demand
         | transparency. At the very least the pressure helps keep NIST
         | honest. Keep reminding them over and over and over again about
         | dual-EC and they're less likely to try stupid stuff like that
         | again.
        
           | xt00 wrote:
           | Speaking of dual-EC -- it does seem like 2 questions seem to
           | be often debated, but it can't be neglected that some of the
           | vocal debaters may be NSA shills:
           | 
           | 1. does the use of standards actually help people, or make it
           | easier for the NSA to determine which encryption method was
           | used?
           | 
           | 2. are there encryption methods that actually do not suffer
           | from reductions in randomness or entropy etc when just simply
           | running the algorithm on the encrypted output multiple times?
           | 
           | It seems that these question often have piles of people ready
           | to jump in saying "oh, don't roll your own encryption, ooh
           | scary... fear uncertainty doubt... and oh whatever you do,
           | don't encrypt something 3X that will probably make it easier
           | to decrypt!!" .. but it would be great if some neutral 3rd
           | party could basically say, ok here is an algorithm that is
           | ridiculously hard to break, and you can crank up the number
           | of bits to a super crazy number.. and then also you can run
           | the encryption N times and just not knowing the number of
           | times it was encrypted would dramatically increase the
           | complexity of decryption... but yea how many minutes before
           | somebody jumps in saying -- yea, don't do that, make sure you
           | encrypt with a well known algorithm exactly once.. "trust
           | me"...
        
             | logifail wrote:
             | > some neutral 3rd party
             | 
             | Unfortunately, this would appear to be the bit we've not
             | yet solved, nor are we likely to.
        
             | tptacek wrote:
             | 1. Formal, centralized crypto standards, be they NIST or
             | IETF, are a force for evil.
             | 
             | 2. All else equal, fewer dependencies on randomness are
             | better. But all else is not equal, and you can easily lose
             | security by adding determinism to designs willy-nilly in an
             | effort to minimize randomness dependencies.
             | 
             | Nothing is, any time in the conceivable future, change to
             | make a broken RNG not game-over. So the important thing
             | remains ensuring that there's a sound design for your RNG.
             | 
             | None of our problems have anything to do with how "much"
             | you encrypt something, or with "cranking up the number of
             | bits". That should be good news for you; generally, you can
             | run ChaPoly or AES-CTR and trust that a direct attack on
             | the cipher isn't going to be an issue for you. Most of our
             | problems are in the joinery, not the beams themselves.
        
             | [deleted]
        
             | Thorrez wrote:
             | >2. are there encryption methods that actually do not
             | suffer from reductions in randomness or entropy etc when
             | just simply running the algorithm on the encrypted output
             | multiple times?
             | 
             | I think all block ciphers (e.g. AES) meet that definition.
             | For AES, for a specific key, there's a 1-to-1 mapping of
             | plaintexts to ciphertexts. It's impossible that running a
             | plaintext through AES produces a ciphertext with less
             | entropy, because if the ciphertext had less entropy, it
             | would be impossible to decrypt to get back the plaintext,
             | but AES always allows decryption.
        
             | MauranKilom wrote:
             | > are there encryption methods that actually do not suffer
             | from reductions in randomness or entropy etc when just
             | simply running the algorithm on the encrypted output
             | multiple times?
             | 
             | Unless you can prove that all e.g. 2^256 possible 256 bit
             | inputs map to 2^256 different 256 bit outputs (for every
             | key, in the case of encryption), then chances are you lose
             | strength with every application because multiple inputs map
             | to the same output (and consequently some outputs are not
             | reachable).
        
               | comex wrote:
               | For encryption, as opposed to hashing, you can't have
               | multiple inputs map to the same output, because then you
               | wouldn't be able to decrypt the output.
        
               | adgjlsfhk1 wrote:
               | it's very easy to prove that all encryption functions are
               | 1 to 1. Otherwise, you couldn't decrypt the data.
        
           | tptacek wrote:
           | Transparency is good, and, as Bernstein's attorneys will ably
           | establish, not optional.
        
             | ddingus wrote:
             | It's as optional as the people can be convinced to not
             | worry about it.
        
         | stefantalpalaru wrote:
        
         | theknocker wrote:
        
         | encryptluks2 wrote:
         | I have no doubt that they are great at their job, but when it
         | comes to lawsuits the judge(s) are equally as important. You
         | could get everything right but a judge has extreme power to
         | interpret the law or even ignore it in select cases.
        
           | NolF wrote:
           | I wouldn't say they ignore the law, but legislation like FOIA
           | has a lot of discretion to balance competing interests and
           | that's where a judge would make the most different despite
           | all the great articulations of the most brilliant lawyers.
        
             | tptacek wrote:
             | There are very few public bodies that do a solid, to-the-
             | letter job of complying with their open records
             | requirements. Almost all FOIA failings are due to the fact
             | that it isn't staffed adequately; FOIA officers, clerks,
             | and records attorneys are all overworked. When you do a
             | bunch of FOIA stuff, you get a feel for what's going on
             | with the other side, and you build a lot of empathy (which
             | is helpful in getting your data over the long run).
             | 
             | And then other times you run into bloody-mindedness, or
             | worse.
             | 
             | I don't think NIST has many excuses here. It looks like
             | they botched this straightforwardly.
             | 
             | It's a straightforward case. My bet is that they'll lose
             | it. The documents will get delivered. That'll be the end of
             | it.
        
       | taliesinb wrote:
       | Why is the submission URL using http instead of https? That just
       | seems... bizarre.
        
         | CharlesW wrote:
         | https://blog.cr.yp.to/20220805-nsa.html works too.
        
         | sdwr wrote:
         | Cryptography experts know when to care about security.
         | Cryptography enthusiasts try to slap encryption on everything.
        
         | [deleted]
        
         | effie wrote:
         | Why? Http is simpler, less fragile, not dependent on good will
         | of third parties, the content is public, and proving
         | authenticity of text on Internet is always hard, even when
         | served via the https scheme. I bet Bernstein thinks there is
         | little point in forcing people to use https to read his page.
        
           | z9znz wrote:
           | MITM could change what the client receives, right?
        
             | effie wrote:
             | Yes. But if you worry about being a target for MITM
             | attacks, https alone does not fix that problem. You need
             | some reliable verification mechanism that is hard to fool.
             | The current CA system or "trust on first use" are only
             | partial, imperfect mechanisms.
        
           | oittaa wrote:
           | That's just wrong on so many levels. Troy Hunt has an
           | excellent explanation: https://www.troyhunt.com/heres-why-
           | your-static-website-needs...
        
             | effie wrote:
             | Troy Hunt points out that HTTP traffic is sometimes MITMed
             | in a way that clients and servers do not like, and HTTPS
             | sometimes prevents that. I never said otherwise. I am
             | saying for certain kinds of pages, it's not a major
             | concern. Like for djb website.
             | 
             | Why not use HTTPS for everything? Because it also has
             | costs, not just benefits.
        
         | msk20 wrote:
         | Just FYI, On my Firefox its saying "Connection Secure (upgraded
         | to https)", its actually using ECDHE CHACHA20 SHA256.
         | 
         | Note: I have "Enable HTTPS-Only Mode in all windows" on by
         | default.
        
       | ForHackernews wrote:
       | Maybe this is too much tinfoil hattery, but are we _sure_ DJB isn
       | 't a government asset? He'd be the perfect deep-cover agent.
        
         | temptemptemp111 wrote:
        
         | rethinkpad wrote:
         | Though 99% of the time I would agree with you, the public has
         | to have faith in people who claim to be fighting (with
         | previously noted successes in Bernstein v. US) in our best
         | interests.
        
         | throwaway654329 wrote:
         | Please don't do the JTRIG thing. Dan is a national treasure and
         | we would be lucky to have more people like him fighting for all
         | of us.
         | 
         | Between the two, material evidence shows that NIST is the deep-
         | cover agent sabotaging our cryptography.
        
           | temptemptemp111 wrote:
        
       | crabbygrabby wrote:
       | Seems like a baaad idea lol.
        
         | yieldcrv wrote:
         | seems like they just need a judge to force the NSA to comply
         | with a Freedom of Information Act request, its just part of the
         | process
         | 
         | I'm stonewalled on an equivalent Public Record Act request w/ a
         | state, and am kind of annoyed that I have to use the state's
         | court system
         | 
         | Doesn't feel super partial and a couple law journals have
         | written about how its not partial at all in this state and
         | should be improved by the legislature
        
           | throwaway654329 wrote:
           | This is part of a class division where we cannot practically
           | exercise our rights which are clearly enumerated in public
           | law. Only people with money or connections can even attempt
           | to get many kinds of records.
           | 
           | It's wrong and government employees involved should be fired,
           | and perhaps seriously punished. If people at NIST had faced
           | real public scrutiny and sanction for their last round of
           | sabotage, perhaps we wouldn't see delay and dismissal by
           | NIST.
           | 
           | Delay of responding to these requests is yet another kind of
           | sabotage of the public NIST standardization processes. Delay
           | in standardization is delay in deployment. Delay means mass
           | surveillance adversaries have more ciphertext that they can
           | attack with a quantum computer. This isn't a coincidence,
           | though I am sure the coincidence theorists will come out in
           | full force.
           | 
           | NIST should be responsive in a timely manner and they should
           | be trustworthy, we rely on their standards for all kinds of
           | mandatory data processing. It's pathetic that Americans don't
           | have _several IG investigations in parallel_ covering NIST
           | and NSA behavior. Rather we have to rely on a professor to
           | file lawsuits for the public (and cryptographers involved in
           | the standardization process) to have even a glimpse of what
           | is happening. Unbelievable but good that _someone_ is doing
           | it. He deserves our support.
        
             | PaulDavisThe1st wrote:
             | Even though I broadly agree with what you've written here
             | ... the situation in question isn't really about NIST/NSA
             | response to FOIA requests at all.
             | 
             | It's about whether the US government has deliberately acted
             | to foist weak encryption on the public (US and otherwise),
             | presumably out of desire/belief that it has the right/need
             | to always decrypt.
             | 
             | Whether and how those agencies respond to FOIA requests is
             | a bit of a side-show, or maybe we could call it a prequel.
        
               | throwaway654329 wrote:
               | We are probably pretty much in agreement. It looks like
               | they've got something to hide and they're hiding it with
               | delay tactics, among others.
               | 
               | They aren't alone in failing to uphold FOIA laws, but
               | they're important in a key way: once the standard is
               | forged, hardware will be built, certified, deployed, and
               | _required_ for certain activities. Delay is an attack
               | that is especially pernicious in this exact FOIA case
               | given the NIST standardization process timeline.
               | 
               | As a side note, the NIST FOIA people seem incompetent for
               | reasons other than delay.
        
               | denton-scratch wrote:
               | > the situation in question isn't really about NIST/NSA
               | response to FOIA requests at all.
               | 
               | I disagree. To my mind, the issue is that a national
               | standards agency with form for certifying standards they
               | knew were broken, still isn't being transparent about
               | their processes. NIST's reputation as been mud since the
               | ECDRBG debacle.
               | 
               | People are _not_ at liberty to ignore NIST
               | recommendations, and use schemes that are attested by the
               | likes of DJB, because NIST recommendations get built into
               | operating systems and hardware. It damages everyone
               | (including the part of NSA that is concerned with
               | national security) that (a) NIST has a reputation for
               | untrustworthiness, and (b) they aren 't showing the
               | commitment to transparency that would be needed to make
               | them trustworthy again.
        
             | yieldcrv wrote:
             | > This is part of a class division where we cannot
             | practically exercise our rights which are clearly
             | enumerated in public law. Only people with money or
             | connections can even attempt to get many kinds of records.
             | 
             | As someone with those resources, I'm still kind of annoyed
             | because I think this state agency is playing chess
             | accurately too. My request was anonymous through my lawyer
             | and nobody would know that I have these documents, while if
             | I went through the court - even if it was anonymous with
             | the ACLU being the filer - there would still be a public
             | record in the court system that someone was looking for
             | those specific documents, so that's annoying
        
               | throwaway654329 wrote:
               | That's a thoughtful and hard won insight, thank you.
        
         | gruturo wrote:
         | Yeah, terrible idea, except this is Daniel Bernstein, who
         | already had an equally terrible idea years ago, and won. That
         | victory was hugely important, it pretty much enabled much of
         | what we use today (to be developed, exported, used without
         | restrictions, etc etc etc)
        
         | zitterbewegung wrote:
         | He won a case against the government representing himself so I
         | think he would be on good footing. He is a professor where I
         | graduated and even the faculty told me he was interesting to
         | deal with. Post QC is his main focus right now and also he
         | published curve25519.
        
           | matthewdgreen wrote:
           | He was represented by the EFF during the first, successful
           | case. They declined to represent him in the second case,
           | which ended in a stalemate.
        
             | throwaway654329 wrote:
             | The full story is interesting and well documented:
             | https://cr.yp.to/export.html
             | 
             | Personally my favorite part of the history is on the
             | "Dishonest behavior by government lawyers" page:
             | https://cr.yp.to/export/dishonesty.html - the disclaimer at
             | the top is hilarious: "This is, sad to say, not a complete
             | list." Indeed!
             | 
             | Are you implying that he didn't contribute to the first win
             | before or during EFF involvement?
             | 
             | Are you further implying that a stalemate against the U.S.
             | government is somehow bad for self representation after the
             | EFF wasn't involved?
             | 
             | In my view it's a little disingenuous to call it a
             | stalemate implying everything was equal save EFF involved
             | when _the government changes the rules_.
             | 
             | He challenged the new rules alone because the EFF
             | apparently decided one win was enough.
             | 
             | When the judge dismissed the case, the judge said said that
             | he should come back when the government had made a
             | "concrete threat" - his self representation wasn't the
             | issue. Do you have reason to believe otherwise?
             | 
             | To quote his press release at the time: ``If and when there
             | is a concrete threat of enforcement against Bernstein for a
             | specific activity, Bernstein may return for judicial
             | resolution of that dispute,'' Patel wrote, after citing
             | Coppolino's ``repeated assurances that Bernstein is not
             | prohibited from engaging in his activities.'' -
             | https://cr.yp.to/export/2003/10.15-bernstein.txt
        
               | matthewdgreen wrote:
               | I'm saying that the EFF are skilled lawyers who won a
               | major case, and they should not be deprived of credit for
               | that accomplishment.
        
               | throwaway654329 wrote:
               | Sure, EFF played a major role in that case as did
               | Bernstein. It made several lawyers into superstars in
               | legal circles and they all clearly acknowledge his
               | contributions to the case.
               | 
               | Still you imply that he shouldn't have credit for that
               | first win and that somehow he failed in the second case.
               | 
               | EFF shouldn't have stopped fighting for the users when
               | the government changed the rules to something that was
               | also unacceptable.
        
               | matthewdgreen wrote:
               | The original poster said "he won a case against the
               | government representing himself" and I felt that
               | statement was incomplete, if not inaccurate and wanted to
               | correct the record. I'm pretty sure Dan, if he was here,
               | would do the same.
        
               | zitterbewegung wrote:
               | Sorry I didn't know that part. I have only seen Professor
               | Bernstein once (he had a post QC t shirt on so that's the
               | only way I knew who he was ). I have never interacted
               | with him really. He is also the only faculty that is
               | allowed to have a non UIC domain. Thank you for
               | correcting me .
        
               | throwaway654329 wrote:
               | You appear to be throwing shade on his contributions. Do
               | I misunderstand you?
               | 
               | A stalemate, if you already want to diminish his efforts,
               | isn't a loss by definition - the classic example is in
               | chess. He brought the government to heel even after EFF
               | bailed. You're also minimizing his contributions to the
               | first case.
               | 
               | His web page clearly credits the right people at the EFF,
               | and he holds back on criticism for their lack of
               | continuing on the case.
               | 
               | I won't presume to speak for Dan.
        
       | josh2600 wrote:
       | I just want to say, the problem here is worldwide standards
       | bodies for encryption need to be trustworthy. It is incredibly
       | hard to know what encryption is actually real without a deep
       | mathematics background and even then, a choir of peers must be
       | able to present algorithms, and audits of those algorithms with a
       | straight face.
       | 
       | Presenting broken-by-design encryption undermines public
       | confidence in what should be one of our most sacrosanct
       | institutions: the National Institute of Standards and Technology
       | (NIST). Many enterprises do not possess the capability to audit
       | these standards and will simply use whatever NIST recommends. The
       | danger is that we could be engineering embedded systems which
       | will be in use for decades which are not only viewable by the NSA
       | (which you might be ok with depending on your political
       | allegiance) but also likely viewable by any capable organization
       | on earth (which you are probably not ok with irrespective of your
       | political allegiance).
       | 
       | In short, we must have trustworthy cryptography standards. If we
       | do not, bedlam will follow.
       | 
       | Please recall, the last lawsuit that DJB filed was the one that
       | resulted in essentially "Code is speech" in our world
       | (https://en.wikipedia.org/wiki/Bernstein_v._United_States).
        
         | bananapub wrote:
         | how could NIST possibly be "one of our most sacrosanct
         | institutions" after the NSA already fucked them with
         | Dual_EC_DRBG?
         | 
         | whoever wants to recommend standards at any point since 2015
         | needs to be someone else
         | 
         | https://en.wikipedia.org/wiki/NIST_SP_800-90A for this who have
         | forgotten.
        
           | josh2600 wrote:
           | Look, my point is that there are lots of companies around the
           | world who can't afford highly skilled mathematicians and
           | cryptographers on staff. These institutions rely on NIST to
           | help them determine what encryption systems may make sense.
           | If NIST is truly adversarial, the public has a right to know
           | and determine how to engage going forward.
        
             | tptacek wrote:
             | They don't have to (and shouldn't) retain highly skilled
             | mathematicians. Nobody is suggesting that everyone design
             | their own ciphers, authenticated key exchanges, signature
             | schemes, and secure transports. Peer review is good; vital;
             | an absolute requirement. Committee-based selection
             | processes are what's problematic.
        
               | josh2600 wrote:
               | I'm just saying, you're speaking as an expert in the
               | field. Let's say you don't want to do design any of that
               | stuff but you need some parts of those systems for the
               | thing you're building. How do you decide what you can or
               | can't trust without having deep knowledge of the subject
               | matter?
               | 
               | Maybe that's it, maybe you can't?
        
               | tptacek wrote:
               | How do you know that Noise is a good design and that a
               | cipher cascade isn't? Whatever (correctly) told you that,
               | apply it to other cryptographic problems.
        
               | josh2600 wrote:
               | I see. So maybe what you're really saying is "why are you
               | writing a system that has cryptographic primitives if
               | you're not a cryptographer/mathematician?"
        
               | tptacek wrote:
               | No, that is not at all what I am saying.
        
               | josh2600 wrote:
               | Let me ask this another way. I know how we determined
               | noise was a good standard and that was talking to a lot
               | of people who had built sophisticated crypto systems and
               | then doing the research ourselves, but that's only
               | because we had the people on staff who had the capacity
               | to evaluate such systems.
               | 
               | If we didn't have those people, how would you suggest
               | figuring out which system to implement?
        
               | tptacek wrote:
               | Peer review is a good start. Noise, and systems derived
               | from it like WireGuard, are peer reviewed (check
               | scholar.google.com for starters), and NIST had nothing at
               | all to do with it.
        
               | nequo wrote:
               | It is incredibly hard to get a good grasp of the
               | consensus in a literature as a non-expert just by
               | searching Google Scholar. People spend years in graduate
               | school to learn to do that.
               | 
               | Are there reputable journals or conference proceedings
               | that you specifically recommend reading for high-quality
               | literature reviews?
        
               | nequo wrote:
               | Where does the non-cryptographer public find out about
               | the current consensus of the literature? Genuine
               | question.
        
               | TaylorAlexander wrote:
               | I guess if I saw what FAANG companies were using to
               | secure their own data that could be an indicator. Though
               | they could be compromised.
        
         | tptacek wrote:
         | There's an easier problem here, which is that our reliance on
         | formal standards bodies for the selection of cryptography
         | constructions is bad, and, not hardly just at NIST, has been
         | over the last 20 years mostly a force for evil. One of the most
         | important "standards" in cryptography, the Noise Protocol
         | Framework, will probably never be a formal standard. But on the
         | flip side, no formal standards body is going to crud it up with
         | nonsense.
         | 
         | So, no, I'd say that bedlam will not follow from a lack of
         | trustworthy cryptography standards. We've trusted standards too
         | much as it is.
        
           | javajosh wrote:
           | Believing both "Don't roll your own crypto" and "Don't trust
           | the standards" would seem to leave the average developer in
           | something of a quandry, no?
        
             | tptacek wrote:
             | No. I don't think we should rely on formal standards, like
             | FIPS, NIST, and the IETF. Like Bernstein himself, I do
             | think we should rely on peer-reviewed expert cryptography.
             | I use Chapoly, not a stream cipher I concocted myself, or
             | some bizarro cipher cascade posted to HN. This is what I'm
             | talking about when I mentioned the Noise Protocol
             | Framework.
             | 
             | If IETF standards happen to end up with good cryptography
             | because they too adopt things like Noise or Ed25519, that's
             | great. I don't distrust the IETF's ability to standardize
             | something like HTTP/3. I do deeply distrust the process
             | they use to arrive at cryptographic architectures. It's
             | gotten markedly better, but there's every reason to believe
             | it'll backslide a generation from now.
             | 
             | (There are very excellent people who contribute to things
             | like CFRG and I wouldn't want to be read as disparaging any
             | of them. It's the process I have an issue with, not
             | anything happening there currently.)
        
               | josh2600 wrote:
               | I guess this is my point: If you have strong
               | mathematicians and cryptographers, you don't end up using
               | NIST.
               | 
               | There are lots of companies who have need for
               | cryptography who don't know who to trust. What should
               | they do in a world where the standards bodies are
               | adversarial?
               | 
               | Maybe this is just the future, if you don't know crypto
               | you're doomed to either do the research or accept that
               | you're probably backdoored? Seems like a rough place to
               | be...
        
               | tptacek wrote:
               | So use whatever crypto Signal uses, or that WireGuard
               | uses. You're not working in a vacuum. You don't even
               | trust NIST to begin with, and yet we still encrypt
               | things, so I'm a little confuddled by the argument that
               | NIST's role as a trusted arbiter of cryptography is vital
               | to our industry. NIST is mostly a force for evil!
        
               | josh2600 wrote:
               | Signal's crypto doesn't solve all problems (neither does
               | wireguard).
               | 
               | For example, we built private information recovery using
               | the first production grade open source implementation of
               | oblivious RAM (https://mobilecoin.com/overview/explain-
               | like-i'm-five/fog you'll want to skip to the software
               | engineer section) so that organizations could obliviously
               | store and recover customer transactions without being
               | able to observe them. The signal protocol's techniques
               | might be part of a cryptographic solution but it is not a
               | silver-bullet.
               | 
               | I guess, notably, we never looked at NIST when designing
               | it so maybe that's the end of the discussion there.
        
               | tptacek wrote:
               | I didn't say Signal and WireGuard "solved all problems",
               | and neither does any given NIST standard! The track
               | record of cryptosystems built to, say, FIPS standards is
               | _extremely bad_.
        
               | dwaite wrote:
               | > No. I don't think we should rely on formal standards,
               | like FIPS, NIST, and the IETF.
               | 
               | I assume your concerns are with the process of
               | standardization, and not the idea of standards
               | themselves. After all, there are plenty of expert peer-
               | reviews going on in NIST and in the IRTF.
               | 
               | Noise is useful for building your own bespoke kit, but
               | there does need to be an agreement to use it in the same
               | manner if you hope for interoperability. Things like
               | public key crypto are precisely useful because the other
               | side can read the information back out at the end of the
               | process, even if they aren't running e.g. the same email
               | client version.
        
               | tptacek wrote:
               | NIST is procedurally the least objectionable of all of
               | these standards bodies. Contests are better than
               | collaborations. But NIST itself is a force for evil, not
               | for the lurid message board reason of a shadowy cabal of
               | lizard people trying to weaken PQC, but because "NIST
               | standardization" keeps a lot of 1990s-era crypto in use
               | and prevents a lot of modern crypto from being deployed
               | in the industry.
        
               | InitialBP wrote:
               | Standards are for people who are not experts in the field
               | or don't have the time and energy to research the
               | existing crypto and actually sift through them to try and
               | decide what to trust and what not to trust.
               | 
               | Lack of standardization might just make it harder for Joe
               | to filter through the google searches and figure out what
               | algorithm to use. He may just pick the first result on
               | Google, which is an ad for the highest bidder on some
               | keywords which may or may not be good.
        
               | zaik wrote:
               | Standards are also essential for the interoperability of
               | systems.
        
               | tptacek wrote:
               | Formal standards aren't essential for interoperability.
        
         | [deleted]
        
       | mort96 wrote:
       | Weirdly, any time I've suggested that maaaybe being too trusting
       | of a known bad actor which has repeatedly published intentionally
       | weak cryptography is a bad idea, I've received a whole lot of
       | push-back and downvotes here on this site.
        
         | throwaway654329 wrote:
         | Indeed. Have my upvote stranger.
         | 
         | The related "just ignore NIST" crowd is intentionally or
         | unintentionally dismissing serious issues of governance. Anyone
         | who deploys this argument is questionable in my mind,
         | essentially bad faith actors, especially when the topic is
         | about the problems brought to the table by NIST and NSA.
         | 
         | It is a good sign that those people are actively ignoring the
         | areas where you have no choice and you _must_ have your data
         | processed by a party required to deploy FIPS certified software
         | or hardware.
        
           | [deleted]
        
         | [deleted]
        
         | morpheuskafka wrote:
         | I'm working on a project that involves a customized version of
         | some unclassified, non-intelligence software for a defense
         | customer at my job (not my ideal choice of market, but it
         | wasn't weapons so okay with it). Some of the people on the
         | project come from the deeper end of that industry, with several
         | TS/SCI contract and IC jobs on their resumes.
         | 
         | We were looking over some errors on the sshd log and it was
         | saying it couldn't find the id_ed25519 server cert. I remarked
         | that that line must have stayed even though the system was put
         | in FIPS mode which probably only allowed the NIST-approved ECC
         | curve and related this story, how everyone else has moved over
         | to ed25519 and the government is the only one left using their
         | broken algorithm.
         | 
         | One of the IC background guys (who is a very nice person,
         | nothing against them) basically said, yeah the NSA used to do
         | all sorts of stuff that was a bad idea, mentioning the Clipper
         | chip, etc. What blew my mind is that they seemed to totally
         | have reasonable beliefs about government surveillance and
         | powers, but then when it comes to someone like Snowden, thinks
         | their are a traitor and should have used the internal channels
         | instead of leaking. I just don't understand how they think
         | those same people who run NSA would have cared one bit, or
         | didn't know about it already. I always assumed the people that
         | worked in the IC would just think all this stuff was OK to
         | begin with I guess.
         | 
         | I don't know what the takeaway is from that, it just seems like
         | a huge cognitive dissonance.
        
           | [deleted]
        
           | 2OEH8eoCRo0 wrote:
           | While I am skeptical of US domestic surveillance, Snowden
           | leaked this information in the worst possible way.
           | 
           | Try internal whistleblower channels first. Not being heard?
           | Mail to members of Congress? Contact congress? Contact the
           | media?
           | 
           | Instead he fled to an adversary with classified material.
           | That's not good faith behavior imo. Traitor
        
             | zingplex wrote:
             | Regarding trying internal channels, Snowden says he tried
             | this
             | 
             | > despite the fact that I could not legally go to the
             | official channels that direct NSA employees have available
             | to them, I still made tremendous efforts to report these
             | programs to co-workers, supervisors, and anyone with the
             | proper clearance who would listen. The reactions of those I
             | told about the scale of the constitutional violations
             | ranged from deeply concerned to appalled, but no one was
             | willing to risk their jobs, families, and possibly even
             | freedom
             | 
             | The fleeing to a foreign adversary part would have been
             | completely avoidable if the US had stronger whistleblower
             | protections. It's perfectly reasonable to see what happened
             | to Chelsey Manning and Julian Assange and not want to
             | suffer a similar fate.
        
               | 2OEH8eoCRo0 wrote:
               | https://www.congress.gov/congressional-report/114th-
               | congress...
               | 
               | There is no record that he attempted to use internal
               | channels. He would have been afforded whistleblower
               | protection had he went to Congress with his findings.
        
               | zingplex wrote:
               | > There is no record that he attempted to use internal
               | channels
               | 
               | From the beginning of the Snowden quote:
               | 
               | > I could not legally go to the official channels that
               | direct NSA employees have available to them
               | 
               | In addition, I find it difficult to take any
               | congressional report on this matter, including the one
               | you cited, seriously given that their primary source is a
               | group of people who have repeatedly lied to Congress
               | without consequence.
        
               | 2OEH8eoCRo0 wrote:
               | Why do you take Snowden's word as gospel but dismiss a
               | bipartisan Congressional Committee's findings? I think
               | that you are biased and nothing will change your mind.
               | Let's agree to disagree.
        
               | throwaway654329 wrote:
               | You could also take the word of a person from deep inside
               | the Obama administration:
               | https://greenwald.substack.com/p/ben-rhodes-book-proves-
               | obam...
        
               | [deleted]
        
           | sneak wrote:
           | I think the term "doublethink" was invented specifically for
           | government functionaries like the IC guy you describe.
           | 
           | Being consistently and perfectly dogmatic requires holding
           | two contradictory beliefs in your head at once. It's a skill.
        
             | l33t2328 wrote:
             | It's not doublethink to say the programs should have been
             | exposed and that Snowden was a traitor for exposing them in
             | a manner that otherwise hurt our country.
             | 
             | He could have done things properly, instead he dumped
             | thousands of files unrelated to illegal surveillance to the
             | media.
        
         | 616c wrote:
         | Another upvote from someone with many friends and colleagues in
         | NIST. I hope transparency prevails and NISTers side with that
         | urge as well (I suspect many do).
        
           | throwaway654329 wrote:
           | They could and should leak more documents if they have
           | evidence of malfeasance.
           | 
           | There are both legal safe avenues via the IG process and
           | legally risky many journalists who are willing to work for
           | major change. Sadly legal doesn't mean safe in modern America
           | and some whistleblower have suffered massive retribution even
           | when they play by "the rules" laid out in public law.
           | 
           | As Ellsberg said: Courage is contagious!
        
         | [deleted]
        
         | glitchc wrote:
         | Many government or government affiliated organizations are
         | required to comply with NIST approved algorithms by regulation
         | or for interoperability. If NIST cannot be trusted as a
         | reputable source it leaves those organizations in limbo. They
         | are not equipped to roll their own crypto and even if they did,
         | it would be a disaster.
        
           | icodestuff wrote:
           | "Other people have no choice but to trust NIST" is not a good
           | argument for trusting NIST. Somehow I don't imagine the NSA
           | is concerned about -- and is probably actively in favor of --
           | those organizations having backdoors.
        
             | wmf wrote:
             | It's an argument for fixing NIST so that it is trustworthy
             | again.
        
               | throwaway654329 wrote:
               | This.
               | 
               | One wonders if NIST can be fixed or if it should simply
               | be abolished with all archives opened in the interest of
               | restoring faith in the _government_. The damage done by
               | NSA and NIST is much larger than either of those
               | organizations.
        
               | [deleted]
        
           | zamadatix wrote:
           | "Roll your own crypto" typically refers to making your own
           | algorithm or implementation of an algorithm not choosing the
           | algorithm.
        
             | lazide wrote:
             | Would you really want every random corporation having some
             | random person pick from the list of open source cipher
             | packages? Which last I checked , still included things like
             | 3DES, MD5, etc.
             | 
             | You might as well hand a drunk monkey a loaded sub machine
             | gun.
        
               | CodeSgt wrote:
               | Surely I'm misunderstanding, are you really advocating
               | that people should roll their own encryption algorithms
               | from scratch? As in, they should invent novel and secure
               | algorithms in isolation? And this should happen.... at
               | every major enterprise or software company in the world?
        
               | lazide wrote:
               | You are completely misunderstanding yes.
               | 
               | I'm saying some standards body is appropriate for
               | validating/vetting algorithms, and having a standards
               | body advocate for known reasonable ones is... reasonable
               | and desirable.
               | 
               | That NIST has a history of being compromised by the NSA
               | (and other standards bodies would likely similarly be a
               | target), is a problem. But having everyone 'figure it
               | out' on their own is even worse. 'hand a drunk monkey a
               | loaded submachine gun' worse.
        
               | CodeSgt wrote:
               | That makes much more sense. Thank you for the
               | clarification.
        
               | pessimizer wrote:
               | > That NIST has a history of being compromised by the NSA
               | is a problem.
               | 
               | It's a disqualifying problem. If you go to a standards
               | body to prevent yourself from making unintentional
               | mistakes, and they have introduced _intentional_
               | mistakes, any other reasonable option is better.
        
               | lazide wrote:
               | Personally I'm of the opinion that everyone is expecting
               | the NSA to try now, so the odds of them pulling it off
               | are essentially zero (same with other actors) at NIST.
               | 
               | If you specialize as a cat burglar after all, hitting the
               | ONE PLACE everyone expects you to hit while they're
               | watching goes against the grain.
               | 
               | More likely they're suborning us somewhere else. But hard
               | to say for sure.
        
               | zamadatix wrote:
               | Every random corporation having some random person
               | picking from a list of open source cipher packages isn't
               | the only alternative to strictly requiring the algorithm
               | be NIST approved. It may be the worst possible
               | alternative one could conceive though, and one that would
               | probably take more work to do than something more
               | reasonable anyways.
        
               | lupire wrote:
        
               | l33t2328 wrote:
               | What's wrong with 3DES?
        
       | dataflow wrote:
       | Tangential question: while some FOIA requests do get stonewalled,
       | I continue to be fascinated that they're honored in other cases.
       | What exactly prevents the government from stonewalling
       | practically _every_ request that it doesn 't like, until and
       | unless it's ordered by a court to comply? Is there any sort of
       | penalty for their noncompliance?
       | 
       | Tangential to the tangent: is there any reason to believe FOIA
       | won't be on the chopping block in a future Congress? Do the
       | majority of voters even know (let alone care enough) about it to
       | hold their representatives accountable if they try to repeal it?
        
         | linuxandrew wrote:
         | I know someone who works in gov (Australia, not US) who told me
         | all about a FOI request that he was stonewalling. From memory,
         | the request was open ended and would have revealed more than it
         | possibly intended it to, and would have revealed some
         | proprietary trade secrets from a third party contractor. That
         | said, it was probably a case that would attract some public
         | interest.
         | 
         | The biggest factors preventing governments from stonewalling
         | every FOI case are generally time and money. Fighting FOI cases
         | is time consuming and expensive and it's simply easier to hand
         | over the information.
        
           | dhx wrote:
           | At least in Australia I gather it it somewhat common for FOI
           | offices to work with an FOI applicant to ask them to narrow
           | the request if it is so broad as to cost too much or take too
           | long to process, or is likely to just to be returned as
           | hundreds of black pages.
           | 
           | Previous FOI responses show more savvy FOI applicants in the
           | past have also (when they don't get the outcome they
           | desired):
           | 
           | 1. Formally requested review of decisions to withhold
           | information from release. This almost always lead to more
           | information being released.
           | 
           | 2. Waited and tried requesting the same or similar
           | information again in a later year when different people are
           | involved.
           | 
           | 3. Sent a follow up FOIA request for correspondence relating
           | to how a previous (or unanswered) request was or is being
           | processed by the FOI office and other parties responding to
           | the request. This has previously shown somewhat humorous
           | interactions with FOI offices such as "We're not going to
           | provide that information because {lame excuse}" vs FOI office
           | "You have to. CC:Executives" vs "No" vs Executives "It's not
           | your information" etc etc.
           | 
           | 4. Sent a follow up FOIA request for documentation, policies,
           | training material and the likes for how FOI requests are
           | assessed as well as how and by whom decisions are made to
           | release or withhold information.
           | 
           | 5. Sent a follow up FOIA request for documentation, policies,
           | staffing levels, budgets, training material and the likes for
           | how a typical event that the original FOIA request referred
           | to would be handled (if details of a specific event are not
           | being provided).
           | 
           | Responses to (2), (3) and (4) are probably more interesting
           | to applicants than responses to (1), (2) and original
           | requests, particularly when it is clear the applicant
           | currently or previously has knowledge of what they're
           | requesting.
        
           | dataflow wrote:
           | Interesting, thanks for the anecdote!
           | 
           | > The biggest factors preventing governments from
           | stonewalling every FOI case are generally time and money.
           | 
           | Is there any backpressure in the system to make the
           | employee(s) responsible for responding/signing off on the
           | disclosure actually care about how expensive it is to fight a
           | case? I would've thought they would think, "Well, the
           | litigation cost doesn't affect me, I just approve/deny
           | requests based on their merits."
        
         | Panzer04 wrote:
         | Presumably most government employees are acting in good faith -
         | why wouldn't they fulfil a reasonable FOIA request?
         | 
         | This is likely the result of some actors not acting in good
         | faith, and so have no choice but to stonewall lest their
         | intransigence be revealed.
        
           | lupire wrote:
           | All execs have to do is not staff the FOIA department, and
           | requests get ignored. People generally prefer free time to
           | doing paperwork, if boss allows.
        
       | graderjs wrote:
       | So the TLDR is... you do roll your own crypto? I mean you
       | probably need to know how to create a RNG that passes Practrand
       | and smasher first and also a hash function that does the same but
       | cool.
        
       | bsaul wrote:
       | holy crap, i wondered why the post didn't mention work by dj
       | bernstein outing flaws in curves submitted by nsa...
       | 
       | Well, didn't expect the post to actually be written by him.
        
       | xiphias2 wrote:
       | An interesting thing that is happening on Bitcoin mailing list is
       | that although it would be quite easy to add Lamport signatures as
       | an extra safety feature for high value transactions, as they
       | would be quite expensive and easy to misuse (they can be used
       | only once, which is a problem if money is sent to the same
       | address twice), the current concensus between developers is to
       | ,,just wait for NSA/NIST to be ready with the algorithm''. I
       | haven't seen any discussion on the possibility of never being
       | ready on purpose because of a sabotage.
        
         | potatototoo99 wrote:
         | Why not start that discussion yourself?
        
         | jack_pp wrote:
         | Indeed as potato said, link this article in the ML for them to
         | see that NIST can not be fully trusted
        
       ___________________________________________________________________
       (page generated 2022-08-06 23:01 UTC)