[HN Gopher] Quantum is unimportant to post-quantum
___________________________________________________________________
Quantum is unimportant to post-quantum
Author : woodruffw
Score : 81 points
Date : 2024-07-01 14:04 UTC (8 hours ago)
(HTM) web link (blog.trailofbits.com)
(TXT) w3m dump (blog.trailofbits.com)
| RcouF1uZ4gsC wrote:
| > But even if a quantum computer is never built, new PQ standards
| are safer, more resilient, and more flexible than their classical
| counterparts.
|
| I disagree.
|
| First of all they are far more inconvenient. They keynotes and
| signature sizes are bigger. With Curve25519 ECC we can have 256
| bit keys.
|
| Secondly, flexibility is mentioned, but I think the last serval
| years have shown that flexibility is huge vulnerability in crypto
| systems.
|
| Finally, I feel pretty confident that the NSA doesn't know too
| much more than we do about RSA or ECC which have been studied,
| studied, implemented, and analyzed for decades in the open. I
| worry with Post-Quantum algorithms, that there is a good chance
| that the NSA knows far more about them than does the public
| cryptography community.
| setopt wrote:
| I guess we then have to bet... Is it more likely that NSA has a
| secret large-scale quantum computer, or that they have a secret
| vulnerability in post-quantum algorithms? And does it matter
| anyway if they have a secret backdoor in your OS :)
| vlovich123 wrote:
| Yeah I feel like the article glosses over the fact that PQ is
| not about crypto flexibility but specific crypto flexibility
| that is immune to a hypothetical QC and a lot is sacrificed to
| achieve that goal (namely signature sizes and/or in some cases
| performance).
|
| And no, you shouldn't be using RSA but they completely gloss
| over that we have very very good ECC standards that in no way
| seem to be suffering the same problem as RSA (ie needing to
| regularly increase the key size) which is why afaict everyone
| has pretty much switched to ECC signatures unless they have
| some legacy reason.
| kibwen wrote:
| _> Secondly, flexibility is mentioned, but I think the last
| serval years have shown that flexibility is huge vulnerability
| in crypto systems._
|
| All the criticism of cryptographic agility that I have seen has
| involved an attacker negotiating a downgrade to a broken
| protocol. But if the protocol is not yet broken, then being
| agile isn't a concern, and if/when the protocol does become
| broken, then you can remove support for the broken protocol,
| which is what you'd be forced to do anyway, so a flexible
| approach just seems like a more gradual way to achieve that
| future transition.
|
| Being distrustful of untested post-quantum algorithms is fair.
| But you can always double-encrypt a message with both classical
| and post-quantum algorithms, which requires an attacker to
| break both of them in order to decipher it. This guards against
| both short-term algorithmic weaknesses in our nascent post-
| quantum algorithms, as well as long-term threats to classical
| algorithms from hypothetical quantum computers.
|
| Key sizes are larger, yes, but not show-stoppingly large; on
| par with secure RSA key sizes, which is tolerable. Kilobytes,
| not megabytes.
| CiPHPerCoder wrote:
| > All the criticism of cryptographic agility that I have seen
| has involved an attacker negotiating a downgrade to a broken
| protocol.
|
| Consider this an additional data point, then:
| https://paragonie.com/blog/2019/10/against-agility-in-
| crypto...
|
| In the years since I wrote that, several people have pointed
| out that "versioned protocols" are just a _safe_ form of
| "crypto agility". However, when people say "crypto agility',
| they usually mean something like what JWT does.
|
| What JWT does is stupid, and has caused a lot of issues:
| https://www.howmanydayssinceajwtalgnonevuln.com/
|
| If you want to use JWT securely, you have to go out of your
| way to do so: https://scottarc.blog/2023/09/06/how-to-write-
| a-secure-jwt-l...
|
| > But if the protocol is not yet broken, then being agile
| isn't a concern, and if/when the protocol does become broken,
| then you can remove support for the broken protocol, which is
| what you'd be forced to do anyway, so a flexible approach
| just seems like a more gradual way to achieve that future
| transition.
|
| This makes sense in situations where you have versioned
| protocols :)
|
| This doesn't work if you're required to support RSA with
| PKCS1v1.5 padding until the heat death of the universe.
| thadt wrote:
| Hmmm, some recent protocols (thinking of MLS[1] here) have
| moved into a middle territory where they have a lot of
| options for piecing together a cryptographic suite, but
| then version that whole suite within the protocol. You can
| still change suites without changing the protocol, but it's
| not nearly as 'agile' as earlier suite and primitive
| negotiations.
|
| Maybe something more like "cryptographic mobility" instead
| of "agility"? You can carefully decamp and move from one
| suite (versioned protocol) to another without changing all
| your software, but you're not negotiating algorithms and
| semantics on the fly.
|
| [1] https://datatracker.ietf.org/doc/rfc9420
| tedunangst wrote:
| The problem is that the broken protocol is frequently not
| removed. It is shoved to the end of the list, where it
| remains available, but that's "good enough" because surely it
| won't actually get used.
| kibwen wrote:
| Sure, but the same concern exists even for applications
| that have inflexible protocols. You either have someone
| willing and able to say that enough is enough and break
| backwards compatibility, or you don't.
| ukdghe wrote:
| Considering the question of whether classical methods can break
| the current breed of secure algorithms is still open I see pqc as
| a hedge against the possibility that p=np.
| IIAOPSW wrote:
| I don't know why this was downvoted. Even if P!=NP there are
| still other assumptions of hardness baked into modern
| cryptography which might turn out false. The abelian hidden
| subgroup problem (which both RSA and elliptic curves are
| instances) may turn out to have a classical solution.
| tyoma wrote:
| For a long time I wondered why there was such a big push for PQ
| even though there was no quantum computer and a reasonably
| working one was always 15 years in the future.
|
| ... or was there a quantum computer somewhere and it was just
| kept hush hush, hence the push for PQ?
|
| The answer turns out to be: it doesn't matter if there is a
| quantum computer! The set of PQ algorithms has many other
| beneficial properties besides quantum resistance.
| bdamm wrote:
| And many problems, namely, enormous keys and signatures that
| make PKI nigh impossible for the embedded/IoT space.
| tyoma wrote:
| According to the linked post there are PQ algorithms that
| will fit this niche:
|
| > This variety of different trade-offs gives developers a lot
| of flexibility. For an embedded device where speed and
| bandwidth are important but ROM space is cheap, McEliece
| might be a great option for key establishment. For server
| farms where processor time is cheap but saving a few bytes of
| network activity on each connection can add up to real
| savings, NTRUSign might be a good option for signatures. Some
| algorithms even provide multiple parameter sets to address
| different needs: SPHINCS+ includes parameter sets for "fast"
| signatures and "small" signatures at the same security level.
| vlovich123 wrote:
| Embedded/IoT is typically slow and small which is not a
| space PQ fits into.
|
| I also think the article is overly optimistic claiming that
| ECC is "hard" because of the need for careful curve
| selection (even though we have very good established
| curves), but I find it hard to believe that PQ algorithms
| are immune to parameter selection problems and
| implementation challenges.
| refset wrote:
| There has been research on the intersection of IoT and PQ
| signatures specifically at least, e.g. see "Short hash-
| based signatures for wireless sensor networks" [0] [1].
| Unlike SPHINCS+ which is mentioned in the article, if
| you're happy to keep some state around to remember the
| last used signature (i.e. you're not concerned about
| accidental re-use) then the scheme can potentially be
| _much_ simpler.
|
| [0] https://web.archive.org/web/20110401080052/https://ww
| w.cdc.i...
|
| [1] https://news.ycombinator.com/item?id=33925383 I wrote
| about this "Dahmen-Krauss Hash-Chain Signature Scheme"
| (DKSS) algorithm previously in a comment a couple of
| years ago
| bdamm wrote:
| The state is enormous. Dedicating megabytes and megabytes
| to key state is painful. And so is tracking state across
| components and through distribution channels. If you're
| not afraid of that then just use symmetric crypto and be
| done with it.
| refset wrote:
| > use symmetric crypto
|
| To be clear my comment is specifically only relating to
| signature schemes, not encryption.
|
| > The state is enormous
|
| The scheme I linked to points towards efficient
| "pebbling" and "hash chain traversal" algorithms which
| minimize the local state required in quite a fascinating
| way (e.g. see https://www.win.tue.nl/~berry/pebbling/).
|
| > tracking state across components and through
| distribution channels
|
| Assuming you have reliable ordering in those channels I
| don't see how the stateful nature of such schemes makes
| it hugely more complex than the essential hard problem of
| key distribution.
| AnotherGoodName wrote:
| The point is that a lot of secrets need to remain secrets for
| many years. If some government found a way to break elliptic
| curve in the same way that the number field seive broke rsa
| (hence we now need 2048-bit keys rather than 256bit keys we
| were using in the 90s) we'd be fucked for many years to come as
| all secrets are leaked.
|
| So there may not be quantum computers now. But if there's going
| to be in 20years we need our crypto to be resilient now.
| thehumanmeat wrote:
| Because SNDL is a long planned attack for secrets that have a
| long life time.
|
| https://en.wikipedia.org/wiki/Harvest_now,_decrypt_later
| vikramkr wrote:
| Also, we are talking about mitigating a large tangible downside
| risk to a sudden breakthrough in the space - all the secrets
| stop being secret. "Reasonable" timeline estimates for how far
| away we are matter for thinks like if/how much we invest in the
| tech, but optimistic timelines become pessimistic when
| defending against downsides and we should be pessimistic when
| preparing regulations and mitigations
| kadoban wrote:
| > ... or was there a quantum computer somewhere and it was just
| kept hush hush, hence the push for PQ?
|
| If there were a quantum computer somewhere, or close to one, it
| would be reasonably likely for it to be secret.
|
| I look at the history of crypto in the mid to late 20th century
| for example. Small groups in the Allies and the NSA and etc.
| had certainly more knowledge than was public by a wide margin,
| years to decades.
| ziofill wrote:
| I'm a physicist working on QC. I know we actually don't know if
| a "secret" QC exists somewhere, but given that major
| theoretical and engineering breakthroughs are needed to build a
| fault tolerant one (and all QC companies are facing this
| regardless of whether their qubits optical, superconducting,
| trapped ions etc), I'd put that possibility to near zero.
| Consider also the talent and expertise that is needed for such
| an endeavour...
| adastra22 wrote:
| There are pathways to quantum computers that are
| intrinsically highly fault tolerant: https://sqc.com.au/news/
| bn-l wrote:
| 50 million in a series A for a cold-fusion-like technology
| in Australia?? What's going on here? Did they discover
| something groundbreaking?
| beloch wrote:
| "Of course, one big concern is that everybody is trying to
| standardize cryptosystems that are relatively young. What if the
| industry (or NIST) picks something that's not secure? What if
| they pick something that will break tomorrow?"
|
| If information remains interesting to an adversary long-term,
| they can always archive classically ciphered text and apply
| future hardware and algorithmic advances to cracking it.
|
| This is why "post-quantum" cryptography may well be "quantum
| cryptography". QC must be broken _at the time of transmission_
| for an adversary to obtain any information at all. If you 're
| trying to communicate something that will remain sensitive long-
| term, with QC you aren't betting against the future producing
| something surprising.
|
| QC already works, it's getting cheaper and faster, _and_ more
| network friendly. It 's not ready for the internet yet, but it's
| getting there. We don't need it for information that changes
| every few years, like credit card info, but that's not _all_
| people use cryptography for, even today.
| awestroke wrote:
| That sounds very interesting. Where can I read more about QC
| over internet and how we're getting closer to it?
| DebtDeflation wrote:
| https://scottlocklin.wordpress.com/2019/01/15/quantum-
| comput...
| marcosdumay wrote:
| QC requires direct contact between sender and receiver (if you
| have direct contact, why do you care about cryptography at
| all?); can't be used on stored data; can't prove authenticity;
| and can inherently be broken by a man in the middle attack.
|
| So, no, it's not a viable replacement for anything.
| fsh wrote:
| Quantum key distribution is completely unsuitable to replace
| asymmetric cryptography, even when ignoring the (enormous)
| technological difficulties. It fundamentally cannot handle
| authentication, and requires a pre-shared key to prevent MITM-
| attacks, so one can replace it with any modern stream cipher
| (which are astronomically more econonical and don't require
| physical point-to-point connections). Even exchanging an SSD
| with a long key and using a one-time pad is far superior to
| QKD.
| HappyPanacea wrote:
| Can't we sign the key with SPHINCS+ and be back to the same
| situation as with classical asymmetric cryptography?
| pclmulqdq wrote:
| Don't knock sending around gigantic one-time pads on SSDs
| until you try it :)
|
| In all seriousness, QKD also has huge SNR problems over long
| range. QKD is really a layer 1 protocol that needs you to
| entangle qubits far away from each other, and that relies on
| having a very clean physical layer. You can sort of do it
| over short-range fiber right now with a known-to-be-direct
| connection, but any other medium is still off the table. Once
| you have done the QKD layer 1, put any protocol you want on
| top of it.
| vlovich123 wrote:
| QC at a very very very limited scale works and it's not clear
| that it can be made to scale to solve problems faster than
| classical.
| adastra22 wrote:
| > But even if a quantum computer is never built, new PQ standards
| are safer, more resilient, and more flexible than their classical
| counterparts.
|
| lol wat? Nothing could be further from the truth. Just a few
| years ago, two of the four finalists of the NIST PQC were broken
| so badly that you could break production key sizes in hours with
| a 10 year old Mac mini. Most PQC deployments aren't broken
| because they double up with classical crypto systems so that you
| have to break both, but people who were using these schemes were
| basically wasting their time.
|
| Key sizes and signatures are typically much, much larger than
| classical cryptosystems.
|
| PQC systems lack the flexibility of classical cryptography.
| Generally speaking the linear structure of classical schemes is
| what quantum computers exploit, and so it is gone from most
| (all?) PQC systems. That linear structure is what lets you do
| things like make a 2-of-2 by adding pubkeys and signatures. I'm
| not sure what is meant by flexibility of not stuff like this.
| some_furry wrote:
| > Just a few years ago, two of the four finalists of the NIST
| PQC were broken so badly that you could break production key
| sizes in hours with a 10 year old Mac mini.
|
| SIKE was not a Finalist, it was a Round 4 candidate.
|
| Candidates being broken before the process finishes is a sign
| of the process working.
| thesz wrote:
| https://cacm.acm.org/news/nist-post-quantum-cryptography-
| can...
|
| "Belgian researchers have cracked the SIKE cryptographic
| algorithm, a fourth and final-round candidate that the U.S.
| National Institute of Standards and Technology (NIST) was
| evaluating for its Post-Quantum Cryptography (PQC) standard."
|
| SIKE was a finalist, it entered final round.
| ohxh wrote:
| > These are all special instances of a more general computational
| problem called the hidden subgroup problem. And quantum computers
| are good at solving the hidden subgroup problem. They're really
| good at it.
|
| I assume they mean the hidden subgroup problem _for abelian
| groups_? Later they mention short integer solutions (SIS) and
| learning with errors (LWE), which by my understanding both rely
| on the hardness of the shortest vector problem, corresponding to
| the hidden subgroup problem for some non-abelian groups. I haven
| 't read into this stuff for a while, though
___________________________________________________________________
(page generated 2024-07-01 23:01 UTC)