[HN Gopher] 2048 Bit RSA and the Year 2030
___________________________________________________________________
2048 Bit RSA and the Year 2030
Author : upofadown
Score : 117 points
Date : 2023-07-10 19:58 UTC (3 hours ago)
(HTM) web link (articles.59.ca)
(TXT) w3m dump (articles.59.ca)
| nailer wrote:
| Is there a quantum algorithm for cracking ECDSA like Shor's for
| RSA? I was hoping they'd mention it in the article.
| akvadrako wrote:
| Practically a quantum computer breaking ECC would happen much
| sooner than RSA, since ECC uses much smaller keys and the main
| issue with quantum computers is scaling.
| frutiger wrote:
| Yes. See
| https://crypto.stackexchange.com/questions/53425/quantum-
| alg....
| bawolff wrote:
| I was under the impression that shor's algorithm worked for
| ecdsa just as well as rsa.
| gary_0 wrote:
| Shor's algorithm can also break elliptic curve encryption:
| https://en.wikipedia.org/wiki/Post-quantum_cryptography
| mjevans wrote:
| Part of the issue as a prospective cryptographic
| user/consumer is that not only do I not know which
| algorithm(s) should be used, the most likely library
| https://github.com/open-quantum-safe/liboqs also explicitly
| states that it shouldn't be used in production.
|
| Hybrid deployment (E.G. with ECC using a curve like 25519) is
| a great recommendation and probably obvious, far more so than
| picking a winner among the available post quantum possibly
| safe algorithms.
| tptacek wrote:
| It is more or less universally assumed in practice that PQ
| key agreement algorithms will be paired with classical
| (RSA/ECC) cryptography. There's not much need to discuss or
| standardize it; it's a given.
| ivlad wrote:
| Doesn't NSA object hybrid schemes on weird grounds that
| current implementations suck, full of implementation
| errors and all-new PQ-only ones will not?
|
| Edit: reference https://mailarchive.ietf.org/arch/msg/spa
| sm/McksDhejGgJJ6xG6...
| tptacek wrote:
| That doesn't make much sense as an objection, since the
| classical cryptography code is better ironed out than the
| PQ stuff, and the logic to combine the two is fairly
| simple.
|
| Unless there is an unexpected leap in the viability of
| quantum cryptanalysis, you should expect that all
| commercial/standard cryptography with PQ capabilities
| will run in a hybrid configuration.
|
| I'm only commenting here because there's a pervasive
| belief that this is controversial in cryptography
| engineering circles, or that NIST is trying somehow to
| prevent hybrid schemes from happening, which is simply
| not the case --- though they may not bother to
| standardize any particular mechanism of combining ECC/RSA
| with PQ exchanges (but: they don't standardize stuff like
| TLS ciphersuites, either).
| nailer wrote:
| So ECDHE becomes EC(Post Quantum Exchange)? Am I
| understanding that correctly?
| mjevans wrote:
| I suspect you do not understand given the phrasing.
|
| Think of Tunneling or layers or nesting dolls. The order
| doesn't particularly matter from a security perspective.
| Though today I'd wrap with the conventional algorithm on
| the OUTSIDE layer, since it takes less computational time
| to check / validate. The inner layer would then be one or
| more of the post-quantum algorithms; a particularly
| paranoid application might use more than one if something
| absolutely must remain a secret.
| tptacek wrote:
| A good Google search term is "CECPQ2". Roughly: run a PQ
| KX to reach a shared secret, and run an independent
| X25519 to reach a second share secret, and then HKDF them
| both (conceptually: just cat them together and use them
| as a single secret).
| HappyPanacea wrote:
| Why not just XOR them? Is the reason we use HKDF is so if
| one or both is only partially broken (only some bits or
| some correlation) instead of fully broken we still
| benefits from the information that is still there?
| flangola7 wrote:
| Every mainstream asymmetric cipher is broken by quantum
| computing. Post-quantum ciphers are only this year slowly
| rolling out in canary & beta platforms and demand more memory
| and CPU cycles
|
| A conservative but reasonable risk assumption is to act as if
| all internet traffic prior to the year 2023 was transmitted in
| the clear. This includes Signal, PGP, and Tor.
| pjs_ wrote:
| https://arxiv.org/abs/2306.08585
| gregw2 wrote:
| Is factoring something that GPU /CUDA parallelism helps with?
| upofadown wrote:
| Not for the sort of technology we have today using the best
| known algorithm. The approach relies on the idea of sieving.
| Sieving takes a lot of memory. As an example, the most recent
| 829 bit factoring record used multi GB for each processor
| during the parallel phase and 1.5 TB for the final matrix
| reduction phase. Neither phase would really get much out of a
| bunch of processors attached to just a few GB.
| klabb3 wrote:
| I'm not a cryptographer, but I can see many more pressing reasons
| for migrating off RSA before 2030. Is there any reason to pick
| RSA for greenfield today?
|
| RSA, to my knowledge, is vulnerable to side channels and poor
| parameter choices. Implementation simplicity is an underrated
| security parameter. The fewer feet you have, the fewer of them
| you can shoot.
|
| The NSA data centers don't want to waste time on your RSA key
| anyway, much less your run-of-the-mill Russian black hat groups.
| What bites us in practice are 0-days of something stupid like
| heartbleed or rowhammer that can be automated, and takes a long
| time to patch.
| upofadown wrote:
| If you accept the current assumption of double exponential
| growth for both computing performance and algorithmic
| efficiency then you would have to accept that 256 bit elliptic
| curve keys will become obsolete in the 2040s. So it might be
| just RSA and discrete logs today but a requirement for
| pointlessly long EC keys will be along soon.
| klabb3 wrote:
| Yeah but does it matter? Either that's true or it isn't. If
| it is true, we'll (as usual) have ample time to migrate. With
| RSA though, it's already today more complex, slower and about
| 8x larger key sizes. And the cryptanalysis track record is
| afaik (please correct me) much more successful than ECC, so
| it's "higher risk" that the timeline gets pushed forward or
| that new patches are needed to avoid bad parameter choices.
|
| > So it might be just RSA and discrete logs today but a
| requirement for pointlessly long EC keys will be along soon.
|
| It wouldn't be pointless if computers can crack those sizes.
| It'd only be pointless if cryptanalysis can exploit structure
| to reduce the effective entropy, no?
| woodruffw wrote:
| This is my (applied) cryptographic understanding as well: RSA
| 2048 probably won't be broken by improvements to prime
| factorization by 2030, but will _continue to be_ a pointlessly
| dangerous (and slow) cryptosystem when compared to ECC.
| AnotherGoodName wrote:
| I'm not convinced on the first part. That it won't be broken
| by improvements to prime factorization.
|
| https://www.ams.org/notices/199612/pomerance.pdf has a great
| writeup on the history of the work around this. Essentially
| when you see improvements in complexity of the form
|
| Old best: Quadratic Sieve: exp(c(log n)^1/2(log log n)^1/2)
|
| New best: General Number field sieve: exp(c(log n)^1/3(log
| log n)^2/3)
|
| I can't help but feel that's an exponent there that we've
| moved to 1/3 that could be moved further. Sure we don't know
| how and we've been stuck here on the current best for just
| over 25 years but i just feel that if you give me two methods
| and one moves a term like that there's a good chance there's
| a way to reduce that term further. It'd be weird for that
| complexity statement to stay as is. That's telling me "the
| universe doesn't allow factorization any faster than a term
| that's raised to a power of 1/3rd" and i'm asking "why is 1/3
| so special?". So i'm not convinced that there's not more
| here. I don't have a clue how of course. But the history of
| RSA going "256bits is secure" to 512bits to 1024bits to
| 2048bits being needed has me worried about the safety of
| prime factorization.
| forgotmypw17 wrote:
| Top reason to use RSA for me, as with most tech I use, is its
| longevity.
|
| Lindy Effect has been the best predictor of what will still
| work in five years.
| raggi wrote:
| The Lindy effect applies to non-perishables. There are
| alternatives which are much more likely to conform to a
| reasonable equivalency of non-perishable for cryptography.
| Per this article, RSA strongly does not fit reasonable
| equivalent definitions.
| withinboredom wrote:
| The fact that the sun rose today, has no effect on whether
| the sun will rise tomorrow.
| Strom wrote:
| And yet I doubt you would be willing to bet against the sun
| rising tomorrow.
|
| Our understanding is based on imperfect models, sure. That
| doesn't matter most of the time. It wouldn't matter in this
| bet.
|
| So much of what any lifeform does is based on past
| experience, even though that experience isn't the direct
| driver of future effects. Turns out that bets based on
| experience work really well.
| withinboredom wrote:
| Of course I'd bet the sun would rise tomorrow, because if
| it doesn't, I'm either dead or money will be worthless...
|
| The same applies here, would you bet on a horse that is
| flagging (RSA won't work forever)? We have the ability to
| take in new information, and throw away past information
| because it is no longer relevant. If you choose to ignore
| the new information, just because "it's always been that
| way", that doesn't seem rational.
| cbm-vic-20 wrote:
| Indeed. The sun has successfully rose every morning,
| hundreds of billions of times straight! That's a pretty
| good record.
| HappyPanacea wrote:
| McEliece encryption algorithm was published in 1978 (one year
| later than RSA) and seems to be considered safe to classical
| and quantum computers, the only downside is the huge public
| key size.
| bsder wrote:
| One of the other problems about RSA cracking progress is that not
| a lot of people care anymore.
|
| RSA is so slow that a _lot_ of people have switched to Elliptic
| Curve.
|
| That's going to dent progress as the smart people are all working
| on ECC instead of RSA.
| bawolff wrote:
| Its not totally implausible that a factoring breakthrough in
| ECC could have implications for RSA.
| woodruffw wrote:
| Could you say more? ECC's trapdoor function isn't prime
| factoring; it's a variant of the discrete log problem.
| ori_b wrote:
| https://arxiv.org/abs/quant-ph/0301141
| woodruffw wrote:
| Yes, I know that DLP and prime factorization are similar
| problems. I was trying to understand if the GP's comment
| was about improvements to prime factorization
| _specifically_ having implications for DLP (since
| "factoring breakthrough in ECC" is an unusual phrasing to
| me.)
| [deleted]
| roomey wrote:
| I think the point is as computers get faster there is less trade
| off in having longer bit lengths, rather than focusing on the
| potential to crack sad keys.
|
| That is, if it costs very little to have larger keys, why not
| have larger keys?
|
| It is essentially hedging your bets as even if quantum computing
| key factorisation works, key lengths will still have an impact on
| the difficulty of factorisation, and it may make a difference in
| terms of practicality.
| nailer wrote:
| They cover that in the article:
|
| > Quantum computing of the sort intended to break RSA involves
| a breakthrough in both computing and algorithms. Normally some
| sort of new computing technology is invented and then
| algorithms are designed to enable that technology to do
| something useful. The quantum computing threat to RSA is
| different. We now have the algorithm (Shor's) but the computer
| to run it on only exists in our imagination.
|
| > If someone were to invent such a computer then RSA 2048 would
| be immediately and trivially breakable. RSA 3072 would also be
| trivially breakable. The same applies to RSA 4096 and 8192.
| hujun wrote:
| `That is, if it costs very little to have larger keys, why not
| have larger keys?` it is often very expensive/difficult to
| change length of a RSA key is part of existing
| platform/infrastructure, like key in TPM/HSM/CA infrastructure,
| regardless how fast computer CPU is
| zokier wrote:
| But RSA has been long time going out, and short-keyed RSA
| doubly so. I would estimate that since maybe 2015ish
| deploying stuff that is coupled to 2048bit RSA would have
| been mistake. That gives generous 15ish year transition
| period. Anyone who cares even the slightest should succeed
| transition in that sort of timeframe.
| tptacek wrote:
| Why would deploying 2048 bit RSA be a mistake? If you
| believe 2048 is threatened in a meaningful time frame, when
| 1024 hasn't even been broken (thus sort of implying that
| the collapse of 2048 will occur in a much shorter time
| frame than the one separating 512 and 1024), is there any
| realistic RSA key size that should make you comfortable?
| bawolff wrote:
| > even if quantum computing key factorisation works, key
| lengths will still have an impact on the difficulty of
| factorisation, and it may make a difference in terms of
| practicality.
|
| I mean, the whole thing with quantum computer factoring is it
| scales well. Getting to 2048 rsa seems really really difficult.
| But if we ever get there, getting to 4096 is probably just a
| tiny extra step.
| alphager wrote:
| The current state of quantum computing suggests that there is
| an exponential effort to increase the available qbits. Going
| from RSA 2048 to RSA 4096 would not just double the effort
| required on the quantum side.
| upofadown wrote:
| Wouldn't that effectively prevent a threat to 2048 bit RSA
| in the first place?
| ihattendorf wrote:
| Almost like there's no free lunch. I don't see quantum
| computing ever really taking off without a fundamental
| breakthrough in our understanding of physics.
|
| Would love to be proven wrong though if my understanding
| is incorrect and there's actually a feasible path towards
| quantum computing at scale.
| bawolff wrote:
| I dont think there is any fundamental physics reasons
| preventing quantum computers. My understanding is it is
| an engineering problem. A hard one no doubt, but not a
| fundamental physics one.
|
| Anyways, my point was that getting a quantum computer at
| a decent scale is really difficult. If we manage to
| overcome that burden somehow, the difference between 2048
| bit rsa abd 4096 bit is peanuts.
| throw0101a wrote:
| Recommendations from various organizations can be found at:
|
| * https://www.keylength.com
|
| Anything recent (>=2016) seems to say 3072 for RSA.
| q845712 wrote:
| I have no specialist knowledge in this subfield, but after
| reading the article's arguments that basically if you could sic
| the entire bitcoin network on 2048 RSA it would take 700+
| years, I have to wonder about perverse incentives.
|
| Another thing that's missing is the lifetime expectancy, e.g.
| "for how many years does something encrypted in 2030 need to be
| unbreakable?"
|
| The author doesn't seem to be a big authority, so has little to
| lose by staking their reputation on "you don't need it to be
| that good," whereas by the very nature of their authority,
| anyone in the resource you link is going to be motivated to
| never be wrong under any circumstances. So if someone with some
| reputation/authority/power to lose think there's a 0.001%
| chance that some new incremental improvements will allow for
| fast-enough breaking of 2048 bit encryption created in 2030
| within a window where that would be unacceptable, then they're
| motivated to guess high. The authority in this case doesn't
| directly bear the costs of too high of a guess, whereas it
| could be very bad for, i dunno, some country's government, and
| by extension the org or people that made that country's
| standards recommendations, if some classified information
| became public 15 or 50 years earlier than intended just because
| it could be decrypted.
| traceddd wrote:
| I assume you're aware, but for clarity: it's not possible to
| sic the bitcoin network on anything, even cracking sha256,
| which it uses internally, due to hard-coded ASICs that
| incorporate specific quirks of the proof-of-work.
| slater wrote:
| Server appears to be hugged, web archive link:
|
| https://web.archive.org/web/20230710195916/https://articles....
| mmaunder wrote:
| That AI will accelerate algorithmic improvements is a new factor
| that has not previously been taken into account. This may be too
| optimistic.
| Conscat wrote:
| There's practically no reason to think that AI will greatly
| accelerate the development of new algorithms.
| zone411 wrote:
| Why? Because we're nearing the theoretical limits or because
| AI will be too dumb to help or for other reasons?
| thomashabets2 wrote:
| I use gpg plus kyber (quantum resistant). RSA may break, and
| kyber might suck. But I'm hoping not both.
|
| https://github.com/ThomasHabets/kybertest
| Aardwolf wrote:
| For Symmetric encryption, it says: 'Current key size: 112 bits'
|
| However the 3 linked examples, AES, ChaCha20 and Camellia all use
| a key size of at least 128 bits, with 192 or 256 bits also listed
| as options.
|
| What does this current NIST key size recommendation (effective as
| of 2019) of 112 mean then? Does anyone use this size?
| zokier wrote:
| 112 bit effective key size almost certainly refers to 3des
| upofadown wrote:
| That's pretty much for 3DES, which is disliked for other
| reasons. The point still stands though...
| jbarrs wrote:
| I find it amusing that the table published a conservative cutoff
| year and an optimistic cutoff year. Based on the trends I've
| seen, most non-critical software would probably have made the
| switch in time for the conservative year, whereas anything
| security-critical like a bank would probably use the optimistic
| year.
| upofadown wrote:
| The author of the paper had a problem. His elliptic curve
| method seemed like it might overtake the best known algorithm
| at the time for factoring. So the conservative estimate takes
| that into account. The elliptic curve method never managed
| supremacy so the optimistic estimate is actually more relevant.
| That means that the actual prediction is 2040 but it seems the
| various national cybersecurity entities might of missed that
| point.
| sansseriff wrote:
| We can expect a quantum computer with 20 million noisy qubits to
| break RSA 2048 [1]
|
| I can't speak to coherence time or circuit depth concerns, but
| qubit counts are doubling roughly every year. Current chips have
| thousands of qubits, so the exponential scaling implies we'd have
| 20 million qubits by 2035-2040.
|
| edit: And from the paper, the required quantum volume
| ("megaqubitdays") scales bewteen O(n^3) and O(n^4) with RSA key
| length. So a few years after breaking RSA 2048, you'd have a
| computer five times larger that could break RSA 3072.
|
| [1] https://quantum-journal.org/papers/q-2021-04-15-433/
| mirekrusin wrote:
| > Current chips have thousands of qubits,
|
| really? I thought we have 70 max?
| cwillu wrote:
| I suspect d-wave nonsense was involved in the confusion.
| fsh wrote:
| Google's "quantum supremacy" paper was about a 53 qubit chip in
| 2019 [1]. This year, they reported having 70 qubits [2].
| Looking at the papers (and supplements), the gate fidelities
| and qubit lifetimes have stayed roughly the same. This really
| doesn't look like anything like Moore's law to me.
|
| [1] https://doi.org/10.1038/s41586-019-1666-5
|
| [2] https://doi.org/10.48550/arXiv.2304.11119
| marcosdumay wrote:
| Last time I took around 1 hour to do a review of literature,
| I discovered that the size of those computers were growing at
| a _linear_ rate of around 4 qbits /year.
|
| It got slightly faster on the last few years, I don't know if
| inherently so or just due to random fluctuation. Yet people
| keep repeating that claim that the growth is exponential,
| based on no evidence at all.
| krastanov wrote:
| For example, is this developer roadmap far from the truth?
| https://newsroom.ibm.com/2022-11-09-IBM-Unveils-400-Qubit-
| Pl...
|
| While it is too early to call it exponential, it is wildly
| faster than 4 qubits/year. Same can be said about other
| hardware systems too (trapped ions, neutral atoms, and with
| some caveats photonics and color centers too).
| orlp wrote:
| If we didn't invent photolithography, we'd still be using
| computers the size of buildings limited to universities and
| military installations.
|
| Right now no one knows how to build a scalable quantum
| computer. But as soon as we find out the equivalent of
| lithography for building quantum chips, the progress will
| come, and it will come quickly and suddenly.
| Mistletoe wrote:
| How do you know there is an equivalent process for quantum
| chips?
| krastanov wrote:
| There are some equivalents already in existence (spin
| qubits in CMOS processes), that use the exact same
| lithography techniques. There are a few other potential
| alternatives too. All of them suffer from harder
| engineering challenges than the much "easier" to produce
| (in small volumes) transmons and trapped
| ions/atoms/molecules. Thus a plausible future is one in
| which the first somewhat-useful quantum computers are
| tens of thousands of qubits in transmon/ion hybrid
| systems. Then (in that plausible future) the spin qubits
| in CMOS to catch up, and thanks to their superior
| manufacturing scalability, they blow past the
| capabilities of transmons/ions.
|
| Not too different from how we had to use vacuum lamps
| while we figure out how solid state systems can work...
| Or spinning hard drives before we figured out SDDs.
|
| Or maybe none of this would work out and the whole field
| would be a bust... but you know Clarke's laws
| https://en.wikipedia.org/wiki/Clarke%27s_three_laws
| jessriedel wrote:
| At least as of 2020, it looked like both qubit counts and two-
| qubit gate quality were improving exponentially, and our naive
| extrapolation said RSA 2048 wouldn't get cracked by 2039 at 95%
| confidence.
|
| https://arxiv.org/abs/2009.05045
|
| As far as I can tell, this website suggests that two-qubit gate
| infidelity has continued to improve exponentially, although
| it's hard to tell if these are reliable datapoints.
|
| https://metriq.info/Task/38
|
| Because there's typically an engineering tension between qubit
| count and gate quality, what you want to track is something
| like quantum volume, which looks to be on trend
|
| https://metriq.info/Task/34
|
| but it's notable that Google achieved quantum supremacy without
| having amazing quantum volume numbers, so it's not a perfect
| metric
|
| https://spectrum.ieee.org/quantum-computing-google-sycamore
|
| Worth noting that once you cross the fault tolerant threshold
| you will probably see a big shift in how engineering effort is
| distributed: many researchers think it will be harder to
| improve gate quality than just increase increase qubit counts
| to make up for it, so you may see gate quality stall and
| extrapolation become even less reliable.
| akvadrako wrote:
| _> once you cross the fault tolerant threshold_
|
| If we cross the threshold. Until we cross it, scaling the
| number of qubits requires an exponentially lower noise floor,
| so the Moore's law equivalent is basically adding a fixed
| number of qubits per year.
| tptacek wrote:
| More detail on progress towards quantum factoring of RSA (among
| other things):
|
| https://sam-jaques.appspot.com/quantum_landscape
|
| I'm not super concerned. Cynically: one big driver of research
| into PQ cryptography is that it's a full employment program for
| academic cryptographers. Not that there's anything wrong with
| that! I like a Richelot isogeny as much as the next guy, or I
| would if I understood what they were.
| sweis wrote:
| The record quantum computers can factor is 21 -- and that is by
| cheating by already knowing the factors are 3 and 7. There are
| other results that use special form composites which don't
| count.
|
| So a QC can factor a 5 bit number with Shor's algorithm in 2023
| (with some cheating). That record has not changed for 10+
| years.
|
| I publicly bet 8 years ago that nobody would factor the number
| 35 by 2030. I hope I'm proved wrong.
| capableweb wrote:
| Unless you're worried about storing and/or transmitting a huge
| amount of keys (in the order of "at least 100/second") and/or
| using one key "at least 100 times/second", why not just go for
| 4096 by default?
| upofadown wrote:
| If you accept the current assumptions, then you would have to
| accept that 4096 bit RSA will be obsolete by 2060.
| thewataccount wrote:
| That's a significant amount of time if we're talking about
| long-term file storage.
|
| I've had my dropbox account for over 10 years now. Being
| concerned about a timescale of 20 to 30 years seems
| reasonable for things like long term file storage IMO.
|
| Backblaze, dropbox, google drive, onedrive, AWS are all over
| a decade old.
| yjftsjthsd-h wrote:
| So only 37 years? I think I can live with that.
| bee_rider wrote:
| It depends on what you are transmitting, right?
|
| Hypothetically if you are a journalist working with
| communications from a source in an authoritarian country
| (or a country that could become authoritarian in the next 4
| decades; and name one that couldn't, right?) it would be no
| good if you got some elderly person killed in the future.
|
| Or just like bank account details I guess?
| trilobyte wrote:
| In that case the volume of traffic such a communication
| medium would need to handle is likely small enough that
| you can bump the key size higher to ensure greater
| longevity, currently past the lifetimes of those
| involved, and accept that transmitting data will take a
| small fraction of time longer.
| JohnFen wrote:
| No cryptographic scheme remains unbreakable forever. If
| what you want is permanent protection, cryptography is
| not the solution. If people are encrypting things
| expecting that encryption will never be broken, they're
| misusing encryption.
|
| The point of cryptography isn't to keep secrets forever,
| it's to keep secrets for long enough that by the time
| those secrets are revealed, they are worthless.
| supertrope wrote:
| If extreme security is needed it's time to turn off your
| computer, leave your cellphone at home, don't tell
| details to colleagues who do not have a need to know, and
| maybe resort to dead drop methods so even you don't know
| who the source is yourself.
| bee_rider wrote:
| Some people are willing to take a on some extra risk to
| talk to journalists, and the world is a better place for
| their bravery.
|
| And we're talking about thousands of bits, we spend way
| more than that on stupid things like making UI slightly
| prettier. I'm streaming music at, I guess, ~200kbps, why
| not spend a couple seconds worth of music on keys? (Who
| knows, maybe it will protect some famous journalist
| somehow and we'll end up ahead when it spares us a whole
| moment of silence).
|
| Edit: TBH I'm not really sure this is a useful way to
| look at things, but the music bandwidth/moment of silence
| parallel was too convenient to pass up.
| devmor wrote:
| I would hope that someone invents a more robust cryptography
| protocol in the next 37 years.
| wongarsu wrote:
| A bit less, since you have to worry about an attacker
| storing anything you do now for future attacks.
|
| But if you only want any given secret to stay save for 20
| years, you can still use 4096 bit RSA for another 17 years.
| Which sounds like a good tradeoff: enough of time for
| better algorithms to get established, but little risk of a
| breach you will care about.
| H8crilA wrote:
| Because that will soon become "why not 128k". Or perhaps we
| should use 2 megabytes for each prime? We don't know, but it's
| better to be safe than sorry.
___________________________________________________________________
(page generated 2023-07-10 23:00 UTC)