[HN Gopher] NSA Backdoor Key from Lotus-Notes (2002)
___________________________________________________________________
NSA Backdoor Key from Lotus-Notes (2002)
Author : hosteur
Score : 281 points
Date : 2023-09-18 10:47 UTC (12 hours ago)
(HTM) web link (www.cypherspace.org)
(TXT) w3m dump (www.cypherspace.org)
| Quentincestino wrote:
| [flagged]
| denysvitali wrote:
| You probably meant: https://thebestmotherfucking.website/
| dartvox wrote:
| [dead]
| leoh wrote:
| Related: https://github.com/goshacmd/nsa_panel
| consoomer wrote:
| Wasn't the original backdoor in a code example the NSA provided
| to companies interested in using cryptography? They gave an
| example seed or whatever, and most companies copy/pasted it
| instead of generating their own primes, so the NSA could break it
| trivially.
|
| My memory around this is fuzzy and I can't seem to find the
| original source.
| _def wrote:
| This one? https://en.wikipedia.org/wiki/Dual_EC_DRBG
| spzb wrote:
| Dupe (2002!) https://news.ycombinator.com/item?id=21859581
|
| With no context, I don't know why this is front page news today.
| Am I missing something?
| dredmorbius wrote:
| This would be a _repost_ rather than a dupe.
|
| HN considers dupes to be stories _with significant discussion_
| repeated within a year. (Items with little or no discussion can
| be resubmitted a few times.)
|
| Stories reshared _after_ a year are reposts, and are perfectly
| fine, though its appreciated to have the item 's original
| publication year included in the title.
|
| <https://news.ycombinator.com/item?id=37312416>
|
| <https://news.ycombinator.com/newsfaq.html>
| [deleted]
| baby wrote:
| Are you asking what reposts are?
| ollemasle wrote:
| Adding the date in the HN title would be better (it is not
| present in the article)
| spzb wrote:
| No. I'm pointing out that (a) it's not marked as being from
| 2002 and someone would therefore assume it was some newly
| discovered backdoor and (b) there's no context or commentary
| as to why it is relevant in 2023.
|
| Also, on closer inspection the story is from 1997
| https://catless.ncl.ac.uk/Risks/19.52.html#subj1
| dredmorbius wrote:
| I've pinged mods to fix the year based on that, thanks.
| boffinAudio wrote:
| I'd wager that its still relevant today because the NSA is
| still the worlds greatest wholesale violator of human
| rights, at massive scale, and literally nothing effective
| has been done about this situation - we are still
| tolerating this repression, because we don't see it and
| simply don't care enough about the human rights violations,
| as a people, to reign in this out of control agency.
|
| Bringing these articles to light is of great utility to
| those of us who do not consider the NSA state of affairs to
| be, in any way, tolerable.
| FredPret wrote:
| ... are you serious?
|
| You don't think military invasions & communist
| dictatorships constitute "wholesale violation of human
| rights at a massive scale"?
|
| If the NSA is spying on people, that's an invasion of
| their privacy, but it is nothing in comparison to those
| other violations
| kmeisthax wrote:
| The NSA violates privacy at scale - a lot of little
| violations of civil liberties. It's the difference
| between robbing a man for everything he has, versus pick-
| pocketing 30 cents out of the pocket of every person on
| the planet.
|
| Furthermore, they're part of a larger intelligence
| apparatus that has absolutely committed very large and
| very harmful violations of civil liberties. The NSA's
| sister org, the CIA, was overthrowing democratically
| elected left-wingers in South America for decades,
| replacing them with brutal dictators and tyrants that
| gave both Hitler and Stalin runs for their money. The CIA
| wrote the book on how to do so, arguably even moreso than
| the KGB did. In fact, the reason why Russia today[0] is
| so effective at information warfare and covert propaganda
| is specifically because they learned from observation.
|
| [0] Not(?) to be confused with Russia Today
| tptacek wrote:
| If you're thinking about overseas signals intelligence,
| then, like the signals intelligence practice in every
| industrialized state in the world, the chartered purpose
| of NSA is to conduct those privacy violations. The
| safeguards we're given against NSA --- take them as
| seriously as you want --- are about domestic
| surveillance.
| acdha wrote:
| > the NSA is still the worlds greatest wholesale violator
| of human rights, at massive scale, and literally nothing
| effective has been done about this situation - we are
| still tolerating this repression
|
| I don't approve of their actions but turning the
| hyperbole up to 11 doesn't help. There are millions of
| people in China who'd love to be only that repressed, for
| example.
| nonrandomstring wrote:
| I think a Microsoft coder recently came clean about some pretty
| funky stuff from the 90s and 00's. Hope I didn't hallucinate
| that.
| ranting-moth wrote:
| Link?
| qingcharles wrote:
| https://www.youtube.com/watch?v=vjkBAl84PJs
| ranting-moth wrote:
| Thanks!
| EvanAnderson wrote:
| I feel like you might be talking about Dave Plummer:
| https://www.youtube.com/@DavesGarage
|
| He recently have a good talk at VCF, too:
| https://youtube.com/watch?v=Ig_5syuWUh0
| [deleted]
| thesuitonym wrote:
| It's amazing to me that the folks at the NSA had enough self-
| reflection to see that this is Big Brother behavior, but not
| enough to realize why that's a bad thing.
| masfuerte wrote:
| I'd guess that was snark from the Lotus engineer who embedded
| it.
| gregw2 wrote:
| It's worth reading Ray Ozzie's (Lotus Notes creator)'s comment on
| this from a HN 2013 discussion:
|
| https://news.ycombinator.com/item?id=5846189
|
| Before the software was released, Ray Ozzie and Kauffman openly
| described what they were doing at an RSA conference. This was not
| a secret back door. It was compliance with export controls
| everybody in the industry dealt with.
|
| Also worth reading barrkel's comment a couple comments down...
| grammers wrote:
| Whether secret or not, it was a backdoor that could be/was
| exploited. Today governments are asking for 'secret backdoors'
| from tech companies, not seeing the immense risks. Crazy times.
| ethbr1 wrote:
| For people younger than ~37, I'd remind them that crypto before
| 2000, especially in shipped commercial products, was playing
| under substantially different government restrictions.
|
| https://en.m.wikipedia.org/wiki/Crypto_Wars
|
| Effectively and in short, you were prohibited by the US
| government from shipping strong encryption in any
| internationally distributed product. Which generally meant
| everything commercial.
|
| Despite open source implementations of strong encryption
| existing (e.g. PGP et al.).
|
| Now, no one bats an eye if you ship the most secure crypto you
| want. Then, it was a coin flip as to whether you'd feel the
| full weight of the US government legal apparatus.
|
| It was a crazy, schizophrenic time.
| r3trohack3r wrote:
| IIRC this is part of what shifted hardware manufacturing out
| of the US.
|
| If you wanted to build in the U.S. you had to produce two
| versions of your product, one with "full encryption" and one
| with encryption hobbled.
|
| Or you could go build one version somewhere else and import
| it into the U.S.
| mike50 wrote:
| Similar situation with space hardware. Even cots memory
| chips hardened for radiation and space are ITAR export
| restricted.
| archgoon wrote:
| I had never heard of this particular aspect of
| demanufacturing, that's fascinating. Do you know of any
| products where this was a deciding factor, or at least a
| major consideration? (I recognize you probably can't easily
| cite internal corporate documents)
| UI_at_80x24 wrote:
| For anybody who hasn't already read it, I highly recommend
| the book: "Crypto" by Steven Levy. I was 30% of my way
| through the book before I started recognizing real world
| events, news stories, whispered computer secrets; and
| realized that it wasn't a fictional book and was instead
| talking about real history.
|
| https://www.goodreads.com/book/show/984428.Crypto?from_searc.
| ..
| matheusmoreira wrote:
| > It was a crazy, schizophrenic time.
|
| Still is. To this day, we have to debate and justify
| ourselves to these people. They make us look like pedophiles
| for caring about this stuff. They just won't give up, they
| keep trying to pass these silly laws again and again. It's
| just a tiresome never ending struggle.
|
| And that's in the US which is relatively good about this.
| Judges in my country were literally foaming at the mouth with
| rage when WhatsApp told them they couldn't provide decryption
| keys. Blocked the entire service for days out of spite,
| impacting hundreds of millions.
| [deleted]
| a1369209993 wrote:
| [flagged]
| tasty_freeze wrote:
| I down-voted this and I'll say why. I'm pretty dang
| liberal in my politics; my push back isn't because I'm
| carrying water for right wing groups.
|
| Q-Anon is a current right wing conspiracy group that
| claims powerful democrats are trafficking children, the
| "we must protect our kids from XYZ" justification crosses
| political lines. But they aren't alone.
|
| Back in the 90s there were a few years of "the satanic
| panic", where there were wild claims made about daycare
| centers doing unspeakable things to children, things that
| beggar belief just from a logistical perspective _.
| People spent years in prison over this. There was no
| whiff then of it being a conservative cause -- it mixed
| the usual conspiracy theory dynamics along with the
| Christian moral panic dynamics.
|
| Back in the 80s Tipper Gore, wife of then senator Al
| Gore, drove a campaign to label and censor music to
| "protect the children."
|
| _ eg, children were coached into giving answers and
| making up scenarios. for instance, one child claimed that
| they were taken in an airplane and flown to a secret
| location with clowns and sex, then flown back to the
| class in time for their 2pm pickup. Stories about ritual
| animal sacrifice in their daycare room, stories about
| children being murdered even though none were reported
| missing.
| [deleted]
| wkat4242 wrote:
| It's not completely gone. If you implement crypto in an iOS
| app you have to get an "export license" even if you're not
| based in the US or publish your app there.
| fullspectrumdev wrote:
| I've had to sign ITAR related paperwork a few times for
| commercial software specifically because it was made in the
| US and being "exported" to the UK.
|
| Really boils my piss given a lot of it, upon inspection,
| just used OpenSSL under the hood.
| snakeyjake wrote:
| Windows 2000 came on a CD... and a floppy disk.
|
| The CD was a globally-legal image, and export-controlled
| strong crypto came on the floppy in countries where it was
| allowed.
|
| https://winworldpc.com/product/windows-2000-high-
| encryption/...
| LeifCarrotson wrote:
| How hard would it have been for a "rogue state" to get a
| copy of that floppy? I understand that times were
| different, you couldn't just PGP encrypt it and attach a
| 1.44 MB blob to an email, sending it at 24 kbps. You
| couldn't just upload it to an anonymous filesharing site.
|
| But today it seems fundamentally obvious that once a single
| copy is leaked, it's all over... was that not true in 2000?
| icedchai wrote:
| It was. People were sharing pirated software on BBSes 40
| years ago! Downloading a floppy might take an hour. In
| the 90's, I knew kids who got jobs at ISPs just so they
| could run warez FTP sites off of the T1.
| fragmede wrote:
| Oh man, a T1. That brings back memories.
|
| Serial Port recently tried to set one up!
|
| https://youtu.be/MEda7SQxh18
| semi-extrinsic wrote:
| Gnutella, including popular clients like LimeWire, were
| released around the same time as Windows 2000. People
| were doing decentralized filesharing of files larger than
| 1.44 MB just fine in 2000.
|
| Filesharing at that time was just wild, by the way. It
| was far too easy to set up your client such that you were
| _sharing the entire contents of your computer with the
| whole internet_. More often than not, this was done by
| the kids in the family on the same machine where mom and
| dad had their work stuff plus their private finances.
|
| So of course the files were leaked. If you were intending
| to share something illegal to distribute outside the US,
| you could easily get plausible deniability just by
| sharing everything on your computer and feigning
| ignorance.
| NL807 wrote:
| eDonkey and eMule was fire during those years.
| hunter2_ wrote:
| > a copy of that floppy
|
| Mostly off-topic, but your use of rhyme is reminiscent of
| https://www.youtube.com/watch?v=up863eQKGUI
| rconti wrote:
| We were sharing lots of 3-7MB files peer-to-peer at the
| time :D Napster, Limewire, Audiogalaxy, etc. Plenty of
| public FTP sites all over the place as well.
|
| Even in the late 90s, 128kbps ISDN connections were not
| unheard of, and 256kbps DSL was rolling out as well.
| wmf wrote:
| Of course all that stuff was leaked (and there were
| anonymous filesharing sites). The whole export-grade
| crypto thing was a legal fig leaf.
| pdw wrote:
| It was all extremely silly. Debian took a different
| approach: before 2005, they put all crypto packages in a
| separate "non-US" archive, hosted in the Netherlands.
| American developers weren't allowed to upload there. That
| way, Debian never exported crypto code from the United
| States, it only ever imported it.
| eastbound wrote:
| There was a story of a hundred programmers taking the
| program PRINTED ON PAPER to a conference in Sweden to
| type it in again, because somehow export of binaries was
| forbidden but not the printed version of it. Is it true?
| Which event organized this?
| wmf wrote:
| The PGP source code was published as a book so that it
| could be exported under the theory that the first
| amendment beats ITAR.
| https://philzimmermann.com/EN/essays/BookPreface.html
| wmf wrote:
| Yep, US export restrictions ended up spurring foreign
| investment in crypto like Thawte (founded by Mark
| Shuttleworth) and SSLeay (later forked as OpenSSL).
| sillywalk wrote:
| I believe OpenBSD (based in Canada) was in a similar
| situation.
| pkaye wrote:
| Not just US but other countries had their own restrictions.
| For example I think France didn't allow anything better than
| 40-bit encryption without key escrow.
|
| http://www.cnn.com/TECH/computing/9805/19/encryption/index.h.
| ..
|
| http://www.opengroup.org/security/meetings/apr98/french-
| regu...
| rozzie wrote:
| "Now, no one bats an eye if you ship the most secure crypto
| you want."
|
| The most surprising thing to me is that, in speaking in the
| past several years with younger entrepreneurs, they're not
| even aware of the obligation to file for an export license
| for any/all software containing crypto (such as that
| submitted to the App Store).
|
| I've not yet seen a case in which a mass market exemption
| isn't quickly granted, but devs still need to file - and re-
| file annually.
| bo1024 wrote:
| > It was a crazy, schizophrenic time.
|
| Or, we are currently experiencing a brief oasis of freedom in
| between extended periods of encryption lockdowns and
| controls.
| Jerrrry wrote:
| Yup, networks with a neuron count above a certain threshold
| (2+T?) will likely be on the IDAR restriction list again.
| SkyMarshal wrote:
| What's a neuron count?
| bagels wrote:
| Neuron in a neural network. Not sure if the parent is
| talking about models, software or hardware though.
| [deleted]
| bagels wrote:
| ITAR? Also, was there a time where there was a
| restriction based on neuron count?
| forgetfreeman wrote:
| That this is no longer the case is a fairly strong indication
| that The Powers That Be have durably resolved the issue of
| decryption.
| brudgers wrote:
| _Now, no one bats an eye if you ship the most secure crypto
| you want._
|
| To me, there are only two plausible explanations for the
| change:
|
| 1. The three letter agencies gave up on backdooring
| cryptography.
|
| 2. The three letter agencies successfully subverted the
| entire chain of trust.
|
| Only one of them is consistent with a workforce consisting of
| highly motivated codebreaking professionals available working
| for many decades with virtually unlimited resources and
| minimal oversight.
|
| The other is what people want to believe.
|
| https://www.cs.cmu.edu/~rdriley/487/papers/Thompson_1984_Ref.
| ..
| hn_throwaway_99 wrote:
| I think a 3rd option is actually much more likely and
| (semi) less conspiratorial:
|
| 3. NSA realized that "frontal assaults" against encryption
| were a lot less fruitful than simply finding ways to access
| info once it has been decrypted.
|
| Would have to search for the quote, but Snowden himself
| said exactly that, something along the lines of "Encryption
| works, and the NSA doesn't have some obscure 'Too Many
| Secrets' encryption breaking machine. But _endpoint_
| security is so bad that the NSA has lots of tools that can
| read messages when you do. " And indeed, that's exactly
| what we saw in things like the Snowden revelations,
| Pegasus, and I'd argue even things like side-chain attacks.
|
| Plus, I don't even know what "The three letter agencies
| successfully subverted the entire chain of trust" means. In
| the case of something like TLS root certificates that makes
| sense, but there are many, many forms of cryptography (like
| cryptocurrency) where no keys are any more privileged than
| any other keys - there is no "chain of trust" to speak
| about in the first place.
| maqp wrote:
| >I don't even know what "The three letter agencies
| successfully subverted the entire chain of trust" means.
|
| For one thing, they're interdicting hardware and
| inserting hardware implants:
|
| https://www.theguardian.com/books/2014/may/12/glenn-
| greenwal...
| wkat4242 wrote:
| I think that's basically what the parent's #2 point
| implies.
| smolder wrote:
| I've long (post-snowden?) estimated NSAs capabilities are
| roughly what you imply. Lots of implementation-specific
| attacks, plus a collection of stolen/coerced/reversed TLS
| certs so they can MITM a great deal of web traffic. US-
| based cloud represents another big backdoor for them to
| everyone's data there, I think.
| sethhochberg wrote:
| They've presumably got a pretty vested interest in making
| sure most communications are legitimately secure against
| most common attacks - arguably good for national security
| overall, but doubly good for making sure that if anyone
| can find a novel way in, its them, and not any of their
| adversarial peers.
|
| There's a reason many corporate information security
| programs don't go overboard with mitigations for
| targeted, persistent, nation-state level attacks.
| Security is a set of compromises, and we've seen time and
| time again in industry that this sort of agency doesn't
| need to break your encryption to get what they need.
| knewter wrote:
| Did you forget about NIST curve recommendations?
| hn_throwaway_99 wrote:
| Not at all, considering that coincidentally just
| yesterday I was having an HN discussion on an unrelated
| topic about DJ Bernstein, https://en.wikipedia.org/wiki/D
| aniel_J._Bernstein#Cryptograp....
|
| You're right though, I guess I didn't mean to say that
| NSA would give up on or would not want back doors into
| widely deployed crypto algorithms, but even with
| Dual_EC_DRBG the suspicions were widely known and
| discussed before it was a NIST standard (i.e. I guess you
| could say it was a conspiracy, but it wasn't really a
| secret conspiracy), and the standard was withdrawn in
| 2014.
| hutzlibu wrote:
| When the NSA for example has access to the Intel ME or
| AMDs version of it(and I think they do) then they surely
| don't need to break any encryption. They don't even need
| to hack. They just would have direct access, to most
| Desktops/Servers.
| smolder wrote:
| Attacking machines directly over the network is dangerous
| for them from the standpoint of detection, though. You
| can bet that any ME/PSP remote access exploits are used
| very carefully due to potential detection.
| hn_throwaway_99 wrote:
| Even this is too conspiratorial for me. Not because I
| believe the NSA wouldn't _like_ access, but because it 's
| not the best approach. Convincing Intel or AMD to have a
| hidden back door, and to somehow keep that it hidden, is
| a nearly impossible task. Compare that with just hunting
| for 0-days like the rest of the world, which the NSA has
| shown to be quite good at.
|
| Not saying there couldn't be a targeted supply chain
| attack (that's essentially what was revealed in some of
| the Snowden leaks, e.g. targeting networking cables
| leased by big tech companies), but I don't believe there
| is some widely dispersed secret backdoor, even if just
| for the reason that it's too hard to keep secret.
| maqp wrote:
| >Convincing Intel or AMD to have a hidden back door, and
| to somehow keep that it hidden, is a nearly impossible
| task
|
| Interesting, how would an X86 instruction with hardcoded
| 256-bit key would be detected? IIRC it's really hard to
| audit the instruction space for CISC architecture.
| hutzlibu wrote:
| Well sure, they would not use it for everyday standard
| cases to limit exposure. Intel does have something to
| loose, if this would became public knowledge.
|
| But I cannot believe they resisted the temptation to use
| that opportunity to get such an easy access to so many
| devices.
| ethbr1 wrote:
| Parent's point is that its very _existence_ (not just
| use, as this is hardware /firmware we're talking about)
| in widely deployed form would be too risky.
|
| Consequently, if there is an ME-subversion, it's only
| deployed / part-replaced for extraordinary targets. Not
| "every system."
| hutzlibu wrote:
| Huh? As far as I know every Intel ME has access to the
| internet, can receive push firmware updates and write
| access to everything else on the system. It does not need
| a modified version, they can just use the official way,
| the normal Intel ME on target devices, if they can cloack
| their access of the official server, which I think could
| be achieved of using just the key of the official server
| and then use another server posing as the official
| server.
|
| But it has been a while that I read about it and I never
| took it apart myself, so maybe what I wrote is not
| possible for technical reasons.
| toast0 wrote:
| I don't think that's the case. Don't you need to have a
| selected NIC, integrated properly to get the Intel ME
| network features? Typically branded as "Intel vPro"
|
| Otherwise, you need something in your OS to ship data
| back and forth between the ME and whatever NIC you have.
| bonzini wrote:
| vPro, also known as AMT, is proprietary and it's for
| professional desktop and laptop systems. ME instead is
| based on IPMI and is for server-class systems.
| ethbr1 wrote:
| That's... definitely not how sensitive networks work. To
| say nothing of airgapped ones.
|
| This seems like as good a short-form intro as any:
| https://blogs.cisco.com/learning/security-in-network-
| design-...
| hutzlibu wrote:
| I would believe, really sensitive networks, have ME
| deactivated anyway and need other, specialised
| infiltration methods.
|
| But when targeting a random individual in a hurry, I
| think it would be handy to just use the build in
| backdoor.
| smolder wrote:
| At a minimum, it's a thing that _certain security
| conscious consumers_ (cough DoD) were able to get Intel
| to include a hidden (not typically user accessible) bios
| flag for disabling most features of the management
| engine. So they 're at least concerned about it as a
| security risk. That doesn't _necessarily_ mean they also
| have backdoors into it, but it 's not crazy to think they
| might. It's hard to be too conspiratorially minded with
| respect to intelligence stuff, if you aren't making the
| mistake of treating suppositions as facts.
| dannyw wrote:
| I have a workstation bought from eBay that has a "ME
| DISABLED" sticker on the chassis.
|
| Any analysis I could or should do?
| mkup wrote:
| Run Intel MEInfo utility, check if it reports "Alt
| Disable Mode" or anything like that. Article for some
| context: https://web.archive.org/web/20170828150536/http:
| //blog.ptsec...
| jraph wrote:
| I see another plausible explanation: The NSA is concerned
| with maintaining security of its own / the government's
| infrastructure / is interested in finding breaches in
| infrastructures of others.
|
| (this is speculation, I have no actual knowledge on this)
| lern_too_spel wrote:
| Only one is consistent with the documents that have been
| leaked since the change to export restrictions. The other
| is what the marketing department at Reynolds Wrap would
| like you to believe.
| sandworm101 wrote:
| They aren't backdooring modern open-source encryption. They
| may have some elite knowledge about some esoteric corner of
| the code that allows them to theoretically throw a data
| center at the problem for a month or two, but the days of
| easy backdoors to decrypting everything in real time are
| gone imho. It is just too easy to implement mathematically-
| strong encryption these days. Too many people know how to
| do it from scratch. The NSA's real job is keeping american
| systems safe. That is done through creating the best
| encryption possible. They are very good at that job.
| ethbr1 wrote:
| IMHO, the IC gave up on the _feasibility_ of maintaining
| hegemony over encryption, particularly in the face of non-
| corporate open source. You can 't sue a book / t-shirt /
| anonymous contributors.
|
| Consequently, they still have highly motivated and talented
| cryptanalysts and vast resources, but they're attacking
| widely-deployed academically-sound crypto systems.
|
| Hypothetical encryption-breaking machines (e.g. large
| quantum computers) are too obviously a double-edged sword:
| who else has one? And given that possibility, wouldn't you
| switch to algorithms more secure against them?
|
| In reality, the NSA's preference would likely be that no-
| such machine exists, but rather there are brute-force
| attacks that require incredibly large and expensive amounts
| of computational resources. Because if it's just a money
| problem, the US can feel more confident that they're near
| the top of the pile.
|
| Which probably means that their most efficient target has
| shifted from mathematical forced decryption to
| implementation attacks. Even the strongest safe has a
| weakest point. Which may still be strong, but is the best
| option if you need to get in.
| chaxor wrote:
| I don't know much about hardware, but is it not
| _possible_ that there is a small part of a chip somewhere
| deep in the highly complex systems we have that simply
| intercepts prior to encryption and, if some condition is
| met (a remote connection sets a flag via hardware set
| keys), encrypts /sends the data elsewhere? Something like
| that anyway. It seems possible, but idk how plausible it
| is, and if things like the Linux kernel would be likely
| to not report on it, if the hardware is not known enough.
|
| Anyway, just suggesting something that wouldn't require
| quantum cryptography.
| ethbr1 wrote:
| As pointed out by another comment above, exfiltration
| then becomes the risky step.
|
| If that did exist, you'd still have to get packets out
| through an unknown network, running unknown detection
| tools. Possible, but dicey over the intermediate term.
|
| Who's to say they didn't just plug a box in, run a fake
| workload on it, and put all network traffic it emits
| under a microscope?
| yomlica8 wrote:
| Seems like you could just blast it out on one of the
| endless Microsoft telemetry or update channels that are
| chatting away all day and either intercept outside the
| network or with Microsoft's help. Only way to protect
| against that would be blocking all internet access.
| [deleted]
| hedora wrote:
| Note that ACME (Let's Encrypt) means that anyone that can
| reliably man-in-the-middle a server can intercept SSL
| traffic (module certificate revocation lists, and pinning,
| but those are mostly done by big sites with extremely broad
| attack surfaces).
|
| Similarly, most consumer devices have a few zero-days each
| year, if not more, so if you really want to decrypt
| someone's stuff, you just need to wait a few months.
|
| I think that both your explanations are probably incorrect
| though. It's a bit of "neither" in this case.
|
| They continue to backdoor all sorts of stuff (they recently
| were marketing and selling backdoored "secure" cell phones
| to crooks), and most chains of trust are weak enough in
| practice.
| woodruffw wrote:
| > Note that ACME (Let's Encrypt) means that anyone that
| can reliably man-in-the-middle a server can intercept SSL
| traffic (module certificate revocation lists, and
| pinning, but those are mostly done by big sites with
| extremely broad attack surfaces).
|
| I don't understand why you think ACME means this. Can you
| explain?
| icedchai wrote:
| Not the original poster, but if you can control responses
| to and from a server (MITM) you can get a TLS/SSL
| certificate issued for it easily. In the old days,
| getting a cert was quite a hassle! You used to have to
| fill out paperwork and perhaps even talk to a human. It
| could literally take weeks.
| woodruffw wrote:
| I don't think a MITM would be sufficient to fool ACME. As
| Let's Encrypt's guide explains[1], an attacker in the
| middle would still fail to possess the target's private
| key. As a result, the proof of possession check would
| fail.
|
| The attacker could sign with their own key instead, but
| this is trivially observable to the target (they don't
| end up with a correct cert, and it all gets logged in CT
| anyways.)
|
| [1]: https://letsencrypt.org/how-it-works/
| fragmede wrote:
| Would the target get notified by LetsEncrypt about this
| scenario though? Let's say I setup Certbot on my server.
| I'm not watching CT logs. How would I know about the
| double issuance?
| southernplaces7 wrote:
| I don't buy that it has to be just one or the other.
| Fundamentally, crypto is just very dense information and
| once it became widely enough standardized by people who
| could easily share and apply it commercially, getting even
| the strongest crypto to the most basic user becomes
| extremely easy.
|
| Short of blocking the very essence of digital data spread
| and transactions, the three-letter agencies and the giant
| governments behind them realized that there was no way to
| effectively put that particular genie back in the bottle
| without fucking over too many other extremely well-
| connected commercial interests.
|
| Thus, while they didn't entirely give up on their bullshit,
| and keep looking to find arguments for privacy subversion,
| they realized that roundabout methods were a usable
| practical course.
|
| That's where we stand today: a world in which there's no
| obvious way to block something that's so cheaply easy to
| share and securely be applied by so many people, but
| governed by technocrats who do what they can to subvert
| meanwhile.
|
| The fundamental math of crypto is secure, regardless of any
| conspiracy theories. AES-256, for example, can't just be
| broken by some secret Area 51 alien decoder ring. The
| mathematics of good modern crypto simply crush any human
| computing technology for breaking them regardless of
| budget. However, the agencies also know that in a complex
| world of half-assed civilian security and public habits,
| they still have enough methods to work with without delving
| into political firestorms.
| ethbr1 wrote:
| I've always thought the ratio of average residential
| network bandwidth to average file size is
| underappreciated as an arbiter of change.
|
| The only true solution to distribution / piracy is for
| the file to be so big as to be inconvenient.
|
| Which is why mp3 was such a game changer.
| intelVISA wrote:
| Fighting against crypto is a public and costly affair, it
| was deemed easier to twist Intel/AMD's arm a little on the
| silicon level.
| convolvatron wrote:
| and I believe it was a major contributor to us having poor
| infrastructure for PKI protocols today, since these
| restrictions meant that it was pointless to try to bake them
| into standards
| hinkley wrote:
| Except to Iran, Syria, North Korea...
|
| Also you couldn't just ship products with a spot where crypto
| went and remove the crypto. API designs had to go through
| mental gymnastics to allow crypto without explicitly adding
| crypto. Which is why you have odd constructs that take
| strings as arguments and give you encryption back. Sometimes.
|
| And since new languages copy patterns from old to remain
| familiar, these APIs are still frequently some of the most
| patience-testing.
| CTDOCodebases wrote:
| An ex Microsoft dev did a good breakdown video of NSAkey:
|
| https://www.youtube.com/watch?v=vjkBAl84PJs
| 13of40 wrote:
| It was an interesting time. I forget the person's name, but I
| talked briefly with the guy who implemented the crc32 and
| encryption algorithms for ZIP, and he (almost apologetically)
| said the encryption was designed to be exportable under those
| laws. It's still not trivial to break, but you can test
| millions of passwords on a ZIP archive entry in the time it
| takes to try one on a modern Office document.
| fullspectrumdev wrote:
| Partial known plaintext attacks are very, very useful when
| cracking ZIP "encryption".
|
| I've mostly used this to unpack ZyXEL firmware updates
| (reference below to this), but it also works on a lot of
| other stuff if you can get a partial plaintext. Some file
| formats headers might work.
|
| https://www.fullspectrum.dev/the-hunt-for-
| cve-2023-28771-par...
| ChrisArchitect wrote:
| (2002)
|
| Some previous discussions all mentioning Lotus Notes in the
| title:
|
| _4 years ago_
|
| https://news.ycombinator.com/item?id=21859581
|
| _8 years ago_
|
| https://news.ycombinator.com/item?id=9291404
|
| _10 years ago_
|
| https://news.ycombinator.com/item?id=5846189
| dang wrote:
| Thanks! Macroexpanded:
|
| _NSA 's Backdoor Key from Lotus Notes (2002)_ -
| https://news.ycombinator.com/item?id=21859581 - Dec 2019 (87
| comments)
|
| _NSA 's Backdoor Key from Lotus Notes_ -
| https://news.ycombinator.com/item?id=9291404 - March 2015 (51
| comments)
|
| _NSA 's Backdoor Key from Lotus Notes_ -
| https://news.ycombinator.com/item?id=5846189 - June 2013 (85
| comments)
| lelandfe wrote:
| Good ole' "NOBUS." More fun NSA fumbles:
|
| https://en.wikipedia.org/wiki/Clipper_chip
|
| https://en.wikipedia.org/wiki/Dual_EC_DRBG
| kmeisthax wrote:
| This and the Clipper Chip aren't NOBUS. The NSA doesn't want
| you to know that the cryptosystem has law-enforcement access
| capability. The FBI doesn't care if you know as the kinds of
| criminals they are attacking don't do OPSEC.
| sneak wrote:
| NOBUS isn't just intentional vulnerabilities, it's any
| vulnerability assumed to only be exploitable by US IC,
| whether engineered or otherwise.
|
| I think these qualify.
| rvnx wrote:
| Well, the article mentions backdoor in Dual_EC_DRBG mostly
| targeting TLS/SSL communications, now we have Cloudflare, a
| much more scalable solution
| tptacek wrote:
| Dual EC is sort of the archetypical NOBUS backdoor.
| agazso wrote:
| I wonder how difficult would it be to brute force the private key
| for an RSA 760 bit public key from 1998. Does anyone know?
| panki27 wrote:
| Always depends on what resources you have (compute, time). It's
| possible, but not easy.
|
| https://crypto.stackexchange.com/a/1982
| tgsovlerkhgsel wrote:
| https://en.wikipedia.org/wiki/Integer_factorization_records and
| https://en.wikipedia.org/wiki/RSA_numbers gives some pointers.
| Specifically, the latter describes a 768 bit key being factored
| "on December 12, 2009, over the span of two years", with CPU
| time that "amounted approximately to the equivalent of almost
| 2000 years of computing on a single-core 2.2 GHz AMD Opteron-
| based computer".
|
| Later, in 2019, a 795 bit key was factored with CPU time that
| "amounted to approximately 900 core-years on a 2.1 GHz Intel
| Xeon Gold 6130 CPU. Compared to the factorization of RSA-768,
| the authors estimate that better algorithms sped their
| calculations by a factor of 3-4 and faster computers sped their
| calculation by a factor of 1.25-1.67."
|
| So assuming the better algorithms transfer to smaller numbers,
| someone who knows how to use them (factoring big numbers seems
| significantly harder than just running CADO-NFS and pointing it
| at a number and a cluster) could probably do it in a couple
| months on a couple dozen modern machines.
|
| For example, using the "795-bit computations should be 2.25
| times harder than 768-bit computations" from the publication
| accompanying the second factorization, we could assume 900/2.25
| = 400 Core-years of the Xeon reference CPU (which is 6 years
| old by now) would be needed to break the smaller key with the
| modern software. Two dozen servers with 64 equivalently strong
| cores each would need slightly over 3 months. Not something a
| hobbyist would want to afford just for fun, but something that
| even a company with a moderate financial interest in doing
| could easily do, provided they had people capable of
| understanding and replicating this work.
| rocketnasa wrote:
| Classic CPU hasn't held a candle compared to GPU on very
| repetitive math calculations. AI this year has really shown
| the same difference. In other words, it isn't just
| graphics... https://www.spiceworks.com/it-security/identity-
| access-manag...
| tgsovlerkhgsel wrote:
| I assume there is some reason why the past factorizations
| weren't done with GPUs. It _could_ be just lack of a good
| implementation and insufficient numbers of people
| interested in the topic, but it could also be something
| about the algorithm not being very suitable for GPUs.
| 15457345234 wrote:
| Oddly specific question, something in particular on your mind?
| cmeacham98 wrote:
| Presumably they are referring to the 760 bit RSA key this
| entire post is about.
| 15457345234 wrote:
| But the header talks about a 64 bit key? I'm a bit lost
| actually.
|
| Edit: Okay, I see it now. 64 bits of cipher of which 24
| bits of that cipher are set to a value derived from a 760
| bit pubkey.
| btdmaster wrote:
| Someone has tried to factorize it before (2018)
| http://factordb.com/index.php?query=444376527415060195687748...
| [deleted]
___________________________________________________________________
(page generated 2023-09-18 23:00 UTC)