[HN Gopher] A brief history of the U.S. trying to add backdoors ...
       ___________________________________________________________________
        
       A brief history of the U.S. trying to add backdoors into encrypted
       data (2016)
        
       Author : whatever3
       Score  : 508 points
       Date   : 2024-02-03 20:11 UTC (1 days ago)
        
 (HTM) web link (www.atlasobscura.com)
 (TXT) w3m dump (www.atlasobscura.com)
        
       | loughnane wrote:
       | This topic comes up a bunch still. Someone please correct me, but
       | as I understand it anyone using new chips that use Intel ME (or
       | AMD's equivalent) have a gaping hole in their security that no OS
       | can patch.
       | 
       | I know puri.sm[0] takes some steps to try to plug the hole, but
       | haven't read up to see if it's effective or no.
       | 
       | [0] https://puri.sm/learn/intel-me/
        
         | bri3d wrote:
         | > anyone using new chips that use Intel ME (or AMD's
         | equivalent) have a gaping hole in their security that no OS can
         | patch
         | 
         | Not really; anyone using chips with Intel ME or AMD PSP have an
         | additional large binary blob running on their system which may
         | or may not contain bugs or backdoors (of course, also realizing
         | a sufficiently bad bug is indistinguishable from a backdoor).
         | 
         | There are tens to hundreds of such blobs running on almost any
         | modern system and these are just one example. I would argue
         | that ME and PSP are not the worst blob on many systems; they
         | have both unsupported but almost certainly effective (MEcleaner
         | / ME code removal), supported and almost certainly effective
         | (HAP bit), or supported and likely effective (ME / PSP disable
         | command) mechanisms to disable their functionality, and they
         | are comparatively well-documented versus the firmware that runs
         | on every other peripheral (networking, GPU, etc.) and
         | comparatively hardened versus EFI.
        
           | loughnane wrote:
           | Yeah, this lives in the back of my mind too. I run debian on
           | 11th gen intel, but with the non-free blobs included to make
           | life easier. I've been meaning to try it without them, but
           | it's too tempting to just get things 'up' instead of hacking
           | on it.
        
             | mistrial9 wrote:
             | Debian has been hacked by Intel's blobs from my point of
             | view
        
             | matheusmoreira wrote:
             | There's little we can do about it short of running ancient
             | libreboot computers. We'll never be truly free until we
             | have the technology to manufacture free computer chips at
             | home, just like we can make free software at home.
        
               | voldacar wrote:
               | There's the talos II, if you can afford it.
        
               | tonetegeatinst wrote:
               | ASML fabs in every basement when? I think riskV is as
               | close to an open source CPU we have at the moment,
               | unfortuantly most riskV cpu's rely on the company having
               | IP that is protected like the CPU layout or the core
               | architecture as of what I understand of modern CPU
               | design.
               | 
               | RISKV has been a great step forward and I'd love to see
               | it succeed but I'm also aware of the lack of open source
               | architecture for GPU's or AI accelerators.
        
               | YoshiRulz wrote:
               | RISC-V* (Reduced Instruction Set Computing, 5th
               | incarnation)
               | 
               | And sure, companies can choose not to share chip designs,
               | but if you want an open-design CPU then you should be
               | checking for that specifically and not just filtering by
               | ISA. There exist such chips already, and I expect they'll
               | catch up with AArch64 chips (in terms of being able to
               | run desktop Linux) in <10 years, given the specs already
               | include SIMD and the high-end chips have clock rates
               | comparable to the oldest Windows-on-ARM laptops, like the
               | 1st-gen Surface.
        
         | wtallis wrote:
         | Most consumer products (as opposed to some of those marketed to
         | businesses) don't have enough of the components in place for
         | the ME to accomplish anything, good or bad.
        
           | loughnane wrote:
           | What do you mean? What sort of components?
        
             | wtallis wrote:
             | For starters, few consumer systems have the ME wired up to
             | a supported Intel NIC to provide the remote access
             | functionality that is usually seen as the scariest feature
             | among those related to the ME. The processors are usually
             | not vPro-enabled models so the firwmare will refuse to
             | enable those features due to Intel's product segmentation
             | strategy. And even if all the right hardware is in place, I
             | think a system still needs to be provisioned by someone
             | with physical access to turn on those features.
             | 
             | For most consumers, the main _valid_ complaint about the ME
             | is that it 's a huge pile of unnecessary complexity
             | operating at low levels of their system with minimal
             | documentation. Anything fitting that description is a bit
             | of a security risk, but the ME is merely one of many of
             | those closed firmware blobs.
        
         | charcircuit wrote:
         | >but as I understand it anyone using new chips that use Intel
         | ME (or AMD's equivalent) have a gaping hole in their security
         | that no OS can patch.
         | 
         | The existence of security coprocessors is not a security hole
         | and firmware updates to these processors can be released if a
         | security issue was found.
        
         | Onavo wrote:
         | Does Apple have a warranty canary? How do we know that the M
         | series of chips haven't been compromised?
        
           | adamomada wrote:
           | marcan of the Asahi Linux project got into a discussion on
           | reddit about this, and says that when it comes to hardware,
           | you just can't know.
           | 
           | > I can't prove the absence of a silicon backdoor on any
           | machine, but I can say that given everything we know about AS
           | systems (and we know quite a bit), there is no known place a
           | significant backdoor could hide that could completely
           | compromise my system. And there are several such places on
           | pretty much every x86 system
           | 
           | (Long) thread starts here, show hidden comments for the full
           | discussion https://old.reddit.com/r/AsahiLinux/comments/13voe
           | ey/what_is...
           | 
           | I highly recommend reading this if you're interested
           | https://github.com/AsahiLinux/docs/wiki/Introduction-to-
           | Appl...
        
         | dylan604 wrote:
         | Are these blob type of attacks accessible after boot?
         | Essentially, are these only accessible if you have physical
         | access? And at that point, isn't it game over anyways?
        
           | bri3d wrote:
           | Intel ME allows intentional remote access through the ME in
           | some enterprise scenarios (vPro). The driver support matrix
           | is quite small and this is a massively overblown concern IMO,
           | but it's the root of a lot of the hand wringing.
           | 
           | However, onboard firmware based attacks are absolutely
           | accessible remotely and after boot in many scenarios. It's
           | certainly plausible in theory that an exploit in ME firmware
           | could, for example, allow an attacker to escape a VM or
           | bypass various types of memory protection. Unfortunately the
           | actual role of the ME is rather opaque (it's known, for
           | example, to manage peripherals in s0ix sleep).
           | 
           | Ditto for any other blob. Maybe a specially crafted packet
           | can exploit a WiFi firmware. Maybe a video frame can
           | compromise the GPU.
           | 
           | These are also good persistence vectors - gain access
           | remotely to the NOR flash containing EFI, and you have a huge
           | attack surface area to install an early boot implant. (or if
           | secure boot isn't enabled, it's just game over anyway). On
           | Linux, it's often just hanging out in /dev as a block device;
           | otherwise, once an attacker has access to the full address
           | space, it's not too hard to bitbang.
           | 
           | These are all fairly esoteric attacks compared to the more
           | likely ways to get owned (supply chain, browser bugs,
           | misconfiguration), but they're definitely real things.
           | 
           | The closed-sourceness is only a tiny part of the problem, too
           | - a lot of the worst attacks so far are actually in open
           | source based EFI firmware, which is riddled with bugs.
           | 
           | Which takes me back to my original response to "isn't
           | everyone backdoored by ME" - sure, maybe, but if you're
           | looking for practical holes and back doors, ME is hardly your
           | largest problem.
        
             | aspenmayer wrote:
             | > The closed-sourceness is only a tiny part of the problem,
             | too - a lot of the worst attacks so far are actually in
             | open source based EFI firmware, which is riddled with bugs.
             | 
             | Can you elaborate and/or provide context/links?
        
               | bri3d wrote:
               | https://eclypsium.com/blog/understanding-detecting-
               | pixiefail...
               | 
               | https://binarly.io/posts/The_Far_Reaching_Consequences_of
               | _Lo...
        
               | aspenmayer wrote:
               | Makes sense given the context. When you said bugs in open
               | source EFI implementations, I thought you meant bugs in
               | things like rEFI/rEFInd/rEFIt.
        
         | sweetjuly wrote:
         | People always complain about ME/PSP but it misses the point:
         | there is no alternative to trusting your SoC manufacturer. If
         | they wanted to implement a backdoor, they could do so in a much
         | more powerful and secretive way.
        
       | hn8305823 wrote:
       | In case anyone is wondering about the context for this 2016
       | article, it was right after the 2015 San Bernardino attack and
       | the FBI was trying to get into one of the attacker's phones.
       | Apple resisted the request primarily because they wanted a
       | certificate that would allow them to install any rogue
       | firmware/app/OS on any iPhone, not just the attacker's.
       | 
       | https://en.wikipedia.org/wiki/2015_San_Bernardino_attack
        
         | KennyBlanken wrote:
         | The FBI has used damn near every major incident to push for
         | nerfing encryption, and inbetween they bray about child porn.
        
       | coppsilgold wrote:
       | FBI director James Comey have publicly lobbied for the insertion
       | of cryptographic "backdoors" into software and hardware to allow
       | law enforcement agencies to bypass authentication and access a
       | suspect's data surreptitiously. Cybersecurity experts have
       | unanimously condemned the idea, pointing out that such backdoors
       | would fundamentally undermine encryption and could exploited by
       | criminals, among other issues.
       | 
       | "could exploited by criminals" is sadly a disingenuous claim. A
       | cryptographic backdoor is presumably a "Sealed Box"[1] type
       | construct (KEM + symmetric-cipher-encrypted package). As long as
       | the government can keep a private key secure only they could make
       | use of it.
       | 
       | There are plenty of reasons not to tolerate such a backdoor, but
       | using false claims only provides potential ammunition to the
       | opposition.
       | 
       | [1] <https://libsodium.gitbook.io/doc/public-
       | key_cryptography/sea...>
        
         | 2OEH8eoCRo0 wrote:
         | And Apple has a backdoor that only Apple can use. Why don't
         | criminals exploit Apple's backdoor?
        
           | frickinLasers wrote:
           | https://arstechnica.com/security/2023/12/exploit-used-in-
           | mas...
           | 
           | Looks like criminals were using it for four years undetected.
        
           | quickslowdown wrote:
           | Which backdoor do you mean? I'm not an Apple expert by any
           | means, but I thought they encrypted customer data in a way
           | that even they can't get to it? Wasn't that the crux of this
           | case, that Apple couldn't help the FBI due to security
           | measures, prompting the agency to ask for a backdoor?
        
             | 2OEH8eoCRo0 wrote:
             | What's an update? They can sign and push any code they want
             | remotely.
        
               | dataangel wrote:
               | IIRC the question is when the phone is totally locked,
               | e.g. if you turn it off then turn it back on and haven't
               | entered the PIN yet. In this state even apple can't get
               | an update to run, the secure hardware won't do it unless
               | you wipe the phone first. And your data is encrypted
               | until you unlock the phone.
               | 
               | In practice though most people are screwed b/c it's all
               | already in icloud.
        
               | fragmede wrote:
               | with advanced data protection, it's encrypted before it
               | hits iCloud, so apple, nor the feds can't get at it.
        
           | catlifeonmars wrote:
           | Source/reference? I'm not aware of such a backdoor
        
             | adrian_b wrote:
             | See the posting above about the Arstechnica article.
             | 
             | During the last days of 2023 there was a big discussion,
             | also on HN, after it was revealed that all recent Apple
             | devices had a hardware backdoor that allowed bypassing all
             | memory access protections claimed to exist by Apple.
             | 
             | It is likely that the backdoor consisted in some cache
             | memory test registers used during production, but it is
             | absolutely incomprehensible how it has been possible for
             | many years that those test registers were not disabled at
             | the end of the manufacturing process but they remained
             | accessible for the attackers who knew Apple's secrets. For
             | instance any iPhone could be completely controlled remotely
             | after sending to it an invisible iMessage message.
        
               | sylware wrote:
               | "Convenient software/hardware bugs"... but "they are not
               | backdoors, I swear!"
        
               | rightbyte wrote:
               | Can't Apple just push an software update with some:
               | if (user_id == "adrian_b")            pwn ();          ?
        
               | _kbh_ wrote:
               | > It is likely that the backdoor consisted in some cache
               | memory test registers used during production, but it is
               | absolutely incomprehensible how it has been possible for
               | many years that those test registers were not disabled at
               | the end of the manufacturing process but they remained
               | accessible for the attackers who knew Apple's secrets.
               | 
               | I think we are nearly certain that the bug is because of
               | a MMIO accessible register that allows you to write into
               | the CPU's cache (its nearly certain this is related to
               | the GPU's coherent L2 cache).
               | 
               | But I don't think it's 'incomprehensible' that such a bug
               | could exist unintentionally. Modern computers and even
               | more so high end mobile devices are a huge basket of
               | complexity that has so many interactions and coprocessors
               | all over the place I think it's very likely that a
               | similar bug exists undiscovered unmitigated.
               | 
               | > For instance any iPhone could be completely controlled
               | remotely after sending to it an invisible iMessage
               | message.
               | 
               | I don't think the iMessage was invisible I think it
               | deleted itself once the exploit had run, its also worth
               | noting just how complicated the attack chain was and that
               | the attacker _needed_ a hardware bug just to patch the
               | kernel whilst having kernel code execution.
        
             | 2OEH8eoCRo0 wrote:
             | How is their update path not considered a backdoor? They
             | can sign and serve you any update that they want.
        
           | catlifeonmars wrote:
           | FWIW this is a fair and valid argument. Generally, no one
           | entity should have that much power. Doesn't really matter if
           | it's USG or a tech giant.
        
         | devwastaken wrote:
         | It's not a false claim, assuming the feds will keep such a key
         | "secure" is not backed by evidence. Top secret materials are
         | leaked all the time. Private keys from well secured systems are
         | extracted from hacks. The FBI having such a key would make them
         | a very profitable target for the various corps that specialize
         | in hacking for hire. For example, NSO group.
         | 
         | If the power doesn't exist, nobody can exploit it.
        
           | coppsilgold wrote:
           | Do military cryptographic keys leak often? Do nuclear codes
           | leak?
           | 
           | The times highly valuable cryptographic keys leaked for
           | various cryptocurrency exchanges it has generally if not
           | always been due to gross negligence.
           | 
           | Such a key would be highly sensitive and it would also
           | require very little traffic to use. You would just need to
           | send the secure system a KEM (<100 bytes) and it will respond
           | with the symmetric key used for the protected package.
           | 
           | I don't doubt they could secure it. Can even split the key
           | into shares and require multiple parties to be present in the
           | secure location.
        
             | dvngnt_ wrote:
             | nuclear codes are probably not used as much as phone
             | backdoors. local police wants access too and other
             | governments so I do believe it would leak
        
             | some_furry wrote:
             | > Do nuclear codes leak?
             | 
             | For many years, the code was 00000000.
             | 
             | https://arstechnica.com/tech-policy/2013/12/launch-code-
             | for-...
        
             | jliptzin wrote:
             | What are you going to do with a nuclear code without access
             | or authority to launch the nukes?
        
             | devwastaken wrote:
             | You're creating so many assumptions that nothing you've
             | stated could be concluded to be an honest reflection of
             | reality.
             | 
             | Nobody has to know the rate of leaks, it's irrelevant.
             | Gross negligence is not necessary, how would you even know?
             | Leaks by definition are rarely exposed, we only see some of
             | them.
             | 
             | A "highly sensitive" key doesn't mean anything. Assigning
             | more words to it doesn't somehow change the nature of it.
             | Humans are bad at securing things, that's why the best
             | security is to not have a system that requires it.
             | 
             | Whatever hypothetical solution you have would be crushed
             | under the weight of government committees and office
             | politics until your security measures are bogus.
        
         | whatshisface wrote:
         | > _As long as the government can keep a private key secure only
         | they could make use of it._
         | 
         | Your devices would be secure as long as a private key that
         | happened to be the most valuable intelligence asset in the
         | United States, accessed thousands of times per day, by police
         | spread across the entire nation, was never copied or stolen.
        
           | dylan604 wrote:
           | Well, it's a good thing that we don't have to worry about
           | corrupt police /s
        
         | catlifeonmars wrote:
         | > As long as the government can keep a private key secure only
         | they could make use of it.
         | 
         | Not disingenuous. Keys are stolen or leaked all the time. And
         | the blast radius of such a master key would be extremely large.
        
         | nonrandomstring wrote:
         | > false claims
         | 
         | As Pauli said, "That's not even wrong". It cannot even meet the
         | basic criteria for truth or falsehood.
         | 
         | It's simply naked hubris.
        
         | sowbug wrote:
         | You assume a perfect implementation of the backdoor. Even if
         | the cryptographic part were well-implemented, someone will
         | accidentally ship a release build with a poorly safeguarded
         | test key, or with a disabled safety that they normally use to
         | test it.
         | 
         | It's an unnecessary moving part that can break, except that
         | this particular part breaking defeats the whole purpose of the
         | system.
        
         | buffet_overflow wrote:
         | > As long as the government can keep a private key secure only
         | they could make use of it.
         | 
         | Well, keep in mind they would have to keep it secure in
         | perpetuity. Any leak over the lifetime of any of that hardware
         | would be devastating to the owners. Blue Team/Defensive
         | security is often described as needing to be lucky every time,
         | where as Red Team/attackers just have to get lucky once.
         | 
         | This attack vector is in addition to just exploiting the
         | implementation in some way, which I don't think can be
         | handwaved away.
        
         | Rebelgecko wrote:
         | >As long as the government can keep a private key secure only
         | they could make use of it.
         | 
         | That's a big "if". Look at how the government has protected
         | physical keys...
         | 
         | Ever since the TSA accidentally leaked them, you can buy a set
         | of keys on Amazon for $5 that opens 99% of "TSA approved" locks
        
         | bayindirh wrote:
         | Let's see:
         | 
         | Mercedes recently forgot a token in a public repository which
         | grants access to _everything_.
         | 
         | Microsoft forgot its "Golden Key" in the open, allowing all
         | kinds of activation and secure boot shenanigans.
         | 
         | Microsoft's JWT private key is also stolen, making the login
         | page a decoration.
         | 
         | Somebody stole Realtek's driver signing keys for Stuxnet
         | attack.
         | 
         | HDMI master key is broken.
         | 
         | BluRay master key is broken.
         | 
         | DVD CSS master key is broken.
         | 
         | TSA master keys are in all 3D printing repositories now.
         | 
         | Staying on the physical realm, somebody made an automated tool
         | to profile, interpret and print key blanks for locks with
         | "restricted keyways" which has no blanks available.
         | 
         | These are the ones I remember just top of my head.
         | 
         | So yes, any digital or physical secret key is secure _until it
         | isn't_.
         | 
         | It's not a question of if, but when. So, no escrows or back
         | doors. Thanks.
        
           | wkat4242 wrote:
           | I've been waiting for those wildvine keys to leak which would
           | finally let me choose what to play my stuff on. But it still
           | hasn't happened. They are getting better at secrecy sadly.
        
             | bayindirh wrote:
             | Since Widevine L3 is completely implemented on software,
             | there are tools you can use, but L2 and L1 are have
             | hardware components, and secure enclaves are hard to break.
             | Up to par ones have self-destruction mechanisms which
             | trigger when you bugger them too much.
             | 
             | On the other hand, there are 4K, 10bit HDR + multichannel
             | versions everywhere, so there must be some secret sauce
             | somewhere.
             | 
             | This is not a rabbit hole I want to enter, though.
        
           | piperswe wrote:
           | It's apparently now trivial to brute force the private key
           | used for Windows XP-era Microsoft Product Activation, as
           | another example. (that's where UMSKT and the like get their
           | private keys from)
        
         | mnw21cam wrote:
         | > As long as the government can keep a private key secure...
         | 
         | Which government? Software crosses borders.
         | 
         | You can bet that if the US mandated a back door to be inserted
         | into software that was being exported to another country, that
         | country would want to either have the master key for that back
         | door, or a different version of the software with a different
         | back door or _without_ the back door. A software user could
         | choose the version of the software that they wanted to use
         | according to which country (if any) could snoop on them. It 's
         | unworkable.
        
         | Hikikomori wrote:
         | Are they lobbying for this because they can't access stuff
         | today and "need" it or is just a psyop so we believe what that
         | they cannot access it today.
        
         | salawat wrote:
         | The same government that failed to keep all of it's Top Secret
         | clearance paperwork secure? How soon we forget the OPM hack...
        
         | eviks wrote:
         | > As long as the government can keep a private key secure only
         | they could make use of it.
         | 
         | That's a disingenuous claim since it's known they can't
        
         | Geisterde wrote:
         | Ill take "what is vault 7" for $500.
        
       | progbits wrote:
       | As this is from 2016 it doesn't include this new fun revelation:
       | 
       | > On 11 February 2020, The Washington Post, ZDF and SRF revealed
       | that Crypto AG was secretly owned by the CIA in a highly
       | classified partnership with West German intelligence, and the spy
       | agencies could easily break the codes used to send encrypted
       | messages.
       | 
       | https://en.m.wikipedia.org/wiki/Crypto_AG
        
         | hedora wrote:
         | More details here:
         | 
         | https://web.archive.org/web/20200212014117/https://www.washi...
        
         | pgeorgi wrote:
         | The CIA/BND connection wasn't known, but the collusion with
         | certain agencies was known to different degrees for decades:
         | https://en.wikipedia.org/w/index.php?title=Crypto_AG&oldid=7...
        
           | lcnPylGDnU4H9OF wrote:
           | Considering that I remember reading the CIA's own historical
           | document on this operation, I would guess its usefulness had
           | run its course. If I'm not mistaken, it was the CIA who
           | released the document to journalists; it seemed like
           | bragging.
        
           | _kbh_ wrote:
           | To add another dimension to this, personally i think that the
           | Crypto AG relationship is what is referred to as "HISTORY" in
           | this leaked NSA ECI codenames list.
           | 
           | https://robert.sesek.com/2014/10/nsa_s_eci_compartments.html
           | 
           | > HISTORY HST NCSC (TS//SI//NF) Protects NSA and certain
           | commercial cryptologic equipment manufacturer relationships.
        
         | treflop wrote:
         | The guy who founded Crypto AG was really good friends with a
         | guy who became a top dog at the NSA.
        
         | p-e-w wrote:
         | > The company had about 230 employees, had offices in Abidjan,
         | Abu Dhabi, Buenos Aires, Kuala Lumpur, Muscat, Selsdon and
         | Steinhausen, and did business throughout the world.
         | 
         | That's a... _really_ strange list of office locations,
         | especially considering the relatively small number of
         | employees.
         | 
         | > The owners of Crypto AG were unknown, supposedly even to the
         | managers of the firm, and they held their ownership through
         | bearer shares.
         | 
         | How does this work in practice? If management doesn't know who
         | owns the company, how can the owners exercise influence on
         | company business?
        
           | quasse wrote:
           | Via lawyer / legal representative if I had to hazard a guess.
        
             | p-e-w wrote:
             | How does that representative prove that they _really_
             | represent the owners, if the owners aren 't known to
             | management? How can they authorize someone without
             | revealing identifying information?
        
               | eviks wrote:
               | Where would this need to really prove anything arise
               | from? The intermediaries just hire and pay the managers,
               | that's enough
        
           | anticensor wrote:
           | Codify all the management policy in the main charter, leaving
           | nothing else to the board to decide?
        
         | 1337biz wrote:
         | Would be interesting what similar companies are (in parts) most
         | likely agency fronts.
         | 
         | My guess would be quite a few in the soft privacy selling
         | business, such as VPN or email providers.
        
           | Goodroo wrote:
           | Proton mail is a CIA front email provider
        
             | AB1908 wrote:
             | It is impossible to tell if this is satire or not.
        
             | hairyplanner wrote:
             | I actually wish this was true. I want an email service that
             | would last forever and is secure enough from my threats,
             | namely security breaches of the email host and account
             | takeover from non state actors.
             | 
             | Gmail is close enough, but I want an alternative. An email
             | service run by the nsa or the cia would be great.
             | 
             | (No sarcasm is intended)
        
             | ykonstant wrote:
             | Hmm... should I choose a provider with a history of spying
             | on everyone and destabilization, or Google? ...OK, I'll go
             | with the CIA.
        
             | OneLeggedCat wrote:
             | Proton Mail's extremely bureaucratic operational deafness,
             | and their glacial pace of product features and open-
             | sourcing, would certainly lend support to that idea.
        
         | EthanHeilman wrote:
         | I wrote blog entry on this subject with a very similar name [0]
         | which covers the CryptoAG story in more detail. It doesn't have
         | the 2020 news.
         | 
         | [0]: A Brief History of NSA Backdoors (2013),
         | https://www.ethanheilman.com/x/12/index.html
        
           | samstave wrote:
           | This is an epically cool blog post! - submit it to HN on its
           | own merits.
           | 
           | This was of particular interest to me:
           | 
           | >>> _"...1986 Reagan tipped off the Libyans that the US could
           | decrypt their communications by talking about information he
           | could only get through Libya decrypts on TV15. In 1991 the
           | Iranians learned that the NSA could break their diplomatic
           | communications when transcripts of Iranian diplomatic
           | communications ended up in a French court case... "_
           | 
           | Because, in 1986 - thats effectively when a lot of the
           | phreaking and social engineering was at a peak - Cyberpunk
           | was moving from imagination --> zeitgeist --> reality.
           | 
           | Social engineering and line-printer litter recovery were
           | yielding the backdoors into the Telecom Switching system.
           | BBS's were raging [0].
           | 
           | So when you get a gaph-guffaw look into infosec in a slipup
           | like these ones, it reinforces in mind that the 80s were some
           | really wild times all around as technology tsunami'd from
           | people's minds business and reality.
           | 
           | [0] BBS Docu - https://www.imdb.com/title/tt0460402/
           | 
           | [1] phreaking - https://en.wikipedia.org/wiki/Phreaking
           | 
           | [2] history of phreaking -
           | https://www.youtube.com/watch?v=8PmkUPBhL4U
        
             | EthanHeilman wrote:
             | Thanks, just submitted
        
           | _kbh_ wrote:
           | > I wrote blog entry on this subject with a very similar name
           | [0] which covers the CryptoAG story in more detail. It
           | doesn't have the 2020 news. [0]: A Brief History of NSA
           | Backdoors (2013),
           | https://www.ethanheilman.com/x/12/index.html
           | 
           | Wow this is super interesting I noticed this paragraph in the
           | text.
           | 
           | > 2013, Enabling for Encryption Chips: In the NSA's budget
           | request documents released by Edward Snowden, one of the
           | goals of the NSA's SIGINT project is to fully backdoor or
           | "enable" certain encryption chips by the end of 201311. It is
           | not publicly known to which encryption chips they are
           | referring.
           | 
           | From what I know Cavium is one of these "SIGINT enabled" chip
           | manufactures.
           | 
           | > https://www.electrospaces.net/2023/09/some-new-snippets-
           | from...
           | 
           | >> "While working on documents in the Snowden archive the
           | thesis author learned that an American fabless semiconductor
           | CPU vendor named Cavium is listed as a successful SIGINT
           | "enabled" CPU vendor. By chance this was the same CPU present
           | in the thesis author's Internet router (UniFi USG3). The
           | entire Snowden archive should be open for academic
           | researchers to better understand more of the history of such
           | behavior." (page 71, note 21)
           | 
           | > https://www.computerweekly.com/news/366552520/New-
           | revelation...
           | 
           | Unfortunately the relevant text for the second is pretty long
           | so I dont wanna quote it.
        
       | schmudde wrote:
       | Sharing this seems like an appropriate way of commemorating David
       | Kahn's passing (https://news.ycombinator.com/item?id=39233855).
       | <3
        
       | xyst wrote:
       | for a long time, the US considered cryptography algos as a
       | munition. Needed some arms license to export.
       | 
       | Also, US tried to convince the world only 56 bits of encryption
       | was sufficient. As SSL (I don't think TLS was a thing back then)
       | was becoming more mainstream, US govt only permitted banks and
       | other entities to use DES [1] to "secure" their communications.
       | Using anything more than 56 bits was considered illegal.
       | 
       | https://en.m.wikipedia.org/wiki/Data_Encryption_Standard
        
         | londons_explore wrote:
         | Even now, if you join a discussion on crypto and say something
         | like "Why don't we double the key length" or "Why not stack two
         | encryption algorithms on top of one another because then if
         | either is broken the data is still secure", you'll immediately
         | get a bunch of negative replies from anonymous accounts saying
         | it's unnecessary and that current crypto is plenty secure.
        
           | ahazred8ta wrote:
           | EVERYTHING IS FINE. WOULD YOU LIKE A BRAIN SLUG?
        
           | paulpauper wrote:
           | two encryption algorithms will mean needing two completely
           | unrelated , unique passwords. this can be impractical and
           | increase odds of being locked out forever
        
             | ranger_danger wrote:
             | no it doesn't mean that at all
        
           | Geisterde wrote:
           | Well, I think that would sevearly inhibit future development.
           | Scaling on bitcoin has been a delicate game of optimizing
           | every bit that gets recorded, but also support future
           | developments that dont even exist yet, there is no undo
           | button either. New signature schemas and clevar cryptography
           | tricks can do quite a bit, but when you slap another layer of
           | cryptography on you will inevitably make things worse in the
           | long run.
           | 
           | Histories biggest bug bounty is sitting on the bitcoin
           | blockchain, if it were even theoretically plausible to crack
           | sha-256 like that then we would probably know, and many have
           | tried.
        
             | monero-xmr wrote:
             | The real security of Bitcoin is the choice of secp256k1.
             | Basically unused before Bitcoin, but chosen specifically
             | because he was more confident it wasn't backdoored.
             | 
             | https://bitcoin.stackexchange.com/a/83623
        
               | crotchfire wrote:
               | And ed25519 was out of the question, since -- being brand
               | new -- its use would have given away the fact that DJB
               | was among the group of people who presented themselves as
               | _Satoshi Nakamoto_.
        
               | voldacar wrote:
               | Evidence?
        
             | londons_explore wrote:
             | If you reveal you have broken sha-256, then your bug bounty
             | becomes worthless. The smart move is to steal and drain a
             | few wallets slowly.
             | 
             | And that's exactly what we see - and every time it happens,
             | the bitcoin community just laughs that someone must have
             | been bad at key management or used a weak random number
             | generator.
        
               | Geisterde wrote:
               | > management or used a weak random number generator.
               | 
               | Except that has been the case in every instance thus far.
               | The dev that lost his bitcoin last year was using arcane
               | software, after a biopsy they found the library being
               | used only had like 64 bits of entropy.
        
           | AnotherGoodName wrote:
           | The best is the claim that multiple encryption makes it
           | weaker or that encryption is the weaker of the two. If that
           | were true we'd break encryption just by encrypting once more
           | with a weaker algo.
        
             | piperswe wrote:
             | The invalidity of that claim is a bit more nuanced. Having
             | an inner, less secure algorithm may expose timing attacks
             | and the like. There are feasible scenarios where layered
             | encryption (with an inner weak algo and outer strong algo)
             | can be less secure than just the outer strong algorithm on
             | its own.
        
           | KennyBlanken wrote:
           | I'll do you one better.
           | 
           | The head of security for Golang, a google employee, was also
           | part of the TLS 1.3 committee and in Golang, it's impossible
           | by design to disable specific ciphers in TLS 1.3
           | 
           | The prick actually had the nerve to assert that TLS 1.3's
           | security is so good this should never be necessary, and that
           | even if it were, they'll just patch it and everyone can
           | upgrade.
           | 
           | So someone releases a 0-day exploit for a specific TLS
           | cipher. Now you have to wait until a patch is released and
           | upgrade your production environment to fix it - all the while
           | your pants are down. That's assuming you're running a current
           | version in production and you don't have to test for bugs or
           | performance issues upgrading to a current release.
           | 
           | Heaven fucking forbid you hear a cipher is exploitable and be
           | able to push a config change within minutes of your team
           | hearing about it.
           | 
           | I'd place 50/50 odds on it being a bribe by the NSA vs sheer
           | ego.
        
             | londons_explore wrote:
             | Seems like a stupid design, if only for the fact that some
             | uses of TLS, where a very specific client is connecting,
             | you might want to enable precisely the one cypher suite you
             | expect that client to use.
             | 
             | Then all your performance tests can rely on the encryption
             | and key exchange will always use the same amount of CPU
             | time etc.
        
         | meroes wrote:
         | Do you have more on the legality aspect? I knew NSA pressured
         | for a weaker key but what aspect could be made illegal? I had
         | to write an undergrad paper on the original DES and I never saw
         | an outright illegality aspect but wouldn't be surprised. They
         | also put in their own substitution boxes which I surprisingly
         | never found much info on how exactly NSA could use them. So
         | much speculation but why no detailed post mortems in the modern
         | age?
        
           | ahazred8ta wrote:
           | In the US, since the 1950s, you need a permit to export any
           | product which has encryption. There are fines if you don't
           | file the right paperwork. In the 1970s and 80s they would
           | only approve keys of 40 bits or less.
           | 
           | https://en.wikipedia.org/wiki/Export_of_cryptography_from_th.
           | ..
        
           | MattPalmer1086 wrote:
           | It seems that they changed the S boxes to make them more
           | resistant to differential analysis (which they knew about but
           | the public didn't). So this is actually a case of them
           | secretly strengthening the crypto.
           | 
           | Presumably this is because they didn't want adversaries being
           | able to decrypt stuff due to a fundamental flaw. I guess it's
           | possible they also weakened it in another way, but if so
           | nobody has managed to find it.
        
       | mannyv wrote:
       | There's talk that the NSA put its own magic numbers into
       | elliptical curve seeds. Does that count?
       | 
       | https://www.bleepingcomputer.com/news/security/bounty-offere...
        
       | nimbius wrote:
       | Everyone forgets the speck and simon crypto the NSA wanted in the
       | Linux kernel that were, ultimately, removed from it entirely
       | after a lot of well deserved criticism from heavy hitters like
       | Schneier.
       | 
       | https://en.m.wikipedia.org/wiki/Speck_(cipher)
        
       | whatever3 wrote:
       | I posted this because of the Enigma/Crypto AG mixup in the
       | article, but it doesn't seem that anyone noticed. Seemed relevant
       | considering the post about fabricated Atlas Obscura stories a few
       | days ago.
        
       | mmaunder wrote:
       | Honorable mention for the ITAR regs that prevented Phil Zimmerman
       | from exporting PGP 128 bit encryption until Zimmerman and MIT
       | press printed the source as a book protected by the first
       | amendment, exported it, and this enabled others to OCR it, and
       | recompile it offshore.
       | 
       | Also that ITAR enabled Thawte in South Africa (where I'm from) as
       | a business to completely dominate sales for 128 bit SSL certs
       | outside the US. Thawte was eventually acquired by Verizon for
       | $600 million and the founder Mark Shuttleworth used the cash to
       | become an astronaut and then founded Ubuntu.
        
         | inquirerGeneral wrote:
         | I didn't know any of this. Thanks.
         | 
         | "The U.S. Munitions List changes over time. Until 1996-1997,
         | ITAR classified strong cryptography as arms and prohibited
         | their export from the U.S.[5]
         | 
         | Another change occurred as a result of Space Systems/Loral's
         | conduct after the February 1996 failed launch of the Intelsat
         | 708 satellite. The Department of State charged Space
         | Systems/Loral with violating the Arms Export Control Act and
         | the ITAR.[6][7]
         | 
         | As a result, technology pertaining to satellites and launch
         | vehicles became more carefully protected." https://en.wikipedia
         | .org/wiki/International_Traffic_in_Arms_....
        
         | CWuestefeld wrote:
         | At the time, I had a t-shirt that said "this t-shirt is a
         | munition", because it also had on it the RSA public key
         | algorithm encoded as a barcode.
        
           | e40 wrote:
           | Cane to say the same thing. It sparked a lot of interesting
           | conversations over the years.
        
           | mmaunder wrote:
           | Haha I remember that. OG fist bump!
        
           | dramm wrote:
           | Had the same t-shirt with the barcode readable source code on
           | it. I think prompted by seeing Greg Rose wear one, may have
           | gotten it from him/mutual friends. As an foreign citizen I
           | was never brave enough to wear it through a USA entry
           | airport.
        
           | a-dub wrote:
           | i seem to have vague recollection of a variant of this shirt
           | that had a perl script on it? or was the perl script for
           | decoding the barcode?
        
             | T3OU-736 wrote:
             | Memory claims: It was RSA in Perl, text shape of a dolphin,
             | but it may be wrong.
        
         | p-e-w wrote:
         | I never understood the story about the book-printed PGP source
         | code. Isn't source code protected speech under the first
         | amendment anyway, regardless of the form in which it is
         | transmitted? All kinds of media receive first amendment
         | protection, including things like monetary donations, corporate
         | speech, art, etc. I've never heard of there being a requirement
         | for the printed form. Did the interpretation of the first
         | amendment change recently in this regard?
        
           | int_19h wrote:
           | The idea was to make it blatantly clear that it's not a
           | "munition".
        
             | p-e-w wrote:
             | Why would that matter, if source code is protected speech
             | anyway?
             | 
             | And why is it more "clear" with a printed book vs. an
             | emailed text file?
        
               | __MatrixMan__ wrote:
               | This was a time when we were happy when they managed to
               | get congresspeople to understand that the internet is not
               | like a truck, but more like a series of tubes.
               | 
               | I think anchoring it to something old school like a book
               | was a good call.
        
               | rfw300 wrote:
               | I think if you're looking for a logical answer from first
               | principles, you won't find one. It's more that the legal
               | system runs on precedent, and a book fits far more
               | squarely in the fact patterns of previous First Amendment
               | cases. Likely the source code case would end up with the
               | same outcome, but it doesn't hurt to make it more
               | obvious.
        
               | int_19h wrote:
               | Legally speaking, it didn't really matter. But
               | symbolically, having the feds argue that a book
               | constitutes "munitions" would be bad optics for them in a
               | way that is more understandable to the average American,
               | compared to more legal arcane arguments about software
               | having 1A protections.
        
           | femto wrote:
           | The book was published in 1995 [1,2]. Free speech protection
           | for source code under US law wasn't decided until 1996, in
           | Bernstein v. United States [3].
           | 
           | [1] https://en.wikipedia.org/wiki/Phil_Zimmermann#Arms_Export
           | _Co...
           | 
           | [2] https://www.eff.org/deeplinks/2015/04/remembering-case-
           | estab...
           | 
           | [3] https://en.wikipedia.org/wiki/Bernstein_v._United_States
        
         | tjoff wrote:
         | > _[...] and this enabled others to OCR it, and recompile it
         | offshore._
         | 
         | Did it? Or did it just give them plausible deniability?
         | 
         | I remember playing with OCR as a kid and all the software I
         | could get my hands on gave horrendous results, even if the
         | input was as perfect as one could hope for.
         | 
         | And even today I sometimes run tesseract on _perfect_
         | screenshots and it still makes weird mistakes.
         | 
         | Would be interesting to know if the book had any extra OCR-
         | enabling features. I'm sure the recipients would get access to
         | proper tools and software but OCRing source-code still seems
         | like a nightmare back then.
        
           | consp wrote:
           | Banks used it on checques for ages, why would it be that
           | difficult? You do need a compatible typesetting though.
        
             | ryanackley wrote:
             | Where did you get your information? MICR lines, the line of
             | numbers at the bottom of the check, use magnetic ink. The
             | acronym stands for Magnetic Ink Character Recognition. So
             | for ages, they didn't use optics at all.
             | 
             | In the modern day, cheque OCR is monopolized by one
             | company, Mitek. They may use tesseract somewhere in their
             | stack but I've never read that anywhere.
        
           | MereInterest wrote:
           | Not so much plausible deniability, as a clear First Amendment
           | argument. If you're forbidden from exporting computer code,
           | that was some new-fangled magic thing that nobody could
           | possibly understand. If you're banned from exporting a book,
           | well that has some clear and obvious precedent as a
           | restriction of free speech.
        
         | bouncycastle wrote:
         | RSA = Republic of South Africa
        
         | T3OU-736 wrote:
         | Thawte happily sold certs to US entities like universities (I
         | was a sysadmin at one of those universities with such a cert :)
         | )
        
       | ETH_start wrote:
       | For financial encryption, so essential is warrantless
       | surveillance to their control of finance, that they've
       | successfully argued that a neutral and immutable protocol
       | instantiating open source code on a distributed public blockchain
       | is property of a sanctionable entity, and thus within their
       | authority to prohibit Americans from using:
       | 
       | https://cases.justia.com/federal/district-courts/texas/txwdc...
        
       | hunglee2 wrote:
       | the biggest threat to a citizens privacy is always your own
       | government.
        
         | BLKNSLVR wrote:
         | Yep. I use Chinese brand phones because if they're snooping all
         | my shit, they're much further away from me than my own
         | government and not likely to have sharing arrangements.
        
           | aspenmayer wrote:
           | Wouldn't Chinese branded phones be a higher priority target
           | by foreign agencies in the first place?
        
             | BLKNSLVR wrote:
             | It's likely an additional data point in some kind of
             | 'suspicious' rating.
             | 
             | I think I hit quite a few of those 'suspicious' check-boxes
             | that law enforcement would consider important, whilst
             | actually technically knowledgeable people wouldn't even
             | blink at them. Refer:
             | https://news.ycombinator.com/item?id=39050898
        
         | AtlasBarfed wrote:
         | I kinda disagree, because the government, even now, can be
         | shamed and outraged.
         | 
         | Corporations however? They are, by design, utterly amoral.
         | 
         | So the modern state is that corporations are hoovering all your
         | data they can for "ad research and optimization". I think I
         | read recently that facebook has thousands of companies involved
         | in the customer data supply chain?
         | 
         | And if those companies have your data, it's not that YOUR
         | government has it guaranteed. It's that ALL governments have
         | your data.
        
       | quasarj wrote:
       | I like that typo in the image label - the Chipper Clip lol
        
         | bemusedthrow75 wrote:
         | We've had enough of chipper clips to last a lifetime!
         | It looks like you're writing an article about encryption. Would
         | you like help?            (o) Insert a joke about Apple forcing
         | a U2 album on us            (o) Let me write the joke myself
         | [x] Don't show me this tip again
        
         | xrd wrote:
         | That's the backdoor in the rot13 visible to the naked eye.
        
       | jarjar2_ wrote:
       | One of my favorite comics about cryptography.
       | https://xkcd.com/538/
       | 
       | Government routinely posits a desperate need for backdoors in
       | crypto and crypto secured products, but almost universally they
       | get the data they want without needing a manufacturer provided
       | backdoor. So why they insist on continuing to do that is beyond
       | me. It's almost security theater.
       | 
       | If they really want your protected information they will be able
       | to get it. Either through a wrench or a legal wrench. In lieu of
       | that they can use practically unlimited resources at their
       | disposal from who they employ (or contract out to) to the long
       | axis to which most secured devices succumb from, time.
       | 
       | My personal threat model isn't to defeat the government. They
       | will get the data eventually. My personal threat model is
       | corporations that want to know literally everything about me and
       | bad faith private actors (scammers, cybercrime and thieves) that
       | do too.
       | 
       | Ultimately it will take strict legislation and compliance
       | measurement along with penalties to protect the government from
       | overstepping the bounds they promise not to step over already,
       | let alone new ones. It will take even stricter legislation to
       | stop corporations from doing it. There are significant financial
       | and political incentives for our ruling bodies to not do that,
       | unfortunately.
       | 
       | I mean honestly, when you have this kind of ability at your
       | disposal...
       | 
       | https://www.npr.org/2021/06/08/1004332551/drug-rings-platfor...
        
         | paulpauper wrote:
         | A backdoor, which works anywhere and way better than the wrench
         | option.
        
           | jarjar2_ wrote:
           | They don't need it, which was my point. They have all the
           | tools the need right now to get what they want. Why should
           | anyone grant them more?
        
             | lazide wrote:
             | Why would they not try to get a magic back door that makes
             | their lives easier, even if they don't need it?
        
         | dennis_jeeves2 wrote:
         | >Ultimately it will take strict legislation and compliance
         | measurement along with penalties to protect the government from
         | overstepping the bounds they promise not to step over already,
         | let alone new ones.
         | 
         | They will find ways to not comply, often blatantly. They have
         | no scruples.
        
       | jonathanyc wrote:
       | Now the argument coming from civil society for backdoors is based
       | on CSAM:
       | 
       | > Heat Initiative is led by Sarah Gardner, former vice president
       | of external affairs for the nonprofit Thorn, which works to use
       | new technologies to combat child exploitation online and sex
       | trafficking. In 2021, Thorn lauded Apple's plan to develop an
       | iCloud CSAM scanning feature. Gardner said in an email to CEO Tim
       | Cook on Wednesday, August 30, which Apple also shared with WIRED,
       | that Heat Initiative found Apple's decision to kill the feature
       | "disappointing."
       | 
       | > "Apple is one of the most successful companies in the world
       | with an army of world-class engineers," Gardner wrote in a
       | statement to WIRED. "It is their responsibility to design a safe,
       | privacy-forward environment that allows for the detection of
       | known child sexual abuse images and videos. For as long as people
       | can still share and store a known image of a child being raped in
       | iCloud we will demand that they do better."
       | 
       | https://www.wired.com/story/apple-csam-scanning-heat-initiat...
        
         | rpmisms wrote:
         | CSAM is evil, and I personally believe we should execute those
         | who distribute it.
         | 
         | I have an even stronger belief in the right to privacy, and
         | those in the government who want to break it should be executed
         | from their positions (fired and publicly shamed).
        
           | smolder wrote:
           | In some cases artwork where no child was harmed counts as
           | CSAM. Is that execution worthy?
        
         | int_19h wrote:
         | This isn't even a recent thing anymore. "iPhone will become the
         | phone of choice for the pedophile" was said by a senior
         | official in 2014, when full device encryption was starting to
         | become common.
        
         | matheusmoreira wrote:
         | The perfect political weapon. Anyone who opposes is
         | automatically labeled a pedophile and child abuser. Their
         | reputations are destroyed and they will never oppose again.
        
       | Joel_Mckay wrote:
       | Encryption is meaningless with cpu-level side-channel memory key
       | dumps active on most modern platforms. The reality is if you have
       | been targeted for financial or technological reasons, than any
       | government will eventually get what they are after.
       | 
       | One can't take it personally, as all despotic movements also
       | started with sycophantic idealism.
       | 
       | Have a great day, =)
       | 
       | https://xkcd.com/538/
        
         | keepamovin wrote:
         | Agree with this. Makes me think that the code-breakers
         | themselves must be using specialized hardware to protect their
         | own side-channels. But for this to be feasible you need to have
         | big chipmakers in on it. Fascinating to consider
        
           | Joel_Mckay wrote:
           | No need, data collection is a different function than
           | exploitation. People that are turned into assets are often
           | not even aware how they are being used.
           | 
           | I once insisted I could be bribed to avoid the escalation of
           | coercion as a joke, that was funny until someone actually
           | offered $80k for my company workstation one day.
           | 
           | It is a cultural phenomena, as in some places it is
           | considered standard acceptable practice.
           | 
           | My advice is to be as boring as possible, legally proactive,
           | and keep corporate financial profit modes quiet.
           | 
           | Good luck =)
        
       | keepamovin wrote:
       | I was so curious about the origins of the SHA algorithms that I
       | made a FOIA to NSA about _SHA-0_ ^0, as I wanted to understand
       | how it was developed and requested all internal communications,
       | diagrams, papers and so on responsive to that.
       | 
       | Interestingly I found that after I got a reply (rough summary:
       | _you are a corporate requester, this is overly broad, it will be
       | very expensive_ ) I could no longer access the NSA website. Some
       | kind of fingerprint block. The block persisted across IP
       | addresses, browsers, incognito tabs, and devices so it can't be
       | based on cookies / storage.
       | 
       | Still in place today:                 Access Denied
       | You don't have permission to access "http://nsa.gov/serve-from-
       | netstorage/" on this server.
       | 
       | 0: https://en.wikipedia.org/wiki/SHA-1#Development
        
         | p-e-w wrote:
         | > Some kind of fingerprint block. The block persisted across IP
         | addresses, browsers, incognito tabs, and devices so it can't be
         | based on cookies / storage.
         | 
         | Then what is it based on, if it happens across different
         | devices _and_ different IP addresses?
         | 
         | I find it very surprising that the NSA would go to such
         | technologically advanced lengths to block FOIA requesters _from
         | their website_ (which, needless to say, doesn 't contain any
         | sensitive information).
        
           | keepamovin wrote:
           | Yeah weird, right? Highly surprising, high entropy, highly
           | informative bit of signal possibly. Obvious way to admit
           | SHA-0 is a pressure point maybe.
           | 
           | Idk, maybe you can figure out the block, I think it's beyond
           | me. Here's a picture if that helps haha! :)
           | 
           | https://imgur.com/a/rNIjrB2
           | 
           | Highly unlikely to be a coincidence but I took it to mean:
           | _Don 't make these requests_ ... OK ... haha! :)
        
             | lcnPylGDnU4H9OF wrote:
             | This honestly seems kinda fun. If one was really dedicated:
             | buy new device with cash; purchased and used outside city
             | of residence; don't drive there, non-electric bike or walk;
             | only use device to connect to the website from public wifi;
             | never connect to own wifi; don't use same VPN service as
             | usual. Not sure if I missed anything. Probably did.
        
               | numpad0 wrote:
               | Or walk into an internet cafe. Cafe membership systems,
               | if any, probably aren't yet connected enough to prevent
               | showing you the raw Internet for first few minutes, for
               | few more years. Everyone who's vocal online should try
               | this once in a while. Even Google search results
               | noticeably change depending on your social classes
               | inferred from location and whatnot.
        
             | rvnx wrote:
             | It's just Akamai being overzealous against bots.
             | 
             | It could simply be you read more pages and it may have
             | triggered anti-scraping rules.
             | 
             | I cannot access many .gov websites either, and maybe it was
             | after 5 pages or so.
        
             | MichaelDickens wrote:
             | This seems like a good way to learn what information your
             | system is leaking that it shouldn't be leaking, eg if you
             | use a VPN and they still block you, your VPN is probably
             | not doing what it claims to be doing. (AFAIK a correctly
             | implemented VPN would not send any of your computer or
             | browser information to nsa.gov.)
        
               | themoonisachees wrote:
               | VPNs do not do what you seem to think they do. A VPN is a
               | privacy tool as much as restarting your router to get a
               | new IP lease is a privacy tool.
        
           | ranger_danger wrote:
           | there are MANY different ways to fingerprint something or
           | someone, see e.g. https://abrahamjuliot.github.io/creepjs/ or
           | https://scrapeops.io/web-scraping-playbook/how-to-bypass-
           | clo....
           | 
           | also fun fact, even Tor Browser can't hide the real OS you're
           | running when a site uses javascript-based OS queries.
        
         | justinclift wrote:
         | That url (http://nsa.gov/serve-from-netstorage/) works via Tor,
         | so maybe try that? ;)
        
         | Anon4Now wrote:
         | I'm curious as to why the NSA still has http:// urls.
        
           | pseudocomposer wrote:
           | It redirects to HTTPS.
        
         | AtlasBarfed wrote:
         | They probably have someone specifically assigned to crack every
         | device you use.
        
       | mouzogu wrote:
       | how likely is it that whatsapp or telegram are backdoored?
       | 
       | i wonder what tools do guerilla armies or drug lords use to
       | communicate..
       | 
       | or maybe its better to hide in plain sight.
       | 
       | just use some kind of double speak that gives plausible
       | deniability.
        
         | KennyBlanken wrote:
         | Telegram is a poster-child for sketchy security.
        
       | KennyBlanken wrote:
       | This leaves out at least one other proven case - the NSA worked
       | to weaken an early encrypted telephone system that was sold to
       | numerous other governments and allowed them to listen in on
       | conversations.
       | 
       | Then there's this: https://www.cnet.com/tech/tech-industry/nsa-
       | secret-backdoor-...
       | 
       | And then there was the Tailored Access Operations group that
       | backdoored hundreds if not thousands of computers and networking
       | gear https://en.wikipedia.org/wiki/Tailored_Access_Operations
       | 
       | And then there's Bullrun where they partnered with commercial
       | software and hardware companies to insert backdoors, specifically
       | in many commercial VPN systems
       | https://en.wikipedia.org/wiki/Bullrun_(decryption_program)
       | 
       | Let's also not forget the backdooring of Windows NT:
       | https://en.wikipedia.org/wiki/NSAKEY
       | 
       | ...and Lotus Notes was also backdoored, as well.
        
       | qubex wrote:
       | The image isn't of an Enigma but of a Lorenz teleprinter
       | encryption system. It was would be referred to now as a "stream
       | cipher".
        
       | mstef wrote:
       | i believe the cryptomuseum has a more extensive list than the one
       | in the link: https://www.cryptomuseum.com/intel/nsa/backdoor.htm
       | particularly one is interesting as i have reverse engineered and
       | proven its existence:
       | https://www.cryptomuseum.com/crypto/philips/px1000/nsa.htm
        
       ___________________________________________________________________
       (page generated 2024-02-04 23:01 UTC)