[HN Gopher] A brief history of the U.S. trying to add backdoors ...
___________________________________________________________________
A brief history of the U.S. trying to add backdoors into encrypted
data (2016)
Author : whatever3
Score : 163 points
Date : 2024-02-03 20:11 UTC (2 hours ago)
(HTM) web link (www.atlasobscura.com)
(TXT) w3m dump (www.atlasobscura.com)
| loughnane wrote:
| This topic comes up a bunch still. Someone please correct me, but
| as I understand it anyone using new chips that use Intel ME (or
| AMD's equivalent) have a gaping hole in their security that no OS
| can patch.
|
| I know puri.sm[0] takes some steps to try to plug the hole, but
| haven't read up to see if it's effective or no.
|
| [0] https://puri.sm/learn/intel-me/
| bri3d wrote:
| > anyone using new chips that use Intel ME (or AMD's
| equivalent) have a gaping hole in their security that no OS can
| patch
|
| Not really; anyone using chips with Intel ME or AMD PSP have an
| additional large binary blob running on their system which may
| or may not contain bugs or backdoors (of course, also realizing
| a sufficiently bad bug is indistinguishable from a backdoor).
|
| There are tens to hundreds of such blobs running on almost any
| modern system and these are just one example. I would argue
| that ME and PSP are not the worst blob on many systems; they
| have both unsupported but almost certainly effective (MEcleaner
| / ME code removal), supported and almost certainly effective
| (HAP bit), or supported and likely effective (ME / PSP disable
| command) mechanisms to disable their functionality, and they
| are comparatively well-documented versus the firmware that runs
| on every other peripheral (networking, GPU, etc.) and
| comparatively hardened versus EFI.
| loughnane wrote:
| Yeah, this lives in the back of my mind too. I run debian on
| 11th gen intel, but with the non-free blobs included to make
| life easier. I've been meaning to try it without them, but
| it's too tempting to just get things 'up' instead of hacking
| on it.
| wtallis wrote:
| Most consumer products (as opposed to some of those marketed to
| businesses) don't have enough of the components in place for
| the ME to accomplish anything, good or bad.
| loughnane wrote:
| What do you mean? What sort of components?
| hn8305823 wrote:
| In case anyone is wondering about the context for this 2016
| article, it was right after the 2015 San Bernardino attack and
| the FBI was trying to get into one of the attacker's phones.
| Apple resisted the request primarily because they wanted a
| certificate that would allow them to install any rogue
| firmware/app/OS on any iPhone, not just the attacker's.
|
| https://en.wikipedia.org/wiki/2015_San_Bernardino_attack
| coppsilgold wrote:
| FBI director James Comey have publicly lobbied for the insertion
| of cryptographic "backdoors" into software and hardware to allow
| law enforcement agencies to bypass authentication and access a
| suspect's data surreptitiously. Cybersecurity experts have
| unanimously condemned the idea, pointing out that such backdoors
| would fundamentally undermine encryption and could exploited by
| criminals, among other issues.
|
| "could exploited by criminals" is sadly a disingenuous claim. A
| cryptographic backdoor is presumably a "Sealed Box"[1] type
| construct (KEM + symmetric-cipher-encrypted package). As long as
| the government can keep a private key secure only they could make
| use of it.
|
| There are plenty of reasons not to tolerate such a backdoor, but
| using false claims only provides potential ammunition to the
| opposition.
|
| [1] <https://libsodium.gitbook.io/doc/public-
| key_cryptography/sea...>
| 2OEH8eoCRo0 wrote:
| And Apple has a backdoor that only Apple can use. Why don't
| criminals exploit Apple's backdoor?
| frickinLasers wrote:
| https://arstechnica.com/security/2023/12/exploit-used-in-
| mas...
|
| Looks like criminals were using it for four years undetected.
| quickslowdown wrote:
| Which backdoor do you mean? I'm not an Apple expert by any
| means, but I thought they encrypted customer data in a way
| that even they can't get to it? Wasn't that the crux of this
| case, that Apple couldn't help the FBI due to security
| measures, prompting the agency to ask for a backdoor?
| 2OEH8eoCRo0 wrote:
| What's an update? They can sign and push any code they want
| remotely.
| dataangel wrote:
| IIRC the question is when the phone is totally locked,
| e.g. if you turn it off then turn it back on and haven't
| entered the PIN yet. In this state even apple can't get
| an update to run, the secure hardware won't do it unless
| you wipe the phone first. And your data is encrypted
| until you unlock the phone.
|
| In practice though most people are screwed b/c it's all
| already in icloud.
| fragmede wrote:
| with advanced data protection, it's encrypted before it
| hits iCloud, so apple, nor the feds can't get at it.
| catlifeonmars wrote:
| Source/reference? I'm not aware of such a backdoor
| adrian_b wrote:
| See the posting above about the Arstechnica article.
|
| During the last days of 2023 there was a big discussion,
| also on HN, after it was revealed that all recent Apple
| devices had a hardware backdoor that allowed bypassing all
| memory access protections claimed to exist by Apple.
|
| It is likely that the backdoor consisted in some cache
| memory test registers used during production, but it is
| absolutely incomprehensible how it has been possible for
| many years that those test registers were not disabled at
| the end of the manufacturing process but they remained
| accessible for the attackers who knew Apple's secrets. For
| instance any iPhone could be completely controlled remotely
| after sending to it an invisible iMessage message.
| catlifeonmars wrote:
| FWIW this is a fair and valid argument. Generally, no one
| entity should have that much power. Doesn't really matter if
| it's USG or a tech giant.
| devwastaken wrote:
| It's not a false claim, assuming the feds will keep such a key
| "secure" is not backed by evidence. Top secret materials are
| leaked all the time. Private keys from well secured systems are
| extracted from hacks. The FBI having such a key would make them
| a very profitable target for the various corps that specialize
| in hacking for hire. For example, NSO group.
|
| If the power doesn't exist, nobody can exploit it.
| coppsilgold wrote:
| Do military cryptographic keys leak often? Do nuclear codes
| leak?
|
| The times highly valuable cryptographic keys leaked for
| various cryptocurrency exchanges it has generally if not
| always been due to gross negligence.
|
| Such a key would be highly sensitive and it would also
| require very little traffic to use. You would just need to
| send the secure system a KEM (<100 bytes) and it will respond
| with the symmetric key used for the protected package.
|
| I don't doubt they could secure it. Can even split the key
| into shares and require multiple parties to be present in the
| secure location.
| dvngnt_ wrote:
| nuclear codes are probably not used as much as phone
| backdoors. local police wants access too and other
| governments so I do believe it would leak
| some_furry wrote:
| > Do nuclear codes leak?
|
| For many years, the code was 00000000.
|
| https://arstechnica.com/tech-policy/2013/12/launch-code-
| for-...
| jliptzin wrote:
| What are you going to do with a nuclear code without access
| or authority to launch the nukes?
| devwastaken wrote:
| You're creating so many assumptions that nothing you've
| stated could be concluded to be an honest reflection of
| reality.
|
| Nobody has to know the rate of leaks, it's irrelevant.
| Gross negligence is not necessary, how would you even know?
| Leaks by definition are rarely exposed, we only see some of
| them.
|
| A "highly sensitive" key doesn't mean anything. Assigning
| more words to it doesn't somehow change the nature of it.
| Humans are bad at securing things, that's why the best
| security is to not have a system that requires it.
|
| Whatever hypothetical solution you have would be crushed
| under the weight of government committees and office
| politics until your security measures are bogus.
| whatshisface wrote:
| > _As long as the government can keep a private key secure only
| they could make use of it._
|
| Your devices would be secure as long as a private key that
| happened to be the most valuable intelligence asset in the
| United States, accessed thousands of times per day, by police
| spread across the entire nation, was never copied or stolen.
| lulznews wrote:
| Most normies would gladly vote for backdoors, you gotta use
| these scary criminal arguments to get them to think about it.
| catlifeonmars wrote:
| > As long as the government can keep a private key secure only
| they could make use of it.
|
| Not disingenuous. Keys are stolen or leaked all the time. And
| the blast radius of such a master key would be extremely large.
| nonrandomstring wrote:
| > false claims
|
| As Pauli said, "That's not even wrong". It cannot even meet the
| basic criteria for truth or falsehood.
|
| It's simply naked hubris.
| sowbug wrote:
| You assume a perfect implementation of the backdoor. Even if
| the cryptographic part were well-implemented, someone will
| accidentally ship a release build with the wrong key, or with a
| disabled safety that they normally use to test it.
|
| It's just another moving part that can break, and breaking it
| defeats the whole purpose of the system.
| buffet_overflow wrote:
| > As long as the government can keep a private key secure only
| they could make use of it.
|
| Well, keep in mind they would have to keep it secure in
| perpetuity. Any leak over the lifetime of any of that hardware
| would be devastating to the owners. Blue Team/Defensive
| security is often described as needing to be lucky every time,
| where as Red Team/attackers just have to get lucky once.
|
| This attack vector is in addition to just exploiting the
| implementation in some way, which I don't think can be
| handwaved away.
| Rebelgecko wrote:
| >As long as the government can keep a private key secure only
| they could make use of it.
|
| That's a big "if". Look at how the government has protected
| physical keys...
|
| Ever since the TSA accidentally leaked them, you can buy a set
| of keys on Amazon for $5 that opens 99% of "TSA approved" locks
| bayindirh wrote:
| Let's see:
|
| Mercedes recently forgot a token in a public repository which
| grants access to _everything_.
|
| Microsoft forgot its "Golden Key" in the open, allowing all
| kinds of activation and secure boot shenanigans.
|
| Microsoft's JWT private key is also stolen, making the login
| page a decoration.
|
| Somebody stole Realtek's driver signing keys for Stuxnet
| attack.
|
| HDMI master key is broken.
|
| BluRay master key is broken.
|
| DVD CSS master key is broken.
|
| TSA master keys are in all 3D printing repositories now.
|
| Staying on the physical realm, somebody made an automated tool
| to profile, interpret and print key blanks for locks with
| "restricted keyways" which has no blanks available.
|
| These are the ones I remember just top of my head.
|
| So yes, any digital or physical secret key is secure _until it
| isn't_.
|
| It's not a question of if, but when. So, no escrows or back
| doors. Thanks.
| mnw21cam wrote:
| > As long as the government can keep a private key secure...
|
| Which government? Software crosses borders.
|
| You can bet that if the US mandated a back door to be inserted
| into software that was being exported to another country, that
| country would want to either have the master key for that back
| door, or a different version of the software with a different
| back door or _without_ the back door. A software user could
| choose the version of the software that they wanted to use
| according to which country (if any) could snoop on them. It 's
| unworkable.
| dupertrooper wrote:
| Russia good country!
| progbits wrote:
| As this is from 2016 it doesn't include this new fun revelation:
|
| > On 11 February 2020, The Washington Post, ZDF and SRF revealed
| that Crypto AG was secretly owned by the CIA in a highly
| classified partnership with West German intelligence, and the spy
| agencies could easily break the codes used to send encrypted
| messages.
|
| https://en.m.wikipedia.org/wiki/Crypto_AG
| hedora wrote:
| More details here:
|
| https://web.archive.org/web/20200212014117/https://www.washi...
| schmudde wrote:
| Sharing this seems like an appropriate way of commemorating David
| Kahn's passing (https://news.ycombinator.com/item?id=39233855).
| <3
| xyst wrote:
| for a long time, the US considered cryptography algos as a
| munition. Needed some arms license to export.
|
| Also, US tried to convince the world only 56 bits of encryption
| was sufficient. As SSL (I don't think TLS was a thing back then)
| was becoming more mainstream, US govt only permitted banks and
| other entities to use DES [1] to "secure" their communications.
| Using anything more than 56 bits was considered illegal.
|
| https://en.m.wikipedia.org/wiki/Data_Encryption_Standard
| londons_explore wrote:
| Even now, if you join a discussion on crypto and say something
| like "Why don't we double the key length" or "Why not stack two
| encryption algorithms on top of one another because then if
| either is broken the data is still secure", you'll immediately
| get a bunch of negative replies from anonymous accounts saying
| it's unnecessary and that current crypto is plenty secure.
| mannyv wrote:
| There's talk that the NSA put its own magic numbers into
| elliptical curve seeds. Does that count?
|
| https://www.bleepingcomputer.com/news/security/bounty-offere...
| nimbius wrote:
| Everyone forgets the speck and simon crypto the NSA wanted in the
| Linux kernel that were, ultimately, removed from it entirely
| after a lot of well deserved criticism from heavy hitters like
| Schneier.
|
| https://en.m.wikipedia.org/wiki/Speck_(cipher)
___________________________________________________________________
(page generated 2024-02-03 23:00 UTC)