[HN Gopher] Vulnerabilities in TETRA radio networks
       ___________________________________________________________________
        
       Vulnerabilities in TETRA radio networks
        
       Author : porsupah
       Score  : 163 points
       Date   : 2023-12-09 17:02 UTC (5 hours ago)
        
 (HTM) web link (www.cryptomuseum.com)
 (TXT) w3m dump (www.cryptomuseum.com)
        
       | denysvitali wrote:
       | TL;DR: The only newsworthy vulnerability is the breaking TEA1 -
       | which is anyways the least secure of them all and only intended
       | for commercial use (that is, no emergency services).
       | 
       | https://www.tetraburst.com/
        
         | pixl97 wrote:
         | The question is, did things like emergency services actually
         | use the higher levels, or did they just use TEA1?
         | 
         | It's kind of like saying...
         | 
         | Vendor: "We support up to 1 zillion bit encryption!"
         | 
         | User: "What's the default out of the box?"
         | 
         | Vendor: "10 bit"
        
         | riversflow wrote:
         | Hogwash, I think it's worth noting that this _European_ system
         | was intentionally backdoored.
         | 
         | Everybody plays the espionage game, Europe really is no
         | exception, they just like to use the US to keep their hands
         | (mostly) clean.
        
         | riedel wrote:
         | > TL;DR: The only newsworthy vulnerability is the breaking TEA1
         | 
         | This is IMHO a very unfair TLDR; . The news is that the
         | researchers claim that there is deliberate backdoor, which ETSI
         | denies. If it is true, there cannot be any further trust in
         | other proprietary parts as well.
        
         | matthewdgreen wrote:
         | It appears to be used for infrastructure, including things like
         | power and transportation signals here in the US.
        
       | freeopinion wrote:
       | > The vulnerabilities were discovered during the course of 2020,
       | and were reported to the NCSC in the Netherlands in December of
       | that year. It was decided to hold off public disclosure until
       | July 2023, to give emergency services and equipment suppliers the
       | ability to patch the equipment.
       | 
       | Interesting discussion about responsible disclosure. It seems a
       | strange belief that you can tell all the radio operators about
       | the vulnerability without also telling exploiters. Aren't they
       | often one and the same? What's a reasonable approach here?
        
         | tptacek wrote:
         | Immediate public disclosure.
        
           | freeopinion wrote:
           | I'm inclined to agree. I'm not comfortable with the way this
           | unfolded.
           | 
           | > The Dutch NCSC (NCSC-NL) was informed in December 2021,
           | after which meetings were held with the law enforcement and
           | intelligence communities, as well as with ETSI and the
           | vendors. Shortly afterwards, on 2 February 2022, preliminary
           | advice was distributed to the various stakeholders and CERTs.
           | The remainder of 2022 and the first half of 2023 were used
           | for coordination and advisory sessions with stakeholders,
           | allowing manufacturers to come up with firmware patches,
           | updates or workarounds.
           | 
           | This reads to me as if malicious parties were notified some
           | 18 months before users were notified.
        
             | actionfromafar wrote:
             | Depends on who the stakeholders were.
        
               | freeopinion wrote:
               | Does it? Intelligence agencies were among the first
               | informed. Those are the bad guys.
               | 
               | I know "bad guys" is a harsh phrasing, but when it comes
               | to encrypted communication, they are literally the
               | definition of the adversary. Anybody in intelligence that
               | doesn't play for my team is a "bad guy". And since
               | everybody belongs to multiple conflicting teams, even a
               | person who plays on one of my teams is a "bad guy" from
               | the perspective of my other teams.
               | 
               | If the first place you go with a disclosure is to the
               | intelligence community, you are hurting users.
        
         | gwbas1c wrote:
         | > It seems a strange belief that you can tell all the radio
         | operators about the vulnerability without also telling
         | exploiters
         | 
         | I suspect that there was an update (or replacement) to the
         | radios that was generally described as an ordinary update /
         | maintenance.
        
           | freeopinion wrote:
           | Do you also suspect that the patch was generally ignored
           | because nobody knew it was important?
           | 
           | Should the vendor be allowed to continue to sell models they
           | know are compromised while their competition loses those
           | contracts? Shouldn't there be some consequence for such
           | fraud?
        
       | LocalH wrote:
       | Sounds like they took the "roll your own and don't tell anyone
       | how it works" approach. Security by obscurity is never security.
       | History has shown that the open encryption standards are the most
       | secure.
        
         | denysvitali wrote:
         | You can't easily put backdoors in cryptographic algorithms that
         | can be audited
        
           | anonym29 wrote:
           | ^ this post brought to you by RSA, ANSI, ISO, NIST, the NSA,
           | and the authors of DUAL_EC_DRBG
           | 
           | /s
        
             | gpderetta wrote:
             | ... Which iirc was immediately identified as suspicious
             | during auditing.
        
               | a1369209993 wrote:
               | And yet became a official standard anyway, and was
               | occasionally actually used, despite the fact that is was
               | obviously backdoored to anyone who knew anything about
               | (elliptic-curve) cryptography. (It's literally a
               | textbook-exercise leaky RNG, of the sort that you would
               | find under "Exercise: create a elliptic-curve-based RNG
               | that leaks seed bits within N bytes of random data." in a
               | actual cryptography textbook.)
        
               | tptacek wrote:
               | You don't really need to understand elliptic curves to
               | understand Dual EC. It's a public key RNG. The
               | vulnerability is that there's a matching private key.
        
               | a1369209993 wrote:
               | True, but my parenthetical was covering the opposite
               | issue: it's _possible_ to not realise DUAL_EC_DRBG is
               | broken (rather than impossible _to_ realise it) if your
               | only knowledge of cryptography is, say, hash functions
               | and stream ciphers (so you don 't recognise public key
               | cryptography from looking at it). It's _unlikely_ ,
               | because DUAL_EC_DRBG is really obviously broken, but I
               | wouldn't fault someone who knew nothing about elliptic-
               | curve cryptography for missing it, even if they were
               | familiar with other types of cryptography. (I _would_
               | fault them for claiming that it 's secure, rather than
               | recognizing that they don't know enough to evaluate its
               | security, but you can't conclude something's backdoored
               | _just_ from that.)
        
               | anonym29 wrote:
               | The assertion I was refuting was that they couldn't be
               | easily inserted into an audited library, not that they
               | wouldn't be detected.
        
           | tptacek wrote:
           | You certainly can.
        
         | stavros wrote:
         | The bigger issue here is that there's an intentional
         | vulnerability.
        
         | umvi wrote:
         | Obscurity should never replace security, but it can and does
         | augment security by increasing the cost to even study the
         | security.
        
         | ok123456 wrote:
         | It's more of intentionally reducing the keyspace when
         | generating keys. You can use weakly generated keys with
         | industry-standard encryption algorithms. When your 4096-bit key
         | is only 32 bits, it doesn't matter how well-trusted the
         | algorithm is.
        
           | tptacek wrote:
           | I just skimmed the paper but it looked to me like the key
           | generation is the same in all profiles, but the TEA1 case has
           | a key setup that compresses the generated key down to 32
           | bits.
        
           | jnwatson wrote:
           | The researchers found several problems. The backdoor seems
           | intentional, but the others do not. They broke the TAA
           | protocol.
        
         | xyzzy4747 wrote:
         | Security has many layers. Obscurity can be one of them.
        
           | barsonme wrote:
           | Obscurity can certainly be part of defense in depth, but it
           | unequivocally does not make anything more (meaningfully)
           | _secure_.
           | 
           | For example, hiding the fact that your data is encrypted with
           | AES doesn't make an attacker any more likely to be able to
           | break AES. Similarly, hiding the fact that you use a weak
           | encryption algorithm doesn't keep an attacker from breaking
           | it.
        
         | londons_explore wrote:
         | And yet this one lasted 30 years. That's far longer than most
         | open encryption algorithms continue to be deemed secure.
         | 
         | Obviously you can debate wether having it 'appear' secure for
         | longer before someone publishes details of the flaw is more
         | important or not...
        
           | nicce wrote:
           | > And yet this one lasted 30 years.
           | 
           | Main goal of security through obscurity is the hindrance.
           | Make it slower and harder to to detect possible
           | vulnerabilities.
           | 
           | So indeed, there is something to debate.
           | 
           | But I guess it helps only against those with limited
           | resources, not against nation states.
        
             | swells34 wrote:
             | This is analogous to physical security doors. They are
             | considered passive security, since they are a deterrent,
             | and are rated by the numbers of hours they are expected to
             | hold up against hand tools.
        
             | xmprt wrote:
             | Is it still true that nation states are at the forefront of
             | innovation and the largest security threats? At least in
             | the United States, I'd be surprised to learn that their
             | best and brightest minds are working in three letter
             | government agencies when they can work in industry for more
             | money and less bureaucracy.
        
               | tptacek wrote:
               | Yes. Additionally, there are extensive public/private
               | partnerships.
        
             | fmajid wrote:
             | > Main goal of security through obscurity is the hindrance
             | 
             | No, the main goal is to obfuscate just how incompetent the
             | authors of the spec are, and how clearly they illustrate
             | Dunning-Kruger.
        
               | nicce wrote:
               | > No, the main goal is to obfuscate just how incompetent
               | the authors of the spec are
               | 
               | If you agree that it obfuscates the meaning of the
               | author's work, then it also slows down other things
               | recursively...
        
           | perlgeek wrote:
           | It lasted 30 years in the sense it hasn't been publicly
           | broken before.
           | 
           | We don't know how many intelligence agencies have found some
           | of these and are happily listening in on "secure"
           | communication, concealing that fact successfully.
        
             | michaelt wrote:
             | Aren't these encrypted radios mostly for cops?
             | 
             | I mean, this is embarrassing - but who cares if the secret
             | police are spying on the regular police?
        
               | dghlsakjg wrote:
               | Whose secret police are spying on the civilian police.
               | 
               | Is it more concerning if it's the Russian secret police
               | spying on the Kyiv police?
        
               | perlgeek wrote:
               | Does the FBI use these? The FBI is tasked with counter
               | intelligence, and for a spy it could be highly relevant
               | to learn if they are being targeted.
        
             | nicce wrote:
             | This argument holds for any non-disclosed vulnerabilities,
             | however.
        
           | inopinatus wrote:
           | The TEA1 key compression weakness may have been known to
           | intelligence agencies as early as 2006. See
           | https://www.cryptomuseum.com/radio/tetra/ under section
           | "Compromise".
        
           | jjav wrote:
           | > And yet this one lasted 30 years.
           | 
           | What do you mean lasted? If it is an intentional backdoor, it
           | was vulnerable (to those who knew the backdoor) from day 1,
           | so it was never secure let alone 30 years.
        
           | snvzz wrote:
           | The publicly known attacks are recent, yes.
           | 
           | I know some group had it pwned at least 2010-ish. But won't
           | elaborate.
           | 
           | And I'm sure they weren't the first, nor the only ones.
        
       | k8svet wrote:
       | What exactly were TETRA radios used for? I assume they were
       | government/infra related, but then I don't understand why they'd
       | need to backdoor the keying
        
         | tptacek wrote:
         | They don't so much backdoor the keying as that they have 4
         | different cipher profiles, and the one approved for global
         | rather than European use (TEA1) compresses the key from 80 to
         | 32 bits.
         | 
         | It's essentially a surreptitious version of what the US did in
         | the 1990s with "export ciphers".
        
           | gwbas1c wrote:
           | Which makes me question describing this as a "deliberate
           | backdoor."
        
             | tptacek wrote:
             | It's pretty clearly a deliberate backdoor.
        
               | swells34 wrote:
               | And that is supported by the known past actions of "some
               | government authorities". This is definitely not the first
               | time the US government has deliberately sabotaged crypto.
        
               | tptacek wrote:
               | This isn't an American product.
        
             | wkat4242 wrote:
             | It's deliberate in making the crypto so weak that our guys
             | can decrypt their guys' radio traffic.
             | 
             | How's that not a backdoor?
        
         | mcpherrinm wrote:
         | They are used for many things, like fire, ambulance, railways,
         | harbour operations, police, military, coast guard, and so on.
         | 
         | The weaker cipher mode, TEA1, is used when selling the radios
         | to anyone who may not necessarily be an ally or highly trusted.
         | This is the legacy of strong crypto being export-controlled.
         | 
         | It was public that these ciphers were weaker, but they were
         | actually much weaker than advertised. This is the backdoor.
        
         | qwertox wrote:
         | I think the most relevant use in the context of deliberate
         | backdoor is its use by police and military forces. Apparently
         | some energy providers also use it for remote controlling tasks
         | (no voice).
        
         | timthorn wrote:
         | There was also the Dolphin network in the UK, offering a public
         | national subscription TETRA network. It didn't prove
         | commercially viable.
         | 
         | https://www.rcrwireless.com/19980309/archived-articles/dolph...
        
       | wyck wrote:
       | The newsworthy item here is that this is an intentional backdoor.
       | The wikipedia pages list the specific uses per country and
       | department.
       | https://en.wikipedia.org/wiki/Terrestrial_Trunked_Radio#Usag...
        
         | H8crilA wrote:
         | Do you remember when cryptography export was controlled? It was
         | implemented by limiting key size to certain number of
         | (effective) bits (of security). This suite is just a victim of
         | that law, as it is a 1990s design.
        
           | wkat4242 wrote:
           | And when people were saying it was a stupid thing? This is
           | one of the many examples that prove it.
        
           | tptacek wrote:
           | It's not "just" a victim of that law unless they disclosed
           | that the export cryptography protocol was trivially
           | breakable. Export cryptography in the 1990s US was
           | documented.
        
           | creato wrote:
           | Reading the wiki page, it seems to be a European standard.
           | The law you are referring to sounds like a US law.
        
       | marcus0x62 wrote:
       | The interview that is linked[0] in the footnotes of the article
       | with the person from ETSI is absolutely wild... Some excerpts:
       | 
       | > kz (interviewer): How did it go about meeting those
       | requirements, because that's the one they're saying has a
       | backdoor in it. Was that the condition for export?
       | 
       | > BM (ETSI): Backdoor can mean a couple of things I think.
       | Something like you'd stop the random number generator being
       | random, for instance. [But] what I think was revealed [by the
       | researchers] was that TEA1 has reduced key-entropy. So is that a
       | backdoor? I don't know. I'm not sure it's what I would describe
       | as a backdoor, nor would the TETRA community I think.
       | 
       | ...
       | 
       | > KZ: People ... believe they're getting an 80-bit key and
       | they're not.
       | 
       | > BM: Well it is an 80-bit long key. [But] if it had 80 bits of
       | entropy, it wouldn't be exportable.
       | 
       | ...
       | 
       | > kz: You're saying 25 years ago 32 bit would have been secure?
       | 
       | > BM: I think so. I can only assume. Because the people who
       | designed this algorithm didn't confer with what was then EP-TETRA
       | [ETSI Project-TETRA is the name of the working group that oversaw
       | the development of the TETRA standard]. We were just given those
       | algorithms. And the algorithms were designed with some assistance
       | from some government authorities, let me put it that way.
       | 
       | ...
       | 
       | > bm: That's what we now know yeah - that it did have a reduced
       | key length.
       | 
       | > KZ: What do you mean we now know? SAGE created this algorithm
       | but the Project-TETRA people did not know it had a reduced key?
       | 
       | > BM: That's correct. Not before it was delivered. Once the
       | software had been delivered to them under the confidential
       | understanding, that's the time at which they [would have known].
       | 
       | ...
       | 
       | You've really got to wonder who at ETSI gave the thumbs up on
       | doing this interview.
       | 
       | 0 - https://www.zetter-zeroday.com/p/interview-with-the-etsi-
       | sta...
        
         | sillysaurusx wrote:
         | The researchers added a footnote explicitly refuting the claim
         | that 32 bit keys were secure 25 years ago, too.
         | 
         | > The Midnight Blue researchers have since demonstrated real-
         | life exploitations of some of the vulnerabilities, for example
         | at the 2023 Blackhat Conference in Las Vegas (USA). They have
         | shown that TETRA communications secured with the TEA1
         | encryption algorithm can be broken in one minute on a regular
         | commercial laptop and in 12 hours on a classic laptop from 1998
         | [III].
        
           | marcus0x62 wrote:
           | In the mid-late 90s, 40-bit encryption was common due to US
           | export control restrictions, and even then, that was thought
           | to be insecure against a nation state attacker.
           | 
           | In 1998, the EFF built a custom DES Cracker[0] for around
           | $250k that could crack a 56-bit DES message in around 1 week.
           | As was the custom at the time, they published the source
           | code, schematics, and VHDL source in a printed book to evade
           | (and, I guess, mock) export restrictions.
           | 
           | 0 - https://en.m.wikipedia.org/wiki/EFF_DES_cracker
        
             | clankyclanker wrote:
             | (If that's the case I'm thinking of) it was actually
             | documented as a challenge to export restrictions, mocking
             | them was merely a pleasant byproduct.
             | 
             | The EFF's legal challenge was essentially that if crypto is
             | a munition, then this printed book explaining the crypto is
             | also at least as much of a munition, if not more so. They
             | gave the judge the choice between deciding that a printed
             | book is some sort of deadly tool, or deciding that crypto
             | wasn't conceptually a munition. Strangely, the judge ruled
             | in the EFF's favor.
        
               | marcus0x62 wrote:
               | That was Phil Zimmerman's book containing the PGP source
               | whixh was published a few years before the Deep Crack
               | book.
               | https://philzimmermann.com/EN/essays/BookPreface.html
        
       | neilv wrote:
       | > _Two of the vulnerabilities are deemed critical. One of them
       | appears to be an intentional backdoor_ [...] Reading the contents
       | of a firmware upgrade is not trivial though, as it is heavily
       | encrypted and relies on a Trusted Execution Environment (TEE),
       | embedded in the core processor of the radio.*
       | 
       | I don't know whether the backdoor allegation is correct, but
       | unfortunately we should treat opaque ostensible security with
       | skepticism.
       | 
       | By their nature, such things often can be used for our protection
       | at the same time they are secretly used against us.
        
       ___________________________________________________________________
       (page generated 2023-12-09 23:00 UTC)