[HN Gopher] Attacking Titan M with Only One Byte
___________________________________________________________________
Attacking Titan M with Only One Byte
Author : ZeroCool2u
Score : 139 points
Date : 2022-08-15 12:51 UTC (10 hours ago)
(HTM) web link (blog.quarkslab.com)
(TXT) w3m dump (blog.quarkslab.com)
| ISL wrote:
| I'm very happy to see that a vulnerability introduced in a May
| 2022 was found, diagnosed, and fixed in a June 2022 update.
|
| After something went wrong, a bunch of things went right very
| quickly. Nice to have good news on a Monday morning.
| bumbledraven wrote:
| It's a bit confusing to me. The doc says:
|
| > ... the vulnerable firmware was introduced by Google's Pixel
| security update of May 2022.
|
| However, further down, in the timeline section, the doc
| indicates that the issue was reported before May 2022, all the
| way back in March 2022:
|
| > 2022-03-02: Vulnerability reported. Opened issue 222318108 in
| Google's Issue Tracker, providing a technical description of
| the bug and a Proof of Concept (PoC) exploit...
| [deleted]
| trhway wrote:
| Sounds like an amateur hour at that Google team. While post
| authors are putting blame on the un-safeness of C, absence of
| user input validation, like that integer from a message, is a
| path to a very unhappy place independent of language. The rest of
| the exploited places of that Titan software seem to be similarly
| sloppy.
| atwood22 wrote:
| Most languages don't let you (or at least make it hard to)
| directly convert user input into memory locations though. The
| scope of the issue in other languages would likely be much more
| limited.
| trhway wrote:
| It is unrelated. For example the input may be used only as a
| parameter for reading operations - ie. one can easily imagine
| a situation where even in a safest language using un-
| validated input may result in a call/query producing info
| outside of what would be expected for valid parameters.
| alexbakker wrote:
| This is amazing work!
|
| I was surprised to see that the reward was set at 10k initially.
| Granted, it was bumped to 75k later, but even that seems on the
| low side considering the degree of compromise that occurred here.
|
| I may have given up too early during my (fairly brief) research
| on CVE-2019-9465. I let the lack of firmware source code
| availability stop me at the time, but in hindsight the presence
| of "0dd0adde0dd0adde" in the ciphertext likely indicated a crash
| in Titan M as well. Perhaps there would have been a similarly
| interesting path to exploitation there.
| vlovich123 wrote:
| The one bit I didn't understand was how they bypassed W/RX. How
| did they manage to get the new code to be marked as RX after
| writing?
|
| I thought I read the whole thing. Did I miss that explanation?
| [deleted]
| teo_zero wrote:
| AIUI, they didn't inject code, just mangled the stack to hijack
| the execution flow towards specific code fragments ("gadgets")
| already in the executable memory.
| keepquestioning wrote:
| Hmm, doesn't ARM have mitigation against Return oriented
| Programming
| pmalynin wrote:
| Yeah some devices support PAC use that feature to sign
| return pointers. But not everyone uses it (even when
| available), and there exist methods to bypass PAC-- from
| attacking the micro architecture to finding signing
| oracles.
| muricula wrote:
| PAC (pointer signing) & Branch Target Identification are
| not available on 32 bit arm chips, and judging by the
| assembly in the blog post the Titan M is a 32 bit chip.
| ZeroCool2u wrote:
| This is an elegant attack that effectively compromises all Titan
| M chips. They were even able to dump all securely stored private
| cryptographic keys, which Google acknowledges in the disclosure
| timeline.
|
| Even still though, the award Google initially gave was only $10k
| USD(!). They finally bumped it to $75k USD after complaint and
| review, but Google's bug bounty program claims up to $1 Million
| USD.
|
| If fully compromising Google's own security chip to dump all
| private keys isn't worth the full $1 Million bounty, I honestly
| don't know what is.
|
| Really, what would, in the mind of those on the internal
| committee, constitute justification for the $1 Million bounty?
| rob_c wrote:
| I think the $1M is probably reserved to something that would
| tank Google stock imo, but maybe I'm cynical
| dylan604 wrote:
| Nah, you're not cynical enough. My cynical take is there is
| nothing they would ever pay $1M for. "Up to" is a marketing
| term to get people to think that if they work hard enough,
| they might qualify for this mythical unicorn bounty, but at
| the end of the day they just get peanuts.
| Lorin wrote:
| "You might win this car!"
| rob_c wrote:
| Really sign me up!
| dylan604 wrote:
| First, we'll need you to fill out this form with some
| basic information: --First, Middle, Last
| Name --Phone number, email address, social contacts
| --Mother's Maiden Name --First concert you attended
| --Name of the street you grew up on --Name of your
| best friend --Name of your first pet
| --Make/Model of your first car
| izacus wrote:
| > Really, what would, in the mind of those on the internal
| committee, constitute justification for the $1 Million bounty?
|
| Probably something that doesn't require physical access to a
| key for longer time to extract the keys?
| joshuamorton wrote:
| Quoting a random article from when this was initially
| announced:
|
| > Google says that if researchers manage to find "a full chain
| remote code execution exploit with persistence" that also
| compromises data protected by Titan M, they are willing to pay
| up to $1 million to the bug hunter who finds it.
|
| So a compromise that doesn't require physical access or root,
| presumably.
|
| Cue also the inevitable discussion that bug bounties are too
| low.
| posnet wrote:
| Remote 0-day onto all google internal infra?
| ClumsyPilot wrote:
| thats literally worth billions, and could be sold to many
| governments.
|
| If thr right people don't buy these zero-days, yhe wrong
| people will.
| google234123 wrote:
| Billions? couldn't a government just get a person
| affiliated with them hired by google?
| shadowtamperer wrote:
| You managed to spell the wrong in 2 different ways 2
| different times. Good job
| solmanac wrote:
| Probably an error-shift-encoded one-time pad scheme. The
| post is the publicly distributed ciphertext.
| NovemberWhiskey wrote:
| From my point of view, the fact that the exploit required
| either to run as root on a rooted device (or alternatively
| direct access to an internal serial bus) is enough to justify
| not paying at the top level.
|
| These things should probably be more transparent, but I would
| assume the $1M level would be for exploits that could be
| deployed on a fresh-from-the-box device with no rooting/mods.
| medo-bear wrote:
| is isb access an issue? your phone could get stolen
| NovemberWhiskey wrote:
| Sure; but it's a sliding scale, right? Zero-click network
| exploits are more severe than interactive exploits are more
| severe than those which require physical access etc.
| miohtama wrote:
| For 2018 chip and for a company like Google, the decision to go
| with C despite their all knowledge oN C/C++ memory issues
| (hello Chrome) is a bit sad.
| Lt_Riza_Hawkeye wrote:
| I would imagine it wasn't a hard decision - they likely
| needed to build something in an environment where they are
| already paying 40-80k C++ developers, and I would guess
| something like 1-10k Rust developers, who are scattered
| around various teams and may not want to hop on a new team
| right now. Also it was released to the consumer market in
| 2018, so probably built in 2016-17. Rust and other memory-
| safe systems programming alternatives didn't have nearly the
| same uptake back then - so maybe 1-2k would be a safer bet.
| Not to mention, Rust didn't get tier-1 ARM support until 2021
| - and even then, that's only when running on linux, which the
| Titan M chip is most likely not running.
| Svoka wrote:
| Google bug bounty program is a sham. When I tried to report
| critical vulnerability in their authentication system, they
| told it's a 'feature' and threatened to sue if I disclose.
|
| Then fixed it in two and a half years, and wrote an article on
| how complicated bug they just found and how proud and secure
| they are with their pentesters.
| deepsun wrote:
| Do you have a link?
| Svoka wrote:
| It was looong time ago (2011). You could use app specific
| password to change master password and disable 2FA, with a
| script.
|
| Funny thing. I had to discover this way myself when I lost
| my phone with Authenticator App. I took my mail client
| password and discovered that with some header magic I was
| able to hijack my own account. I couldn't believed it. So I
| created another account, protected with 2FA and did same
| thing. Got gaslighted by Google bug bounty team and decided
| it's not worth it.
| Lorin wrote:
| On what grounds could they sue you if it was truly considered
| a "feature", and they used that exact terminology in their
| response?
| Spivak wrote:
| You have inadvertently discovered the value of pentesters
| which is making handling security issues a win for the the
| management.
| rurban wrote:
| Those committee's just lost contact to the real world.
| Ridiculous
| tgv wrote:
| I guess there's someone who orders the marketing department
| to say "1 million", while telling the operational side "10k",
| because his bonus rides on it.
| UncleMeat wrote:
| That's not how it works. Bug bounties work like this.
|
| Somebody sets up a bounty program and defines a framework
| for deciding how much to pay out. Security is complicated
| as hell and you cannot possibly devise a framework that
| accounts for all possible things so this framework is
| necessarily brittle. For example, you might reasonably
| decide that the highest payouts require very minimal
| attacker capabilities (fully remote unauthenticated attacks
| being the top payouts). This makes sense since those are
| the easiest attacks to mount.
|
| So now a bounty comes in. It goes to a triage person or, at
| best, a small group. They refer to the framework. Your bug
| doesn't really match any of the categories but it kind of
| looks like this thing over here so it gets bucketed as
| such. Maybe there is some discussion. Ultimately, the rules
| say "max payout requires unauthenticated remote attacks" so
| the payout ends up lower, even if the attack is exciting.
| Maybe somebody managing the system takes a note to update
| the framework and policy moving forward. The community then
| rages about how this bug is actually a big deal and
| deserves a lot of reward.
|
| In my experience, the people managing these programs get
| rewarded based on the amount they pay out _going up_ , not
| down. But you need a payment framework otherwise each bug
| is paid out on somebody's whim (and trust me, the security
| researchers will complain to high heaven if they perceive
| inconsistency in bounty sizes). So you end up with novel
| bug structures that aren't handled well by the framework
| and get treated weirdly.
| eklitzke wrote:
| I'm still surprised by the $10k payout given the
| disclosure timeline. For example they indicate that on
| 2022-05-04 there was a "conference call between Quarkslab
| engineers, Google Android Security Team members, and a
| Titan engineering team member." Since there were both
| Android security team members and a Titan engineer in the
| phone call there were clearly engineers in the loop who
| understood the technical details of the issue as well as
| the severity. The initial $10k payout was done on
| 2022-06-07, a month later.
|
| One would imagine that this would have been escalated to
| some pretty senior security folks at Google before the
| payout was decided. That would mean that there would be
| some amount of discretion on Google's end as to the
| payout, since there would (presumably) be someone senior
| enough to look at this closely and authorize a higher
| amount even if there was some rubric that might seem to
| award a lower amount. Obviously this is ultimately what
| happened, as they eventually did increase the payout.
| It's a little strange to me though that this wasn't done
| sooner.
| UncleMeat wrote:
| > One would imagine that this would have been escalated
| to some pretty senior security folks at Google before the
| payout was decided.
|
| Bug bounties are routine. "How much do you want to pay
| out" is way down the list of things that leadership is
| focused on for these things. "How do we mitigate this"
| and "how does the researcher get paid" are often
| questions owned by different people and teams. Directors
| aren't swooping in to make payout decisions.
| medo-bear wrote:
| i would imagine higher bounty would be for extracting device
| keys. from my reading this exploit would not allow you to
| extract the key needed to unlock a powered off device. im no
| security expert so please correct me if im wrong
| ZeroCool2u wrote:
| >2022-06-20: Quarkslab sent Google a new exploit that
| demonstrates code execution on the chip and exfiltration of
| encryption keys from it. A detailed description of the
| exploitation technique, the exploit's source code, and a
| video showing its use to exfiltrate a StrongBox-protected AES
| key were provided.
|
| This sounds close enough to me, but perhaps there's some
| subtle nuance between device keys and other keys in the chip.
| medo-bear wrote:
| > perhaps there's some subtle nuance between device keys
| and other keys in the chip.
|
| Thats what im wondering too. particularly this line from
| mitigations section from the report
|
| >> However, we do want to point out an interesting feature
| that would have made the StrongBox key blob leak
| impossible. Indeed, an application can create a key that is
| authentication-bound, specifying
| setUserAuthenticationRequired(true) when building it with
| KeyGenParameterSpec. This way, users need to authenticate
| before using the key and the key blob is encrypted a second
| time using a special key derived from the user password
| that we do not have.
|
| so unless your phone doesnt have a password i dont see how
| they can retrieve device keys
| ZeroCool2u wrote:
| Yeah, okay I definitely missed that part. Makes more
| sense. Still, the bounty seems shockingly low. They
| probably could've gotten a lot more for it on the open
| market.
| medo-bear wrote:
| yeah i think its a noteworthy attack and defn worth more
| than 10K. also google should be more open about how bug
| bounties are evaluated
| lnyng wrote:
| For a moment I thought this is something related to the anime
| [deleted]
| Stevvo wrote:
| Myself pulled in hoping for a historical tale of attacking
| vulnerabilities in Titan ICBMs.
| flyaway123 wrote:
| Just curious - if the usage of acronym here a soft-euphemism?
| (So that only those who knew or care to know gets it)
|
| edit: Thanks for the clarifications. That helps. I'm asking
| for 2 reasons: 1. Discussing "nukes" openly
| where I come from would raise some eyebrows. 2. I
| see acronyms used on HN frequently - sometimes ambiguously
| even considering the context.
| HideousKojima wrote:
| I don't think so, people could just say "nukes" but there
| are plenty of nukes that aren't capable of hitting targets
| halfway around the world. ICBM seems like the fastest way
| of saying "nukes that can hit stuff really far away" while
| also making a distinction from sub launched and cruise
| missile launched nukes.
| tialaramex wrote:
| The payload on the missile doesn't need to be a nuclear
| weapon of any sort. The important thing about the ICBM is
| that being further away doesn't stop it. Nuclear weapons
| are the obvious choice because it's hard to imagine _why_
| you want to strike something so very far away, at such
| great expense, with conventional explosives.
|
| The German V2 rocket from World War II has a maximum
| range of about 320km. So you literally can't fire one
| from say Berlin to London. They were actually launched
| from coastal sites in the Netherlands and other occupied
| countries, and as the Allies took territory after
| Overlord, the targets changed to cities nearer Germany
| because the launchers were pulled back.
| anamexis wrote:
| ICBM is a very common term. For example, the NY Times uses
| it in headlines.
| https://www.nytimes.com/2022/05/24/world/asia/north-korea-
| ba...
| its_bbq wrote:
| That would be "only one _bite_ "
| [deleted]
| 2OEH8eoCRo0 wrote:
| > As a reminder, there are two conditions to perform this attack.
| First, we need to be able to send commands to the chip, either
| from a rooted device (required to use nosclient), or physically
| accessing the SPI bus.
|
| > Then, we need a way to access the key blobs on the Android file
| system, which can be done again by being root, or with some
| exploit to bypass File Based Encryption or the uid access
| control.
| userbinator wrote:
| A lot of software can be cracked "with only one byte". Finding
| which one is the hard part.
|
| Don't lose sight of the fact that the purpose of this and other
| TPM-like devices is to hide secrets from its owner.
| tialaramex wrote:
| > hide secrets from its owner.
|
| It makes sense to use exactly the same technology even if you
| are "the owner" unless you are somehow only ever running
| software you wrote on data you obtained, and maybe not even
| then if other people are able to influence that data.
|
| Most of use a lot of software we didn't write, to process data
| we got from some third party who may or may not have our best
| interests in mind.
| ClumsyPilot wrote:
| well there are cases where owner and current user aren't one
| and the same. Like an ATM terminal, or when someone else is
| using your computer.
|
| The whole security landscape seems to have many catch-22's
| IncRnd wrote:
| > Don't lose sight of the fact that the purpose of this and
| other TPM-like devices is to hide secrets from its owner.
|
| That's a complete misunderstanding of a TPM's security model. A
| TPM guards against key theft in a compromised environment by
| securely storing artifacts and authenticating the platform. The
| user doesn't enter this threat picture. It is the platform that
| gets authenticated, not the user.
| nyanpasu64 wrote:
| SafetyNet and "authenticating the platform" is used for
| remote attestation, so app authors can deny phones access to
| apps and services if they're running root/kernel-level CPU
| code chosen by the user (or also an attacker without access
| to the phone's OS signing keys) rather than by the phone's
| manufacturer.
| IncRnd wrote:
| That's a great example. Thanks!
| rob_c wrote:
| Very cool.
|
| I wonder why companies still leave the UART pins accessible. Fine
| they're on the chip, but just remove the trace and slow down
| attack evolution is worth the cost of a board revision surely...
| dmitrygr wrote:
| If _THAT_ is your idea of security, i hope you do not work on
| any hardware whose security matters. First thing anyone would
| do is find the pins and connect to them. It buys you nothing,
| and if anything tells me that i should go look for them.
|
| Visible and labeled UART pins tell me that you've (hopefully)
| thought through the consequences of me having access to them.
| Hidden UART tells me that most likely nobody ever gave that
| half a thought.
| rob_c wrote:
| Do you have __any idea__ how difficult removing the chip and
| re-surface mounting it for an attack...
|
| Removing the trace means an extra step which is the whole
| point. Ffs
| dmitrygr wrote:
| Yes i do. done it. at home. for fun. Which means anyone
| motivated to do it can easily get it done too..
| rob_c wrote:
| With data intact after etching a custom PCB for a custom
| chip? I'd be impressed if that skillset overlaps with
| someone hacking bytecode
| nickzana wrote:
| Isn't it better to leave them exposed and make it easier for
| security researchers who genuinely want to test the chip?
| Someone interested in and capable of developing and
| using/selling an exploit won't be deterred by needing a special
| cable to get a UART console, whereas a security researcher
| might appreciate the simpler access.
|
| So long as it doesn't weaken the actual security model,
| companies should make their products as easy to analyze as
| possible imo.
| rob_c wrote:
| Then sell a dev unit surely, not a consumer grade device...
___________________________________________________________________
(page generated 2022-08-15 23:01 UTC)