[HN Gopher] TKey is a RISC-V computer in a USB-C case, that can ...
___________________________________________________________________
TKey is a RISC-V computer in a USB-C case, that can run security
applications
Author : jandeboevrie
Score : 92 points
Date : 2023-12-25 17:26 UTC (5 hours ago)
(HTM) web link (dev.tillitis.se)
(TXT) w3m dump (dev.tillitis.se)
| goodpoint wrote:
| Way too expensive.
| SpaceNoodled wrote:
| And they couldn't even put application ROM on there.
| rdl wrote:
| Qty 1, $80 seems tolerable for a dev device. They should
| explicitly list 10 for $500 and 100 for $3000 or something
| though.
|
| This is competing with $5-30K HSMs as well as $30-400 "hardware
| wallets" or second factor keys, as well as (free) embedded
| secure enclaves.
| rwmj wrote:
| I would guess because of the FPGA. The advantage is it's
| (somewhat, arguably) more secure than a RISC-V ASIC from some
| random vendor.
| charcircuit wrote:
| Do not forget that they also need to recoup the $$$$$ spent
| getting it certified. For niche electronics this can be a big
| percentage of the price. Making something into an actual
| commercial product, especially globally, results in extra
| markup.
| treyd wrote:
| I'm kinda skeptical of security devices like this that don't have
| their own screens that let you know what you're authenticating.
| Obviously there's some applications where it's fine but if the PC
| it's plugged into is compromised it could be MITMing the auth.
| gorkish wrote:
| Yes agree the end user needs feedback at the authenticator.
| This device has an RGB LED and a touch sensor, so there are
| still some possibilities.
| rdl wrote:
| I'd put a key inside my app which then talks out of band to
| multiple devices (including maybe a trusted one?) -- e.g.
| "run app on the device, request input using the directly-
| connected PC, message tunneled out to my phone for
| confirmation")
|
| A screen and 2-3 buttons would be nice, though; also a
| tamper-evident/tamper-responding package. But my main desired
| use case for this is a lightweight app-specific HSM so I'd
| just need to validate integrity of device before connecting
| it to host. USB Armory II is the main alternative.
| fmajid wrote:
| Have a look at Bunnie Huang's Precursor project, but it's much
| more expensive.
| rdl wrote:
| If they could add some hardware tamper-evidence/tamper-response
| (design-to-meet FIPS 140-3 level 3 or EAL whatever, although no
| need to actually get certified), this could be super useful as a
| cheap application-specific HSM (which can run app logic inside,
| rather than the lobotomized/zombie signing oracle PKCS "HSM" use
| which is common.)
|
| Obviously can prototype without this functionality, and can build
| out the toolchain/etc. I'm a lot more excited about Tropic Square
| than most alternatives, but this is shipping now.
| kfreds wrote:
| Regarding FIPS 140-2 level 2 tamper evidence and level 3 tamper
| response I'm interested to hear what you (and others here on
| HN) value and why.
|
| Level 2 can be accomplished with nail polish and glitter, or
| plastic potting. Yubikey's potting is a good example of level 2
| tamper evidence.
|
| For level 3 tamper response, e.g. of a rack-mounted server
| case, it is enough to put micro-switches under the lid. For
| that type of product level 2 tamper evidence could be
| accomplished by covering the server case's screws with copper
| cans stamped with serial numbers.
|
| When I first learned what is actually required for FIPS 140-2
| level 2 and 3 I thought it was security theater. Then I
| realized that in practice such a server case will be (1) in a
| locked server cabinet, in a server room with access control,
| within a building with access control. Add a two-person
| requirement for entering the room and unlocking the cabinet,
| and the attacker will have to be quite sophisticated and
| capable to not be discovered way before any tamper evidence or
| tamper response comes into play.
|
| In other words, if you don't care about FIPS for regulatory
| reasons, or insurance reasons, or company policy reasons, then
| how much does level 3 tamper response really matter?
|
| Regarding CC EAL, I'll simply say that I don't believe there is
| any other hardware security product that is more open source
| hardware and software than the TKey. Thanks to it being FPGA-
| based it is somewhat protected from various attacks on the
| supply chain. If you're looking to do an Ed25519 signature I
| don't believe there is a single device more simple or easily
| verifiable than the TKey and it's Ed25519 signer application.
| I'd love to be proven wrong.
| wepple wrote:
| Can anyone comment on the design decision to not store device
| firmware? Is this common?
|
| I would've assumed it would generally be safer to have permanent
| but updatable firmware, to reduce the attack surface of malicious
| firmware loadings
| wepple wrote:
| I guess as a follow-up, you possibly can't actually "update"
| firmware for this, without it creating entirely new keys that
| would have to be registered with every application
| kfreds wrote:
| You're mostly correct. The immutable firmware derives unique
| key material for each device+application combination.
| However, nothing prevents the loaded application from loading
| another application. This allows the developer to construct
| their own update logic. For instance, the first application
| could simply exist to verify a digital signature over some
| hash, then load an application, hash it, compare it to the
| trusted hash, and then execute it. The first application
| could hand over its own key material.
| kfreds wrote:
| Sure. Specifically regarding not storing firmware: The TKey
| does have roughly 6 KB of immutable firmware which is
| responsible for loading device applications sent by the host.
| This firmware receives and then hashes the application together
| with the Unique per Device Secret. The resulting hash is handed
| to the loaded application as key material, which is now unique
| per device+application combination. This model doesn't prevent
| storing applications on the device, but to enable the user to
| have flexibility to load other apps we would need to change
| things a bit.
|
| Regarding security: This is a question of measured vs verified
| boot. Until the TKey, users of USB authenticators had to choose
| between security and flexibility. Yubikey is not updatable at
| all, but several USB authenticators have updatable firmware
| using signed firmware updates. Some of them are open source
| software, and of those a few are open source hardware in the
| sense that they provide BOM, schematics and PCB design. We do
| all of the above, and in addition we provide the HDL / "chip
| design" / FPGA configuration. In any case, having a USB
| authenticator which is updatable using a signed update begs the
| question of who is allowed to sign updates, which then leads to
| central control and a long tail of interesting use cases not
| being signed. TKey aims to combine security, openness and
| flexibility. We accomplish this thanks to unconditional
| measured boot as a KDF, as opposed to verified boot.
|
| There is more information on our website.
| rwmj wrote:
| Which FPGA are they using?
|
| Edit: There's a lot to like here, but a lot that is confusing. An
| FPGA based version of PicoPV32 could be really secure. Your
| attack vector would be the FPGA vendor doing something at the
| hardware level (hard to pull off), or the toolchain being
| compromised.
|
| But what FPGA it is matters, also what toolchain they are using
| to programme the FPGA (yosys is not mentioned ...). The whole
| "locked down" FPGA bitstream sounds very fishy as well.
|
| PicoRV32 does fit into Lattice FPGAs and they are fully reverse-
| engineered and supported by yosys.
| kfreds wrote:
| FPGA: Lattice ice40up5k
|
| Toolchain: Yosys.
|
| For the convenience of most end users we configure and lock the
| FPGA. This allows them to start using it right away. The core
| cryptographic technology relies on a Unique per Device Secret
| (UDS). If we didn't lock the FPGA's configuration memory from
| reads a physical attacker would be able to read it out in
| seconds.
|
| Users that want to provision their own hardware design / FPGA
| configuration / bitstream can simply buy the TKey Unlocked and
| the TKey Programmer. They can then configure the on-die OTP
| NVCM and lock the FPGA themselves. Configuration and locking of
| the ice40up5k was not possible to do with open tooling until we
| made it happen, as part of the project to create the TKey.
|
| Since you seem knowledgeable it might interest you that:
|
| * The OTCP NVCM uses antifuse technology, so it's most likely
| not possible to read out the UDS with an electron microscope.
| The physical attacker will have to circumvent the locking
| mechanism and read out the NVCM through probing.
|
| * One of the pins can be used to toggle SPI slave configuration
| mode even after NVCM has been configured and locked. This
| allows a physical attacker to configure their own bitstream.
| Unfortunately EBR and SPRAM also keep their state across warm
| reboots. As mitigations we (1) store the UDS in LCs until it is
| used by the KD, (2) use our TRNG to randomize when the UDS
| readout happens, (3) accelerate the hashing (Blake2s G
| function) in LCs, (4) randomize address and data layout using a
| non-cryptographic PRP, and some other things I don't remember
| at the moment. Depending on the user's security concerns we
| recommend the use of a user-supplied secret in addition to the
| UDS. In that case the TKey by itself doesn't contain all the
| key material, making a physical attack insufficient. The KDF
| can be read in the manual.
|
| Edit: Clarified _physical_ attacker. Added details about the
| chip.
| wslh wrote:
| Genuine question: how many transistors and/or logic gates do we
| need to perform ECDSA and feel more secure than using FPGA or
| other security elements? I see that security elements used by
| Ledger crypto wallets is the [1] and an ASIC one would be
| better to reduce the attack vector? But do they could have
| memory inside or just cache? I don't know too much about
| electronics.
|
| [1]
| https://octopart.com/stm32wb55ccu6-stmicroelectronics-100293...
| kfreds wrote:
| The answer is, as always, it depends. I'll do my best to
| characterize the problem:
|
| If we only care about minimizing logic gates we could use
| SERV, the world's smallest RV32 core, and run a bare metal
| ECDSA implementation on it. Let's use it without the M
| extension, so RV32I. I'm not sure what SERV's max clock
| frequency is, but assuming we configured it on the ice40 and
| it runs at 40 MHz I'm guessing a single ECDSA signature would
| take hours to comput.
|
| Due to the math involved in ECC it is quite challenging to
| make a "hardware-only" ECC signer. The ones I've seen are
| effectively ECC accelerators with some kind of state machine
| or microcode to run the algorithm.
|
| In the case of TKey we use picoRV32 configured as RV32ICZmmul
| (multiply without divide). We use the FPGA's DSPs to
| accelerate multiplication. On the TKey an Ed25519 signature
| takes less than a second, which we believe is acceptable for
| many use cases, and I'm willing to bet there is no Ed25519
| signer that is more open source hardware and software than
| the TKey.
|
| As GP points out using an FPGA is in fact an excellent way to
| mitigate various supply chain attacks. It's like hardware
| ASLR, to paraphrase bunnie in his CCC talk.
| wslh wrote:
| Thank you! BTW I assume you will support U2F, Crypto, etc
| in the future? or do you expect third parties to develop on
| it?
|
| From a quick glance at the product it seems I should buy
| the unlocked to have full control of the device and in the
| future could be a device with a display and some more
| sensors and/or buttons to know what I am signing in?
|
| I am currently in South America so waiting to travel to one
| of your shipping locations to buy several TKeys.
| kfreds wrote:
| I believe we already have a U2F prototype for Linux. In
| general we are quite selective about which applications
| we take on development and maintenance responsibilities
| for.
|
| Given that this is the most open source hardware USB
| authenticator we hope the communities that value this
| level of openness, design assurance and design
| verifiability will adopt the TKey and build whatever
| applications they need for it. Having said that we see
| lots of opportunities for us to make it easier for
| developers to build what they need.
| 70rd wrote:
| Supply chain attacks can be mitigated with "golden chip
| analysis", you destructively analyse a known good chip after
| measuring various power and timing benchmarks across
| adversarial configurations, and repeat those measurements
| across all future chips and check they are within margin of
| error.
| josephcsible wrote:
| > Note well: In the end-user version (not TKey Unlocked) the FPGA
| configuration is locked down. This means you cannot change the
| FPGA bitstream or read out the bitstream (or the Unique Device
| Secret, UDS) from the configuration memory, even if you break the
| case and insert it into a programmer board.
|
| That seems like it's only useful for "security" against the
| owner, rather than for any legitimate form of security.
| AVincentInSpace wrote:
| How so?
| josephcsible wrote:
| Consider a DRM system that does challenge/response against
| said secret so that only one computer at a time can use
| something. As the owner of this device, if I want to clone
| it, I should be able to.
| its-summertime wrote:
| You can buy an unlocked version and clone it, no?
| charcircuit wrote:
| But then the device could be used by 2 computers at once
| defeating the security. Another use case would be as a
| second factor of authentication. Making it impossible to
| clone is essential for a "something you own" factor.
| FiloSottile wrote:
| That's the whole point of a TKey as a security device: the
| secret available to an application depends on both the device
| it's running on and the application, and can't be extracted, so
| you can do things like "sign a blob only if it follows these
| rules" and enforce it in hardware. If the device wasn't locked,
| you could just... change the rules.
|
| How is that not a legitimate form of security?
| josephcsible wrote:
| If it's my device, then I should be able to change the rules.
| kfreds wrote:
| You're more than welcome to change the rules. Please read
| my other comments in this thread, and you'll hopefully find
| answers to your concerns. The TKey Unlocked gives you all
| the control you're asking for.
| teruakohatu wrote:
| If you can change the rules then someone else can also
| change the rules.
|
| And any ability to changes the rules will greatly increase
| the surface area of attack.
| ooterness wrote:
| Buy the unlocked version, program it and lock it down yourself.
| kfreds wrote:
| I created the TKey together with my colleagues. AMA. :)
| fsflover wrote:
| How is it better than Precursor?
| https://www.crowdsupply.com/sutajio-kosagi/precursor
| INTPenis wrote:
| I'm not affiliated with either but I just looked at the
| precursor page because I'm a huge nerd and love crowdfunded
| projects.
|
| They seem to be in slightly different leagues. Tkey is small
| and meant to be easily carried with you to authenticate with
| ssh, gpg and more. While precursor is a full development
| board that can probably do everything tkey can, and more.
| kfreds wrote:
| Whether TKey or Precursor is better depends on your needs.
| Here are some differences between them:
|
| * TKey uses open tooling for everything. Precursor uses
| Vivado.
|
| * Precursor has a screen and a keyboard, allowing the user to
| interact directly with the trusted device in a completely
| different way than the TKey.
|
| * The Precursor is bigger than the TKey.
|
| * The Precursor costs much more than the TKey.
|
| Finally I'd just like to mentiont that bunnie and xobs work
| on Precursor as well as their other projects have been a
| great inspiration to the TKey project.
| fsflover wrote:
| Thank you!
| 70rd wrote:
| Different product at a different price point with different
| objectives. Security keys are suitable for mass deployment
| across an entire workforce, precursor aims to be a
| launchboard for general purpose secure and trustable
| computation. Even at mass production prices, precursor would
| likely still cost more than 100$.
| xw3089 wrote:
| Genuinely curious, what's the argument for locking down the
| FPGA?
| adastra22 wrote:
| What assurance do I have (I'm speaking cryptographically) that
| the per-device secret isn't known to you?
| 7e wrote:
| "There is no way of storing a device application (or any other
| data) on the TKey. A device app has to be loaded onto the TKey
| every time you plug it in."
| 5- wrote:
| hardware wise, looks to be a usbc version of fomu.
|
| https://www.crowdsupply.com/sutajio-kosagi/fomu
| kfreds wrote:
| One major difference between TKey and Fomu is that Fomu uses
| the ice40 for USB hardware logic as well as USB device
| firmware. TKey uses a dedicated USB chip (CH552, which comes
| with open source firmware).
|
| Fomu could likely not fit USB logic as well as the security-
| related cores we have. That, and the fact that we don't want
| the attack surface of the USB firmware running in the same core
| that handles the KDF.
|
| They are very different products for very different use cases.
| 5- wrote:
| thanks, that's an important distinction.
|
| i like the conceptual purity of fomu, but it does take quite
| a bit of work to get to the point of talking to it over usb.
___________________________________________________________________
(page generated 2023-12-25 23:00 UTC)