[HN Gopher] Apple, Google and Microsoft Commit to Expanded Suppo...
___________________________________________________________________
Apple, Google and Microsoft Commit to Expanded Support for FIDO
Standard
Author : KoftaBob
Score : 308 points
Date : 2022-05-05 13:13 UTC (9 hours ago)
(HTM) web link (fidoalliance.org)
(TXT) w3m dump (fidoalliance.org)
| [deleted]
| Andrew_nenakhov wrote:
| Played with the Yubikeys a couple of days ago. Rather nice
| thingies that are very easy to lose somewhere.
| 0daystock wrote:
| It's reasonably safe to leave them connected to the devices you
| regularly authenticate from, unless your threat model includes
| an adversary willing to use physical attacks.
| oynqr wrote:
| Unless you have some way of authenticating all of your
| hardware with the key, taking it with you still leaves plenty
| of options for a physical attacker.
| Animats wrote:
| Does this imply that every service you connect to knows your
| unique identity?
| diffeomorphism wrote:
| No.
| NovemberWhiskey wrote:
| No; each site to which you authenticate gets a separate,
| probabilistically unique credential id.
| Mindless2112 wrote:
| No, each registration generates a new key pair; but maybe:
|
| > _The signature counter is a strictly monotonic counter and
| the intent is that a relying party can record the values and so
| notice if a private key has been duplicated, as the strictly-
| monotonic property will eventually be violated if multiple,
| independent copies of the key are used._
|
| > _There are numerous problems with this, however. Firstly,
| recall that CTAP1 tokens have very little state in order to
| keep costs down. Because of that, all tokens that I'm aware of
| have a single, global counter shared by all keys created by the
| device. [...] This means that the value and growth rate of the
| counter is a trackable signal that's transmitted to all sites
| that the token is used to login with._
|
| http://www.imperialviolet.org/2018/03/27/webauthn.html
| account-5 wrote:
| Genuine question. What benefit does this FIDO system provide over
| my password manager?
|
| I've read the what is Fido page and what Fido does page. I don't
| get it.
|
| It seems like the device is a single point of failure.
| criddell wrote:
| It seems like FIDO is going to win the race for the next
| generation authentication scheme. I was rooting for Steve
| Gibson's SQRL.
| davidkhess wrote:
| Last time I looked at SQRL, it had an unfixable man-in-the-
| middle problem.
|
| Though, general acceptance of QR Codes does seem to have
| finally taken off.
| jeroenhd wrote:
| I hope this cross device system will be cross platform, but I
| wouldn't be surprised if you could only choose between macOS/iOS,
| Chrome/Chrome, or Edge/Edge sync.
|
| Funnily enough, a system for signing web authentication requests
| from a mobile device is far from new: I've been using
| https://krypt.co/ for years (though it's on the long road of
| sunsetting right now) and I hope that will last long enough for
| the new cross device standard to replace it.
| sdfgdfgbsdfg wrote:
| It won't, at least not in the short term. For that to happen
| trusted platform modules would need an api to export a private
| key wrapped with a certificate signed by (none/one/all/a
| quorum) of members in the circle of trust and itself. This will
| need standardizing. Only apple has implemented it so far
| because it has total control of their ecosystem. I think for
| Windows and Chrome to work like this, they'll need to start
| requiring TPM vendors to implement this in their drivers, but I
| can't see it being cross compatible with the API in the apple
| TPM any time soon, especially because the circle of trust is
| now as weak as the weakest TPM, and it's a reputation risk for
| apple if a credential gets compromised because some non-apple
| device trusted by the user in an apple circle of trust got
| breached
| jeroenhd wrote:
| I think the first iteration of this system will definitely
| receive the synced key material in RAM.
|
| It's possible that the TPM spec will be updated to allow for
| loading pre-encrypted data into the TPM store as a response
| to this. Alternatively, existing secure computing systems
| (SGX/TrustZone) can also be used to decrypt the synchronised
| key _relatively_ securely.
| sdfgdfgbsdfg wrote:
| receiving synced key material in RAM significantly alters
| the threat model. Apple's current passkey implementation
| does not, at any point, handle unwrapped key material in
| the operating system. I expect all other implementations to
| follow.
| blibble wrote:
| TPMs don't generally store encrypted data (bar their master
| key)
|
| instead they wrap/seal everything instead with a layer of
| crypto, then you can pass that wrapped object around as
| much as you want, only the TPM can unseal it
|
| a TPM could easily be instructed to seal an internally
| generated secret with additional escrow keys for
| MS/Apple/...
|
| that plus remote attestation could make it so you can never
| see the key in the clear
| jeroenhd wrote:
| As far as my understanding goes this sealed secret is
| device specific and connected to the TPM master key. That
| would mean you could pass it around, but you'd need to
| have the blob on the device itself to actually use it.
|
| The problem is that you need private/public key pairs
| that are synchronised across devices for FIDO to work
| properly cross-device. When you register an account on
| your phone, you need that account key on your desktop to
| use it there, and that's nearly impossible without some
| kind of key sharing mechanism.
| sdfgdfgbsdfg wrote:
| Yes but what the OP is saying is that the TPM does not
| store the encrypted passkey, rather, the passkey is
| wrapped with this TPM's public key by another TPM that
| already trusts this TPM, so this TPM can import a passkey
| that's been wrapped with its own public key and store it
| unencrypted. See Apple's circle of trust:
| https://support.apple.com/guide/security/secure-keychain-
| syn...
| jeroenhd wrote:
| I understand that, but that's not supported by any
| current standard as far as I know. We'll need a new TPM
| standard for this, which probably also means it will take
| years before every device supports this feature as modern
| computers can easily last five to seven years if you
| replace the batteries and don't cheap out. FIDO needs
| something that works now, or maybe tomorrow.
| blibble wrote:
| you can do it easily enough with the current TPM
| operations (2.0, not 1.2)
| sdfgdfgbsdfg wrote:
| Agreed, and that's why I say in my original comment that
| I don't see it happening in the short term. If we had
| something that worked now or maybe tomorrow and was
| acceptable, it would simply be virtual authenticators; an
| authenticator implemented entirely in software. There's
| no practical reason why password managers like 1Password
| can't do that beyond attestation which nobody checks
| anyway. But in the end, I don't see the big three
| participating in sharing. The threat model changes so
| much that especially for Microsoft (in cell phones) and
| Google (in desktops) that means trusting an adversarial
| OS they have no control over
| daenz wrote:
| It would be nice for Amazon to commit as well. AWS has support
| for only a single Yubikey, which is mostly useless, unless you
| don't care about being locked out of your account if you lose
| that one key.
| arianvanp wrote:
| They support multiple if you use AWS SSO instead of IAM
| directly though.
| aaaaaaaaata wrote:
| Yeah, they should be ashamed of themselves -- forget losing it,
| what if it fails...?
|
| It's bad practice to register just one hardware key, if your
| service has no side-doors.
| missosoup wrote:
| So their vision of the future is that to do anything online, one
| MUST have a phone (ahem, portable wiretap)? And they're going to
| be keeping my secrets for me, for my own good?
|
| I'm not sure I'm down with any of that.
| jeroenhd wrote:
| I doubt they'll do away with tools like smart cards or Yubikeys
| any time soon. Laptops and modern computers also contains a TPM
| so you don't necessarily need to have a phone for secrets
| storage.
|
| If push comes to shove, I'm sure someone will develop a
| lightweight Android emulation layer you can run in the cloud
| that pretends to be a phone enough that you can use it.
| dane-pgp wrote:
| > Laptops and modern computers also contains a TPM
|
| The root of trust for which extends to who knows where, and
| you're not allowed to look at the source code or learn how it
| works because that would threaten Hollywood's profit margins.
|
| We're basically building a system of DRM for access to human
| beings, and making the whole world dependent on these
| unaccountable entities.
| jeroenhd wrote:
| TPMs allow for arbitrary key storage by the operating
| system. They're not necessary for DRM. In fact, I've wiped
| my TPM several times to upgrade the firmware and I've had
| no trouble playing DRM content whatsoever.
|
| Technologies like Intel's management engine and SGX or
| their AMD/Qualcom/Apple counterparts are definitely
| problematic for user freedom in the way they're
| implemented. However, the TPM system itself is quite
| neutral: usually, you can clear it from the UEFI, lock it
| with a password (though that might need to be done from the
| OS) leaving whatever hostile OS you may run unable to exert
| any control on the device whatsoever.
|
| I'm personally a big fan of technologies like TPMs and
| secure boot as long as they're user configurable. I want to
| be able to install my own keys and force the system to only
| boot operating systems of my choice. Secure boot with just
| the MS keys is quite silly and ever since that one version
| of Grub could be exploited it's basically useless; secure
| boot with user or enterprise keys can be an incredible tool
| for defence in depth, for example when visiting countries
| where border agents may try to gain access to your data
| without your permission or knowledge (China, USA, etc.).
|
| If I had my way, I'd use Coreboot together with Secure
| Boot, with encryption keys stored in a TPM, the transfer of
| which goes through an encrypted channel (a feature of TPM
| 2.0 that nobody uses) after unlocking it with a password.
| Sadly, most Linux OS developers have a grudge against these
| technologies because they're used by companies such as
| Microsoft and Apple to reduce user freedom on some of their
| devices.
| nybble41 wrote:
| The user-hostile part of the TPM is the built-in key
| signed by the manufacturer which shows that it's an
| "approved" TPM which won't--for example--release any of
| the keys stored inside to the device's owner. This is
| what allows the TPM to be used as part of a DRM scheme.
|
| If it weren't for that small detail then I would agree
| that TPMs can be useful for secure key storage and the
| like, working for the device's owner and not against
| them. The actually _useful_ (to the owner) parts of the
| TPM do not require the manufacturer 's signature.
| blibble wrote:
| > Secure boot with just the MS keys is quite silly and
| ever since that one version of Grub could be exploited
| it's basically useless
|
| this isn't true: there's a hash blacklist which is
| (supposed) to be regularly updated by your OS update
| mechanism
|
| windows update does it anyway
| jeroenhd wrote:
| Is there a way to list this blacklist? I have several
| computers which haven't received updates in years and I
| strongly doubt that the internal blacklist has been
| updated.
| blibble wrote:
| mokutil --dbx
|
| official list is here:
| https://uefi.org/revocationlistfile
|
| (I have my own root configured for all of my machines so
| only stuff I've signed can boot)
| 0daystock wrote:
| My vision of future authentication (shared by colleagues in
| security) is based in strong hardware credentials and
| additional layer-7 context about identity, device and location.
| Basically, more identification of you and your browser using
| cryptographically-guaranteed and immutable events. It is
| actually the deprecation of passwords altogether and generally
| moving the trust boundary away from the control of the user
| entirely. I also don't enjoy it, but it would solve a lot of
| current problems we see in information security.
| [deleted]
| dane-pgp wrote:
| > additional layer-7 context about identity, ... more
| identification of you
|
| Mass surveillance. You can just say mass surveillance.
| 0daystock wrote:
| Every technology is a double-edged sword. Like firearms,
| security controls can be used to guarantee peace and
| freedom or wage war and distress. The responsibility is
| with the administrator of that tool, not the tool itself.
| xdennis wrote:
| I don't know if you're being sarcastic, but your vision
| sounds like a nightmare and not very far removed from
| Gattaca.
|
| > moving the trust boundary away from the control of the user
| entirely. I also don't enjoy it, but it would solve a lot of
| current problems we see in information security.
|
| Every despot throughout history has noted that freedom can be
| traded for security, but I thought that most of us would
| agree that freedom is more important.
| 0daystock wrote:
| Society is replete with trade-offs sacrificing freedom for
| collective security. You can make moral judgements about
| this all day, but it won't change the dynamics of our
| lives.
| anthony_r wrote:
| It's literally the opposite. You "must" have a cryptographic
| device (a dongle) that is only doing that one thing,
| authentication. Doesn't have a built in radio (unless for NFC,
| if you want it), doesn't have any microphone or camera, doesn't
| store any data beyond what's needed to authenticate, doesn't
| communicate except to authenticate - bi-directionally, so
| phishing is no longer a thing, or at least it's a lot harder.
|
| It's very hard to make a privacy case against FIDO. Practically
| speaking it's one of the best things that happened to
| privacy&security since the invention of asymmetric
| cryptography. The deployment of this tech reduces phishing
| effectiveness to near zero, or in many cases literally zero.
| eMGm4D0zgUAVXc7 wrote:
| > It's very hard to make a privacy case against FIDO.
|
| With username and password, I have full control over my
| privacy in a very easy to understand fashion: If I randomly
| generate them I know I cannot be tracked (as long as I ensure
| my browser doesn't allow it by other means).
|
| With those keys I have a opaque piece of hardware which
| transfers an opaque set of data to each website I use and I
| have NO idea what data that is because I do not manually type
| it in. I need to trust the hardware.
|
| Sure, I could read the standard, but it very likely is
| complex enough that it is impossible to understand and trust
| for someone who has no crypto background.
|
| And I also have no guarantee that the hardware obeys the
| standard. It might violate it in a way which makes tracking
| possible. Which is rather likely, because why else would big
| tech companies push this if it didn't benefit them in some
| way?
| oynqr wrote:
| How about using one of the open hardware + open software
| security keys?
| kccqzy wrote:
| I think you brought up something very important:
| explainability.
| anthony_r wrote:
| > Which is rather likely, because why else would big tech
| companies push this if it didn't benefit them in some way?
|
| They switched to this internally a long time ago which
| basically eliminated phishing attacks against employees.
| There are security teams inside those megacorps that have a
| general objective of reducing the number of account
| takeovers, and non trivial resources to accomplish that.
| Not everything is a conspiracy.
|
| Also, I am sure you will be able to stick to just passwords
| for a pretty long time while the world moves on to
| cryptographic authentication. I'm not being sarcastic here.
| danuker wrote:
| > There are security teams inside those megacorps that
| have a general objective of reducing the number of
| account takeovers
|
| Said security teams have at most zero incentive that the
| privacy of the policy subjects is preserved.
| matheusmoreira wrote:
| > There are security teams inside those megacorps that
| have a general objective of reducing the number of
| account takeovers
|
| The same corporations that routinely intercept all
| network traffic.
| stjohnswarts wrote:
| The primary case for FIDO is a company like google or apple
| revoking your access and they have no/limited ways of
| recovering your account.
| BluSyn wrote:
| Doesn't require phone? Supported by desktop browsers also.
| Third party "auth managers" should be possible -- likely
| integrated into existing password managers?
| Ajedi32 wrote:
| This is huge! It sounds like they're _finally_ going implement
| cross-device synced credentials; a move I 've been advocating now
| for the last two and a half years[1].
|
| Widespread support for this feature is, in my opinion, the last
| thing needed to make WebAuthn viable as a complete replacement
| for passwords on the web.
|
| The white paper is here: https://media.fidoalliance.org/wp-
| content/uploads/2022/03/Ho... Seems like they announced this back
| in March and I missed it somehow.
|
| [1]:
| https://hn.algolia.com/?query=ajedi32%20webauthn&type=commen...
| PostThisTooFast wrote:
| eulers_secret wrote:
| Is there any way to use this system without an extra device (no
| phone, no key, only my pc)?
|
| If not, is there a FOSS implementation of these required new
| devices? Maybe an emulator for one?
|
| Can I download an manage my own keys, manually?
|
| Can I self-host the authentication layers so I don't need to use
| a 3rd party?
| sjustinas wrote:
| > Is there any way to use this system without an extra device
| (no phone, no key, only my pc)?
|
| I've used rust-u2f in the past, although it seems to be Linux
| only. https://github.com/danstiner/rust-u2f
| eulers_secret wrote:
| Thanks, I'll check it out!
|
| Looks like it answers some of my concerns. Super cool!
| austinbv wrote:
| The problem with any key based auth or biometric auth is a user
| can be compelled by LEO to hand over private keys or open a
| biometric lock.
|
| Passwords are protected by the 5th amendment.
| devwastaken wrote:
| You can be compelled by the court to divulge passwords. It's
| one of those areas of interpretation of law and there's
| precedent against it as can be searched for.
| rad88 wrote:
| The litigation on that matter is ongoing. What you said is not
| true right now. If you try to fight an order for your password,
| you'll wind up in court and probably lose, and then have to
| chose whether to act in contempt.
| babypuncher wrote:
| For most people living in a western democracy, this is a pretty
| minor consideration to their threat model.
|
| Most people default to what is easiest. Before TouchID, most
| iPhone users did not lock their phones with a password. Making
| biometrics readily available and default means more people are
| walking around with more secure devices than would be if we
| only encouraged people to use the absolute most secure options
| available.
| jbverschoor wrote:
| For apple devices the keys are stored in a secure element. You
| _need_ your password to access when booting, or after certain
| timeouts. Until then you can't use faceid /touchid
| la6472 wrote:
| Why do we need another AuthN protocol? We should extend OIDC as
| needed instead of _again_ trying to reinvent the wheel.
| drdaeman wrote:
| In WebAuthn you're actually in possession of your own
| identity (or, to be more precise, your identity is
| established between you and website).
|
| In OpenID, OAuth and OpenID Connect the paradigm is
| completely different, where your identity is provided by
| someone else.
| bdamm wrote:
| Because the interaction with the hardware authenticator is
| local.
|
| OIDC and WebAuthn can work together.
| [deleted]
| stingraycharles wrote:
| The standard answers for these things is to use both; they're
| not mutually exclusive, and for important things you almost
| certainly want both.
| staticassertion wrote:
| The expansion mentioned in the article is explicitly
| passwordless.
| [deleted]
| danuker wrote:
| > Passcodes can therefore be compelled if their existence,
| possession and authentication are "foregone conclusions," the
| court said in the August 2020 ruling, determining the 5th
| Amendment's foregone conclusion exception applied in the case.
|
| https://www.reuters.com/business/legal/us-supreme-court-nixe...
| matheusmoreira wrote:
| What if you forget the password?
| Wohlf wrote:
| Same as if you forget your safe combination, you're charged
| with contempt of court.
| [deleted]
| lnxg33k1 wrote:
| I think the main problem I'm never buying into Fido keys
| anymore is that mine point blank stopped working and I had to
| sweat to get back in website that supported it, hopefully back
| then not many, but if identity is the responsibility of a close
| piece of hardware if it breaks you're out
| bdamm wrote:
| Normally you can assign multiple keys to one identity. That's
| baked into WebAuthn and pretty much all the implementations
| I've seen do it.
| Rafert wrote:
| The actual exchange with the server is using public key
| cryptography. How you unlock the key material locally could be
| a number of ways: PIN, password, fingerprint scan, voice
| recognition, etc
| mordae wrote:
| LOL, I was expecting the title to continue "... Fraud, According
| to $EU_COUNTRY Government". Seems like I don't trust them.
| danijelb wrote:
| I'm glad to see that the tech industry seems to be re-learning
| that creating and adopting interoperable standards is a way to go
| beefee wrote:
| I fear services will force the use of certain devices, like those
| on the FIDO certified products list [0]. Will there be a way to
| use open hardware, open firmware, and user-controlled hardware
| attestation keys? Or will that be considered a fraud/bot risk?
|
| [0] https://fidoalliance.org/certification/fido-certified-
| produc...
| brightball wrote:
| Finally!
|
| Assume every password for every user on your site has been
| leaked. What now?
|
| This dramatically reduces the potential exposure to phishing and
| password leaks.
| RandyRanderson wrote:
| Saved you a goog and pop-over:
|
| "FIDO ... is an open industry association ... whose stated
| mission is to ... help reduce the world's over-reliance on
| passwords"
|
| https://en.wikipedia.org/wiki/FIDO_Alliance
| earthboundkid wrote:
| Aw yeah, BBSes are back baby!
|
| One question, who is going to pay for all the long distance
| calls?
| [deleted]
| throwaway52022 wrote:
| I don't trust Google or Apple to be my main authentication
| provider, or to manage syncing my private key. Their customer
| service is terrible and they are way too arbitrary on locking
| folks out.
|
| I would trust my bank (well, my credit union.) I can go see them
| in person if I need to and they take my lawyer seriously, they
| also take security seriously, they're properly regulated, and
| ultimately they're my main concern if someone stole my
| credentials, so I'd like them to be on the hook for protecting my
| credentials.
| mavhc wrote:
| I wouldn't trust my bank to not give my account to someone
| pretending to have forgotten my password
| epistasis wrote:
| There is no comparison between Google and Apple customer
| support, and they should not be mentioned in the same sentence.
| Google support is nonexistent. With Apple, I can chat online or
| get in person support. They are more like a bank.
| [deleted]
| [deleted]
| maxwelldone wrote:
| > Their customer service is terrible
|
| Let me add my recent experience in the bucket. Few days ago I
| upgraded my legacy Workspace account to a business account. (I
| was in a time crunch; couldn't evaluate alternatives.) I enter
| my debit card details in the checkout and got a generic error
| message asking me to "try again later." Thought there was
| something wrong with their service and tried the next day. Same
| error. After some 15 minutes of searching forums, turns out
| debit card is not supported in my country on account of SMS
| based TOTP, which doesn't work for subscription services. (If
| they could mention it in the haystack of their help pages, why
| can't they say that right when I signup?)
|
| Anyway, more searching led to an alternative. There's an option
| to request invoiced billing where I would get a monthly bill &
| pay - debit card works here. Clicking that option took me to
| form. Filled it, got a call from a sales guy few hours later.
| Sadly, he had no clue about my problem, despite being from my
| country. On top of that he told me he's from a different team
| and don't deal with sales queries (WTF. Then why did he call
| me?). Told me he'd email me some options and, at that point I
| wasn't hopeful. Thought he would send me some stuff I had
| already seen on their forums. On seeing the said email, my
| disappointment sank even lower. The generic mail had absolutely
| nothing to do with my issue and the help urls were totally
| unrelated.
|
| I just ended up using my friend's credit card to complete the
| transaction. I'm seriously considering moving elsewhere.
|
| Is product management this pathetic at Google? I'm sure if you
| went for a PM interview they'd judge you nine ways to Sunday.
| For what? Everything Google does seems like it's built by three
| robots in a trench coat collaborating unsuccessfully with other
| robots in trench coats.
| rootusrootus wrote:
| > I'm seriously considering moving elsewhere.
|
| I recently moved my family's legacy GSuite service over to
| Fastmail, and it seems like they've carefully planned for
| this exact scenario. Account setup on each device is as
| simple as downloading a configuration profile with a QR code.
| And Fastmail has a built-in option to authenticate to your
| old Google account and pull all your mail over to the new
| account, preserving all the details, and then keep sync'ing
| until you're ready to turn the old account off. I thought I
| was going to have to sync things myself. Nope! Took all of
| five minutes to set up my account and sync. Couple weeks
| later I deactivated the old GSuite accounts.
|
| And now I'm a customer again, which feels good, even though
| it means spending actual money.
| 0daystock wrote:
| This announcement isn't about that and neither provider is
| asking to sync your private key. In fact the opposite is true:
| with FIDO2, you're in much greater control of your account
| security because authentication creds are now on a hardware
| token versus as bearer credentials you type and an adversary
| can steal and replay. Many of us believe we're very good at
| protecting our passwords, but this isn't true in reality and
| FIDO2/U2F standards objectively make accounts more secure
| precisely because they remove humans from the equation.
| throwaway52022 wrote:
| Except it kind of is - the way I read this is "Apple/Google
| will turn your phone into a hardware FIDO token, but will use
| iCloud/whatever to reduce the huge painpoint of having more
| than one hardware token and keeping them all in sync"
|
| I really love the idea of FIDO and making sure that my
| authenticator only authenticates to sites that I've approved,
| but having multiple keys right now is a huge pain, but I'm
| not excited about "just sign up for Apple and that pain goes
| away" because I sure as hell don't trust Apple not to cause
| me pain in the future.
| toomuchtodo wrote:
| Your average user is more concerned about losing their
| password than they are about authenticator sovereignty.
| Moving towards cryptographic primitives for auth versus
| shared secrets is a net benefit versus current state.
|
| > but having multiple keys right now is a huge pain, but
| I'm not excited about "just sign up for Apple and that pain
| goes away" because I sure as hell don't trust Apple not to
| cause me pain in the future.
|
| Compromise is necessary, and probably a bit of regulation
| from government to enforce good outcomes from exception
| handling. Passkeys need to be stored and managed somehow,
| and your average user does not want to do that, just like
| they don't want to run their own mail server, syncthing
| instance, or mastodon instance.
|
| EDIT: (HN throttling, can't reply) @signal11 You can
| already be locked out of all of those accounts without
| recourse.
| signal11 wrote:
| > Your average user is more concerned about losing their
| password than they are about authenticator sovereignty
|
| Right up to the point when they're locked out from their
| Google, iCloud or Facebook accounts with little recourse
| or appeal. And then they discover it's not just Google, a
| whole host of other services don't work.
|
| And it does happen, and I for one don't want to wait for
| legislation to mitigate this blatant attempt at yet more
| centralisation.
|
| Better to not centralise in the first place.
| zozbot234 wrote:
| Many authenticator apps allow you to extract and back up
| the private key yourself, with no involvement of any 3rd
| party. But it's a totally optional workflow and you're
| never asked for that private key while authenticating, so
| the mass phishing and spear-phishing attacks seen with
| passwords are still infeasible.
| judge2020 wrote:
| This is a net benefit over synced passwords, which everyone
| already trusts them to do. You haven't been forced to use a
| (syncing) password manager over a physical password book in
| the past, and you won't be forced to use Passkeys[0] or the
| Android equivalent in the future; hardware security keys
| will still be usable since this announcement is about
| embracing the FIDO Standard.
|
| 0: https://developer.apple.com/documentation/authentication
| serv...
| mbrubeck wrote:
| > neither provider is asking to sync your private key.
|
| Yes, they are. According to the white paper linked in the
| press release:
|
| _Just like password managers do with passwords, the
| underlying OS platform will "sync" the cryptographic keys
| that belong to a FIDO credential from device to device._
|
| https://media.fidoalliance.org/wp-
| content/uploads/2022/03/Ho...
|
| Ars Technica had a better write-up of these announcements
| back in March: https://arstechnica.com/information-
| technology/2022/03/a-big...
| eMGm4D0zgUAVXc7 wrote:
| epistasis wrote:
| Not at all, because before anybody could take your account
| away from you if you did not accurately compare two visual
| strings, potentially in Unicode.
|
| By replacing that operation, which humans can not perform
| reliably, with computer operations, users are no longer
| subject to others taking control of their account.
|
| It is wonderful.
| [deleted]
| dwaite wrote:
| This announcement is partially about the platform-integrated
| authenticators being made into 'virtual' authenticators
| backed by a platform vendor-specific cloud ecosystem. So for
| example, a credential registered on an iPhone may be
| synchronized over iCloud Keychain to work to log in my Mac
| via TouchID.
|
| This is something which has always been as part of the model
| - an authenticator is just an abstract thing that represents
| an authentication factor, generates keys for a particular
| use, and doesn't share private keys outside its boundaries.
|
| This announcement possibly marks a transition where sites
| supporting Web Authentication (with a bring-your-own-
| authenticator model) will go from seeing 90%+ hardware-bound
| authenticators to seeing 90%+ platform-integrated,
| synchronizing authenticators. Bundled into that prediction is
| a hope that this (and other proposed changes) will lead to a
| 10x increase in adoption.
| jurmous wrote:
| In the Netherlands the banks provide the iDIN system, so you
| can authenticate on more sites with the bank provided logins.
| Each bank has a slightly different system often using bank card
| and bank card readers and ways to authenticate through
| authorised banking app on individual mobile phones.
|
| - https://www.idin.nl/en/about-idin/
|
| - https://nl.wikipedia.org/wiki/IDIN - (Use translate function
| in browser to read as there is no English version
|
| And besides that we have also a government provided login
| system which can also even work with your ID card. But mostly
| works with government systems and health insurance companies.
|
| - https://en.wikipedia.org/wiki/DigiD
|
| - https://www.digid.nl/en
| eMGm4D0zgUAVXc7 wrote:
| Given that banks usually MUST validate their customers'
| identity card the opportunities for tracking your users with
| this must be superb.
|
| I'd frankly prefer "insecure" user+pass over all of these
| guardrails which are 90% about control over the users and 10%
| about security.
| jve wrote:
| Tracking from bank or both? Anyways, in Latvia we have
| similar system and it is a convenient way to authenticate
| within services where you MUST prove you are person X.Y.Z.
|
| For example, some electric company, if you auth via this
| method, will provide you with contracts, electricity usage
| graphics for all the sites you own and and other info you
| must access as a customer. Same goes for recycling company.
| These usually provide a way to register using email
| matching whatever email you had in contract (thus linking
| to real person anyway)
|
| And then for other services where you request some data
| electronically that they must "register" each request. For
| example request some extended data on land/house ownership.
| You can't have that with non-real-life identifiable entity.
|
| So usually login via bank is an login option with companies
| you either have juridical relationships or you must provide
| real life identity where you would otherwise have to show
| passport in real life.
| avianlyric wrote:
| We have GDPR and consumer focused regulators in the EU. Our
| governments are actually out to protect citizens from
| corporate malfeasances, as opposed to either ignoring it,
| or out right enabling it.
|
| If a company abuses this data, you have strong forms of
| recourse available to you as a citizen, and banks are
| incentivised to remove bad actors, to ensure they don't
| become embroiled in enforcement action triggered by a 3rd
| party.
| londons_explore wrote:
| > they also take security seriously,
|
| Despite what most people think, banks are often a really long
| way behind on security. Banks don't care about security of any
| individual customer, merely security of the bank as a whole.
| That means if 0.01% of customers lose all their funds due to
| credential stuffing, it isn't an issue - the bank will just
| refund them if needed.
|
| Unlike say ssh with key authentication, where it would be a
| total failure if 0.01% of attackers were allowed to login
| without the key.
| [deleted]
| CKMo wrote:
| This is a good step forward, I just hope they work on ensuring
| the end user experience for less technical people doesn't seem
| more complicated than passwords.
|
| It's a work in progress.
| nekomorphism wrote:
| bedast wrote:
| The weakest link in security is always going to be humans.
| Account compromise is is more often a human problem than a
| technological one (spamming requests, password reuse, simple
| passwords, (spear) phishing, direct social engineering, etc).
|
| If I'm understanding correctly, they're aiming to reduce multi-
| factor auth back down to a single factor that's "easier" than
| passwords. Easier to use. Easier to social engineer a compromise.
|
| I get regular requests to get into my Microsoft account using
| their new login form that sends a key code rather than prompting
| for password. "Passwordless" just means that prompt goes to an
| app where a user unlocks their device to approve the login.
|
| This seems like worse security, not better. I'm okay with an
| approval prompt if it's part of a multi-factor auth system. Not
| if it's the only auth.
| imwillofficial wrote:
| "The weakest link in security is always going to be humans."
|
| This is not true whatsoever.
|
| Humans will always be a weakness for sure. But hardly the
| "weakest", and hardly "always"
| vlan0 wrote:
| >"Passwordless" just means that prompt goes to an app where a
| user unlocks their device to approve the login.
|
| Setup a yubikey with an attested cert/pub key. Require a pin to
| use said yubikey.Requiring attestation will prove that private
| key was generated on the device, and will only live on that
| yubikey. That's your best bet.
|
| It also satisfies the multi-factor needs. The something you
| have is the yuibkey. The something you know is the PIN.
| vngzs wrote:
| There's a frequent misconception that hardware keys are no
| better than, say, a TOTP seed on a secure element of your
| phone.
|
| The core practical difference between a hardware key and that
| TOTP code on a secure element is the hardware key, when
| registered with a domain, is programmed with the domain name in
| it. Lookalike domains - or anything besides the _exact_ domain
| you registered the key with - fail to 2FA because they are
| unregistered. This essentially prevents (spear)phishing attacks
| from stealing login credentials.
| bedast wrote:
| But this seems like a technological solution to a very human
| problem. If I can trick a user into approving the login, then
| hardware fobs, secure elements, etc, are meaningless.
|
| The audience here is likely to assume that the security is
| solid. And it probably is. But this is a technology targeting
| your average user. It'll certainly be easier for the end
| user. But it seems like it introduces a human-based attack
| vector that may be easier to exploit.
| vngzs wrote:
| What I mean is: with modern hardware keys you literally
| can't trick a user into approving a remote login, or a
| login to a fake domain. It's not possible unless you can
| control the domain that the user is logging into (say
| you've got code execution on their machine or compromised
| their network and broken TLS, attacks which are
| significantly more complex than phishing). Hardware keys
| enforce that the device can only authenticate against the
| real domain, not a phishing domain. The core of this is a
| real improvement of how 2FA works at the protocol layer,
| rather than simply a change to how the user interacts with
| the device.
|
| Hardware keys also require that the key can only
| authenticate a local session, so there's also no risk that
| your "hardware key tap" can be captured and used by a
| remote adversary who doesn't control the local computer.
| zozbot234 wrote:
| Doesn't TOTP use current time as part of the challenge? Why
| couldn't a refinement of TOTP add the domain name as a
| further element?
| somethingAlex wrote:
| That's pretty much what happened here. Obviously it's going
| to look a bit different afterwards because you have to
| mathematically tangle the time, key, and domain together.
| You can't really do that with the six digits of a
| traditional OTP code.
|
| And like the other reply stated, if you can't
| mathematically tie them together, you have to rely on the
| user validating the domain (which you can't).
| PeterisP wrote:
| If a user manually enters a code from TOTP
| device/calculator into a website, that TOTP
| device/calculator has no way to know which exact website
| domain it is - if the user visiting notmybank.com thinks
| they are visiting mybank.com, they'll get the right code
| for mybank.com from their device and get pwned.
|
| The key part of FIDO protocol is that it prevents the user
| from getting the code intended from one domain and sending
| it to a different domain.
| netheril96 wrote:
| You can't rely on the end users to check the domain name,
| because
|
| * Most users have no idea what a domain name is.
|
| * It is tedious to compare the domain name character by
| character.
|
| * Phishing sites have used many UI tricks historically to
| make their domain name look authentic (e.g lookalike
| Unicode characters).
| NovemberWhiskey wrote:
| Absolutely right - put another way: the responsibility of the
| user is reduced from "be absolutely certain that you're
| entering your credentials to the web site that you think
| you're authenticating at" to "provide consent to
| authenticate".
| j_san wrote:
| But isn't the "thing" about FIDO (or maybe just security keys?)
| that the domain is also integrated into the challenge the
| client/key has to solve?
|
| So from what I understand a attacker couldn't as easily fish me
| by pretending e.g. to be Google. With a password or even a TOTP
| code the attacker could just pose as Google and forward the
| credentials to the actual site.
| bedast wrote:
| You're looking at an exploit from a technological point of
| view, which I expect this community is likely to do. Think of
| it from the perspective of the average user. I know for a
| fact if my mom was told by an attacker "if you see an
| approval request for your account, just accept it" she would
| do so. It's taken time to train her not to give anyone her
| password.
|
| I've read of attackers with valid passwords spamming logins
| in hopes to trick a user into approving the auth. Whether
| it's because it woke up the user and they're in a sleep fog,
| or they're busy and not paying attention.
|
| Microsoft, at some point, changed their login flow so that,
| by default, when you enter your username, it sends a pin. I
| receive regular attempts at this. This isn't going to work
| out for the attacker because they have to get the pin. But if
| all that's required is a button press, the attacker could
| just make the login request and wait.
|
| With multi-factor auth, where a password is in use, you have
| to get past the password before getting to that auth
| approval. It reduces how much noise the user gets and the
| chances of success for the attacker.
| cmeacham98 wrote:
| You don't understand FIDO/webauthn/etc. The scenario you
| describe is impossible. This is the genius - the user is
| totally cut out of the equation, there is _no action_ your
| mom can take on phishing-website.com to send the
| credentials of google.com, because the key will refuse to
| do so.
| bedast wrote:
| What this article is about is authenticating the request
| with an app on your phone, not a hardware key. This ends
| up being a device totally disconnected from the device
| requesting the auth, and neither have to be in the same
| geographic location unless implemented alongside the
| spec.
| j_san wrote:
| Depending on how it's implemented it could still use the
| same mechanism, couldn't it? (genuine question)
|
| For me the question is if this is a webauthn thing in
| general or a security key thing (to include the domain in
| the challenge to prevent phishing)
| bedast wrote:
| The article specifically discusses auth via app, but if
| it's involving the FIDO alliance, it'd be weird to
| exclude hardware keys, I guess. I still don't like the
| idea of going single factor, but if it's with a hardware
| key, I can see it being better than with an app since it
| has to directly interact with the process itself.
|
| But, of course, if this is optional, I still have to
| reference the end users. I'm willing to pay for an
| authentic FIDO key, which can be a tad costly. Your
| typical user might be more inclined to go for a cheap one
| that does enough to get into the account, and may not be
| trustworthy, or would prefer not to do it at all.
| theplumber wrote:
| That's why with webauthn humans are not part of the auth scheme
| anymore. All the auth is negotiated between machines(web
| browser -> domain name -> hardware key storage).
| 0daystock wrote:
| > If I'm understanding correctly, they're aiming to reduce
| multi-factor auth back down to a single factor that's "easier"
| than passwords.
|
| It isn't only easier, it's significantly more secure. FIDO/U2F
| is basically immune to phishing, because there's no one-time
| code to type and steal; there's a cryptographically backed
| signing assertion guaranteeing the person with physical
| possession of the token is in control. This is so airtight
| (because almost all account compromise is done remotely, not
| through physical in-person attacks) that I would even be
| personally comfortable disclosing my password for accounts
| secured by FIDO/U2F.
| bedast wrote:
| In a multi-factor scheme, I would agree with you. I use
| FIDO/U2F myself...as a secondary factor.
|
| There are active attacks that attempt to exploit human lack
| of vigilance in an authentication approval flow. With a
| password as a first factor, it reduces the chances that these
| attempts make it to the user.
|
| You and I are probably fine in terms of vigilance. If I see
| an auth request, say, from my Okta app, that I did not
| initiate, I know it's something I need to investigate and
| will not automatically approve it. But consider the typical
| user...
| foobarian wrote:
| I feel like I woke up in some parallel universe where TCP/IP
| didn't take off and FidoNet ended up as the Internet.
| chondl wrote:
| I had the same thought. Living in the past.
| TIPSIO wrote:
| I'm sure people way smarter than me have this figured out, from
| the Google post:
|
| > When you sign into a website or app on your phone, you will
| simply unlock your phone -- your account won't need a password
| anymore.
|
| > Instead, your phone will store a FIDO credential called a
| passkey which is used to unlock your online account. The passkey
| makes signing in far more secure, as it's based on public key
| cryptography and is only shown to your online account when you
| unlock your phone
|
| So if I was a dumb kid, I could login to my parents bank accounts
| (or more / worst) if my mom gave me her 4 digit phone password
| for games earlier?
| amelius wrote:
| No, because it should be evident by now that a phone is a
| personal computer, not to be shared with other people.
| josteink wrote:
| I can tell you don't have kids ;)
| wlesieutre wrote:
| And personal computers have supported multiple people with
| separated data and settings since what, the 1980s?
| jeroenhd wrote:
| For you perhaps. Phones are shared more often than you think.
| And no, they don't use multi-account features built into
| modern mobile operating systems.
| davidkhess wrote:
| Normal usage would require a reauthentication - i.e. FaceId or
| TouchId - to produce the passkey.
| kogus wrote:
| Currently on the iPhone, if your FaceID or TouchID fail
| repeatedly, you have the option to type in the passcode,
| which grants the same access. I'm not sure if the same is
| true on Android.
|
| I think the more general point is that "able to unlock the
| phone" is not / should not be the same as "I have verified
| that this is you" for sensitive applications and information.
| michaelt wrote:
| I just tested with two banks' apps. They both allow touch
| ID with fallback to a bank-account-specific PIN - not the
| phone passcode.
|
| Of course, if you've enrolled your kid's fingerprints
| they'd have access.
| TIPSIO wrote:
| Ah cool, the Google post made it seem a bit more automatic
| and instant.
|
| > you will simply unlock your phone
|
| Then I guess that really is no different from opening an app.
| jeroenhd wrote:
| If that kid can get their parent's finger on the fingerprint
| scanner, sure. The authentication part of the process is moved
| to the device's security system, so that's fingerprints,
| passcodes, and facial recognition.
| criddell wrote:
| I don't think fingerprint scanners on consumer devices are
| always great. My daughter has one on her laptop and last week
| I tried my finger and it worked.
| jeroenhd wrote:
| Honestly, biometrics are terrible for authorization.
| They're more of a username than a password and we shouldn't
| use them like passwords. The same is truth for facial
| recognition algorithms, no matter how advanced.
|
| They're so damn convenient, though. I trust the fingerprint
| scanner on my phone and my laptop, but there are definitely
| bad scanners out there.
| criddell wrote:
| Why do you trust your laptop scanner? Have you let other
| people try to unlock with their fingerprint?
|
| FWIW, my daughter's laptop is a Dell.
| jeroenhd wrote:
| I've tried unlocking my laptop's scanner with my other
| hand and I've asked other people to put their finger on
| it to see if it does some kind of weird matching based on
| finger type. No problems so far. It even works across
| both Windows and Linux if I use the right Windows reboot
| incantations.
|
| Since there is nothing genetic about fingerprints, I'd
| personally consider your daughter's laptop to be
| defective if you're able to unlock it. A critical part of
| the laptop's security mechanisms is clearly broken and
| should be looked at. I can't find many other stories
| about Dell specifically so this may be a specific unit or
| product line that's broken.
|
| You may even have something to gain by reporting it; I
| don't know if Dell or their manufacturers do bug
| bounties, but this definitely sounds like something that
| should be accepted in such a program. Even if they don't,
| writing a short blog about it with the brand, model, and
| model of the fingerprint reader might get the press
| rolling, forcing Dell to take action. This is simply
| unacceptable.
| danans wrote:
| > Since there is nothing genetic about fingerprints
|
| While it's true that even identical twins don't have the
| same fingerprints [1], it's not true that there are no
| genetic factors in the general shape of fingerprints [2].
| I agree that it's unacceptable if a fingerprint reader
| isn't good enough to distinguish identical twins based on
| the differences in fingerprints though, as those should
| be the most similar fingerprints possible, essentially
| setting a floor on the minimum uniqueness in the problem.
|
| It seems like they would use identical twin derived
| validation data sets to ensure this.
|
| 1. https://www.nytimes.com/2004/11/02/health/the-claim-
| identica...
|
| 2. https://www.mcgill.ca/oss/article/did-you-know/you-
| inherit-p...
| dlivingston wrote:
| Apple's bioauth is very good and I trust both the
| fingerprint and face authentication. YMMV with other
| manufacturers and devices.
| c3534l wrote:
| I don't want my password to be something I leave behind
| on everything I touch, which the police have because I
| was arrested once, which can be ascertained from high
| quality photos, and which I can't ever change once
| stolen.
| Nathanba wrote:
| .. but biometrics can be lost too. I could lose my finger, I
| could have a facial injury. The algorithm could be changed
| and suddenly I can't log into anything anymore. Or I simply
| age and my faceId stops working some day. I don't know but
| biometrics only sound smart initially but it seems very
| brittle if you think about it. Plus there are plenty of
| stories of people who were able to unlock somebody else's
| phone randomly. Just google "unlocked my friends phone via
| faceid". This all seems like such theater for nothing. I
| think a simple "own this usb stick = it's proof that you are
| you" is a very nice 2 factor without any biometrics. Create a
| usb stick that needs to be unlocked via a passcode to work
| and voila.
| threeseed wrote:
| * On Apple devices TouchID allows you to register multiple
| fingers. And if you have a severe facial injury it will
| fall back to a password if it can't identity you.
|
| * No one is unlocking their friends phone via FaceID unless
| they are unconscious and they have deliberately disabled
| the awake-detection feature.
|
| * It is not theatre for nothing. It is a far more secure
| and convenient form for authentication.
| [deleted]
| josteink wrote:
| That's a whole lot of text saying "things will be simpler" on
| repeat, without specifying how things are going to be and why
| that is simpler.
|
| Anyone got a link to something less hand-wavey and more concrete?
| oversocialized wrote:
| fmakunbound wrote:
| Reading through the threads here: If the HN can't articulate FIDO
| and differences between it and the now decades old password model
| to each other, I think regular jack-offs are going to have
| trouble.
|
| People have the mental model that their secret is stored in their
| gray matter/post-it note/password manager, and now you're telling
| them it's in their phone, and somewhat related to the phone's
| security model, or maybe a "yubikey", or behind biometrics, or
| maybe not, it depends, and Big Co. has copy, of something, and
| it's synced, and one possibly "migrates" between Big Co. and Big
| Co. might deliberately/accidentally disable all your websites, or
| losing your device/yubikey/piece of paper means you're screwed,
| possibly...
| rad88 wrote:
| Yeah, well what I want is a (physical, literal) membership card
| like I have at the gym or library. I think "regular" people can
| learn to use USB tokens, and that they might make more
| intuitive sense than passwords. These places don't challenge me
| for the "secret password" when I come in, I just present or
| scan my card.
|
| It's very tricky obviously, in terms of engineering and
| operations, for an internet based company to arrange anything
| similar. But I don't think it's too mentally foreign for the
| user (assuming we develop good standards).
|
| So cards make sense to me. Way more sense than passwords. Maybe
| someone else feels more comfortable with the details living
| inside their phone, but that doesn't affect my mental model.
| Users don't need to understand or be taught the entire
| standard.
| tialaramex wrote:
| As you will have seen in lots of other posts to this topic,
| people want privacy and "I just show my membership ID
| everywhere, what's the problem?" unsurprisingly is not what
| they had in mind.
|
| So, FIDO preserves privacy by minting unique credentials for
| each site where you use it. This is invisible to the user of
| course, for them it's just the case that you use your FIDO
| authenticator everywhere (that it works) and now it's secure.
| rad88 wrote:
| I understand that. I was responding to the idea that
| hardware tokens like yubikey, in fact all alternatives to
| passwords, are too complicated for regular people to
| understand. And also saying that multiple options, to
| accommodate different people/scenarios, are fine and don't
| have to be complicated from the user's perspective. By way
| of analogy (admittedly I didn't make that very clear).
| [deleted]
| toxik wrote:
| Unrelated to what you wrote, but it is actually jagoff or jag-
| off. It is not related to jacking off.
| [deleted]
| stavros wrote:
| FIDO is an authentication standard. It doesn't care where your
| secrets are, it just mandates a way to use them to log in to
| websites. You can still use a password manager, it will just
| basically contain a single encryption key for all sites.
| davidkhess wrote:
| I think this is really great news and am glad to see FIDO move
| forward as I think it greatly increases account security.
|
| One aspect of FIDO that could still be troublesome is account
| recovery in case of inadvertent loss of passkey. OOB recovery
| with SMS or email is considered too weak and the main recommended
| alternatives are to maintain multiple authenticators (i.e.
| multiple copies of your passkeys), re-run onboarding processes
| for new users or just abandon the account.
|
| It's going to be interesting to see how those alternatives play
| out in real world situations.
| jeroenhd wrote:
| Reading this announcement, the idea seems to be that FIDO keys
| will be synchronised across devices. That means you can lose
| your phone and still get access to your accounts from your
| desktop.
|
| You might even be able to get access by simply logging in to
| your Microsoft/Apple/Google account on a new device if they
| implement this system stupidly enough.
| davidkhess wrote:
| Yes, these will be stored in cloud storage like iCloud
| Keychain. But I can go into my iCloud Keychain and delete
| individual passkeys - or I may have only one Apple device and
| then lose it. Or some malware clears out all of my iCloud
| Keychain.
| danieldk wrote:
| _One aspect of FIDO that could still be troublesome is account
| recovery in case of inadvertent loss of passkey._
|
| I think the idea is that passkeys are synced between devices,
| see e.g.:
|
| https://developer.apple.com/videos/play/wwdc2021/10106/
|
| I haven't look deeply into passkey enough yet, but aren't we
| replacing "what if I lose by device" by "what if company XYZ
| decides to nuke my access to my synchronized passkey"?
| photochemsyn wrote:
| Suggested edit of mission statement in the name of increased
| accuracy:
|
| "The standards developed by the FIDO Alliance and World Wide Web
| Consortium and being led in practice by these innovative
| companies is the type of forward-leaning thinking that will
| ultimately _make the American people easier to track online_.
|
| This will be done by linking all online activity to unique
| personal attributes, i.e. "their fingerprint or face, or a device
| PIN." It's basically another step towards the China model of
| total mass surveillance of the population.
|
| [edit: all the justifications for this proposal - aren't they
| mostly solved by the use of password managers?]
| ChikkaChiChi wrote:
| Why aren't we doing more to validate the identity of the service
| we are trying to connect to? CAs don't allow me to establish my
| own personal web of trust. If I connect once to my bank in a
| method I deem safe, I should be able to store their credentials
| in an easy to validate way.
|
| That way if I fall for a phishing attack, the browser can CLEARLY
| indicate to me that I'm encountering a new entity, not one I have
| an established relationship with.
|
| Concurrently, OSes need to do a way better job of supporting two
| factor locally and out-of-the-box. To even use a yubikey
| personally you have to install their software and disable the
| existing login methods or else you can still login the original
| way you set up.
|
| While we're at it, browsers and operating systems should actually
| lock out the second a key is no longer connected/in range. I know
| smart cards can behave similarly, but this needs to be
| grandparents level of easy to set up and control.
|
| I would feel much safer with my elderly family having "car keys"
| to their PC.
| Arnavion wrote:
| They closest thing to avoiding being phished by a different
| "secure" entity is that your password manager will refuse to
| autofill (*) your credentials. But it's true that this is far
| from sufficient - this kind of autofill is wonky and doesn't
| work with all pages, so users can get conditioned to working
| around it by manually copying and pasting from the password
| manager to the browser, which defeats the protection. Many
| users prefer to always copy-and-paste anyway, because that
| avoids having to install the password manager's corresponding
| browser addon which can seem more secure.
|
| (*): Note that "autofill" only means "automatically populate
| credentials", not "automatically populate credentials without
| any user interaction". Clicking the username field, choosing a
| credential from a dropdown that the password manager populated
| for you based on which credentials match the website in
| question, and then having it be applied is also "autofill".
| ChikkaChiChi wrote:
| You're right, that's woefully insufficient. The
| authentication challenge should clearly (using color _and_
| text) whether or not the challenge is an established part of
| your trust network and the hardware token should be able to
| validate the authenticity of the challenge modal itself.
|
| Users should be able to take an action they trust, while at
| the same time having the choice of that action taken away (or
| made more cumbersome) if they are about to get themselves
| into trouble.
|
| There are people _far_ smarter than me working on these
| problems, but I feel like they are so hyperfocused on state-
| security that they refuse to listen to anyone regarding
| actual usability.
| blibble wrote:
| U2F/FIDO2 are immune to this problem as the magic exchange
| requires the origin hostname to decrypt/verify the remotely
| stored blob
|
| wrong origin? can't work at all, ever
| ChikkaChiChi wrote:
| Thanks you for your response. I'm going to read up more. I
| wasn't aware this was a feature.
| tialaramex wrote:
| Specifically what's going on here in the cheapest FIDO
| devices is roughly this:
|
| On every site where you enroll, random private keys are
| generated - this ensures you can't be tracked by the keys,
| your Facebook login and your GitHub login with WebAuthn are
| not related, so although if both accounts are named
| "ChikkaChiChi" there are no prizes for guessing it's the
| same person, WebAuthn does not help prove this.
|
| A private key used to prove who you are to say example.com
| is not stored on the device, instead, it's actually
| _encrypted_ using a symmetric key that is really your
| device 's sole "identity" the thing that makes it different
| from the millions of others, and with a unique "Relying
| Party" ID or RPID which for WebAuthn is basically (SHA256
| of) the DNS name using AEAD encryption mode, and then, sent
| to example.com during your enrolment, along with the
| associated public key and other data.
|
| They can't decrypt it, in fact, they aren't formally told
| it's encrypted at all, they're just given this huge ID
| number for your enrolment, and from expensive devices (say,
| an iPhone) it might not be encrypted at all, it might just
| really _be_ a huge randomly chosen ID number. Who knows?
| Not them. But even if they were 100% sure it was encrypted
| too bad, the only decryption key is baked inside your
| authenticator which they don 't have.
|
| What they do have is the _public_ key, which means when you
| can prove you know that private key (by your device signing
| a message with it) you must be you. This "I'm (still) me"
| feature is deliberately all that cheap Security Keys do,
| out of the box, it's precisely enough to solve the
| authentication problem, with the minimum cost to privacy.
|
| Now, when it's time to log in to example.com, they send
| back that huge ID. Your browser says OK, any Security Keys
| that are plugged in, I just got this enrolment ID, from
| example.com, who can use that to authenticate ? Each
| authenticator looks at the ID, and tries to decrypt it,
| knowing their symmetric key and the fact it's for
| example.com. AEAD mode means the result is either "OK" and
| the Private Key, which they can then use to sign the "I'm
| (still) me" proof for WebAuthn and sign you in, or "Bzzt
| wrong" with no further details and that authenticator tells
| the browser it didn't match so it must be some other
| authenticator.
|
| This means, if you're actually at example.org instead of
| example.com the AEAD decryption would fail and your
| authenticator doesn't even know _why_ this didn 't work, as
| far as it knows, maybe you forgot to plug in the right
| authenticator? You not only don't send valid credentials
| for example.com to the wrong site, your devices don't even
| know what the valid credentials _are_ because they couldn
| 't decrypt the message unless it's the correct site.
| EGreg wrote:
| Can I use the webauthn API to have the user confirm arbitrary
| actions or only authenticate? Like, what if I send a different
| challenge every time?
| adhesive_wombat wrote:
| Since Teams can't even be bothered to get Yubikeys working on
| Ubuntu, I'll believe Microsoft when I see it.
| hitovst wrote:
| Can't say we weren't warned.
| alberth wrote:
| Dumb question: why are biometrics being used to replace the
| _password_ , shouldn't the biometric replace the _username_?
| somethingAlex wrote:
| Fundamentally, if you want to support multiple, unlinked
| accounts per person you'll still need some sort of "account
| designator."
|
| If you don't then the biometric marker can just replace both
| password and username. The reason why the username exists for
| the password is because it's problematic to guarantee
| uniqueness of passwords across your users. One is unique and
| public and the other is not and private.
| moritonal wrote:
| Stunning question really. I imagine (absolute guess) it's
| because biometrics provide a 1-100% likelihood of a match, not
| a unique ID?
| taeric wrote:
| I think, at the end of the day, there really isn't much of a
| difference between the two, for authentication. One is just a
| public part of who you are, and remains fairly static.
|
| An argument for keeping the username separate is it is often
| used for identification. That is, you identify on this site as
| alberth. Not as any biometric scan. Even if you change which
| finger you want to use to authenticate, you'd still be alberth
| to everyone else here.
|
| I think there are arguments on favor of letting you change a
| display name. Probably still would keep a name that is static.
| (What Twitter does?)
| 0daystock wrote:
| I believe it's because people generally find the idea to be
| comfortable and familiar based on fictional representations in
| movies, etc. I'm of the opinion biometric information is
| totally private, yet easily spoof-able, thus should only be
| used to identity - not authorize - me.
| hansel_der wrote:
| it's not a dumb question, but that will not stop anyone because
| biometric authentication works very well in practice
| (convenient and foolproof) despite beeing not very secure.
|
| i.e. lockpicking howtos and existence of glasscutters don't
| dissuade from having a locked front door.
| md_ wrote:
| The biometrics don't authenticate you to the remote service.
| They authenticate you to the device that has the keys that
| authenticate you to the remote service.
|
| Biometrics are a convenient replacement for a screen lock
| pattern/PIN, but not a necessary one, of course.
|
| https://www.wired.com/story/fido-alliance-ios-android-passwo...
| is a good explainer.
| _trackno5 wrote:
| Not it shouldn't. If you wanted to associate various devices
| (phone, laptop, etc) to the same account it wouldn't work. The
| fingerprint produced by each device is different.
|
| You associate biometric credentials to a username for that.
| sph wrote:
| Sounds to be like we're replacing the username and the
| password, i.e. _something you know_ with username and your
| phone, i.e. _something you have_.
|
| It sounds like it's still a one factor authentication system,
| but different.
| aaaaaaaaata wrote:
| The "one" factor is a zk proof.
| avianlyric wrote:
| The expectation is that you also have "something you are" as
| provided by your devices biometric authentication.
|
| The standard allows for service to demand that the
| authentication device performs an additional factor
| authentication. Which is usually either a PIN or biometrics,
| and your device attests to doing this during authentication.
|
| So then you have two complete factors "something you have"
| (your phone) and "something you are" (biometrics) or
| "something you know" (unlock PIN for device).
| sph wrote:
| But I'm responding to GP that said biometrics are a
| username, not a secret, which I agree. I'm not sure
| _something you are_ counts as a security factor.
| avianlyric wrote:
| > But I'm responding to GP that said biometrics are a
| username, not a secret, which I agree.
|
| If you were sending an actual copy of your biometric data
| to the remote authentication service, then maybe you
| could make that argument.
|
| But that never happens, no FIDO biometric device sends a
| biometric fingerprint that could be reproduced by a
| different device. The device authenticates you with
| biometrics, then uses that data to unlock a private key,
| which is then used to answer a challenge-response request
| from the authenticating service.
|
| If you don't the device, then it's pretty much impossible
| for you to correctly answer that challenge-response,
| despite being in possession of the biometric features
| that device would use to authenticate you.
|
| So you can't use your biometrics as a username. Because
| the device measuring the biometric data pushes that data
| through a mono-directional, randomly generated (at device
| manufacture), hash function, that exists within that
| device only. Take your biometrics else where (I.e same
| device type/model but different physical object), and
| you'll get a different output even with identical inputs.
| Which would be a pretty useless username.
|
| > I'm not sure _something you are_ counts as a security
| factor.
|
| You should take that up with NIST then:
| https://www.nist.gov/itl/smallbusinesscyber/guidance-
| topic/m...
| Spivak wrote:
| Something you is a an authentication factor that doesn't
| need to be secret to be secure. That's the whole point.
| You can have a high-res 3d model of my finger but you
| can't create a human with my fingerprint.
|
| In the same way that the security of something you know
| is a scale based on "how difficult is your password to
| guess" or "how hard is it to crack the hash" the security
| of something you are is a scale based on "how difficult
| is it for someone to create a fake that tricks this
| specific machine into thinking it's reading metrics from
| a live human."
|
| The security lives in the system reading the metrics not
| your body which is why you don't have to rotate your face
| every 90 days.
|
| A cheap fingerprint reader is the 4 digit pin of
| something you are. Retina scans that take temperature,
| look for blood flow and eye movement are the correct
| horse battery staple.
| alberth wrote:
| Follow-up dumb questions:
|
| - so what happens if you don't have your phone at time of
| login?
|
| - if I enroll on iPhone, is my identity forever tied to Apple
| or can it be migrated to Android if I ever wanted to change
| platforms?
|
| - Can Apple/Google/Microsoft ever block/ban my account,
| preventing me from logging into my bank, etc that use FIDO
| login?
| Hamuko wrote:
| > _if I enroll on iPhone, is my identity forever tied to
| Apple or can it be migrated to Android if I ever wanted to
| change platforms?_
|
| A good FIDO implementation will give you the ability to
| enroll multiple authenticators. In fact, if you can't,
| you're basically going against the WebAuthn spec.
|
| _" Relying Parties SHOULD allow and encourage users to
| register multiple credentials to the same account. Relying
| Parties SHOULD make use of the excludeCredentials and
| user.id options to ensure that these different credentials
| are bound to different authenticators."_
|
| Basically, you should enroll your iPhone and a backup key.
| And if you get an Android device, you log in with the
| Android device using a backup key, and enroll the Android
| device and remove the iPhone. Alternatively, you remove the
| iPhone authentication using the iPhone, and enroll the
| Android device using an alternative authentication method
| (like traditional username/password).
| avianlyric wrote:
| > - so what happens if you don't have your phone at time of
| login?
|
| You can't login. Same as it is with any 2FA system where
| you don't have access to the second factor.
|
| > - if I enroll on iPhone, is my identity forever tied to
| Apple or can it be migrated to Android if I ever wanted to
| change platforms?
|
| At a minimum services should support multiple
| authentication devices/tokens. So you can enrol both an iOS
| device and Android device, or any other FIDO device E.g.
| YubiKey.
|
| This is already the standard approach for FIDO tokens, and
| basically a requirement for existing services, because we
| don't currently have FIDO token syncing.
|
| I would hope that these syncing services will also allow
| you to export your private key. But that's a slightly scary
| prospect because it would allow the holder of that key to
| authenticate as you anywhere.
|
| > - Can Apple/Google/Microsoft ever block/ban my account,
| preventing me from logging into my bank, etc that use FIDO
| login?
|
| Services will still need a credential recovery process.
| People lose phone etc everyday. I imagine your bank will
| happy reset your credentials if you turned up in-person
| holding government identification.
| dane-pgp wrote:
| > Can Apple/Google/Microsoft ever block/ban my account,
| preventing me from logging into my bank, etc
|
| If you don't accept their 10,000 word ever-changing terms
| of use, and if don't let them check for the marks on your
| forehead or right hand, then yes, you won't be able to buy
| or sell.
| anotheracctfo wrote:
| Same thing that happened when my work required 2FA for
| checking email, I simply stopped checking email on my
| personal phone.
|
| Its not like InfoSec cares if the business functions,
| that's not their job.
| maxfurman wrote:
| If you don't have your phone, you can't log in. SMS 2FA has
| the same problem.
|
| You technically should be able to migrate from one provider
| to another, it remains to be seen how easy Apple and Google
| will make the process.
|
| That last one is a great question that I don't know the
| answer to.
| judge2020 wrote:
| > You technically should be able to migrate from one
| provider to another, it remains to be seen how easy Apple
| and Google will make the process.
|
| On a UX level, the transfer to another syncing security
| key "provider" is going to be interesting, if they even
| do that at all - I kind of doubt they'll have a "transfer
| your iCloud passkeys to your Chrome password manager" and
| they'll instead say "go to each service and enroll a new
| security key via your new syncing key manager". On a
| technical level, I wholly imagine there'll be a tool that
| pulls iCloud Passkeys[0] via the MacOS Keychain
| application and then inserts them into your new key
| manager.
|
| 0: https://developer.apple.com/documentation/authenticati
| onserv...
| judge2020 wrote:
| > - so what happens if you don't have your phone at time of
| login?
|
| Depends on if they allow you to turn off password+2FA login
| entirely, which I only see being possible with something
| like Advanced Protection Program[0] which can already be
| used to enforce "Only allow authentication with my
| password+security keys; there is no way for Google Support
| to remove 2fa; if I lose the keys, the account's lost".
|
| > - if I enroll on iPhone, is my identity forever tied to
| Apple or can it be migrated to Android if I ever wanted to
| change platforms?
|
| I imagine they'll say "login to each website" (which you
| can do via iOS if you use qr android login[1]) then "re-
| enroll with your new provider", but I hope there will be an
| actual export/import or migration experience.
|
| > - Can Apple/Google/Microsoft ever block/ban my account,
| preventing me from logging into my bank, etc that use FIDO
| login?
|
| Assuming they don't change how Chrome and iCloud keychain
| currently works, everything synced should stay on your
| already signed-in devices, so hopefully you can continue to
| use your devices as authenticators until you can log into
| each service and register a regular, hardware key for sign-
| in.
|
| 0: https://landing.google.com/advancedprotection/
|
| 1: https://www.chromestory.com/2021/12/qr-
| code-2fa/#:~:text=Her... I personally tried this with my
| iPhone, and my phone prompted me to use an iCloud Passkey.
| I was able to confirm that, by enrolling my iPhone as a
| security key on GitHub, then this 'BLE Webauthn' feature
| allowed me to sign in to GitHub on my desktop Chrome
| browser via my phone. Only downside to this is that the
| desktop must have a bluetooth card, but hopefully
| motherboards will continue to come integrated with
| wifi+bluetooth.
| nyuszika7h wrote:
| > - Can Apple/Google/Microsoft ever block/ban my account,
| preventing me from logging into my bank, etc that use FIDO
| login?
|
| You don't need an Apple/Google/Microsoft account to use
| WebAuthn on another website, it's based on the biometrics
| on your local device. Syncing that credential across your
| devices with the same account is just an optional extra
| feature.
| epistasis wrote:
| Without making any explicit argument for it, what I see coming
| out of Fido and U2F are really changing the importance of the
| long-standing "something you have, something you know..."
| mindset around security. That prior mode was not helping us
| design system that take human capabilities of the user into
| account.
|
| Prior security seemed to focus entirely on attackers, and their
| agency, and what they could potentially do. But we also need to
| pay attention to what users can do in order to build a secure
| system. Requiring users to read domain name strings,
| potentially in Unicode, every time, and make sure they are
| correct, to prevent phishing, is a really bad design. Instead,
| have the website authenticate themselves to the user,
| automatically, and have the machine do the string comparisons.
|
| Similarly, the distinction between user and password for a
| biometric doesn't make much sense in this case. It's neither.
| The user is identified by a different mechanism, the biometric
| is merely a way for the device to see that the user is there.
|
| There are always lots of attack modes for biometrics, but they
| are convenient and good enough to capture nearly all common and
| practica attack modes. And a huge problem of the 90s and 2000s
| security thinking was focusing on the wrong attack modes for
| the internet age.
| avianlyric wrote:
| > Without making any explicit argument for it, what I see
| coming out of Fido and U2F are really changing the importance
| of the long-standing "something you have, something you
| know..." mindset around security. That prior mode was not
| helping us design system that take human capabilities of the
| user into account.
|
| Don't think that's quite true. It's continuation of the old
| "something you know", "something you have" and "something you
| are" authentication factors, and the idea that at least two
| factors should be used to authenticate.
|
| The username/password approach is only a single factor,
| "something you know".
|
| Common 2FA solutions use "something you know" (you're
| password) and "something you have" (a device proven via OTP
| or SMS).
|
| FIDO with biometrics trades all that for 2FA driven by
| "something you are" (biometrics) and "something you have"
| (you're devices Secure Enclave).
|
| You don't send you biometrics to the service your
| authenticating with. Rather you're using your biometrics to
| prove "something you are" to your device, which your device
| then mixes with a private key which proves you're in
| possession of a known device. All of that is then used to
| authenticate with your service.
|
| In order to enable a cloud synced private key, you need the
| syncing process to require 2FA to enable new devices. The 2FA
| process can be clunky and slow, because you only need to do
| once per device enrolment. Indeed it's need to be clunkier,
| because you don't have a biometric factor available for use,
| as the enrolment process is normally used to onboard both a
| device _and_ a device specific biometric factor.
|
| After that you're device becomes a know authenticated device,
| which can be used as "something you have" factor for
| authentication.
|
| All of this isn't a change from long standing authentication
| strategy. It's just a refinement of process to make the
| underlying authentication strategy user friendly.
| danuker wrote:
| > you're password
|
| Pardon my pedantry, but you should only use the apostrophe
| (') to show you are joining two words.
|
| In this case, the words are "you" and "are", merging into
| "you're". "you are password" is what I read.
| kozd wrote:
| Are you a bot?
|
| > "but you should only use the apostrophe (') to show you
| are joining two words. In this case, the words are "you"
| and "are", merging into "you're"."
|
| Is obvious, unnecessary and condescending.
| epistasis wrote:
| You make very good points! The user focused design work of
| the FIDO group feels like a large departure of traditional
| designs, but need not be viewed that way in terms of those
| elements.
| vlozko wrote:
| A username (or email, same thing really) is required because
| there needs to be an identity to match a source of auth to.
| postalrat wrote:
| Why?
| snowwrestler wrote:
| The short answer is that you can lose a biometric but still be
| you. So a biometric is not a username.
|
| Also, the biometric does not actually replace a service
| password in this instance, it just helps authenticate you
| locally to a device. The key on the device is what is actually
| replacing your password.
|
| Depending on the device or settings you choose, you don't need
| to use biometrics at all if you don't want to.
| CyberRage wrote:
| From a theoretical point of view or practical?
|
| Username is simply an ID. Password is how we truly verify who
| the user is.
|
| Bio-metrics are just convenient because they are unique and
| hard\impossible to replicate.
| alberth wrote:
| > Bio-metrics are just convenient because they are unique and
| hard\impossible to replicate.
|
| But if your biometric is able to be faked, you can't change
| it like you can change a typical text based password. There's
| no "reset your password" equivalent for biometrics.
| CyberRage wrote:
| Oh gosh... your raw bio-metrics are never stored
| anywhere...
|
| The signal from the sensor is used as a "seed" to generate
| key using robust cryptography
|
| Different sensors will output different "data" based on the
| sensor type.
| imoverclocked wrote:
| > your raw bio-metrics are never stored anywhere...
|
| Unless you have a drivers license in California where
| they require inked versions of your biometrics.
| CyberRage wrote:
| That's governments for you(btw not only CA but other
| places as well) I would definitely be more worried about
| that than my biometrics on my phone.
| deelowe wrote:
| Let's ignore the part about biometrics being faked since
| this seems to be a point of contention.
|
| Isn't it a fair argument that secret keys should be
| mutable by the user? In the future, some unforeseen event
| COULD occur which compromises or otherwise renders the
| particular biometric unusable. Now what?
| CyberRage wrote:
| But they are... Firstly, with how it works. even if you
| use the same finger to generate hundreds of keys, they
| should all be different because we are using
| noise\randomness within the algorithm itself. different
| sensors will generate different outputs and therefore it
| is pointless to worry about the key used stolen.
|
| I think what you want is secret keys completely detached
| from the user. we have that as well with hardware tokens.
| stjohnswarts wrote:
| Once they have a way to fake your biometric though they
| have it for forever, that's the point. With a password
| you have a way to provide a key only known to you and
| while it can be faked, it can also be reset, you can't
| reset your fingerprint without surgery
| CyberRage wrote:
| I don't get the point... If someone steals your
| fingerprint, he stole your fingerprint.
|
| As I explained you can't get the fingerprint from the
| device\key, it is simply not there.
|
| This isn't the problem of the implementation\technology
| if someone stole your fingerprint. it didn't lead to your
| biometrics compromised
|
| What's easier to do? stealing someone's fingerprint or
| cracking\guessing their password.
|
| Definitely the latter.
| nybble41 wrote:
| > What's easier to do? stealing someone's fingerprint or
| cracking\guessing their password.
|
| > Definitely the latter.
|
| You sure about that? A properly generated (i.e. random)
| password won't be cracked or guessed in any reasonable
| amount of time, whereas a model of your fingerprint(s)
| can be lifted from any object you've touched and used to
| create a silicone mold capable of fooling many
| fingerprint readers. And you only have 10 of them at
| best; once all your fingerprints are known to potential
| attackers that's it; you can't use fingerprint
| authentication any more for the rest of your life.
| [deleted]
| CyberRage wrote:
| Let me follow up and say. why do people go nuts over
| biometrics?
|
| Password based biometrics is the last place I would look
| at for biometric compromise.
|
| We leave biometric traces everywhere, all the time. do
| you cover your face and wear gloves in public? hmmmm...
| hansel_der wrote:
| > Oh gosh... your raw bio-metrics are never stored
| anywhere...
|
| right, who would do that... i mean for what purpose...
| CyberRage wrote:
| I mean you don't have to give it away if you think Google
| is storing databases of fingerprints for the lizard
| masters to track you down.
|
| FIDO simply wants to make authentication stronger, you
| can use hardware keys that have a key burnt into them
| which is unique and much harder to brute-force than
| passwords.
|
| Again according to how biometrics are described in
| whitepapers\industry, we extract features from the
| fingerprint\face sometimes very little compared to the
| actual biometric and use it to derive a key. that key
| cannot be reversed to get the original features and
| different algorithms use different features.
| dane-pgp wrote:
| > that key cannot be reversed to get the original
| features
|
| "As a result, the early common belief among the
| biometrics community of templates irreversibility has
| been proven wrong. It is now an accepted fact that it is
| possible to reconstruct from an unprotected template a
| synthetic sample that matches the bona fide one."
|
| -- Reversing the irreversible: A survey on inverse
| biometrics
|
| https://www.sciencedirect.com/science/article/pii/S016740
| 481...
| stjohnswarts wrote:
| they aren't impossible to replicate tho
| CyberRage wrote:
| Well it depends on how you define replicate, I'm not aware
| of a technology that can perfectly recreate someone's
| face\fingerprint.
|
| a photo\mask isn't perfect and actually in some instances
| they fail to work vs sensors because of that.
|
| It is more of a question of how robust is the
| authentication method.(can a photo\mask fool it? which can
| happen sometime but usually require pretty high quality
| sample)
| aftbit wrote:
| Are there any FIDO security keys that explicitly support backing
| up and restoring their master secrets? I would love to move from
| Username + Password + TOTP but my current workflow requires that
| I am able to regain access to my digital accounts using nothing
| but a few page paper backup including core service passwords &
| exported TOTP secrets.
| eikenberry wrote:
| Yes there are. FIDO specifies different authentication levels
| and level 1 allows access to the master keys (it allows for
| pure software implementations). I think this page gives a
| decent overview of the levels...
| https://fidoalliance.org/certification/authenticator-certifi...
| dwaite wrote:
| The vendors here are proposing a platform synchronization
| method such that these are both backed up as well as shared
| across devices within a particular platform account.
|
| There likely is a hardware key that supports export and import
| of keys (even if that winds up being a fork of say the Solo key
| firmware). However, as an end-user one doesn't want to
| accidentally forget to export keys for a while, nor do they
| want to worry about how to properly secure a backup. So, you
| likely would want additional infrastructure such as vendor
| software which would do this for you on a schedule.
|
| There are interesting models which could work here, such as a
| factory-paired 'set' of keys being sold in the same package,
| where only the second key (the one you kept in your fire safe)
| has the necessary keys to decrypt and load such a backup.
|
| The question is whether a security manufacturer would be
| interested in this, as the presence of such a mechanism may
| prevent them from getting certain security certifications and
| being able to sell/be used in certain markets and scenarios.
| snarf21 wrote:
| I wish FIDO was built into the phones (enclave) requiring a
| biometric and passcode. For 99% of users this would be superior
| to email/password and get rid of a lot of hacks/phishing. It
| doesn't require extra hardware to buy and simply requires a
| minor protocol update to have the challenge on a laptop/desktop
| show as a QR-code (or could be sent via BT). The mobile sends
| the response out of band to a destination set at creation.
|
| For users with a greater threat model (worry about enclave
| being hacked), they can use physical FIDO keys.
| amf12 wrote:
| > I wish FIDO was built into the phones (enclave) requiring a
| biometric and passcode
|
| Pixel 6 supports this (Titan Security Key built-in) , but
| only works with Google accounts I think. I hope more phones
| support this.
| tialaramex wrote:
| Not just Google accounts, most Pixel phones (I think I have
| a Pixel 2 here) do WebAuthn. I use it for GitHub
| (occasionally) and my private pastebin setup which is
| WebAuthn protected for ease of use - and I could use it for
| Facebook (but I never book faces on my phone) and other
| services.
|
| One bonus feature does need the Google account. If you're
| signing into say banana.example with WebAuthn, using Chrome
| on a PC, and Chrome can't see any FIDO authenticators
| plugged in, it will try your phone! It asks Google, hey,
| does this user have a phone (Chrome is signed in to your
| Google account) ? Google says yeah, they have "amf12's
| Pixel 6". The Chrome browser uses Bluetooth on the PC to
| say "Hey, amf12's Pixel 6 are you out there? Can you do
| WebAuthn?" if your phone hears the Bluetooth message it's
| like "Hi, amf12's Pixel 6 here. Standing ready to do
| WebAuthn" and then _via the Google servers_ it 's arranged
| that your login attempt on Chrome, on the PC, is proxied to
| the phone, where it appears on screen ("Do you want to log
| in to banana.example as amf12?") and you can OK that from
| the phone. Nice work flow actually, although the technology
| is remarkably complicated.
| X-Istence wrote:
| That is exactly what Safari supports. Safari supports
| TouchID, FaceID (on iOS), and also supports storing data in a
| remote device with a QR code.
| dwaite wrote:
| The proposal here is using iCloud Keychain, leveraging the
| secure enclave. The only catch (for some security-minded, a
| bit one) is that iCloud Keychain acts similar to a resizable
| HSM cluster.
| snarf21 wrote:
| Let me be more specific, this should work in apps not just
| the browser and should work with my logging into my laptop in
| the browser and leveraging my phone as a FIDO "key".
| dwaite wrote:
| > This should work in apps not just the browser
|
| Apple, Google and Microsoft all have native API variants of
| the Web Authentication API. These typically use
| entitlements requiring authorization back to a website.
| This means e.g. a Twitter application could leverage the
| same authenticator registrations as twitter.com, leveraging
| both platform and roaming security keys.
|
| >... and should work with my logging into my laptop in the
| browser and leveraging my phone as a FIDO "key".
|
| The press release details this commitment; for instance, I
| can use an Android phone to log into a website on my Mac.
| An example of such an option should be visible on all
| shipping desktop Chrome browsers if you do a Web
| Authentication registration or authentication request (I
| believe unfortunately currently titled something like 'Add
| an Android phone'). On the Apple side, being able to
| leverage this is currently sitting behind a developer
| feature toggle.
|
| One can hope that this will be extended to say the Windows
| platform itself - at that time, I would expect be able to
| use my iPhone or Android phone to log into any Windows
| machine on an AAD-backed domain.
| giaour wrote:
| This should already be supported on most phones:
| https://webauthn.me/browser-support
| lrvick wrote:
| Most iOS and Android devices have support for WebAuthn right
| now out of the box. Go give it a try.
| vngzs wrote:
| With a sufficiently programmable hardware key, yes, you can
| back up the secrets. See an enumeration of methods in [0]. Be
| careful if you plan on doing this; make sure the tradeoffs make
| sense to you. You probably want to do the programming from an
| airgapped, trustworthy Linux machine.
|
| Beware that if you do this and lose your primary key, or if it
| is stolen, then an attacker can impersonate you. Setting up
| multiple unique keys is probably more useful in general, even
| if it's more cumbersome.
|
| [0]: https://dmitryfrank.com/articles/backup_u2f_token
| [deleted]
| [deleted]
| rdl wrote:
| Ideally there would be a way to create "tickets" or something
| from an authenticator in advance and then use them for
| registration without physical access to the device. Then I
| could have 100 tickets from my backup on my master, keep the
| physical backup in a secure offsite location, and enroll new
| services using master + backup-tickets. When I run out of
| tickets, generate 100 more.
|
| Being able to export/back up/restore master secrets would be
| nice too.
| tadfisher wrote:
| This sounds suspiciously like PGP subkeys. Having not read
| into how FIDO works, I'm going to now assume it works by
| supplying a "public key" to a third party, and the third
| party authenticates by having you encrypt a nonce with a
| private key. How far off am I?
| est31 wrote:
| FIDO involves creating a new public/private key pair for
| each website, to prevent cross website tracking. The keys
| are derived from a secret stored on the device, so the
| device doesn't need to store anything but that secret,
| which enables it to be used with a limitless number of
| websites.
|
| Edit: there seems to have been a paper that studied the
| very question of "can you create keys asynchronously so
| that you can later use them with a backup key":
| https://eprint.iacr.org/2020/1004.pdf
| https://www.youtube.com/watch?v=urJ2DhpLAEk
|
| I need to read the paper however on how feasible an
| implementation of this is, and how much buy-in it needs
| from website and browser vendors.
| lxgr wrote:
| Cryptographically speaking it's signing a challenge, not
| encrypting a value (which would be a public key operation),
| but generally speaking yes, that's the idea of it!
|
| One of the things FIDO adds beyond a protocol for "plain"
| hardware-generated and stored keys is the idea of
| attestation, i.e. authenticators being able to express
| statements like "keys can never leave this authenticator"
| or "this key requires PIN or fingerprint verification" -
| all assuming you do trust the manufacturer.
| tialaramex wrote:
| However, you _should not_ require attestation for public
| services. If you let Jim sign up with his dog 's name as
| a password, but then refuse to let Sarah sign in because
| her FIDO device wouldn't provide "attestation" you're
| crazy.
|
| Attestation _probably_ isn 't the correct choice for
| almost anybody, but the cases where it could at least
| _make sense_ are if you 're an employer checking employee
| authenticators, if you gave everybody a Yubico Fantastic
| XXV authenticator, you might decide it makes sense to
| verify the credentials enrolled are from Yubico Fantastic
| XXV authenticators. But still probably not. On the whole,
| everywhere I see UIs for managing attestation it makes me
| a little bit sad, because it's an attractive nuisance.
| Azure AD for example does this.
| lxgr wrote:
| Sure, don't require attestation for services where 2FA is
| completely optional.
|
| But for sensitive systems/services, why not make use of
| the advanced capabilities that hardware authenticators
| offer? I'm using one in a corporate environment, and in
| my view it makes a lot of sense.
|
| I'd also not be upset if my bank would let me bypass the
| mandatory "account restricted, call customer support"
| dance every time I initiate a transfer over $10 between
| my own accounts using only a "trusted" authenticator
| brand... And banks typically care a lot about security
| properties like "this authenticator does _not_ allow
| extracting private keys ".
| lrvick wrote:
| Existing devices with BIP39 random seed backup support have
| you covered here with far less complexity.
| pat2man wrote:
| Ledger (https://www.ledger.com) supports FIDO and lets you do
| backups. You really need a screen to do it correctly, otherwise
| there is little point in having an external device.
| lxgr wrote:
| This might be true for cryptocurrency transaction initiation,
| but in the WebAuthN model, what's the benefit of having a
| screen?
|
| The result of a WebAuthN challenge procedure is almost always
| a session cookie (TLS channel binding if you're really
| fancy), so the only thing that an authenticator could display
| on your screen is "do you want to authenticate as user x to
| website y", which arguably does not add that much value.
| nybble41 wrote:
| > ... so the only thing that an authenticator could display
| on your screen is "do you want to authenticate as user x to
| website y" ...
|
| That is exactly why you want it.
|
| Consider, for a moment, that you have a key which is used
| to log in to your bank account and some other, much less
| critical site. Perhaps a GitHub account where you store
| some hobby projects.
|
| Without an unforgeable indication on the authenticator to
| show what you're logging in to, malware can wait until
| you're logging in to the second site, and thus expecting a
| prompt to use the authenticator, but actually trigger the
| authentication process for your bank off-screen. You tap
| the button or whatever on the authenticator thinking that
| you're logging in to your GitHub account but actually all
| your money is being siphoned off to who-knows-where.
|
| A key that signs whatever request is presented to it
| without any indication to the user of what the request
| actually _was_ is _dangerous_.
| lxgr wrote:
| If you have malware on your computer (that can compromise
| the browser), it can just wait until you actually log in
| to your bank and then grab the session cookie/proxy away
| your authentication.
|
| It's a different story if the operation you are
| confirming with a security key actually can be rendered
| on the display, e.g. "pay $100 to someshop.com" (as in
| SPC [1]). In that scenario, there is actually nothing to
| steal except for the signed message itself, which would
| be useless to anybody that's not someshop.com, but given
| that WebAuthN almost always just yields a session cookie,
| I don't really see the benefit.
|
| [1] https://www.w3.org/TR/secure-payment-confirmation/
| nybble41 wrote:
| > If you have malware on your computer (that can
| compromise the browser), it can just wait until you
| actually log in to your bank and then grab the session
| cookie/proxy away your authentication.
|
| Sure, but you might never log in to your bank from this
| particular computer precisely because you _don 't_ trust
| it. But you think it's fine to log in to your hobby
| account since that doesn't store anything you really
| consider important.
|
| If you assume there is never any malware on the host then
| you don't need the key at all--the host can store the
| secrets and handle the authentication on its own.
| lxgr wrote:
| Oh, that's a good point - I personally never use my
| security key at untrusted computers, but I guess this
| could be a somewhat common use case.
|
| > If you assume there is never any malware on the host
| then you don't need the key at all--the host can store
| the secrets and handle the authentication on its own.
|
| True, a permanently plugged in authenticator is largely
| equivalent to just using a password manager (which also
| prevents against skimming, if used exclusively via
| autofill, never via copy/paste), but unlike a password
| mananger, it makes unsafe actions explicitly impossible
| for non-sophisticated users. I'd consider this a strong
| advantage.
|
| It also survives OS reinstalls, ransomware etc.
| tialaramex wrote:
| The comment you're replying to is about _cloning_ WebAuthn
| credentials. This is a delicate operation, because you are
| effectively cloning your entire identity. So yeah, a screen
| seems reasonable for that purpose.
| diggernet wrote:
| TacticalCoder has previously commented here
| (https://news.ycombinator.com/item?id=26844019) about the
| ability to set a specific seed on Ledger devices, so you can
| make a backup key that behaves identically to your main key.
| gazby wrote:
| Just so you don't feel alone with the replies being of the
| typical variety, I'm 100% with you. The flaws in the "backup
| token" approach are rehashed constantly but the world keeps
| turning as though they're irrelevant.
|
| I look forward to hardware tokens reaching a popularity level
| where we see implementations in software and this conversation
| can be rendered moot.
|
| Shout out to Mozilla and Dan Stiner for their work so far.
| zozbot234 wrote:
| Software ("virtual") implementations are already possible in
| WebAuthn. It's up to the service whether to allow enrollment
| via a software authenticator; most services will want to
| allow this, seeing as it's still way more secure than
| ordinary username/password.
| throw1138 wrote:
| Are there any existing implementations I should be aware
| of?
| md_ wrote:
| https://github.com/danstiner/rust-u2f
| https://github.com/google/OpenSK
| sodality2 wrote:
| OpenSK is mainly intended for flashing onto a device like
| a nordic dongle.
| throwaway52022 wrote:
| For web apps/services, the browser needs to be involved
| here too, right? (And maybe the OS?) How can I tell Chrome
| on my desktop to use my "software token" instead of Chrome
| looking for a hardware token over USB or finding it via
| NFC, so the remote service can ultimately interact with my
| (virtual) token?
|
| (I don't even want to think about how to tell Mobile Safari
| on my iPhone how to find my key)
|
| EDIT: My ideal setup, I think, is an app on my phone that I
| can use as my token - somehow signaling to my
| desktop/laptop that it's nearby and can be used as a token
| and ideally popping up a notice on the phone lock screen
| when there's an authentication request so I can quickly get
| to it. Then in my app, I'm free to export and backup my
| keys for all of the sites I'm enrolled with as I see fit. I
| know, I know, maybe being able to export the keys makes the
| setup less secure, but I will trust myself not to
| accidentally give the backup to a phishing site. (And I do
| worry that I'll accidentally get phished using a TOTP app,
| so I'd like to switch to FIDO, but I don't want the pain of
| multiple keys)
| mjevans wrote:
| I do NOT want to use my phone. It cannot be considered to
| be a secure device given the 'network' baseband control
| chipset will never be owned by the phone's buyer and has
| full access to the device.
| tpush wrote:
| The baseband CPU doesn't have full access on any decent
| phone.
| lxgr wrote:
| Storing your keys in secure hardware on a phone is almost
| certainly more secure than storing a key in software on
| your desktop hard drive.
|
| If you don't trust your hardware, it's almost always game
| over. Desktops have devices running dubious firwmare as
| well, but at least with a hardware key store, the window
| of compromise ends at the time you update - a stolen key
| stays stolen forever.
| lxgr wrote:
| A much more secure way of doing this is to use the
| platform's/OSs most secure way of storing private keys,
| which in many cases is hardware (Secure Enclave on iOS,
| TrustZone or "real" secure hardware like Titan M on
| Android, TPM on Windows/Linux).
|
| This is already supported by many browsers (unfortunately
| Mozilla/Firefox are dragging their feet on this one [1])
| and gives you exactly the user experience you want.
|
| [1] https://bugzilla.mozilla.org/show_bug.cgi?id=1536482
| zozbot234 wrote:
| This does not solve the backup issue. It's effectively
| using the phone or computer as a whole as a hardware key,
| which introduces multiple failure modes compared to
| external hardware keys while also adding to privacy
| concerns. It might have some extremely niche use for some
| use of on-prem devices in enterprise settings where the
| inability to sever the authentication element from the
| actual hardware might be convenient; other than that,
| TPMs are essentially a misfeature given the existence of
| smartcards and hardware keys.
| lxgr wrote:
| The backup issue is solved by using an external
| authenticator for initial provisioning of new devices.
|
| In a compliant implementation, you can add a new external
| authenticator from an existing trusted device, and a new
| trusted device from an existing external authenticator.
|
| > while also adding to privacy concerns.
|
| What concerns are you thinking about here?
|
| > TPMs are essentially a misfeature given the existence
| of smartcards and hardware keys.
|
| TPMs are essentially built-in smartcards (with a few
| other optional features like measurements/attestation,
| but these have never really taken off as far as I know,
| other than giving TPMs the reputation they have) and are
| very well suited for use as platform authenticators.
| danstiner wrote:
| Thanks for the shout-out!
|
| I wrote that U2F implementation in software because I wanted
| phishing protection without needing to carry a hardware key.
| Well, and to learn Rust :) It's certainly a security trade-
| off to just store secrets in your keychain like I choose to,
| it is not meant to be a replacement for a hardware key and in
| fact I have a Yubikey I use when the situation calls for it.
|
| I'd love to use TPM and biometrics to implement U2F/WebAuthn
| on Linux and have a proper, secure solution. Similar to what
| Apple has done with Touch ID. But that's no easy task. TPM
| support is poor on Linux and other options like relaying auth
| requests to your phone for approval and storing secrets in
| the Secure Enclave is no easier.
| zozbot234 wrote:
| Is this actually needed? Looks like the online part of this is
| just WebAuthn, which could be supported by the same tools we
| use for TOTP. You would "enroll" a visible master secret that
| you could then back up and optionally store in a hardware
| security key. The device itself wouldn't need to allow for
| extracting the secret again, because you backed it up at
| enrollment.
| lrvick wrote:
| Ledger and similar hardware wallets support one time BIP39
| backup covering all current and future accounts today.
| jupp0r wrote:
| Yubikey recommends a backup key for that very reason. Most
| providers allow you to register multiple keys.
| markatto wrote:
| I use the "multiple security keys" approach, and the biggest
| problem is keeping track of which keys are registered with
| which services and making sure the list is up to date. A few
| examples of situations where this is a problem:
|
| 1) I don't keep all of my keys on my person, so if I want to
| sign up for a service when I'm not at home, I have to
| remember to go back and add my other keys at a later time. If
| I wanted to, for example, keep a backup key in an offsite
| location such as a safe deposit box, this would be even more
| painful.
|
| 2) If I lose a key, I need to go and change every service to
| deactivate the lost key and add my replacement key. This is
| both time-consuming and error-prone, as it requires me to
| keep a full list of providers that I use keys with somewhere.
|
| 3) Some providers do not even allow you to register multiple
| keys.
| kayodelycaon wrote:
| And where do you store the backup key?
| graton wrote:
| In each one of my PCs and also on my key-ring is how I do
| it.
| kayodelycaon wrote:
| If you sign up using one key, do the other keys work with
| that account? Unless it does, you're greatly increasing
| the complexity of creating new accounts anywhere.
|
| That's basically what I'm getting at. Do I need to do
| significant amounts of extra work to keep an off-site
| backup in another state?
| graton wrote:
| I don't personally consider it greatly increasing
| complexity. At account creation I register the Yubikeys
| at the PC and on my keyring. When I first login from a
| different PC I use the Yubikey from my keyring to login
| and then register the Yubikey at this new PC.
| tadfisher wrote:
| Sounds like we should be doing FIDO with TPM then.
| howinteresting wrote:
| Sounds miserable compared to using a password manager.
| MrStonedOne wrote:
| Yep! Just store your backup key in a safe-deposit box with
| your bank.
|
| Then go get it every time you sign up for a new account so
| you can make it the backup for that account
|
| then go store it again.
|
| and again. and again. and again.
|
| oh no! you lost your key! time to go to the bank to get your
| backup, sign in to all the accounts, remove the old key,
| register a new backup, oh wait, got to wait for the new
| backup to ship, so i guess you can't do that yet. hope you
| don't lose your key in the meantime, anywho, time to spend a
| few hours painstakingly removing your lost key from all the 9
| thousand sites you use.
|
| yay! its a week to a month later, you finally got your new
| yubikey shipped, time to go log into to 9 thousand websites
| again to set it up as the backup for all of the sites.
|
| Ok, time to take it down to the bank.
|
| whats this? a cool new app my friend wants to show me, ok,
| time to go drive to the bank and get my backup key out of
| storage and sign up for this cool new app.
|
| You know, this whole driving to the bank thing, its kinda
| inconvenient, maybe i should just store it in my closet safe.
|
| What do you mean the gas line under my house exploded? but
| both my yubikeys are in there!
|
| ----
|
| The above is fiction, and even under fiction it seems
| ridiculous how this would really go is even worse:
|
| "Go get my backup key to use for this new app my friend
| showed me? fuck that"
|
| . .
|
| "What do you mean i can't reset my password, but i lost my
| yubikey!"
|
| "No, i didn't want to get up to grab my backup token when i
| was registering."
|
| "Oh wait! i bet i still have the recovery codes as a pdf in
| my downloads folder. its a good thing no viruses ever think
| to look in there"
|
| ----
| lrvick wrote:
| There is no need for your described complexity.
|
| More advanced FIDO devices like the Ledger allow you to
| backup the initial random seed allowing you to create a
| duplicate device from the backup any time you wish. No
| sites you signed up with will know or care that you swapped
| devices as the new device will generate identical keys via
| a deterministic KDF from the seed.
|
| You can put this seed far away and would only ever need it
| when you wish to replace a lost or broken authentication
| device.
|
| Aside: no US major banks issue safety deposit boxes anymore
| other than wells fargo which will stop issuing them soon as
| well.
| Zamicol wrote:
| "providers allow you to register multiple keys"
|
| Why isn't my identity just a Merkle root? I don't understand
| the need to register individual keys.
| giaour wrote:
| You'd also need some way to revoke keys signed by the root
| if a valid hardware key were lost, stolen, or confiscated.
|
| I think Yubico will actually do something like this for
| large enough customers, though revocation is left as an
| exercise for the customer. When I worked for AWS, I was
| issued a couple company YubiKeys, and there was a web
| portal where I could revoke a token's association with my
| account.
| lxgr wrote:
| How would you add a new key at a later point in time (i.e.
| after your initial registration of e.g. a main and a backup
| key, after having lost the main key and wanting to add a
| new backup/main key)?
|
| The FIDO/WebAuthN model does by design not include a
| stateful/centralized authority that could maintain the
| required state for any such solution.
| jedberg wrote:
| While it's not a bad idea, I'm pretty sure Yubikey recommends
| a backup key so that they can sell twice as many Yubikeys.
| nybble41 wrote:
| A backup key is a good idea, but there needs to be a way to
| _enroll_ the backup key for a new account (ideally
| automatically) without the key being physically present,
| since otherwise you can 't (practically) store the backup key
| off-site. To this end, you should be able to enroll a key
| using only the public part of the keypair.
| traceroute66 wrote:
| > Are there any FIDO security keys that explicitly support
| backing up and restoring their master secrets?
|
| Why would you need that ? On most services that I use that
| support FIDO, you can register as many keys as you like.
|
| Seems to me that is a much more secure option than to provide a
| potentially exploitable option of allowing key extraction.
| eikenberry wrote:
| I have 100s of passwords and dozens of TOTP keys in my
| password manager. Logging into every one of these sites with
| 2 keys, and having to re-auth with all of them if you lose
| one of those is unworkable. It only really makes sense for
| centralized auth solutions like you'd have at work, not for
| day to day personal things. I want a FIDO key that I can use
| for day to day things.
| lrvick wrote:
| Backup a 24 word seed phrase once on a FIDO device that
| supports BIP39.
|
| Now go enroll 100 sites in it and then lose or destroy the
| device.
|
| Enter the 24 word backup to a new device and access to your
| 100 sites is restored.
| li2uR3ce wrote:
| I want to avoid having to fetch my backup key every time I
| want to setup a new account. The backup key is kept offsite
| and thus inconvenient. It's offsite because that protects me
| from fire or other such catastrophic events.
|
| This is what keeps me preferring password based security
| because I can backup my encrypted password database offsite
| with ease. Everything else provides hard path to recovery.
| traceroute66 wrote:
| > I want to avoid having to fetch my backup key every time
| I want to setup a new account. The backup key is kept
| offsite and thus inconvenient.
|
| As stated, you can have as many backup keys as you like.
|
| Thus you are not limited to two keys.
|
| Buy one more key. Keep one off-site and two on-prem. Rotate
| them once in a while to keep the off-site one fresh.
| zozbot234 wrote:
| > As stated, you can have as many backup keys as you
| like.
|
| That does not solve anything unless the backup keys are
| enrolled to each and every one of the services you use.
| Adding more backup keys and storing them more and more
| securely just makes it harder to be sure that you've
| enrolled them all to the latest service.
|
| This is not an issue if users are explicitly allowed to
| enroll "virtual" soft authenticators that they can back
| up and restore as they see fit, but that's an additional
| requirement that comes at some compromise, since some
| services might instead want to ensure that you're
| enrolling a non-cloneable credential. (E.g. your physical
| bank or non-remote employer, that can easily verify your
| identity via additional means if needed to restore your
| access.) The WebAuthn spec allows for both models.
| md_ wrote:
| I think their point is that they don't want to have the
| window of vulnerability (to loss) when they have added a
| new account but haven't yet rotated their off-site key?
|
| That said, the real answer is that FIDO keys can be
| synced by e.g. Apple (as described in more detail here:
| https://www.wired.com/story/fido-alliance-ios-android-
| passwo...). So you can potentially just make your offsite
| backup be a hardware key _that gets you into your iCloud
| keychain_ , and (if you are willing to trust Apple) use
| your iCloud for backing up all your other accounts' keys.
| MrStonedOne wrote:
| light_cone wrote:
| First of all, _most_ services is not _all_ services, so you
| have a use case here.
|
| Also, you could make FIDO keys that support restoring but not
| backing up. If you could set up a FIDO with custom random
| seed _as an expert option_, then you could have a secure key,
| and keeping the seed private would be your expert problem.
|
| I would adopt such a solution, whereas now I don't adopt the
| proposed solution because I cannot add a new service while
| having the backup key remaining off-site.
|
| Maybe another solution would be to be to have _absolutely
| all_ services accept several keys (enforced by protocol), in
| addition to be able to accept adding an off-site key with
| only its fingerprint, but without requiring to have it
| physically.
| andrey_utkin wrote:
| You cannot register more than one security key with *AWS*.
| Which a lot of devs beg for, for years. Can't find that
| support ticket URL now.
| Spivak wrote:
| And probably never will because the official unofficial
| solution is to use SSO and have your identity provider
| handle that.
| docandrew wrote:
| That's a pretty big foul. I use a different hardware key
| plugged into each workstation I use and then some "roaming"
| keys that I can use for backup, travel, etc.
| lrvick wrote:
| To work around this I sync multiple Ledger devices with
| identical seed phrases which allow for duplicate FIDO
| devices that can be shared with any teams that need break-
| glass root account access.
| lrvick wrote:
| Hardware wallets like a Ledger allow you only once, on
| initialization, to backup the initial random seed to paper in
| the form of 24 english words via the BIP39 standard.
|
| You can use this seed by hand or on a duplicate device to
| deterministically recreate all keys be they webauthn, pgp,
| bitcoin, or otherwise.
| docandrew wrote:
| I think the whole point of HSMs is that you can't back up
| (read: exfiltrate) the master secrets. Having said that, on
| certain Yubikeys you can store PGP keys on them, and put the
| same secret key on several different Yubis. If you're relying
| on a hardware key it's probably a good idea to have a backup
| key and make sure both are registered with whatever system
| you're accessing. LastPass and GitHub at least support adding
| several different security keys.
| adrian_b wrote:
| The ability to have a backup does not imply any ability to
| exfiltrate the master secrets.
|
| It is enough to have a means to wipe out any information
| contained in the device, including any master secret.
|
| At that point, there should be a means to enter a new master
| secret in the blank device, before proceeding to use it
| normally.
|
| If a device provides this feature and it does not contain any
| other secret information introduced in it by the
| manufacturer, then it allows the owner to have any kind of
| backup that is desired.
|
| I am also one of those who would never use a security device
| that contains any kind of secret information that is not
| under my control.
| docandrew wrote:
| If I'm the one entering the master secret in, then the
| device is a glorified password manager. The point of an HSM
| is that nobody, not even the user, can access the secrets.
| I'm not saying there isn't a use case for such a device, or
| that it isn't possible, only that the security guarantees
| you get from it are different. The security model you're
| describing is the same as someone entering their secret key
| in the "notes" app in a phone, leaving it in Airplane mode
| with FDE and wipe after a certain number of incorrect PIN
| entries. You can call that a "HSM", but it's not what I'd
| consider one.
| adrian_b wrote:
| The device that you name "HSM" is the kind of device that
| is suitable for a company to distribute to its employees
| to login into the company network or servers.
|
| It is not a device useful to have for an individual user.
| On the other hand, hardware password managers are useful
| for individual users.
| zozbot234 wrote:
| A password manager that does not allow one to log in to
| the wrong site is still very useful. Also, just because
| you're entering a master secret in doesn't mean it's any
| easier to get it out. The user could simply be required
| to generate the master secret herself and back it up on
| her own.
| docandrew wrote:
| Sounds basically like a key store/loader like this one:
| https://www.cryptomuseum.com/crypto/usa/kyk13/index.htm.
| I have a little experience with its successor. There's a
| legit purpose for it but it's a different animal then an
| HSM in my opinion.
| TacticalCoder wrote:
| > If a device provides this feature and it does not contain
| any other secret information introduced in it by the
| manufacturer, then it allows the owner to have any kind of
| backup that is desired.
|
| Precisely. The Ledger Nano S (and probably the Nano X too)
| allows to do exactly what you describe, the very way you
| describe it (three wrong PINs, on purpose or not, and the
| device resets itself to factory default and, as you wrote,
| at that point it's unusable until you enter again a master
| secret (either your old one or a new one: the device has no
| way to know and doesn't care).
| miohtama wrote:
| iPhones can do wipe on too many failed attempts as well:
|
| https://osxdaily.com/2021/05/27/set-iphone-erase-
| automatical...
| lxgr wrote:
| You'll need to generate the key on a less secure host to do
| that, though, which partially defeats the purpose of a
| hardware key in the first place.
|
| As far as I understand, "real" HSMs (i.e. the expensive, rack
| sized type of security key) sometimes offer the ability to
| export their root key to other models by the same
| manufacturer using a specific ceremony.
|
| Arguably this also significantly weakens the security of the
| keys protected in the HSM, but at least it does not
| automatically expose it to software.
| TacticalCoder wrote:
| But is that a problem though? I generated my own HSM/U2F
| keys throwing dice and the seed is basically just one 256
| bit numbers. I did have, indeed, to compute a matching
| checksum (for the scheme I used represented the 256 bit
| numbers as a list of 24 words, out of a dictionary of 2048
| words, where some bits of the last number acts as a
| checksum).
|
| This only needs to be done once. For example by booting an
| old computer with no wifi capabilities and no hard disk
| from a Linux Live CD.
|
| > You'll need to generate the key on a less secure host to
| do that, though, which partially defeats the purpose of a
| hardware key in the first place.
|
| I kinda disagree with that. I generated my key by throwing
| physical dice. No random number generator to trust here. I
| only needed an offline/airgapped computer to compute the
| checksum and that program cannot lie: I know the first 256
| bits out of the 264 bits so the program computing the
| checksum cannot lie to me, it's only giving me 8 bits of
| checksum.
|
| Then I only need to trust the specs, not the HSM vendor.
|
| Now, sure, my old Pentium III without any hard disk and
| without any WiFi, without any physical ethernet port, may
| be somehow compromised and exfiltrate data through its fans
| or PSU or something but what are the odds here? Especially:
| what are the odds compared to the odds of having a rogue
| HSM vendor selling you U2F keys for which it fully knows
| the secret?
|
| I'd argue this requires less trust than the trust required
| in buying a pre-initialized HSM device.
| lxgr wrote:
| > booting an old computer with no wifi capabilities and
| no hard disk from a Linux Live CD.
|
| By doing that, you are increasing your trusted code base
| by several orders of magnitudes doing that. This might be
| fine for your purposes, but in a corporate environment,
| it might very much not be.
|
| > Then I only need to trust the specs, not the HSM
| vendor.
|
| You do trust the HSM (vendor) no matter how you use it.
| Ironically, the more modern a cryptographic protocol is,
| the more opportunity for surreptitious key exfiltration
| there is. This could be in the form of predictable (using
| a shared secret) initialization vectors, wrapped keys and
| much more.
|
| You also trust an HSM to be more tamper-resistant and/or
| more hardened against logical attacks than a regular
| computer, or there would not really be a point in using
| one in the first place.
| docandrew wrote:
| Yeah, for the paranoid you'd have an air-gapped box with
| your PGP tools on it that you'd plug the hardware key into
| for setup.
| withinboredom wrote:
| You can generate the keys inside the yubikey. Then just
| have two keys instead of a shared key. That's actually
| better IMHO since that allows you to revoke one of you lose
| it instead of having a compromised backup.
| lxgr wrote:
| Ah, sure, if your use case allows registering multiple
| keys that is indeed a good way to solve it. Unfortunately
| that's not always the case (as pointed out in other
| threads).
| sheerun wrote:
| A backup key could have ability to reject original one,
| problem solved...
| lxgr wrote:
| How would that work in practice? Key/certificate revocation
| is notoriously hard.
| withinboredom wrote:
| With GitHub, you just remove it from your account. For
| 2fa, same story.
| yuav wrote:
| For HSM with FIPS140 Level 3 certification the master key can
| only enter and leave in encrypted form. Backup/restore and
| cloning is possible, but there are mechanisms like hardware
| and firmware validation to ensure only the same type device
| and certified venfor software can make use of it.
| thinkmassive wrote:
| compliance != security
| [deleted]
| TacticalCoder wrote:
| > I think the whole point of HSMs is that you can't back up
| (read: exfiltrate) the master secrets.
|
| You're getting it backwards though. You are right that the
| whole point of an HSM is to not leak secrets when connected
| to a compromised computer. However there's nothing wrong with
| a HSM device that can be initialized with a "seed" of your
| liking, as long as that initialization step is done in a
| fully offline / airgapped way.
|
| Ledger (whose CEO was, before creating Ledger), one of the
| member of the team working on the FIDO specs, make a
| "hardware wallet" for cryptocurrencies that can run a FIDO
| app. And it's totally possible to initialize the seed of your
| liking for your U2F device.
|
| Now I did test this a while ago (out of curiosity) and it all
| working but I'm not really using it atm: I don't know where
| it's at regarding the latest FIDO2 specs.
|
| But the point is: what GP asked for can totally be done.
|
| I understand some may want to move the goalpost and say: _"
| ok but then the problem now is not losing your piece of paper
| where you wrote that seed"_. But that is another topic
| altogether that does change nothing to the fact that you can
| have an HSM used for U2F that can be backed up.
| docandrew wrote:
| How is that seed any different from a password then?
| lrvick wrote:
| It never comes into contact with the memory of a network
| connected system, therefore putting it beyond the reach
| of phishing or malware thus solving the overwhelming
| majority of risks actual passwords are exposed to.
| Kab1r wrote:
| For one, it's only ever seen once: during initialization.
| jjeaff wrote:
| In addition, I believe it is not stored on the device. So
| it can't be exhilarated.
| TacticalCoder wrote:
| > Are there any FIDO security keys that explicitly support
| backing up and restoring their master secrets?
|
| Yup there are for sure, for I tried it and it works. Now: I
| tried it out of curiosity and I'm not actually using it atm, so
| I don't where it's at but...
|
| I tried on Ledger hardware wallets (stuff meant for
| cryptocurrencies, but I tried them precisely for the U2F app):
| initialize the device with a seed of my own and then register
| to FIDO2/U2F/webauthn sites. Worked fine.
|
| Took a _second_ hardware wallet, initialized it with the exact
| same seed: boom, it 's working fine and I could login using
| that 2nd HSM device as the 2FA.
|
| Note that as soon as I used the 2nd device, the first one
| wasn't working anymore: if I wanted it to work again, I'd need
| to reinstall the U2F app on the HSM device (the way the device
| work is it only accepts apps that are signed, and that is
| enforced by the HSM itself: the HSM has the public key of the
| Ledger company so it can only install "apps", like the U2F app,
| that is actually signed by Ledger... I'm not saying that's 100%
| foolproof, but it's not exactly the most hackable thing on
| earth either).
|
| The reason you cannot use both devices at once is because of
| how an increment number is used: it has to be monotonicaly
| increasing and when the app is installed on the HSM, it uses
| the current time to initialize its counter.
|
| I haven't checked these lately: I know the specs evolved and I
| know Ledger said they were coming with a new U2F app but I
| didn't follow the latest developments.
|
| Still: I 100% confirm you that it's not only doable but it's
| actually been done.
| TacticalCoder wrote:
| > requires that I am able to regain access to my digital
| accounts using nothing but a few page paper backup including
| core service passwords & exported TOTP secrets.
|
| EDIT: you basically save a 256 master seed as a list of 24
| words (out of a fixed dictionary of precisely 2048 words, so
| 11 bits of entropy per number). 264 bits altogether: last
| word contains 3 bits par of the seed and 8 bits of checksum.
|
| Trivial to write down. Very little chance of miswriting it
| for: _a)_ you must prove to the HSM you wrote your seed down
| correctly and _b)_ the dictionary is known and hardly any
| word can be mistaken for another.
| password4321 wrote:
| There are some software tokens (Wear OS watch, Android phone),
| but they are purposely not exportable from the Android
| Keystore.
|
| https://github.com/herrjemand/awesome-webauthn#software-auth...
|
| There was mention of a secure backup proposal, but it doesn't
| appear to have been touched after being a draft for a year:
|
| https://github.com/Yubico/webauthn-recovery-extension
| lrvick wrote:
| Devices like Ledger already have seed phrase backup/restore
| via BIP39, but this is only safe or practical on devices with
| a screen which Yubico is very adamantly against ever
| supporting.
| NovemberWhiskey wrote:
| By design, ideally not. WebAuthn optionally includes (as a
| 'SHOULD') a signature counter concept that allows relying
| parties to be confident that the attestation isn't coming from
| a previously-cloned version of the authenticator.
| zozbot234 wrote:
| The specification states that cloning should merely be
| included in the service's underlying threat model. E.g. you
| might well be able to log in from an authenticator that fails
| the "signature counting" step if it's expected that the
| authenticator would allow for backing up and restoring its
| stored credentials.
| NovemberWhiskey wrote:
| I think the point is it's up to the relying party to make
| the decision; so an uneven user experience is possible.
| zozbot234 wrote:
| If an "insecure" authenticator reveals itself at
| enrollment, the only "unevenness" is that attempts to
| enroll it might fail, perhaps with a request to use a
| securely attested authenticator instead - you would never
| be locked out from any service after the fact. This is
| better than "cloning" a supposedly secure device and then
| failing a count check while trying to authenticate.
| NovemberWhiskey wrote:
| Yeah; being told "you can't use that here" for some sites
| is a pretty uneven user experience, wouldn't you say? Not
| sure why the scare quotes are necessary here.
| mmis1000 wrote:
| I think FIDO is not meant to be backup at first place. It's
| more a derived key exists to authenticate certain device safely
| without being stolen to authenticate other you aren't expected.
| Make it easily backupable actually defeats its whole purpose if
| it is intended to be used this way.
|
| And for service that actually want it to be used as major key.
| I think they can just make the one authenticated able to
| authenticate another(and even decide whether this new device
| can auth yet another or not). (Like the way google use: popup
| on user phone, and ask if user would like to let the computer
| login.)
|
| I think the one we actually need is a common protocol to
| authenticate new fido device from existing one. Although you
| can do it currently, every website have different flow to do
| it. And there is not currently a common way here. A common and
| machine understandable way to auth new device from existing one
| may ease the pain.
| lxgr wrote:
| At least the WebAuthN standard seems to be moving in a
| different direction [1], which is also surprising to me.
|
| In a nutshell, it will be possible for relying parties (i.e.
| websites) to detect multi-device/backup capable
| authenticators if required, but disabling multi-device
| functionality would require a very explicit opt-out, not an
| opt-in, on the relying party's side.
|
| [1] https://github.com/w3c/webauthn/issues/1714
| mmis1000 wrote:
| That seems to make fido just a non human
| readable/rememberable account/password. A somewhat
| downgrade from original hardware enforced implementation.
| But also make it more usable to majority of people, because
| keep something without losing it is just a pain to many
| people(where is my fxxking key goes again?). And it is
| still 1000x better than people using same password on every
| website.
| lxgr wrote:
| It's much more than a non-rememberable password: One of
| the most important attributes of WebAuthN/FIDO is that
| it's fundamentally impossible to fall victim to phishing.
|
| Assuming your browser isn't itself compromised, it is
| impossible to authenticate using a key for service.com on
| evil.com. Passwords can't do that. (PAKEs or other
| challenge/response protocols theoretically could, if
| implemented in the browser and not on websites, but
| that's a different story.)
| [deleted]
| _pdp_ wrote:
| Everyone forgets that the majority of people don't use password
| managers let alone FIDO/WebAuthn and similar technologies. It
| will take really, really long time to replace passwords.
| mrkramer wrote:
| I never understood why TLS Client Authentication[0] is not used
| more because that way all other standards including FIDO wouldn't
| be needed.
|
| [0] https://blog.cloudflare.com/introducing-tls-client-auth/
| zozbot234 wrote:
| Client certificates have privacy concerns when used with
| ordinary web sites, as opposed to the API requests assumed in
| that blogpost. WebAuthn provides a tweaked featureset that
| addresses these concerns.
| NovemberWhiskey wrote:
| Within the DoD, it's ubiquitous. see
| https://en.wikipedia.org/wiki/Common_Access_Card
| jedberg wrote:
| It's the classic key distribution problem. You'd have to get
| client certificates to people securely. It works for
| corporations because they can send you a device with the client
| cert pre-loaded.
| NovemberWhiskey wrote:
| Firstly, PKI generally doesn't have a key distribution
| problem because keys never get distributed; they get
| generated in-place, certificate signing requests are sent to
| certificate authorities, certificates are signed and
| returned.
|
| Secondly, in the TOFU model that applies to WebAuthn, you
| don't even need to have a certificate authority - you can
| self-sign.
|
| The problem is really, as alluded to another comment, that if
| you share a single certificate across multiple sites then you
| are sharing a common tracking id between them (e.g. your
| certificate fingerprint).
|
| Logout is also a user experience pain point unless the
| certificates are stored on e.g. a smart card that can be
| removed.
| _trackno5 wrote:
| Has anyone found a link to the new specification draft? I'm
| wondering if the spec for the authenticator will be open so that
| anyone could build their own
| stjohnswarts wrote:
| no thanks. I don't want one account that apple/google/whoever can
| revoke and ruin my online life. fuck that. I'll take my chances
| with 2FA and passwords. When it finally gets breached (if you
| haven't been cancelled!) imagine how much one online cracker will
| able to do. This also allows them unlimited access to follow you
| all around and see what you do, where you log in, etc.
| _trackno5 wrote:
| > I don't want one account that apple/google/whoever can revoke
| and ruin my online life
|
| What are you even going on about?
|
| As long as you have access to your FIDO authenticator, you'll
| still be able to use it regardless of what they do with your
| account.
| [deleted]
| teeray wrote:
| Does this provide any benefit over a (properly used) password
| manager?
|
| I'd be happy with just:
|
| - an "alphabet", "minlength" and "maxlength" attributes on
| password fields so password managers generate perfect passwords
| every time
|
| - a well-known URI for password managers to do zero-touch
| password rotation.
|
| - actual elements for login components to close the confused
| deputy attack for password managers.
|
| All these things would be much easier changes for crufty old
| sites to make, which would aid adoption.
| yakak wrote:
| First you need to invent a password manager that can be
| properly used? The one I have runs on my computer and trusts
| everything else I've ever installed not to have put in a
| mechanism to observe memory allocated to my browser.
| teeray wrote:
| Is that how most credentials are stolen these days?
| Sahbak wrote:
| Most attacks are social engineering. Everything else, to
| the best of my knowledge, does not target passwords
| fullstop wrote:
| A keylogger can take my password but a Yubikey can not be
| copied.
| vngzs wrote:
| If you're copying passwords out of a password manager and
| pasting them into password fields, then yes, you're getting a
| significant improvement to phishing protection with a hardware
| key. If you're using the password manager's autofill feature,
| and that autofill feature is bug-free, then you're not getting
| any additional phishing protection.
|
| Your passwords can still be stolen, however. Any hardware
| authentication mechanism is going to ensure that no matter how
| compromised your local machine is, the worst an attacker can do
| is steal one active session. They can't steal the secret
| required to initiate any future session.
| tadfisher wrote:
| If the password manager stores the requesting site with the
| secret, either manually or through TOFU, then it has an
| opportunity to provide better phishing protection than manual
| copying.
|
| This is how Android Password Store [1] works, and it
| regularly triggers a phishing warning (that I have to
| override with multiple taps) when I'm trying it out by
| attempting to autofill a password for one app with the
| password associated with a different app ID.
|
| Granted, I also use it with my Yubikey, because that's what
| holds the GPG decryption key.
|
| [1]: https://github.com/android-password-store/Android-
| Password-S...
| mark_l_watson wrote:
| Does anyone here know what privacy/tracking issues are with this
| standard?
| jeroenhd wrote:
| I haven't found any so far. Each account gets a new
| public/private key pair so accounts can't be traced back to
| each other. Usernames are optional and might even become a
| thing or the past, making username reuse less of an issue for
| linking accounts.
|
| It all depends on the sync method provided. If synchronisation
| isn't end-to-end protected, you're handing
| Apple/Google/Microsoft the keys to the kingdom which is pretty
| bad.
| bachmeier wrote:
| It fully resolves all existing privacy and tracking issues. You
| will have no privacy and will be tracked all the time. Problem
| solved.
| 0daystock wrote:
| FIDO2 privacy is actually pretty good and well thought out.
| There's a theoretical risk of a website sending authentication
| challenges for two different accounts and having both
| assertions signed by the same credential, basically correlating
| those accounts together, but this is unlikely to weaken
| collective privacy at scale.
| RL_Quine wrote:
| Kind of. Yubikeys intentionally have a very small number of
| devices signed with one CA key and then they produce a new CA
| key, so those devices do have a basically unique identifier.
| 0daystock wrote:
| Can you share any more information about that? Is this
| identifier shared as part of the FIDO2/U2F spec?
| md_ wrote:
| I think they're referring to attestation
| (https://fidoalliance.org/fido-technotes-the-truth-about-
| atte....), which requires that attestation certificates
| be shared with a minimum of 100,000 other devices in
| order to ensure they're not unique IDs.
|
| Maybe the parent misread the spec as saying a _maximum_
| of 100,000? Or something?
| dane-pgp wrote:
| The point being, the FIDO Alliance reserves the right to
| blacklist any device that an attacker manages to extract
| the secret keys from, which has the consequence that
| 99,999 other people have their devices bricked.
|
| Also, the Alliance could decide to blacklist a
| manufacturer just because they haven't implemented some
| new policy (like requiring a DNA scan of the user) so you
| better make sure that you buy a device from one of the
| "too big to fail" providers.
| md_ wrote:
| > The point being, the FIDO Alliance reserves the right
| to blacklist any device that an attacker manages to
| extract the secret keys from, which has the consequence
| that 99,999 other people have their devices bricked.
|
| 1. By what mechanism can they blacklist a device? A given
| _relying party_ can choose to use or not use attestation
| and, if they choose to use it, which certificates to
| trust. But that 's between you and the RP. Authentication
| doesn't "talk to" the FIDO Alliance--which is just a
| standards body and does not (AFAIK) even publish anything
| like a CRL for "bad" attestation keys, so I don't
| understand what you are talking about here.
|
| 2. The intention of the attestation, as I understand it,
| is to enable RPs to use attestation to limit
| authenticators to e.g. those that pass FIPS certification
| (or similar enterprisey requirements), not to ban a whole
| batch because one key is known to be compromised. That's
| crazy; can you point out where anyone other than you has
| ever proposed this?
|
| 3. DNA scan? What are you talking about?
|
| 4. This assertion you are making, while bizarre and
| wrong, is very different than the assertion the
| grandparent made ("Yubikeys intentionally have a very
| small number of devices signed with one CA key...so those
| devices do have a basically unique identifier"), which,
| while also wrong, is I think a genuine mistake and not a
| bad-faith argument.
| dane-pgp wrote:
| > A given relying party can choose to use or not use
| attestation and, if they choose to use it, which
| certificates to trust.
|
| True, and a website could decide to issue its own
| certificates rather than get one from a CA trusted by
| browsers, but in practice (and potentially one day by
| law) most sites will defer to the FIDO Alliance to
| determine which devices are "sufficiently secure".
|
| > the FIDO Alliance--which is just a standards body and
| does not (AFAIK) even publish anything like a CRL for
| "bad" attestation keys
|
| "The FIDO Alliance Metadata Service (MDS) is a
| centralized repository of the Metadata Statement that is
| used by the relying parties to validate authenticator
| attestation and prove the genuineness of the device
| model."[0]
|
| > That's crazy; can you point out where anyone other than
| you has ever proposed this?
|
| "If the private ECDAA attestation key sk of an
| authenticator has been leaked, it can be revoked by
| adding its value to a RogueList."[1]
|
| > DNA scan? What are you talking about?
|
| I picked a deliberately extreme example to make the point
| that there are requirements for these devices that users
| might not be happy with (but might not have any choice
| about, once the capability becomes ubiquitous). That
| specific example may never come to pass, but I don't
| think we should assume that allowing RPs to put arbitrary
| conditions on the hardware we use is a power that won't
| be abused.
|
| For added context: "FIDO will soon be launching a
| biometric certification program that ensures biometrics
| correctly verify users. Both certifications show up as
| metadata about the authenticator, providing more
| information to enable services to establish stronger
| trust in the authenticators.)"[2]
|
| > This assertion you are making, while bizarre and wrong
| ... a bad-faith argument.
|
| Maybe you should have assumed good faith.
|
| [0] https://fidoalliance.org/metadata/
|
| [1] https://fidoalliance.org/specs/fido-
| uaf-v1.2-rd-20171128/fid...
|
| [2] https://fidoalliance.org/fido-technotes-the-truth-
| about-atte...
| throwaway52022 wrote:
| I've resisted switching to a hardware key because I know that I'm
| going to break it, and that seems like a huge pain in the ass. I
| really want to be able to make a couple of backup keys, or maybe
| put another way, I want to be able to put the private key on the
| device myself, I don't necessarily care that the key is generated
| on the device and never leaves the device. I don't care if that
| slightly reduces my security - I'm not protecting nuclear
| weapons, my threat model is not state actors trying to attack me,
| my threat model is me leaving my key in my pants pocket before
| putting it in the washing machine.
| [deleted]
| vlan0 wrote:
| >my threat model is me leaving my key in my pants pocket before
| putting it in the washing machine.
|
| "YubiKey Survives Ten Weeks in a Washing Machine"
|
| I think you'll be safe! :)
|
| https://www.yubico.com/press-releases/yubikey-survives-ten-w...
| eMGm4D0zgUAVXc7 wrote:
| Source: The people who sell YubiKeys.
| sumitgt wrote:
| I've actually put my Yubikeys through worse. They've always
| survived :)
|
| These were the larger ones. Not sure how the smaller "nano"
| ones would perform.
| vlan0 wrote:
| You know what...I have a couple spares. I'll run the
| experiment myself :)
| lxgr wrote:
| While I'm generally a fan of YubiKeys too, I've had one break
| without warning. Backups are highly recommended - you can
| lose them too, after all.
| john567 wrote:
| I was about to say the same thing. These things are quite
| sturdy and if you loose your key (as people do) you retire
| the old one and make a new. These things are neither
| expensive nor irreplaceable. Of course if you loose your key
| it's going to hurt, as it should.
|
| I've had a Yubi Key for almost 5 years now. Zero issues.
| md_ wrote:
| This announcement is primarily not about hardware keys, right?
| xur17 wrote:
| Has anyone tried a Ledger or Trezor device for something like
| this? Your FIDO U2F private key is deterministically generated
| [0] based upon your seed phrase, which you can backup, and
| restore on other devices.
|
| [0]
| https://www.reddit.com/r/ledgerwallet/comments/udzx1c/ledger...
| eswat wrote:
| I have a Ledger as a _backup_ key. Keyword is backup since it
| 's less convenient to use than a Yubikey due to needing to
| put in a pincode first. Though that could be a security
| feature in and of itself.
| aaaaaaaaata wrote:
| Any input which (Trezor/Ledger) is least likely to be a
| honeypot?, and/or a good device?
| xur17 wrote:
| What do you mean by a honeypot? Both are pretty well
| trusted devices, and users easily have tens of billions of
| dollars deposited on them. Given that, I feel pretty
| comfortable using them for 2fa.
| lxgr wrote:
| At least Ledger actually does support U2F as an installable
| application, but that's the predecessor to FIDO and has some
| weaknesses in comparison. I'm also not sure whether WebAuthN
| supports legacy U2F authenticators without the browser
| performing some protocol translation.
| nybble41 wrote:
| Ledger and Trezor both support U2F, which is a FIDO
| protocol but not the latest version. The Trezor Model T
| additionally supports FIDO2 (WebAuthn).
| lrvick wrote:
| I have abused and soaked every model of Yubikey. I even melted
| the casings off every model with acetone to lookup chip specs
| and Yubico responded by switching to a new unidentified glass
| hybrid compound no common solvents seem to impact.
|
| In all cases the Yubikeys still worked even as bare abused
| PCBs. You need a blowtorch or a drill to break one.
| adam-p wrote:
| I killed a Yubikey in my pocket (without washing) after four
| years. Get two and keep the backup safe.
| https://adam-p.ca/blog/2021/06/backup-yubikey/
| db65edfc7996 wrote:
| Hear hear. I already have enough to worry about besides this
| little magical security wand failing/getting lost. I require a
| bullet proof method-of-last-resort mechanism in place for the
| inevitable day when the fob is no longer available.
| vngzs wrote:
| You just register 2-3 keys. It's not so bad.
| graton wrote:
| That's what I do. I have a nano Yubikey installed in each of
| my computers. Plus a Yubikey on my keychain. I register all
| of them with each account.
|
| All of the accounts require username / passowrd and the
| Yubikey. I'm not willing to not have a password.
| RandomChance wrote:
| I think this is the best approach, the one mobile key
| allows you to add your other devices as you use them so
| it's not adding a ton of overhead.
| eswat wrote:
| AWS has entered the chat.
| beefee wrote:
| The services I interact with that support WebAuthn usually
| only allow you to register one key. Backup and recovery is a
| confusing puzzle for most of these services.
| Hamuko wrote:
| Tell the services you interact with that they're basically
| going against the spec.
|
| _" Relying Parties SHOULD allow and encourage users to
| register multiple credentials to the same account. Relying
| Parties SHOULD make use of the excludeCredentials and
| user.id options to ensure that these different credentials
| are bound to different authenticators."_
| 2OEH8eoCRo0 wrote:
| Is it a SHOULD vs SHALL issue? Link to full spec?
| Hamuko wrote:
| It's SHOULD as per RFC2119, so basically you need to have
| a good reason with an understanding of the implications
| to ignore it.
|
| One of the implications here being that you have zero
| available authenticators if your main authenticator
| breaks.
|
| https://www.w3.org/TR/webauthn-2/
| rootusrootus wrote:
| I haven't run into any like that, but I'm with you -- if I
| could only store one webauthn key, I wouldn't use it at
| all. Too risky.
| dividedbyzero wrote:
| I believe AWS root accounts don't support more than one
| key to be added.
| droopyEyelids wrote:
| I don't think any AWS account allows more than one!
| aaaaaaaaata wrote:
| This has been talked about in HN comments almost daily
| for like a week -- does anyone from AWS/Amazon read this
| forum, or are they too busy performing blood sacrifices
| trying to recruit graduates?
| blibble wrote:
| more like 2 years
|
| they do know about it (I had a friend who was a PM
| there), but it's low priority...
| plexicle wrote:
| They don't. And it's also not supported in the mobile
| app, which is a huge pain.
| georgyo wrote:
| It's actually horrible! Even key rotation is horrible!
|
| My yubikey is getting to about 10 years old, and I have
| replacements for it but find it very difficult to switch. It
| will eventually fail as an things do and it will be
| problematic.
|
| The problem is that I have several dozen accounts connected
| to it and I don't know all of them. So either I'm carrying
| and trying multiple keys at all times or not getting into a
| site that haven't been rotated yet.
|
| Multiple keys on an all sites is also basically impossible.
| You need to register all the keys, and ideally those keys are
| in different places.
| bpye wrote:
| I'm going to need to work this out soon. I picked up a pair
| of new YubiKey 5Cs yesterday with their sale. I've been
| using a YubiKey Neo for years for U2F, TOTP and GPG.
|
| Moving the GPG key is easy - though I might try using the
| FIDO2 support in SSH instead. However for every TOTP and
| U2F key I'm going to have to re-enroll the new keys... It
| feels like there should be a better way.
| andrey_utkin wrote:
| I've recently started to track my service dependency
| graph! So like, to keep using github I need my password
| store, my email and one of my two security keys. To use
| my email, I need...
|
| Please contact me if you're interested, I will release
| the tooling I have.
| lrvick wrote:
| You can backup/restore FIDO2 keys via BIP39 on supported
| devices like a Ledger or Trezor.
| docandrew wrote:
| FIDO2 with resident SSH ed25519 keys works great, just
| make sure the OpenSSH client and server versions on all
| the machines you'll be using the key on support it. I
| wish there was a way to sign Git commits somehow using
| them instead of PGP.
| bradstewart wrote:
| It would indeed be nice if you could "cross sign", CA
| style, a new yubikey with an old one, and that would
| somehow get passed along to the various services.
|
| I have _not_ thought about the various attack vectors
| that this may or may not enable though.
| browningstreet wrote:
| So if you're travelling overseas for 3-4 weeks, you need to
| keep extras in all your different luggage/bags, just in case?
| eMGm4D0zgUAVXc7 wrote:
| Do you only have 2-3 backups of your workstation?
|
| I have much more backups of my workstation etc., should I now
| buy dozens of crypto hardware key thingies and constantly
| switch them around to match the backup disks?
|
| For those who do offsite backups: Is an offsite backup
| possible across the Internet? Or do you have to physically
| drive the key to the offsite location?
|
| When I create a new account somewhere, does that mean I have
| to move N backup keys out of their drawer to the workstation
| and register each of them on the account?
|
| And how to even create a backup and keep it in sync?
|
| With backup disks, it is a matter of shutting down the
| machine, removing one disk from the RAID1, and you have a
| backup (the removed disk is the backup). Or doing "dd if=..."
| if you don't use raid.
|
| Is something as simple possible with those fancy crypto toys?
| Or is some arcane magic required to copy them?
|
| Is this perhaps all as usual: An attempt to get more control
| and tracking of users, disguised as "security"?
| lrvick wrote:
| With devices that support BIP39 backups like the Ledger or
| Trezor, you are backing up the random seed that generates
| all possible future accounts deterministically.
|
| Backup once, setup 100 accounts, lose authentication
| device, restore backup to new device, regain access to all
| 100 accounts. Easy.
| throwaway52022 wrote:
| I keep an off-site backup at my parents house. Right now it
| includes a printed copy of my backup codes, so if my house
| burns down and everything is a total loss here, I at least
| don't have to start from zero. They live far enough away that
| my backup offsite backup can get to be a few months out of
| date but that's usually fine. (If I were to make a major
| change in something I'd make a special visit)
|
| I don't want to spend a bunch of time when I visit to find
| that key and add it to all of my new accounts and hope I got
| everything - I want to make a backup of my current key right
| before I visit and when I visit, I just put the new backup
| key in the desk drawer and take the old one home with me.
| lrvick wrote:
| Use a Ledger or Trezor which supports backing up the random
| seed allowing you to backup all current and future accounts
| in one shot.
| xdennis wrote:
| Why do people always say this? Do you not know how expensive
| they are?
| lrvick wrote:
| Most people have smartphones which ship with WebAuthn so
| they are good to go. Granted phones are like $500+ so by
| contrast a Yubikey is a much cheaper alternative for a
| secondary backup device for most people.
| michaelt wrote:
| Eh, retrieving a key from off-site storage every time you
| open a new account is a pretty big inconvenience, even for a
| security enthusiast.
| vngzs wrote:
| Right. Some core services get this treatment, like email
| and important online accounts. For others I rely on reset
| mechanisms tied to those email accounts if I lose the
| primary key and haven't had the chance to register the
| secondary. Every few months I'll sync up anything that has
| been missed.
|
| It's not perfect, but it's a hell of a lot better than
| TOTP.
| xur17 wrote:
| I realize FIDO is better than TOTP since it prevents
| phishing attacks, etc, but as a user, the ability to
| backup my seeds is extremely convenient.
| lrvick wrote:
| You can backup FIDO with BIP39 on some devices like
| Ledger.
| lrvick wrote:
| You can use devices like Ledger that support BIP39 backup
| allowing you to create duplicate devices any time from a 24
| word random seed.
|
| Now your one time backup covers all current and future
| services.
| lxgr wrote:
| That's exactly why FIDO [1] and WebAuthN [2] are moving
| towards a concept of backup-able/cross-device-sync-able
| authenticators.
|
| That is arguably less secure in some contexts, but there
| are workarounds, and I do see the point that for most
| services, availability/key loss recovery is as much of a
| concern as is security.
|
| [1] https://fidoalliance.org/multi-device-fido-credentials/
|
| [2] https://github.com/w3c/webauthn/issues/1714
| xur17 wrote:
| This. For a while I tried to keep a list of accounts I
| needed to add my offsite key to, and then every year or so
| I'd retrieve the key, and add in bulk, but that became way
| too complicated.
|
| While not ideal, I'd be happy if I could register with the
| public key of my offsite key or something similar. Really I
| think there should be a way to register a public persona,
| and add / remove keys from that persona at will.
|
| Or, just let me (somehow) generate multiple hardware
| devices with a shared seed.
| lrvick wrote:
| You can generate duplicate FIDO devices with a shared
| seed right now. Hardware wallets like Ledger support this
| today.
| rvz wrote:
| Not a great thing to see the big three once again, driving the
| standards here. You should be worried.
|
| But as long as the ridiculous SMS 2FA is removed or replaced by
| something better, then fine. But we'll see how this goes.
|
| From the web side of this standard, this also tells me that
| Mozilla has no influence anywhere and will be the last ones to
| implement this standard in Firefox.
|
| Oh dear.
| [deleted]
| theklr wrote:
| Both FIDO and W3C are neutral places for this. Mozilla being a
| part of W3C and FIDO will have a say. This is just a PR stunt
| that the tech journalists ate up. They've been working together
| to make this for the last two years, it's just an announcement
| to accelerate the work on it.
| jeroenhd wrote:
| Mozilla is a member (https://fidoalliance.org/members/), so I
| doubt they'll be the left to their own devices. They'll
| probably lack the manpower to implement the additions well (I
| mean, you can't even paste a URL to an IPv6 address in Firefox
| for Android, which is one of the most basic features of a
| browser), but then again they already have Firefox Sync and a
| working WebAuthn system.
| [deleted]
| WaitWaitWha wrote:
| I like the idea, but not sure about the implementation.
|
| What happens when there is an outage at Google or Apple?
|
| What happens when I lose/get stolen/break/change my cell phone?
|
| What happens if I do not have/want a cell phone?
| pionar wrote:
| > What happens if I do not have/want a cell phone?
|
| WebAuthn works with hardware tokens, so something like a
| Yubikey will also work.
| judge2020 wrote:
| The FIDO Standard talked about here includes regular security
| keys, so, if you don't want to use passkey, you can get a
| physical security key; and while I imagine the push for
| passwordless will be large, I doubt they'll completely remove
| passwords anytime soon.
| bcatanzaro wrote:
| This passwordless signin process sounds neat, but will it
| increase Google's power to lock people out of things? I don't
| understand why Google doesn't have an ombudsman - consumers have
| no recourse when Google locks them out, and it seems the
| consequences of Google locking you out are ever increasing. I
| think we're going to need legislation to force Google to make a
| proper appeals process.
| CyberRage wrote:
| how FIDO has anything to do with google in particular? log in
| with google is not FIDO authentication...
| avianlyric wrote:
| > I don't understand why Google doesn't have an ombudsman -
| consumers have no recourse when Google locks them out
|
| Coming soon to an EU country near you!
|
| The EU digital markets act address this issue directly, by
| requiring "gatekeepers" to provide human customer response, and
| clear processes for appealing bans etc and generally forcing
| companies to provide something akin to due-process.
| judge2020 wrote:
| Google's power to lock people out of their website is already
| here with Oauth2.
|
| This standard is unrelated; it works by having the
| browser/device itself sync the virtual security keys[0], much
| in the same way they sync passwords currently. That's the only
| thing changing here, giving people the choice (and encouraging
| them) to sign in via "what you have" instead of "what you
| know", but along with that they want to alleviate the UX
| concerns of people not being ready to carry around a separate
| physical security key.
|
| 0:
| https://developer.apple.com/documentation/authenticationserv...
| bcatanzaro wrote:
| Thanks, I appreciate this.
| [deleted]
| moritonal wrote:
| At some point (unless you're an Android dev) you have to accept
| a bit of responsibility. If you use Gmail, Drive and Android
| and then decide to use Android as your preferred implementation
| of FIDO (when YubiKeys ect, exist) I struggle to see how if
| you're locked out it's not partially the consumer's fault.
|
| You choose to use Google, and can attest to the fact it's not
| that difficult not to.
| ThunderSizzle wrote:
| Don't use Google Drive. Don't use Gmail. That solves half the
| horror stories already.
| kragen wrote:
| Jeez, I hope they don't expect us to put them in charge of the
| nodelist for Zone 1.
| rasengan0 wrote:
| Your phone or else?
|
| I already have a FIDO supported Yubikey, what if I don't have a
| phone?
|
| Or don't want to upgrade or use one in the future?
| lizardactivist wrote:
| FIDO also enables great opportunities for tracking. But we're not
| supposed to talk about that.
| md_ wrote:
| Hmm, what opportunities? Genuinely curious what you mean here.
| aaaaaaaaata wrote:
| If you use the same public keypair across many unrelated
| sites, an observer could blah blah blah
___________________________________________________________________
(page generated 2022-05-05 23:00 UTC)