[HN Gopher] Recent 'MFA Bombing' Attacks Targeting Apple Users
       ___________________________________________________________________
        
       Recent 'MFA Bombing' Attacks Targeting Apple Users
        
       Author : vdddv
       Score  : 317 points
       Date   : 2024-03-27 07:10 UTC (15 hours ago)
        
 (HTM) web link (krebsonsecurity.com)
 (TXT) w3m dump (krebsonsecurity.com)
        
       | type_Ben_struct wrote:
       | I'm still disappointed by Apples implementation of security keys.
       | I want to be able to prevent all 2FA methods other than security
       | keys, but it still seems possible in certain flows to authorise a
       | new login with another iOS device making it vulnerable to this
       | attack.
        
         | lloeki wrote:
         | Interesting. I was contemplating moving to security keys (which
         | according to the setup flow "replaces verification codes" but
         | IIUC you're saying one can still fall back to verification
         | codes in some flows?
        
         | dm wrote:
         | What flows have you found not to use security keys?
        
           | ThePowerOfFuet wrote:
           | All of them.
        
         | antihero wrote:
         | Just change over to using HSMs instead of push.
         | 
         | https://support.apple.com/en-gb/HT213154
        
           | EasyMark wrote:
           | If I was doing something that needed heavy security, but I'm
           | just a boring average joe. My critical accounts are protected
           | by TOTP on one (backed up) device only, other things are kind
           | of "good enough" with passkeys and passwords. If I ever
           | become a criminal mastermind or double agent I'll probably
           | dive into such methods though.
        
           | 2OEH8eoCRo0 wrote:
           | YubiKeys aren't HSMs, Yubico sells an HSM though.
           | 
           | https://www.yubico.com/product/yubihsm-2-series/yubihsm-2-fi.
           | ..
        
       | Zetobal wrote:
       | Same problem with Instagram it's insane that so many giant
       | companies have no rate limits in their recovery flows.
        
         | WatchDog wrote:
         | The problem with adding rate limits, at least a global per user
         | rate limit, is that you then create a new denial of service
         | issue, preventing people from being able to recover their
         | account.
        
           | tdudhhu wrote:
           | Why? You can rate limit the business logic but still show the
           | user the default flow.
           | 
           | For example: if a user is requesting a reset password link 10
           | times a minute you can just send the link one time but
           | display everytime that a reset link was sent by email.
        
             | WatchDog wrote:
             | This flow is a bit different from a password reset email,
             | it's a notification with a direct call to action, allow or
             | deny.
             | 
             | You can't debounce them like you can with a reset password
             | email flow.
             | 
             | With a typical password reset email, the actual password
             | resetting is done by the user after they click the link in
             | the email, only someone with access to the email can
             | proceed, and they can only proceed on the same device that
             | they clicked the email link.
             | 
             | In this flow, there is no further on-device interaction.
        
           | AtNightWeCode wrote:
           | Rate limiting per user is mostly a thing of the past. You set
           | other rate limits and various rules and then get the rate
           | limit per user for free.
        
             | millzlane wrote:
             | > Rate limiting per user is mostly a thing of the past
             | 
             | Someone please tell this to fidelity. After 3 wrong
             | password attempts they lock your account.
        
               | k8svet wrote:
               | Fidelity are clowns. They've spent an impressive effort
               | breaking every god damn third party integration AND using
               | Akamai to block scraping. I can scrape Ameriprise fine,
               | but no matter how creative I get Fidelity gives back a
               | weird error on login.
               | 
               | (This is on top of them not sending any actionable email
               | when changing my contributions to 0 in between pay
               | periods)
               | 
               | I'm rolling my 401k out as often and fast as possible. I
               | hate American banks _so much_.
        
               | mananaysiempre wrote:
               | > us[e] Akamai to block scraping
               | 
               | Would https://github.com/lwthiker/curl-impersonate help?
               | Haven't tried with Akamai, but did help with another
               | widely used CDN that shall remain unnamed (but has
               | successfully infused me with burning hate for their
               | products after a couple of years' worth of using an
               | always-on VPN to bypass Internet censorship and/or a
               | slightly unusual browser).
        
               | k8svet wrote:
               | I'm using this to fill forms interactively and emulate a
               | user. https://github.com/rust-headless-chrome/rust-
               | headless-chrome
               | 
               | Afaict, it drives a stock Chromium instance. I'm not sure
               | how Fidelity is detecting it, but they detect it even in
               | normal headful mode. Idk if there's some JS that notices
               | there's no mouse-move movements.
               | 
               | It's just not worth the headache. I despise bending over
               | backwards for companies like this. But obviously I have
               | no choice since they're my 401k plan facilitator.
        
               | ryandrake wrote:
               | > Fidelity are clowns. They've spent an impressive effort
               | breaking every god damn third party integration AND using
               | Akamai to block scraping.
               | 
               | What's funny/sad is they probably pat themselves on their
               | back thinking their security is so advanced and awesome.
               | Financial services web integrations are all total clown
               | shows.
        
               | EasyMark wrote:
               | but can't you buy API access? I would assume that's more
               | of a business decision to promote paid for API access,
               | rather than "security" against scraping.
        
               | dpkirchner wrote:
               | And they convert usernames to sets of digits so they can
               | be entered more easily on phones. Naturally this results
               | in a lot of collisions.
        
           | faeriechangling wrote:
           | If you're getting DOSed by identical prompts you already
           | can't recover your account since you'll likely hit the wrong
           | prompt. There's no protection here against an MFA fatigue
           | attack attempt.
        
           | EasyMark wrote:
           | it wouldn't be hard to add to the app though. obviously if
           | you get a flood it's bullshit and more than a couple can be
           | ignored. It's not rocket surgery
        
           | forgotmyinfo wrote:
           | You're telling me Facebook, with its billions of dollars and
           | leetcode interviews, can't figure this out? That is outside
           | the realm of computable functions?
        
       | rvz wrote:
       | Yet another reason why phone number verification is the most
       | insecure way to verify users and it doesn't matter if a company
       | like Apple is using it or your bank using so called 'Military
       | grade encryption'. The point still stands [4] with countless
       | examples [0] [1] [2] [3].
       | 
       | Unless you want your users to be SIM swapped, there is no reason
       | to use phone numbers for logins, verification and 2FA.
       | 
       | [0] https://news.ycombinator.com/item?id=36133030
       | 
       | [1] https://news.ycombinator.com/item?id=34447883
       | 
       | [2] https://news.ycombinator.com/item?id=27310112
       | 
       | [3] https://news.ycombinator.com/item?id=29254051
       | 
       | [4] https://www.issms2fasecure.com
        
         | saagarjha wrote:
         | This has nothing to do with SIM swapping or phone numbers.
        
           | isoprophlex wrote:
           | TFA talks specifically about a victim buying a brand new
           | phone, registering a new appleid, and getting MFA bombed
           | immediately when putting in his old SIM...
        
             | klabb3 wrote:
             | > and getting MFA bombed immediately when putting in his
             | old SIM...
             | 
             | I think it's technically unrelated to the SIM, but rather
             | to create the new Apple ID he used his existing
             | (compromised, lol) phone number for "verification" or
             | something. Which is weird in a way because then Apple must
             | allow multiple accounts per phone number?
        
           | jasode wrote:
           | _> phone numbers._
           | 
           | On the official Apple reset form, the "phone number" is one
           | of the id options the hackers can use to MFA bomb the target:
           | 
           | https://iforgot.apple.com/password/verify/appleid
           | 
           | The gp proposes a different "private identification string"
           | that's not public. Public IDs such as "email address" or
           | "phone number" are susceptible to what this article is
           | talking about.
        
             | xamarin wrote:
             | Yes, like password :)
        
             | consp wrote:
             | > On the official Apple reset form, the "phone number" is
             | one of the id options the hackers can use to MFA bomb the
             | target
             | 
             | Funny thing is you cannot set a passphrase or equivalent
             | recovery code unless you have an apple device. So users who
             | have an apple account for development purposes (I hate
             | apple device UX and wont ever use anything apple again
             | other than to approve releases and manage certificates) and
             | have no apple products are cursed to use ones phone number.
        
               | EasyMark wrote:
               | I used to be hardcore about stuff like this, but as I
               | grew older I guess I gave up some of my morality and
               | bought things like $150 iphone # and moved on with life
               | if it was making me $$$.
        
             | gruez wrote:
             | Given that the gp was talking about victims being "SIM
             | swapped", I strongly suspect he's referring to the classic
             | sim swap attack where you sim swap, then use the newly
             | registered sim to receive a password reset code. If it just
             | involves discovering your phone number, you wouldn't need
             | to sim swap at all.
             | 
             | >The gp proposes a different "private identification
             | string" that's not public. Public IDs such as "email
             | address" or "phone number" are susceptible to what this
             | article is talking about.
             | 
             | This is a non-starter for the general public. If they can
             | barely remember their password what are the chances they'll
             | remember a "private identification string" or whatever?
        
           | xamarin wrote:
           | That is not true. Please read article, he even bought new
           | phone, and this did not stop attack, because of same phone
           | number. I woul not even call this MFA attack, as they did not
           | need his password. It is more like recovery password attack.
        
         | mhdhn wrote:
         | What's the recommended alternative for mere mortal hackers?
        
           | antihero wrote:
           | Use HSMs for your Apple ID MFA.
        
             | EasyMark wrote:
             | that brings in a whole new world of complexity and change
             | that isn't for everyone.
        
         | yieldcrv wrote:
         | I think we should start doing product liability lawsuits to any
         | organization capable of having user financial data affected
         | from their account, that is using SMS one time codes as either
         | default, enabled by default, and the heaviest legal remedies to
         | financial organizations where that's the only option
         | 
         | we should also update PCI DSS compliance or whatever relevant
         | security standard to call SMS one time codes totally insecure
         | 
         | we can also reach insurers these companies use and tell them to
         | force removal of SMS one time codes
         | 
         | do a multi pronged assault on SMS one time passcodes
        
           | guappa wrote:
           | I think the more urgent thing is to not use the social
           | security number both as the ultimate secret, and also as a
           | number you must give to hundreds of people.
        
             | yieldcrv wrote:
             | non sequitur, make a different thread for that cause
        
               | guappa wrote:
               | Well if you fine companies for using SMS for security...
               | you should put the CEO in jail for authenticating with
               | social security number... if we go by just the number of
               | people who get affected by skimmed SMS and by stolen ssn.
        
               | chgs wrote:
               | Not sure what sms one time codes has to do with this
               | story either
        
               | yieldcrv wrote:
               | It's one of the MFA methods Apple allows
        
             | klabb3 wrote:
             | > both as the ultimate secret, and also as a number you
             | must give to hundreds of people
             | 
             | Don't forget the final nail in the coffin, which completes
             | the trifecta: it's entirely immutable - damage radius =
             | infinite.
        
             | ImPostingOnHN wrote:
             | I think the more urgent thing is to end world hunger.
        
             | CatWChainsaw wrote:
             | That. I'm in favor of stopping this societal wave of making
             | phone numbers the equivalent of digital SSNs (they're
             | critical for digital life, everyone wants them, nothing
             | good happens when you hand them out that freely).
        
           | adrr wrote:
           | Never will happen on the consumer side. Consumer lose their
           | device way to often to make TOTP or pass codes viable.
           | 
           | Financial institutions can detect if your phone number has
           | been ported or forwarded.
           | 
           | Bigger threat is phishing and password sharing between
           | accounts. I ran tech at investment firm/ neo bank and never
           | saw an attack on sms 2FA and we had over a million customers.
           | We had email 2FA for a while there was significant number of
           | people who shared passwords between email and their bank.
        
       | lloeki wrote:
       | "recent"?
       | 
       | This happened to me and my wife (each starting a few days apart)
       | in 2021, or maybe 2022 but no later. It started with a couple
       | requests a day, then ramped up to every hour or something. IIRC
       | we also both got a couple SMS claiming to be from Apple.
       | 
       | As soon as it ramped up I set up both accounts to use recovery
       | keys, which is a move I had planned anyway on grounds that it
       | should not be in Apple's (or someone coercing/subverting Apple,
       | be it law enforcement or a hacker) power to get access to our
       | accounts. This obviously stopped the attackers dead in their
       | track.
       | 
       | For similar reasons I set up advanced data protection as soon as
       | it was available and disabled web access. Only trusted devices
       | get to see our data, and only trusted devices get to enroll a new
       | device.
        
         | viraptor wrote:
         | It's not a recent approach, but this is a recent campaign using
         | it against many people. Someone likely got a list of hacked
         | passwords from some recent dump and is going through the apple
         | accounts from it.
        
           | lloeki wrote:
           | I ventured as much. Given the amount of messages and the
           | personal details gathered, I also guess attacker tools have
           | significantly been improved or streamlined.
        
           | criddell wrote:
           | How would that explain Chris' experience at the Genius Bar?
        
         | JKCalhoun wrote:
         | I was unsure what this Recovery Key was:
         | https://support.apple.com/en-us/109345
         | 
         | It is kind of scary too -- lose the key and no one can get you
         | back in to your account.
        
           | pmontra wrote:
           | > A recovery key is an randomly generated 28-character code
           | 
           | That's easy to backup. You can even print it and bury it in a
           | sealed box in the garden or put it in a book or whatever. It
           | depends who you are protecting against.
        
             | Klathmon wrote:
             | But you shouldn't ONLY store it in a box or in your house.
             | 
             | That means you're one natural disaster away from losing
             | everything.
             | 
             | As much as it can "weaken" security, an electronic backup
             | is still recommended for most
        
               | alistairSH wrote:
               | _As much as it can "weaken" security, an electronic
               | backup is still recommended for most_
               | 
               | Maybe I'm being dense (probably), but where would you
               | save it?
               | 
               | iCloud? No, that doesn't work - you need the key to
               | access iCloud.
               | 
               | Some other cloud storage service? No, that doesn't work -
               | you need your phone to generate a token for access and
               | your phone was destroyed in the same fire as the paper
               | backup.
               | 
               | Seems like the safe choice is a lock box at a bank or
               | similar. Or a fireproof safe at home.
        
               | bombcar wrote:
               | Get it tattooed on a (normally not seen) part of your
               | body. Like under your hair! ;)
               | 
               | Of course, a code like that can be in multiple places,
               | possibly where it won't be recognized as such.
        
               | alistairSH wrote:
               | And pray you never need to update the passcode!
               | 
               | I'm imagining this spiraled around somebody's upper
               | thigh... "fakePassw0rdonetwothreefour"
        
               | Klathmon wrote:
               | Personally, I encrypt my backup/recovery/setup keys in a
               | CSV file using a password that I have memorized, and send
               | them to family members to store in their accounts/cloud
               | storage.
               | 
               | But safety deposit boxes are a good choice too, just be
               | careful to balance your own convenience. If you can't
               | easily update your backups, you're really unlikely to
               | include new accounts in them
        
               | thfuran wrote:
               | That also means you can't easily update passwords.
        
               | danieldk wrote:
               | You could put your passwords in 1Password or iCloud
               | Keychain, so you only need to back up those credentials.
        
               | fennecbutt wrote:
               | Doesn't that just mean that Apple's X character key is
               | protected only by a password presumably of lesser length?
               | 
               | I suppose a phrase works too, and easy to remember.
        
               | tacocataco wrote:
               | What happens if you suffer a TBI and can't remember the
               | password?
               | 
               | I guess you'd have bigger problems at that point.
        
               | tshaddox wrote:
               | Perhaps an estate lawyer could be trusted with the
               | information in case you become incapacitated or dead.
        
               | s3krit wrote:
               | Engraved onto something like titanium would be better
               | than a fireproof safe - they're only safe for X amount of
               | time (I want to take a stab in the dark and say about 90
               | minutes?). This is how I have backed up some (since
               | retired) crypto seed phrases in the past.
        
               | tshaddox wrote:
               | Where do you keep the titanium plate? I'd be more worried
               | about losing it due to a natural disaster than merely
               | having it destroyed beyond readability in a natural
               | disaster.
        
               | dylan604 wrote:
               | What happens if there's a typo in the engraving? Who's
               | doing the engraving? How much do you trust the people you
               | are providing the key to do it? When does the paranoia
               | kick in vs being diligent?
        
               | 0cf8612b2e1e wrote:
               | This was at least an innovation in the bitcoin community.
               | Several assemble at home systems where you can build a
               | physical manifestation of a secret. Metal cards you punch
               | with a hammer and nail. Another is essentially a tube
               | where you string along metal letters of the password.
        
               | dylan604 wrote:
               | Sure, sounds perfect. Let me send some crypto person that
               | has invested in a home stamping kit the secret to my
               | crypto wallet. At least they won't know what it's for to
               | be able to hijack my wallet. phew. had me nervous that
               | committing the cardinal sin of sharing my secret with
               | someone I don't know isn't going to come back to haunt
               | me.
        
               | 0cf8612b2e1e wrote:
               | You assemble it at home? You do not send anyone your
               | secrets.
               | 
               | Also, the idea is simple enough you could DIY your own
               | version with stuff from any hardware store.
        
               | CydeWeys wrote:
               | > Some other cloud storage service? No, that doesn't work
               | - you need your phone to generate a token for access
               | 
               | You definitely don't need your phone for access. I use
               | Yubico security keys for everything like this. I have
               | several of them that are on all my accounts and I don't
               | keep them in the same place.
        
               | tzs wrote:
               | One possibility is to encrypt a copy with a key that you
               | are pretty sure you can remember, and store that
               | encrypted copy someplace public on the web. Periodically
               | check that you do still remember the key.
               | 
               | The conventional way to do this would be encrypt it with
               | a symmetric cipher keyed from a password or passphrase.
               | I've been using an unconventional approach where the
               | secret you have to memorize is an algorithm rather than a
               | password/phrase. Programmers might find an algorithm
               | easier to memorize than a passphrase.
               | 
               | Here's an example of this general idea. The algorithm is
               | going to be a hash. This one will take a count and a
               | string, and output a hex string. In English the algorithm
               | is:                 hash the input string using sha512
               | giving a hex string       while count > 0         prepend
               | the count and a "." to current hash and apply sha512
               | 
               | The recovery code I want to backup is
               | 3FAEAB4D-BA00-4735-8010-ADF45B33B736.
               | 
               | I'd pick a count (say 1969) and a string (say "one giant
               | leap for mankind"), actually implement that algorithm,
               | run it on that input and string. That would give me a 512
               | bit number. I'd take
               | "3FAEAB4D-BA00-4735-8010-ADF45B33B736" and turn it into a
               | number too (by treating at as 36 base 256 digits). I'd
               | xor those two numbers, print the result in hex, and split
               | it into 2 smaller strings so it wouldn't be annoyingly
               | wide.
               | 
               | Then I'd save the input count, input string, and the
               | output:                 1969 one giant leap for mankind
               | ed428dffa23f4f14ae2a7b7e842019fc11b5726d726b96c11ec266758
               | be67cb0       f2a78a320a85df809afe83c6c7840e2d175cceadb45
               | 5260735405cd047459cc9
               | 
               | I'd then delete my code.
               | 
               | I could then do a variety of things with the "1969 one
               | giant leap for mankind" and the two hex strings. Put then
               | in my HN description. Include then in a Reddit comment.
               | Put them on Pastebin. Take a screenshot of them and put
               | it on Imgur.
               | 
               | To recover the code from one of those backups, the
               | procedure is to implement the algorithm from above, run
               | it with the count and string from the backup to get the
               | 512 bit hash, take the 512 bits of hex from the backup,
               | xor them, and then treat the bytes of the result as
               | ASCII.
               | 
               | Then delete the implementation of the algorithm. With
               | this approach the algorithm is the secret, so should
               | never exist outside your head except when you are
               | actually making or restoring from backup.
               | 
               | When picking the algorithm take into account the
               | circumstances you might be in when you need to use it for
               | recovery. Since you'd probably only be needing this if
               | something so bad happened that you most of your devices
               | and things like your fireproof safe, you might want to
               | pick an algorithm that does not require a fancy computer
               | setup or software that would not be in a basic operating
               | system installation.
               | 
               | The algorithm from this example just needs a basic Unix-
               | like system that you have shell access to:
               | #!/bin/sh       COUNT=$1;       shift;
               | KEY=`/bin/echo -n $* | shasum -a 512 | cut -d ' ' -f 1`
               | while [ $COUNT -ge 1  ]; do         KEY=`/bin/echo -n
               | $COUNT.$KEY | shasum -a 512 | cut -d ' ' -f 1`
               | COUNT=`expr $COUNT - 1`       done       echo $KEY
        
               | zrm wrote:
               | Keep one copy in your fire-resistant safe at home. Then
               | encrypt a copy, give the encrypted copy to your best
               | friend and the decryption key to a family member, or keep
               | one of these things in your desk at work. Neither of them
               | have access unless they both figure out what it is and
               | collude with each other, but you have a recovery system
               | in case you lose your own copy.
        
               | tacocataco wrote:
               | Why can't you bury a 2nd box in your friends yard who
               | lives across the country?
        
               | CatWChainsaw wrote:
               | Okay, and when your friend moves, and you buried it years
               | ago, so they forgot to dig it up what with everything
               | else going on in their life at moving time?
        
             | forgotmyinfo wrote:
             | Never underestimate the security and safety of a hidden
             | piece of paper! If it's good enough for wills for the last
             | 500 years, it's good enough for a password.
        
               | lxgr wrote:
               | A better analogy would be a piece of paper with your
               | username.
               | 
               | Finding somebody's will doesn't give you access to any of
               | their data or funds.
        
             | fortran77 wrote:
             | I keep one-time keys between pages of some books on my
             | shelf, and a copy in a safe deposit box. I suppose if I
             | were publically known to have tons of money in "crypto" or
             | were a target of a nation-state, this wouldn't be safe
             | enough. But I think it's OK for my gmail and OneDrive, etc.
        
           | m_a_g wrote:
           | > When you set up a recovery key, you turn off Apple's
           | standard account recovery process.
           | 
           | > However, if you lose your recovery key and can't access one
           | of your trusted devices, you'll be locked out of your account
           | permanently.
           | 
           | I considered it before but I think it's just too much risk as
           | I rely heavily on iCloud. On the other hand, I don't see the
           | risk with the current method if you're smart enough not to
           | fall for things like MFA bombing tactics.
        
             | asd4 wrote:
             | The security researcher in the article was concerned about
             | accidently confirming the prompt on his watch.
             | 
             | I don't think its a matter of being "smart enough". Human
             | error can easily creep in when dismissing 10's or 100's of
             | prompts.
        
               | hunter2_ wrote:
               | The prompt UX should step into a special "bombed" mode
               | when a frequency threshold is crossed, at which point
               | accepting a prompt has fat-finger protection such as
               | double confirmation steps, and declining all (or perhaps
               | all that share a commonality, like same initiating IP
               | address) becomes possible.
        
               | lazide wrote:
               | Or you know, not allow this kind of brute forcing at all?
        
           | nerdjon wrote:
           | You can setup a recovery contact incase you do loose the key.
           | I just set that up with my partner and the chance of loosing
           | the key and both of us losing all of our apple devices I
           | think is fairly slim.
           | 
           | I also stuck that key in 1Password (sure it's less safe, but
           | if my 1Password was breached I have far bigger problems than
           | this key being retrieved).
           | 
           | Then keep a hard copy in a safe. Been contemplating sending
           | my parents a safe (who live several states away) with keys on
           | a sheet of paper without context that only I have the
           | combination too. But not sure yet.
        
             | rjbwork wrote:
             | >Then keep a hard copy in a safe. Been contemplating
             | sending my parents a safe (who live several states away)
             | with keys on a sheet of paper without context that only I
             | have the combination too. But not sure yet.
             | 
             | A friend of mine who was (maybe is? he knows I'm not a fan
             | so we don't talk about it much) big into crypto stores his
             | secrets in similar safes with trusted friends and family
             | around the country. I think it's a good idea for things
             | like this tbh.
        
               | nerdjon wrote:
               | I think it is a good idea in theory also, there I just
               | that voice that says "well now that key is out of my
               | possession" and it scares me a bit.
               | 
               | I think I might need to look up to see if there is a
               | known pattern to these keys that it could be easily
               | figured out what it is even if it is just on a sheet with
               | no context. Particularly 1Password which I think is a
               | pattern if I remember correctly.
        
               | robmccoll wrote:
               | You could split the key a few ways if you don't want to
               | trust that one of your stores won't be compromised
               | https://en.m.wikipedia.org/wiki/Shamir%27s_secret_sharing
        
               | IncreasePosts wrote:
               | Or, just apply some simple, easy to remember permutation
               | to the key that no one would be likely to guess - eg
               | rot13 the key, or add 1 to every character, move the
               | first 14 characters of the key to the end of the key,
               | etc.
        
               | e40 wrote:
               | > Particularly 1Password which I think is a pattern if I
               | remember correctly.
               | 
               | What does that mean?
        
               | hackinthebochs wrote:
               | Probably that the key has features that allows 1Password
               | (and potentially anyone) to recognize that its a
               | 1Password key. E.g. Fixed size, patterns of spaces or
               | dashes, specific digits, embedded error correction, etc.
        
               | nerdjon wrote:
               | Yeah that is what I mean.
               | 
               | Similar to how a lot of package companies have a certain
               | pattern, length, whatever for their tracking numbers. If
               | there was a somewhat reliable way to say "This is a
               | 1Password key" or "This is an iCloud key" it makes it
               | means even without context it could be an issue.
        
             | catlikesshrimp wrote:
             | Hard copy? edge the string in a hard surface. My favorite
             | is a rock in my garden. The characters are facing the
             | ground to shield from erosion. The visible surface of the
             | rocks (all of them) is painted white for aesthetic.
             | 
             | Survives a fire, earthquake. No tornadoes or tsunamis here.
             | Nobody has stolen any such rocks from here.
        
             | lxgr wrote:
             | How many people own a safe? I personally don't know anybody
             | that does. I do know that safes sometimes get stolen.
        
           | josephcsible wrote:
           | Such a high risk of being locked out permanently is more than
           | most people can stomach. Why can't they offer a last-resort
           | option like showing up in person at an Apple Store with
           | government-issued photo ID?
        
             | toomuchtodo wrote:
             | Because they aren't required to by law. I have filed
             | comments with the FTC that this recovery path should be
             | legally mandated for digital accounts, I encourage others
             | to do the same. It doesn't have to be an Apple Store
             | (insider risk, see SIM swapping analogy); could be USPS or
             | another government identity proofer they partner with.
             | Login.gov uses USPS for in person identity proofing, for
             | example.
             | 
             | Your data and account ownership interest doesn't disappear
             | because of failure to possess the right sequence of bytes
             | or a string. Can you imagine if your real estate or
             | securities ownership evaporated because you didn't have the
             | right password? Silliness.
        
               | kccqzy wrote:
               | Well previously when stock trades involved exchanging
               | physical certificates, I could imagine that ownership
               | could evaporate if you lost that piece of paper. Or just
               | think about cash: you do lose that ownership when you
               | lose that magical piece of paper. It's a simpler world
               | when what you have physically determines what you own.
        
               | josephcsible wrote:
               | If the deed to land or the title to a car gets destroyed,
               | what happens? It doesn't suddenly forever become
               | unownable.
        
               | jrkatz wrote:
               | Physical certificates are not a thing of the past and can
               | be restored upon loss or destruction:
               | https://www.investor.gov/introduction-
               | investing/investing-ba...
        
               | toomuchtodo wrote:
               | People want a just world (imho, n=1, based on all
               | available evidence, etc), recourse, and protections, not
               | a simple world. Interestingly, cash will likely be the
               | last to go in the near future from a "possession of
               | value" as the world goes cashless (although whether this
               | is "good" or "bad" can be argued in another thread).
               | 
               | https://en.wikipedia.org/wiki/Cashless_society
        
               | AnthonyMouse wrote:
               | This should not be required by law because many people
               | specifically don't want it. I'm content to keep my own
               | redundant copies of a recovery key and suffer the
               | consequences of my own actions, rather than allowing
               | someone to steal my account just because they made a
               | convincing fake ID or hacked some government system. In
               | general centralized identity systems are a single point
               | of failure and hooking more things into them is a bad
               | thing.
               | 
               | > Your data and account ownership interest doesn't
               | disappear because of failure to possess the right
               | sequence of bytes or a string.
               | 
               | Somehow you have to establish that you _are_ the owner of
               | the account, in a way that nobody else can do it. This is
               | very much not a trivial problem, and government IDs don
               | 't provide any kind of solution to it.
               | 
               | If you need a driver's license, how do you get a driver's
               | license? With a birth certificate? Okay, how do you get a
               | copy of your birth certificate when you don't have a
               | driver's license?
               | 
               | If there is a path to go from your house burning down and
               | you having zero documents to you having a valid ID again
               | without proving you've memorized or otherwise backed up
               | any kind of secrets, an attacker can do the same thing
               | and get an ID in your name. This is why identity theft is
               | a thing in every system that relies on government ID.
               | Requiring all systems to accept government ID is
               | requiring all systems to be subject to identity theft.
        
               | toomuchtodo wrote:
               | I argue for and advocate that this capability should
               | exist, but not be mandatory. If you do not want to tie
               | your personal identity to your digital identity,
               | certainly, you should be able to not do so and rely
               | solely on a cryptographic primitive, recovery key, or
               | other digital mechanism to govern access of last resort.
               | If your account access is lost forever, it's on you and
               | that was a choice that was made.
               | 
               | > Somehow you have to establish that you are the owner of
               | the account, in a way that nobody else can do it. This is
               | very much not a trivial problem, and government IDs don't
               | provide any kind of solution to it.
               | 
               | This is actually very easy. You can identity proof
               | someone through Stripe Identity [1] for ~$2/transaction.
               | There are of course other private companies who will do
               | this. You bind this identity to the digital identity
               | once, when you have a high identity assurance level
               | (IAL). Account recovery is then trivial.
               | 
               | > If you need a driver's license, how do you get a
               | driver's license? With a birth certificate? Okay, how do
               | you get a copy of your birth certificate when you don't
               | have a driver's license?
               | 
               | This is government's problem luckily, not that of private
               | companies who would need to offer account identity
               | bootstrapping. Does the liquor store or bar care where
               | you got your government ID? The notary? The bank? They do
               | not, because they trust the government to issue these
               | credentials. They simply require the state of federal
               | government credential. Based on the amount of crypto
               | fraud that has occurred (~$72B and counting [2]),
               | government identity web of trust is much more robust than
               | "not your keys, not your crypto" and similar digital only
               | primitives.
               | 
               | NIST 800-63 should answer any questions you might have I
               | have not already answered:
               | https://pages.nist.gov/800-63-3/ (NIST Digital Identity
               | Guidelines)
               | 
               | [1] https://stripe.com/identity
               | 
               | [2] https://www.web3isgoinggreat.com/charts/top
               | 
               | (customer identity is a component of my work in financial
               | services)
        
               | AnthonyMouse wrote:
               | > This is actually very easy. You can identity proof
               | someone through Stripe Identity [1] for ~$2/transaction.
               | 
               | "Pay someone else to do it" is easy in the sense that
               | doing the hard thing is now somebody else's problem, not
               | in the sense that doing it is not hard. That also seems
               | like a compliance service -- you are required to KYC,
               | service provides box-checking for the regulatory
               | requirement -- not something that can actually determine
               | if someone is using a fraudulent ID, e.g. because they
               | breached some DMV or some other company's servers and now
               | have access to their customers' IDs.
               | 
               | > This is government's problem luckily, not that of
               | private companies who would need to offer account
               | identity bootstrapping.
               | 
               | But it's actually the user's problem if it means the
               | government's system has poor security and allows someone
               | else to gain access to their account.
               | 
               | > Based on the amount of crypto fraud that has occurred
               | (~$72B and counting [2]), government identity web of
               | trust is much more robust than "not your keys, not your
               | crypto" and similar digital only primitives.
               | 
               | The vast majority of these are from custodial services,
               | i.e. the things that _don 't_ keep the important keys in
               | the hands of the users. Notably this number (which is
               | global) is less than the losses from identity theft in
               | the US alone.
               | 
               | The general problem also stems from "crypto transactions
               | are irreversible" rather than "crypto transactions are
               | secured by secrets". Systems with irreversible
               | transactions are suitable for storing and transferring
               | moderate amounts of value, as for example the amount of
               | ordinary cash a person might keep in their wallet. People
               | storing a hundred million dollars in a crypto wallet and
               | not physically securing the keys like they're a hundred
               | million dollars in gold bars are the fools from the
               | saying about fools and their money.
        
             | lazide wrote:
             | Have you seen how easy it is to get fake government ID?
             | It's damn near a rite of passage for teenagers so they can
             | buy alcohol. $20-$50 if you know the right person or can
             | wander the dark web right.
             | 
             | I'm not sure you want that to be the absolute best digital
             | security you can get.
        
               | SoftTalker wrote:
               | Yes it is vulnerable to an attacker who is willing to
               | present himself in person with a fake ID to target a
               | specific account. However it's not scalable or remotely
               | exploitable.
        
               | lazide wrote:
               | Since it requires a human looking at an ID and then
               | pressing a button, the system triggered by the button
               | press is likely quite exploitable no? Or even worse,
               | scanning and storing an ID, which allows spoofing if
               | those get compromised.
               | 
               | Recovery key isn't susceptible to that - and isn't
               | susceptible to fake-id-spotting-ability or bribeability
               | of staff either.
        
               | josephcsible wrote:
               | Okay, then also require a photo when opting in to this,
               | and make sure the person who shows up looks like said
               | photo too.
        
             | semiquaver wrote:
             | > Why can't they offer a last-resort option like showing up
             | in person at an Apple Store with government-issued photo
             | ID?
             | 
             | This is easy to defeat and completely subverts the purpose
             | of the system. If you are not comfortable with self-custody
             | then don't opt in.
        
             | matwood wrote:
             | I think their option for last resort is the trusted
             | contact.
        
             | recursive wrote:
             | How would this work? If this was possible, that would mean
             | an Apple employee is verifying the ID. This has failure
             | modes. See SIM swapping attacks.
        
               | mlyle wrote:
               | There's a wide set of possible approaches between "let
               | any employee validate any ID" and "never let someone into
               | an account that they have lost the credential to."
               | 
               | E.g. you could make it costly to attempt, require a
               | notarized proof of identity -and- showing up at the Apple
               | store, and enforce a n-day waiting period. A different
               | employee does the unlock (from a customer service queue)
               | than accepts the paperwork.
               | 
               | We don't lock people out of financial accounts forever
               | when they forget a credential. It could definitely be
               | solved for other types of accounts.
        
               | josephcsible wrote:
               | Aren't SIM swapping attacks only such a problem because
               | you can get a new SIM _without_ showing up in person with
               | ID?
        
               | recursive wrote:
               | No, they're also a problem because you can run into a
               | storefront and snatch the employee's authenticated
               | tablet, regardless of what company policy is.
        
             | hedora wrote:
             | This is the default behavior if you don't turn this stuff
             | on. They store your account recovery key in an escrow
             | device.
             | 
             | The main problem is that walking into an apple store with a
             | government-issued warrant works just as well as walking in
             | with a government-issued ID.
        
           | fennecbutt wrote:
           | Put in safe deposit boxes at 2 different banks or something,
           | I guess.
        
             | lxgr wrote:
             | That raises the TCO of iCloud considerably.
        
           | jareklupinski wrote:
           | > lose the key and no one can get you back in to your account
           | 
           | sounds like a feature
           | 
           | "want to totally restart your entire digital life? just rip
           | up your key :) never worry about something from your past
           | coming back to you ever again!
        
             | bilbo0s wrote:
             | Only if you do everything at Apple.
             | 
             | You make posts on twitter, it's not protected the same way.
        
               | dylan604 wrote:
               | I want to be upset that you've made a comment so obvious,
               | yet sadly, there will be people in the wild that don't
               | understand the silos platforms build. However, I doubt
               | any of them are here reading this, but you never know.
        
               | hedora wrote:
               | Go read any thread about passkeys. :-)
        
             | headmelted wrote:
             | That seems like the worst option. Everything up to the free
             | tier would stay there forever with no way for you to ever
             | request it to be deleted.
        
               | sgjohnson wrote:
               | Turn on Advanced Data Protection before you rip up the
               | key. Then it's all as good as deleted.
        
           | ThomasBb wrote:
           | " If you use Advanced Data Protection and set up both a
           | recovery key and a recovery contact, you can use either your
           | recovery key or recovery contact to regain access to your
           | account."
        
           | themagician wrote:
           | You can regenerate a new key from any logged in device, so
           | you have to lose the key AND every device.
        
             | lxgr wrote:
             | Except if Apple decides, based on undocumented heuristics,
             | that you do in fact need the key, as far as I've heard.
        
           | lloeki wrote:
           | > lose the key and no one can get you back in to your
           | account.
           | 
           | Incorrect: only Apple cannot.
           | 
           | You can voluntarily declare:
           | 
           | - recovery accounts: these trusted accounts can help you
           | authenticate anytime.
           | 
           | https://support.apple.com/en-us/HT212513
           | 
           | - legacy contacts: these trusted contacts can access your
           | account in the event of your death.
           | 
           | https://support.apple.com/en-us/102631
           | 
           | As for the "lose recovery key" situation is no different than
           | hardware token 2FA + recovery codes. Print multiple copies
           | and spread them to trusted third parties.
        
         | antihero wrote:
         | Also, buy some (at least three) YubiKeys and use them for your
         | Apple ID verification instead of the dumb push MFA.
         | 
         | https://support.apple.com/en-gb/HT213154
        
           | someguydave wrote:
           | But is it the case that the Yubikey is essentially treated
           | the same as a trusted device? What if I want to untrust my
           | devices and only trust ubikeys (without removing the device
           | from my icloud account?)
        
             | antihero wrote:
             | I don't seem to have the push option now
        
               | someguydave wrote:
               | Yes but my understanding is that you can remove the
               | Yubikey without possessing it, just with a "trusted
               | device". I want to mark all of my devices untrusted (wrt
               | icloud account changes) and rely only on Yubikeys
        
           | someguydave wrote:
           | From your apple doc:
           | 
           | " When you use Security Keys for Apple ID, you'll need a
           | trusted device _or_ a security key to:
           | 
           | Sign in with your Apple ID on a new device or on the web
           | 
           | Reset your Apple ID password or unlock your Apple ID
           | 
           | Add additional security keys or remove a security key"
           | 
           | Yubikeys do nothing except enlarge your attack surface.
        
         | vdddv wrote:
         | Interesting that using the recovery key stopped the issue for
         | you, but does not seem to do its job now. From the article "Ken
         | said he enabled a recovery key for his account as instructed,
         | but that it hasn't stopped the unbidden system alerts from
         | appearing on all of his devices every few days.
         | 
         | KrebsOnSecurity tested Ken's experience, and can confirm that
         | enabling a recovery key does nothing to stop a password reset
         | prompt from being sent to associated Apple devices. "
        
           | hx833001 wrote:
           | A password reset prompt is sent to the devices, but
           | unfortunately the article leaves out that the prompt only
           | enables you to reset the password on the device that receives
           | the prompt. So it is not a security issue, just an annoyance.
        
         | fortran77 wrote:
         | Wow! You'd think they'd rate limit these! Once you've done it
         | twice, go to once every 15 minutes, then hour, then 4 hours,
         | than day, etc. Like bad logins.
        
           | nilsherzig wrote:
           | That would allow me to log you out of your accounts
        
             | prepend wrote:
             | No, it would affect login status. Just a delay between
             | reset attempts.
             | 
             | No reset actually occurs until one prompt is accepted.
        
           | WorldMaker wrote:
           | Krebs notes that the recovery form does have some form of
           | CAPTCHA on them, which mostly just goes to show that CAPTCHA
           | systems are a poor and increasingly deficient rate limiter.
           | 
           | ETA: Also from a user experience even once a week between
           | attempts is still enough to deeply annoy a user getting
           | popups on their devices. This is one of those cases where
           | rate limits probably still can't solve the user irritation.
        
       | _def wrote:
       | I wonder how long it will take until another goal of these phone
       | calls will be to gather enough samples to convincingly clone your
       | voice.
        
         | Sarkie wrote:
         | Good fucking point this
        
           | ted_bunny wrote:
           | Bad comment, this. Just upvote.
        
         | rvz wrote:
         | Exactly this.
         | 
         | Another reason to not to use phone (or the numbers) calls to
         | verify users even with so called 'voice identification or voice
         | ID' which can easily be broken with advanced voice cloning.
        
           | _def wrote:
           | Recently I was baffled how far we've come with this. It
           | doesn't work perfectly, but could be enough to fool someone.
           | Just one pip install and a short voice sample away:
           | https://github.com/coqui-ai/TTS
        
         | ManBeardPc wrote:
         | There is already a variant where they try to get someone to say
         | ,,yes" and just use a recording of it to use as ,,proof" that
         | you agreed to some contract.
        
           | guappa wrote:
           | Phone providers have been doing this one in italy for over a
           | decade.
        
           | nebula8804 wrote:
           | OMG that explains so much. I kept getting these calls where
           | they would ask "Am I speaking with the head of the
           | household?"...crap
        
           | 14 wrote:
           | I actually don't answer unknown callers with "hello" or any
           | words actually. I simply just say "mmmhhmm" or make a dumb
           | sound if it is automated it will trigger the automatic
           | message. Someone asked why and I said voice cloning software
           | they said wtf you have nothing to steal. Just feels risky idk
           | why.
        
             | jmkni wrote:
             | https://www.youtube.com/watch?v=YFWgyi-zzmE
        
         | gruez wrote:
         | You probably not going to get a voice clone from someone saying
         | "hello?" 100 times. However, you don't really need to "MFA
         | Bomb" people to clone their voice, just call them with a
         | plausible sounding reason that will cause them to engage in an
         | extended conversation (eg. "hey this is your uber/doordash
         | driver/doctor/school/daycare).
        
           | rainbowzootsuit wrote:
           | I just really want to hear you say "passport" !
        
       | honzaik wrote:
       | I am confused. What does happen after clicking allow? Does Apple
       | just provide a password reset form to the person on the iForgot
       | website or does it show up only on the device?
        
         | viktorcode wrote:
         | I think it will show you the confirmation code on the device.
         | Then the scammer will call to learn the code.
        
       | chrisjj wrote:
       | > even though I have my Apple watch set to remain quiet during
       | the time I'm usually sleeping at night, it woke me up with one of
       | these alerts.
       | 
       | So... Apple Watch "quiet" is broken??
        
         | brookst wrote:
         | I find sleep focus mode much more reliable than the silent
         | switch. It's confusing they have both.
        
           | aareet wrote:
           | I think it is just a transition period until they can get rid
           | of models with the switch in their lineup. Since the action
           | button is now configurable, it could soon turn back into just
           | focus modes as the configurable way to silence your phone
        
       | prmoustache wrote:
       | I have hated Push MFA since it was introduced.
       | 
       | How hard is it to just type a code really. In the end to fight
       | against push bombing you end up with push notification that ask
       | you for a code anyway.
        
         | gruez wrote:
         | At least on for icloud sign ins (not sure about password
         | resets, too lazy to check), clicking "allow" doesn't allow the
         | sign in, it only displays a 6 digit code that you have to enter
         | to log in.
        
         | antihero wrote:
         | You can instead opt to use HSMs for your Apple ID MFA. I have
         | 3x YubiKeys in various locations for this exact purpose.
         | 
         | https://support.apple.com/en-gb/HT213154
        
           | gruez wrote:
           | They mention "FIDO(r) Certified* security keys", this
           | presumably means physical keys only, and not soft keys like
           | the ones that keepassxc/bitwarden provides? If so that might
           | be too much of a hassle for me. I care about my security, but
           | I don't care enough to drop $100 on 3 separate security keys,
           | and finding 3 separate places to keep them secure.
        
             | hocuspocus wrote:
             | You need two keys, not three.
             | 
             | But yes I wish you could use one hardware key as backup and
             | one software key for day-to-day usage, or at least the
             | security key in a trusted device (up to you to have a
             | circular dependency to your main device or not).
        
           | someguydave wrote:
           | It does not help you when a trusted device is stolen, the
           | yubikeys can be disabled if they unlock your phone or device
        
       | mavamaarten wrote:
       | I've been getting these on my LinkedIn account since a couple of
       | days. Every few hours I get an email with a magic login link.
       | They seem legitimate, originating from various locations around
       | the globe.
        
         | standing_user wrote:
         | Happened to me yesterday, I was baffled but then I found that
         | you can request the one time password just using the email
         | associated with the LinkedIn account, so the password wasn't
         | compromised
         | 
         | I have changed the password, main mail and in the privacy
         | settings of LinkedIn removed the visibility of the email
        
           | pinebox wrote:
           | Linkedin _will_ silently change your visibility settings
           | without your consent.
        
             | estiaan wrote:
             | Do you have a source for that? Or any more info? It's not
             | that I doubt it, I ask because some details like my work
             | email, job title and place of employment has been leaking
             | into the hands of marketing companies and I an trying to
             | figure out how.
        
               | forgotmyinfo wrote:
               | Your own company could've sold it to data brokers. Look
               | into Equifax's Work Number score, it includes fun things
               | like where you worked and how much you made. But no,
               | let's not unionize or anything.
        
               | dpifke wrote:
               | Companies with union labor also sell their employees'
               | data to Equifax.
               | 
               | Unions are on board with this, see e.g.
               | https://unitedafa.org/news/2020/5/9/employment-
               | verification-...
        
         | m-p-3 wrote:
         | I get these too, I wish I could turn the feature off in my
         | account, especially since I already have multiple forms of 2FA
         | (TOTP, Passkeys).
        
         | donovanallen wrote:
         | It's been Uber for me
        
       | mcintyre1994 wrote:
       | That message is horribly designed if it allows a password reset
       | to happen on any other device after you click allow. It
       | specifically says "Use this iPhone to reset". I'd have assumed it
       | asks the person who clicked allow to set a new password, on the
       | same device they clicked allow.
       | 
       | Then again if it shows on the watch too (and isn't just mirroring
       | a phone notification, since it ignores quiet mode), I can't
       | imagine the idea is you click allow on your watch and then type a
       | password on its keyboard?
        
         | fortran77 wrote:
         | > That message is horribly designed if it allows a password
         | reset to happen on any other device after you click allow
         | 
         | This was a lifesaver when my 90 year old mother forget her iMac
         | password (and I forgot that I had created a second admin
         | account on her machine.) After getting locked out of the iMac,
         | we were able to reset it because we were able to get into her
         | iPad (which she forgot the pin to, but fortunately we found it
         | written down.)
        
       | woadwarrior01 wrote:
       | Quite shocking how oblivious a lot of ostensibly tech savvy
       | people are to the existence of hardware security tokens. Yubikeys
       | have been around for over 15 years now, although Apple only added
       | support for hardware tokens recently.
       | 
       | https://support.apple.com/en-us/HT213154
        
         | recursive wrote:
         | I know they exist. I just don't really know how they work or
         | what they do.
        
         | someguydave wrote:
         | They don't help in the case that your unlocked phone is stolen
        
       | nerdjon wrote:
       | The lack of rate limiting is surprising, either on the server
       | side or the OS side (or both).
       | 
       | I mean they already lock my iPhone after too many failed attempts
       | with my passcode and it gets longer each time, I feel like the
       | lock here should be the same.
       | 
       | A better prompt would also go a long way.
        
       | rekoil wrote:
       | At some point the ability to trigger these prompts (or ones like
       | them, like the Bluetooth-based setup new device prompts that were
       | in the news last year) on Apple devices is itself the problem
       | right?
       | 
       | Obviously it must be possible to reset ones password, but from
       | the article it's apparently possible to make 30 requests to reset
       | ones password in a short amount of time.
       | 
       | What possible non-malicious reason could there be for that to
       | happen?
        
         | gruez wrote:
         | None, it's just that they haven't bothered adding a check for
         | them. This isn't necessarily an indictment of them. It make
         | sense in hindsight, but between sprints, OKRs/KPIs, and
         | promotion packets, it's easy to let non-sexy functionality like
         | these slip through the cracks.
        
           | forgotmyinfo wrote:
           | It's distressing and sad that we've come to expect so little
           | from the trillion-dollar market cap companies to which we are
           | beholden to participate in modernity.
        
             | zubairshaik wrote:
             | It's not as alarming if we just reframe it. Apple's
             | software is written by developers, like many HN readers,
             | and they follow similar interal processes. There is nothing
             | inherent about having a large market cap that makes
             | everyone involved superhuman. Some issues always slip
             | through the cracks.
             | 
             | I'm surprised to see this comment on HN where many readers
             | see how the sausage is made. There's no secret sauce, no
             | matter how far up in FAANG/MANGA you get.
        
       | chefandy wrote:
       | I've been too immersed in university happenings recently. It took
       | me clicking on the link and reading until "password reset
       | feature" to realize that this wasn't some bizarre phishing attack
       | involving Masters of Fine Arts degrees.
        
       | fennecbutt wrote:
       | B-but iPhones are secure and are the best and Apple spends so
       | much money on security to keep us safe and don't need any
       | government/EU oversight at all. Proof that Apple's "it's for your
       | own good" has always just been marketing.
       | 
       | (Don't get me wrong, let's go after Google, MS, Sony, et al
       | too!!!)
        
         | ghodith wrote:
         | I don't see where EU regulations would have helped in this
         | case.
        
       | WarOnPrivacy wrote:
       | _he received a call on his iPhone that said it was from Apple
       | support._
       | 
       |  _" I said I would call them back and hung up," Chris said,
       | demonstrating the proper response to such unbidden
       | solicitations_."
       | 
       | We're long-conditioned to assume that calling a large company and
       | reaching a human will be difficult to impossible - and if we
       | succeed, it will be an unpleasant experience. Much more so for a
       | major tech company.
       | 
       | As far as this scam succeeds, it's partially due to intentional
       | business designs.
        
         | someguydave wrote:
         | This is true, and it is because the public is mostly too inept
         | to be responsible for themselves
        
           | WarOnPrivacy wrote:
           | > This is true, and it is because the public is mostly too
           | inept to be responsible for themselves
           | 
           | So why is an inept public responsible for major corps choices
           | to mostly remove phone-to-human cust svc - and not corp
           | poisoning by MBAs?
        
         | metanonsense wrote:
         | A few weeks ago, we had a major problem with our Apple
         | developer account (which is registered to my name). For days, I
         | tried everything to avoid calling customer support (for the
         | above reasons) and only agreed when our release team started
         | panicking. I was more than surprised how incredibly good
         | Apple's support team was. Recovering from the problem was quite
         | difficult (and the circumstances that lead to it made me
         | question Apple's SW dev capabilities), but the support
         | experience was simply perfect.
        
       | JohnMakin wrote:
       | my mfa applications do not work on any other device, even if it's
       | restored from icloud. However, this would still be incredibly
       | concerning.
        
       | tanelpoder wrote:
       | There's an important omission in the article and the top comments
       | here don't mention it either: Accidentally tapping "Allow" does
       | not allow the attacker to change the password on their web
       | browser. When you tap Allow on your device, you are shown the
       | 6-digit pin on _your_ device and _you_ can use it to change your
       | password on _your_ device. The final part of the attack is that
       | the attacker calls you using a spoofed Apple phone number and
       | asks _you_ to read out the 6-digit pin to them. If _you_ choose
       | to give out the 6-digit pin to the attacker over an incoming
       | phone call, then they can use it in their browser to reset your
       | password.
       | 
       | It's surprising that Krebs chose to omit this little detail in
       | the security blog and instead seemed to confirm that someone
       | could completely give away access to their account while
       | sleeping.
        
         | WheatMillington wrote:
         | He describes this in the very first paragraph of the article:
         | 
         | >Assuming the user manages not to fat-finger the wrong button
         | on the umpteenth password reset request, the scammers will then
         | call the victim while spoofing Apple support in the caller ID,
         | saying the user's account is under attack and that Apple
         | support needs to "verify" a one-time code.
        
           | rootusrootus wrote:
           | That seems to be an entirely different point. Krebs suggests
           | repeatedly that all you need to do to get hacked is click
           | "Allow" in the push notification. This is demonstrably false.
           | 
           | "Assuming the user manages not to fat-finger the wrong
           | button" means "assuming the user clicks Don't Allow". They
           | call on the phone to try and convince the user to say Allow
           | next time.
           | 
           | Of course that's kinda BS too, because the only time "Allow"
           | gives you a six digit code is if you successfully
           | authenticate your apple ID on a new device. If you get the
           | reset password dialog, the result of Allow is not a six digit
           | code, it just allows you to reset the password. Yourself. On
           | your device.
        
             | WheatMillington wrote:
             | Are you reading the second half of the sentence I posted?
             | Sorry but I'm not understanding where you are coming from -
             | Krebbs lays out clearly in the first paragraph how the
             | attack works and you seem to be deliberately ignoring that.
        
               | rootusrootus wrote:
               | No? I thought I specifically addressed that. They call
               | you on the phone and ask for a code you won't have, even
               | if you hit Allow.
               | 
               | What I find interesting is that Krebs didn't do any
               | legwork to verify the claims before publishing.
        
         | mattmaroon wrote:
         | Fair, and good to know, but I could still easily see reasonable
         | people (not just 80 yr olds with their Obamaphone) falling for
         | this.
         | 
         | And even if not, there's a severe annoyance factor here that
         | could be simply removed by Apple rate limiting these requests.
         | Why can they send you hundreds of these in a short time?
        
       | chatmasta wrote:
       | > he received a call on his iPhone that said it was from Apple
       | Support (the number displayed was 1-800-275-2273, Apple's real
       | customer support line)
       | 
       | This happened to me exactly once, and it was two days after I
       | ordered a new MacBook from the online Apple Store. Since I was
       | expecting a shipment, I almost picked it up. But instead I called
       | Apple Support myself, and asked if they had called me, and they
       | said they had not.
        
       | rootusrootus wrote:
       | This seems like it is entirely a human problem, not any kind of
       | technical failure. The fix is the same as it always was -- people
       | need to be trained to say no by default, do not trust inbound
       | calls _ever_ , and never ever share your credentials.
       | 
       | If you follow that advice, this attack poses no risk other than
       | annoyance. If you do not give your password to the creep who
       | calls you claiming to be apple support, you will be okay.
        
         | dimgl wrote:
         | > people need to be trained to say no by default, do not trust
         | inbound calls ever
         | 
         | This really sucks though. It basically means that our current
         | phone system is inherently broken and something that was
         | potentially useful before is no longer useful due to malicious
         | actors.
        
         | ascorbic wrote:
         | A system that lets an attacker send hundreds of push
         | notifications, effectively making a phone unusable until you
         | click "allow" is a technical failure. So is one that lets an
         | attacker spoof Apple's caller ID. Sure, that one is a failure
         | with caller ID in general, but it's not beyond Apple's ability
         | to special-case its own numbers.
        
       | shuntress wrote:
       | It still seems wrong to me that we, as a society, have basically
       | accepting this level of crime as just a constant sort of
       | background noise in daily life.
        
       | kevrmoore wrote:
       | This happened to me about 2 yrs ago. It catches you off guard
       | when you receive a spoofed call from Apple Care as you are being
       | bombarded with PW reset requests from your iCloud. Of course, the
       | hacker is really good and answers all the Apple-related questions
       | fluidly. I believe my account data came from the big Ledger hack,
       | so they were targeting crypto holders. iCloud security was so
       | weak back then!
        
       ___________________________________________________________________
       (page generated 2024-03-27 23:01 UTC)