[HN Gopher] Reverse Engineering iOS 18 Inactivity Reboot
___________________________________________________________________
Reverse Engineering iOS 18 Inactivity Reboot
Author : moonsword
Score : 482 points
Date : 2024-11-17 21:50 UTC (1 days ago)
(HTM) web link (naehrdine.blogspot.com)
(TXT) w3m dump (naehrdine.blogspot.com)
| chews wrote:
| thank you for such a great writeup, this is an excellent
| breakdown!
| threeseed wrote:
| I suspected this was being managed in the Secure Enclave.
|
| That means it's going to be extremely difficult to disable this
| even if iOS is fully compromised.
| karlgkk wrote:
| If I'm reading this right:
|
| Reboot is not enforced by the SEP, though, only requested. It's
| a kernel module, which means if a kernel exploit is found, this
| could be stopped.
|
| However, considering Apple's excellent track record on these
| kind of security measures, I would not at all be surprised to
| find out that a next generation iPhone would involve the SEP
| forcing a reboot without the kernels involvement.
|
| what this does is that it reduces the window (to three days) of
| time between when an iOS device is captured, and a usable*
| kernel exploit is developed.
|
| * there is almost certainly a known kernel exploit out in the
| wild, but the agencies that have it generally reserve using
| them until they really need to - or they're patched. If you
| have a captured phone used in a, for example, low stakes
| insurance fraud case, it's not at all worth revealing your
| ownership of a kernel exploit.
|
| Once an exploit is "burned", they distribute them out to
| agencies and all affected devices are unlocked at once. This
| now means that kernel exploits must be deployed within three
| days, and it's going to preserve the privacy of a lot of
| people.
| toomuchtodo wrote:
| Would be nice if Apple would expose an option to set the
| timer to a shorter window, but still great work.
| jojobas wrote:
| Or to disable it entirely. Someone could set up and ipad to
| do something always plugged in, would be bloody annoying to
| have it locked cold every three days.
| mjevans wrote:
| I'd rather have a dedicated Kiosk mode that has a profile
| of allow-listed applications and one or more that are
| auto-started.
| aspenmayer wrote:
| Maybe one or two of these will do what you want?
|
| https://support.apple.com/en-us/105121
|
| > With Screen Time, you can turn on Content & Privacy
| Restrictions to manage content, apps, and settings on
| your child's device. You can also restrict explicit
| content, purchases and downloads, and changes to privacy
| settings.
|
| https://support.apple.com/en-us/111795
|
| > Guided Access limits your device to a single app and
| lets you control which features are available.
| duskwuff wrote:
| Or "single-app mode", which is a more tightly focused
| kiosk mode:
|
| https://support.apple.com/guide/apple-configurator-
| mac/start...
| grahamj wrote:
| Conspiracy theory time! Apple puts this out there to
| break iPad-based DIY home control panels because they're
| about to release a product that would compete with them.
| aspenmayer wrote:
| It's more likely than you think!
|
| > Apple's Next Device Is an AI Wall Tablet for Home
| Control, Siri and Video Calls
|
| https://news.ycombinator.com/item?id=42119559
|
| via
|
| > Apple's Tim Cook Has Ways to Cope with the Looming
| Trump Tariffs
|
| https://news.ycombinator.com/item?id=42168808
| duskwuff wrote:
| > Apple puts this out there to break iPad-based DIY home
| control panels
|
| If you were using an iPad as a home control panel, you'd
| probably disable the passcode on it entirely - and I
| believe that'd disable the inactivity reboot as well.
| aspenmayer wrote:
| You could also set the auto-lock in display settings to
| never.
| grahamj wrote:
| I dunno, does this SEP check only happen when the device
| is locked? I don't recall that being mentioned.
| grahamj wrote:
| I dunno, does this SEP check only happen when the device
| is locked? I don't recall that being mentioned.
| stephen_g wrote:
| I'm not sure, but I wouldn't expect the inactivity
| timeout to trigger if the device was already in an
| unlocked state (if I understand the feature correctly) so
| in kiosk mode or with the auto screen lock turned off and
| an app open I wouldn't expect it to happen.
| jojobas wrote:
| Maybe you want it locked and only showing notification
| headers.
| stephen_g wrote:
| Having to put your passcode in every three days is not
| the end of the world. It would make sense also that if
| you turned off the passcode entirely it also wouldn't
| restart.
| alwayslikethis wrote:
| In GrapheneOS, you can set it to as little as 10 minutes,
| with the default being 18 hours. That would be a lot more
| effective for this type of data exfiltration scenario.
| technics256 wrote:
| You can do this yourself with Shortcuts app.
|
| Create a timer function to run a shutdown on a time
| interval you order. Change shutdown to "restart".
| KennyBlanken wrote:
| You clearly haven't tried it or even googled it - because
| it's impossible to do it unattended. A dialog pops up
| (and only when unlocked) asking you to confirm the
| reboot. It's probably because they were worried users
| might end up in a constant reboot/shutdown cycle, though
| presumably they could just implement a "if rebooted in
| the last hour by a script, don't allow it again" rule.
| dmitrygr wrote:
| > Reboot is not enforced by the SEP, though, only requested
|
| We (the public) do not know if SEP can control nRST of the
| main cores, but there is no reason to suspect that it cannot.
| karlgkk wrote:
| We actually do know, it cannot directly*. What it could do
| is functionally disable RAM, but that would basically cause
| the phone to hard lock and even cause data corruption in
| some limited cases.
|
| This is still being actively researched. I have no
| evidence, but would not be surprised to find out that a SEP
| update has been pushed that causes it to pull RAM keys
| after the kernel panic window has closed.
|
| * This may have been changed since the last major writeup
| came out for the iPhone 11.
| grahamj wrote:
| > Reboot is not enforced by the SEP, though, only requested.
| It's a kernel module, which means if a kernel exploit is
| found, this could be stopped.
|
| True. I wonder if they've considered the SEP taking a more
| active role in filesystem decryption. If the kernel had to be
| reauthenticated periodically (think oauth's refresh token)
| maybe SEP could stop data exfiltration after the expiry even
| without a reboot.
|
| Maybe it would be too much of a bottleneck; interesting to
| think about though.
| karlgkk wrote:
| > If the kernel had to be reauthenticated periodically
| (think oauth's refresh token)
|
| If the kernel is compromised, this is pointless I think.
| You could just "fake it".
|
| SEP is already very active in filesystem encryption. The
| real important thing is evicting all sensitive information
| from memory. Reboot is the simplest and most effective, and
| the end result is the same.
| grahamj wrote:
| It's involved in handling the keys but I don't think disk
| is processed by the SEP. If it was the SEP could simply
| stop providing access.
| markasoftware wrote:
| Kernel exploits would let someone bypass the lockscreen and
| access all the data they want immediately, unless I'm missing
| something. Why would you even need to disable the reboot
| timer in this case?
| karlgkk wrote:
| Hypotdthically, I suppose there's value in disabling the
| timer if you're, for example, waiting for a SEP exploit
| that only works if in an AFU state?
|
| But, I don't know where the idea of disabling a reboot
| timer came in? I'm only simply saying that now, you have to
| have a kernel exploit on hand, or expect to have one within
| three days - a very tall order indeed.
| KennyBlanken wrote:
| > * there is almost certainly a known kernel exploit out in
| the wild, but the agencies that have it generally reserve
| using them until they really need to - or they're patched.
|
| There's literally emails from police investigators spreading
| word about the reboots, which state that the device goes from
| them being able to extract data while in AFU, to them not
| being able to get anything out of the device in BFU state.
|
| It's a bit pointless, IMHO. All cops will do is make sure
| they have a search warrant lined up to start AFU extraction
| right away, or submit warrant requests with urgent/emergency
| status.
| karlgkk wrote:
| I sort addressed this in my original comment but local
| police likely do not have access to an AFU vuln, and
| generally get it after it's been patched. Then, they go on
| an unlocking spree. This prevents that
| op00to wrote:
| If reboot doesn't happen kernel panics, at least that's what
| the article says.
| aaronmdjones wrote:
| That's only because the kernel tells the userland to
| reboot. If the kernel is compromised, they can stop it from
| telling userland to reboot _and_ stop the kernel panicing.
| pnw wrote:
| Great writeup! And it's good to see Apple pushing the envelope on
| device security.
| Etheryte wrote:
| Wouldn't really say Apple is pushing the envelope here, as
| covered in the previous threads about this topic, a number of
| Android flavors have done this long ago.
| dewey wrote:
| The power of defaults is not to be underestimated. Yes, you
| probably can do it with some Android distribution but the
| amount of people using that would be microscopic.
| bananapub wrote:
| > Wouldn't really say Apple is pushing the envelope here
|
| come on dude. they're doing it by default, for > billion
| people, with their army of lawyers sitting around waiting to
| defend lawsuits from shitty governments around the world.
| abhishekjha wrote:
| How do these things work with devices inside a NAT gateway? Most
| of our devices are inside a LAN. Even if a server gets started,
| it won't be visible to the outside world, unless we play with the
| modem settings.
|
| Now, a hacker/state who has penetrated a device can do an upload
| of data from the local decice to a CNC server.
|
| But that seems risky as you need to do it again and again. Or do
| they just get into your device once and upload everything to CNC?
| aspenmayer wrote:
| This particular feature doesn't rely on network connectivity or
| lack thereof.
|
| Here's some info about how some spyware works:
|
| https://www.kaspersky.com/blog/commercial-spyware/50813/
| meindnoch wrote:
| What are you even talking about?
| alwayslikethis wrote:
| Great writeup, but I wonder why so much emphasis is put on not
| 'connected to network' part. It seems like a timed inactivity
| reboot is a simpler idea than any type of inter-device
| communication schemes. It's not new either; Grapheneos had this
| for a while now and the default is 18 hours (and you can set it
| to 10 minutes) which would be a lot more effective as a
| countermeasure against data exfiltration tools.
| nneonneo wrote:
| This is because earlier reports coming out of law enforcement
| agencies suggested that the network was involved in making even
| older devices reboot. This blog post is an effort to debunk
| that claim.
| lathiat wrote:
| If you're targeting these evidence grabbing/device exploiting
| mobs, generally the phones get locked into a faraday cage to
| drop the mobile network so that they can't receive a remote
| wipe request from iCloud.
| dblitt wrote:
| Does anyone have insight into why Apple encrypts SEP firmware?
| Clearly it's not critical to their security model so maybe just
| for IP protection?
| jonpalmisc wrote:
| They have a long history of encrypting firmware. iBoot just
| stopped being decrypted recently with the launch of PCC, and
| prior to iOS 10 the kernel was encrypted too.
|
| The operating theory is that higher management at Apple sees
| this as a layer of protection. However, word on the street is
| that members of actual security teams at Apple want it to be
| unencrypted for the sake of research/openness.
| saagarjha wrote:
| Someone high up is an idiot presumably
| thrdbndndn wrote:
| Two questions:
|
| 1. surely unconditionally rebooting locked iPhones every 3 days
| would cause issues in certain legit use cases?
|
| 2. If I read the article correctly, it reboots to re-enter
| "Before First Unlock" state for security. Why can't it just go
| into this state without rebooting?
|
| Bonus question: my Android phone would ask for my passcode (can't
| unlock with fingerprint or face) if it thinks it might be left
| unattended (a few hours without moving etc.), just like after
| rebooting. Is it different from "Before First Unlock" state? (I
| understand Android's "Before First Unlock" state could be
| fundamentally different from iPhone's to begin with).
| bonyt wrote:
| > 1. surely unconditionally rebooting locked iPhones every 3
| days would cause issues in certain legit use cases?
|
| I wonder if this explains why the older iPhone I keep mounted
| to my monitor to use as a webcam keeps refusing to be a webcam
| so often lately and needing me to unlock it with my password...
| athrun wrote:
| I have the same setup and what works for me is putting the
| phone into Supervised mode using the Apple Configurator.
|
| From there, you can enable single app mode to lock it into
| the app you're using for the webcam (I use Camo).
| diggan wrote:
| > it reboots to re-enter "Before First Unlock" state for
| security. Why can't it just go into this state without
| rebooting?
|
| I think the reason is to make sure anything from RAM is wiped
| completely clean. Things like the password should be stored in
| the Secure Enclave (which encryption keys stored in RAM are
| derived from) but a reboot would wipe that too + any other
| sensitive data that might be still in memory.
|
| As an extra bonus, I suppose iOS does integrity checks on boot
| too, so could be a way to trigger that also. Seems to me like a
| reboot is a "better safe than sorry" approach which isn't that
| bad approach.
| gizmo686 wrote:
| Reboots don't typically wipe RAM. Although wiping ram is
| relatively easy if you are early enough in the boot process
| (or late enough in the shutdown process).
| bayindirh wrote:
| With ASLR and tons of activity happening during the boot
| process, it's almost guaranteed that you'll damage the keys
| you need. Plus, we don't know how shutdown processes are
| done. It might be wiping the keys clean before resetting
| the processor.
| johncolanduoni wrote:
| I'd expect that the RAM encryption key is regenerated each
| boot, so the RAM should be effectively wiped when the key
| from the previous boot is deleted from the memory
| controller.
| diggan wrote:
| > Reboots don't typically wipe RAM.
|
| Typically yeah, I think you're right. But I seem to recall
| reading that iOS does some special stuff when shutting
| down/booting related to RAM but of course now I cannot find
| any source backing this up :/
| oneplane wrote:
| It is very different as the cryptography systems can only
| assure a secure state with a known root of trust path to the
| state it is in.
|
| The big issue with most platforms out there (x86, multi-vendor,
| IBVs etc.) is you can't actually trust what your partners
| deliver. So the guarantee or delta between what's in your
| TEE/SGX is a lot messier than when you're apple and you have
| the SoC, SEP, iBoot stages and kernel all measured and assured
| to levels only a vertical manufacturer could know.
|
| Most devices/companies/bundles just assume it kinda sucks and
| give up (TCG Optal, TPM, BitLocker: looking at you!) and make
| most actual secure methods optional so the bottom line doesn't
| get hit.
|
| That means (for Android phones) your baseband and application
| processor, boot rom and boot loader might all be from different
| vendors with different levels of quality and maturity, and for
| most product lifecycles and brand reputation/trust/confidence,
| it mostly just needs to not get breached in the first year it's
| on the market and look somewhat good on the surface for the
| remaining 1 to 2 years while it's supported.
|
| Google is of course trying hard to make the ecosystem hardened,
| secure and maintainable (it has been feasible to get a lot of
| patches in without having to wait for manufacturers or telcos
| for extended periods of time), including some standards for FDE
| and in-AOSP security options, but in almost all retail cases it
| is ultimately an individual manufacturer of the SoC and of the
| integrated device to make it actually secure, and most don't
| since there is not a lot of ROI for them. Even Intel's SGX is
| somewhat of a clown show... Samsung does try to implement their
| own for example, I think KNOX is both the brand name for the
| software side as well as the hardware side, but I don't
| remember if that was strictly Exynos-only. The supply chain for
| UEFI Secure Boot has similar problems, especially with the PKI
| and rather large supply chain attack surface. But even if that
| wasn't such an issue, we still get "TEST BIOS DO NOT USE"
| firmware on production mainboards in retail. Security (and
| cryptography) is hard.
|
| As for what the difference is in BFU/AFU etc. imagine it like:
| essentially some cryptographic material is no longer available
| to the live OS. Instead of hoping it gets cleared from all
| memory, it is a lot safer to assume it might be messed with by
| an attacker and drop all keys and reboot the device to a known
| disabled state. That way, without a user present, the SEP will
| not decrypt anything (and it would take a SEPROM exploit to
| start breaking in to the thing - nothing the OS could do about
| it, nor someone attacking the OS).
|
| There is a compartmentalisation where some keys and keybags are
| dropped when locked, hard locked and BFU locked, the main
| differences between all of them is the amount of stuff that is
| still operational. It would suck if your phone would stop
| working as soon as you lock it (no more notifications,
| background tasks like email, messaging, no more music etc).
|
| On the other hand, it might fine if everything that was running
| at the time of the lock-to-lockscreen keeps running, but no new
| crypto is allowed during the locked period. That means
| everything keeps working, but if an attacker were to try to
| access the container of an app that isn't open it wouldn't
| work, not because of some permissions, but because the keys
| aren't available and the means to get the keys is
| cryptographically locked.
|
| That is where the main difference lies with more modern
| security, keys (or mostly, KEKs - key encryption keys) are a
| pretty strong guarantee that someone can only perform some
| action if they have the keys to do it. There are no permissions
| to bypass, no logic bugs to exploit, no 'service mode' that
| bypasses security. The bugs that remain would all be HSM-type
| bugs, but SEP edition (if that makes sense).
|
| Apple has some sort of flowchart to see what possible states a
| device and the cryptographic systems can be in, and how the
| assurance for those states work. I don't have it bookmarked but
| IIRC it was presented at Black Hat a year or so ago, and it is
| published in the platform security guide.
| spijdar wrote:
| The short answer to your last two questions is that "before
| first unlock" is a different state from requiring the
| PIN/passcode. On boot, the decryption keys for user profile
| data are not in memory, and aren't available until they're
| accessed from the security coprocessor via user input. The
| specifics depend on the device, but for Pixel devices running
| GrapheneOS you can get the gist of it here:
| https://grapheneos.org/faq#encryption
|
| The important distinction is that, before you unlock your phone
| for the first time, there are no processes with access to your
| data. Afterwards, there are, even if you're prompted for the
| full credentials to unlock, so an exploit could still shell the
| OS and, with privilege escalation, access your data.
|
| Before first unlock, even a full device compromise does
| nothing, since all the keys are on the <flavor of security
| chip> and inaccessible without the PIN.
| dwaite wrote:
| > Why can't it just go into this state without rebooting?
|
| Because the state of the phone isn't clean - there is
| information in RAM, including executing programs that will be
| sad if the disk volume their open files are stored on goes
| away.
|
| If your goal is to get to the same secure state the phone is in
| when it first starts, why not just soft reboot?
| TimeBearingDown wrote:
| this also clears out deeper OS rootkits if they could not
| achieve reboot persistence, which is not uncommon.
| Kwpolska wrote:
| What legit use case involves not touching your phone at all for
| 3 days?
| Hackbraten wrote:
| Maybe you want people to be able to reach you on a secondary,
| inbound-only phone number.
|
| I've also heard people re-purpose old phones (with their
| batteries disconnected, hopefully) as tiny home servers or
| informational displays.
| adastra22 wrote:
| Not a phone, but at my old apartment I used to have an iPad
| mounted on the wall. It was a dynamic weather display, Ring
| doorbell answerer, multimedia control, etc. Would suck if
| every 3 days I had to enter my passcode again.
| Shank wrote:
| I haven't tested this, but I assume this wouldn't occur if
| the device is fully unlocked and powered on. Most kiosk
| adjacent deployments are setup so that they never turn the
| screen off and remain unlocked.
| grishka wrote:
| Looks like something that doesn't need to have a passcode
| on it in the first place.
| layer8 wrote:
| I have something like this as well, connected to my Apple
| account for calendar and reminder access etc. I wouldn't
| want every random guest to have access to that.
| myflash13 wrote:
| iPad has a kiosk mode for these use cases.
| YoumuChan wrote:
| I connect an iPhone 12 to my vehicle's CarPlay all the time.
| Recently I often found the start unreliable, which defeats
| all the purpose.
| layer8 wrote:
| It means that in the future you can't use old iPhone hardware
| to run an unattended server or similar anymore (unless you
| simulate user activity by adding some hardware that taps on
| the display every three minutes, or something). This is why I
| don't like that it's a hardcoded non-configurable setting. It
| cripples potential use cases for the hardware.
| Someone wrote:
| > If I read the article correctly, it reboots to re-enter
| "Before First Unlock" state for security. Why can't it just go
| into this state without rebooting?
|
| 1. Getting there reliably can be hard (see the age-old
| discussions about zero-downtime OS updates vs rebooting), even
| more so if you must assume malware may be present on the system
| (how can you know that all that's running is what you want to
| be running if you cannot trust the OS to tell you what
| processes are running?)
|
| 2. It may be faster to just reboot than to carefully bring back
| stuff.
| alphan0n wrote:
| If I were looking for low hanging fruit, I suspect it wouldn't
| reboot if you were to replicate the user's home WiFi environment
| in the faraday cage, sans internet connection of course. Or
| repeatedly initializing the camera from the lock screen.
| Syonyk wrote:
| From the article:
|
| > _Turns out, the inactivity reboot triggers exactly after 3
| days (72 hours). The iPhone would do so despite being connected
| to Wi-Fi. This confirms my suspicion that this feature had
| nothing to do with wireless connectivity._
| happytoexplain wrote:
| >In the After First Unlock (AFU) state, user data is decrypted
|
| Note that this is a slight simplification because, I assume, the
| reality is irrelevant to understanding the topic:
|
| There are a few different keys [0] that can be chosen at this
| level of the encryption pipeline. The default one makes data
| available after first unlock, as described. But, as the
| developer, you can choose a key that, for example, makes your
| app's data unavailable _any_ time the device is locked. Apple
| uses that one for the user 's health data, and maybe other extra-
| sensitive stuff.
|
| [0]: https://support.apple.com/guide/security/data-protection-
| cla...
| wepple wrote:
| How useful do you think this is in practice? Wouldn't it rely
| on app-level memory scrubbing and page clearing and such as
| well, if you wanted to truly make sure it's unavailable? Do
| Apple offer APIs to assist there?
| myflash13 wrote:
| > The class key is protected with a key derived from the user
| passcode or password and the device UID. Shortly after the
| user locks a device (10 seconds, if the Require Password
| setting is Immediately), the decrypted class key is
| discarded, rendering all data in this class inaccessible
| until the user enters the passcode again or unlocks (logs in
| to) the device using Face ID or Touch ID.
| happytoexplain wrote:
| This means it can't be read from storage, but AFAIK
| anything you've read into your app's memory sandbox is
| still sitting there decrypted until your app releases it or
| is closed or has its memory wiped by system housekeeping.
| happytoexplain wrote:
| It's a good point - I am not an expert, but I think this
| feature just doesn't protect memory (tying one of the keys to
| rebooting helps, but the Data Protection feature itself
| doesn't seem to protect memory). However, that doesn't moot
| in-storage protection. There are other features protecting
| memory (and other features protecting data in storage - there
| are tons of security features).
|
| I am not aware of APIs for securely clearing your app's
| memory (aside from lower level, more manual APIs). This may
| be one of those cases that relies mostly on sandboxing for
| protection. I also imagine it's hard to circumvent sandboxing
| without rebooting. But I'm making a lot of guesses here.
| axoltl wrote:
| There's a decent amount of data protected by Class A keys
| (which are only available when a device is 'actively
| unlocked') and some amount of data protected by Class B keys
| (which are asymmetric keys to allow data to be encrypted
| while the device is locked but only decrypted when the device
| is unlocked by way of a private key encrypted with a Class A
| key). The security guide[0] isn't super obvious about what
| data is protected with what keys:
|
| > The Mail app database (including attachments), managed
| books, Safari bookmarks, app launch images, and location data
| are also stored through encryption, with keys protected by
| the user's passcode on their device.
|
| > Calendar (excluding attachments), Contacts, Reminders,
| Notes, Messages, and Photos implement the Data Protection
| entitlement Protected Until First User Authentication.
|
| I can confirm that when they say "keys protected by the
| user's passcode" they mean "protected with class A or B". The
| most shameful omissions there in my opinion are Messages and
| Photos, but location data is (from a law enforcement
| perspective) obviously a big one.
|
| 0: https://help.apple.com/pdf/security/en_US/apple-platform-
| sec...
|
| Edit: Additionally, as to your API question, the system
| provides notifications for when content is about to become
| unavailable allowing for an app developer to flush data to
| disk:
|
| https://developer.apple.com/documentation/uikit/uiapplicatio.
| ..
| ghssds wrote:
| My question is: why three days specifically instead of a user-
| configurable delay?
| Etheryte wrote:
| Apple's whole thing is offering whatever they think is a good
| default over configuration. I can't even begin to count all the
| things I wish were configurable on iOS and macOS, but aren't.
| Makes for a smooth user experience, sure, but is also
| frustrating if you're a power user.
| Slartie wrote:
| Because this way, the delay is parameterized within the Secure
| Enclave firmware by hard-coding it, which is a thing that only
| Apple can do.
|
| If you were to allow a user to change it, you'd have to
| safeguard the channel by which the users' desired delay gets
| pushed into the SE against malicious use, which is inherently
| hard because that channel must be writable by the user.
| Therefore it opens up another attack surface by which the
| inactivity reboot feature itself might be attacked: if the
| thief could use an AFU exploit to tell the SE to only trigger
| the reboot after 300 days, the entire feature becomes useless.
|
| It's not impossible to secure this - after all, changing the
| login credentials is such a critical channel as well - but it
| increases the cost to implement this feature significantly, and
| I can totally see the discussions around this feature coming to
| the conclusion that a sane, unchangeable default would be the
| better trade-off here.
| axxto wrote:
| > if the thief could use an AFU exploit to tell the SE to
| only trigger the reboot after 300 days, the entire feature
| becomes useless
|
| Then why not simply hardcode some fixed modes of operation?
| Just as an example, a forced choice between 12, 24, 48, or a
| maximum of 72 hours. You can't cheat your way into convincing
| the SE to set an unlimited reset timer. I'm sure there must
| be a better reason.
| pushupentry1219 wrote:
| I haven't read the whole thing, but from skimming the beginning.
| This is pretty similar how AOSP's BFU vs AFU unlock works.
| archeantus wrote:
| Great post. They talked about the possibility of iOS 18
| wirelessly telling other phones to reboot, but then afaik didn't
| address that again. Maybe they did and I missed it?
| C4K3 wrote:
| They conclude that there's no wireless component to the
| feature.
|
| _This feature is not at all related to wireless activity. The
| law enforcement document 's conclusion that the reboot is due
| to phones wirelessly communicating with each other is
| implausible. The older iPhones before iOS 18 likely rebooted
| due to another reason, such as a software bug._
| zarzavat wrote:
| If you think about it, if the attacker is sophisticated
| enough to break the phone within a 72 hour window, then they
| are definitely sophisticated enough to use a faraday
| container. So communication between phones wouldn't help very
| much.
|
| Moreover, you'd have to have some inhibitory signal to
| prevent everybody's phones restarting in a crowded
| environment, but any such signal could be spoofed.
| sunnybeetroot wrote:
| This may explain why since iOS18 my device randomly reboots
| (albeit only takes max 5 seconds). I am a daily user so perhaps
| the reboot I experience is a bug.
| sroussey wrote:
| Yes, lots of complaints on forums about this bug. Saw it happen
| to my phone today.
| echoangle wrote:
| If it takes only 5 seconds, it doesn't sound like a reboot.
| Does it show a black screen and the apple logo during this
| event?
| sunnybeetroot wrote:
| No Apple logo, just black screen with loading spinner
| followed by requiring passcode to unlock
| future10se wrote:
| That might be what's informally called a "respring", where
| the SpringBoard process is restarted.
|
| SpringBoard is the process that shows the home screen, and
| does part of the lifecycle management for regular user
| apps. (i.e. if you tap an icon, it launches the app, if you
| swipe it away in the app switcher, it closes the app)
|
| It is restarted to make certain changes take effect, like
| the system language. In the jailbreaking days, it was also
| restarted to make certain tweaks take effect. Of course, it
| can also just crash for some reason (which is likely what
| is happening to you)
| kaba0 wrote:
| Hi, is there some further info on iOS "internals" like
| this? I was always interested in how it works, but I
| found much less information compared to android (which
| obviously makes sense given one is more or less open-
| source), even though these probably don't fall in the
| secret category.
| sss111 wrote:
| mine used to do that when the battery needed replacement
| h1fra wrote:
| I always assumed that it was memory reaching capacity or
| routine cleanup more than a reboot. This often happened to me
| after intensive use
| jjallen wrote:
| If this is such a security benefit why not do it after 24 hours
| instead? How many people go that long without using their phones?
|
| How many people are using their phones for some other purpose for
| which they want their phones to never reboot? And what are they
| actually doing with their phones?
| saagarjha wrote:
| Because it harms the user experience.
| jjallen wrote:
| How though? Users haven't used their phone in a day or more?
| How would they notice except for having to reenter their
| passcode which takes two seconds?
| IshKebab wrote:
| Read the introduction.
| Shank wrote:
| Not being able to glance at any push notifications or get
| incoming caller ID would be pretty disruptive.
| layer8 wrote:
| That's not the case if you also have other Apple devices
| on the same account.
| Wowfunhappy wrote:
| I'm sure this is why but I had the same thought as GP. Under
| what circumstances would 24 hours be disruptive, but three
| days would be okay?
|
| If you're using the iPhone as some type of IoT appliance,
| either time limit would be disruptive. But if you e.g. enable
| Guided Access, the phone will stay unlocked and so shouldn't
| reboot.
|
| If you're using the iPhone as a phone, who the heck doesn't
| touch their phone in 24 hours? Maybe if you're on some phone-
| free camping trip and you just need the iPhone with you as an
| emergency backup--but in that case, I don't think Inactivity
| Reboot would be particularly disruptive.
|
| Maybe Apple will lower the window over time?
| layer8 wrote:
| > How many people go that long without using their phones?
|
| For people who don't leave the house that often and have other
| Apple devices, this suddenly becomes much more frequent.
| jesprenj wrote:
| > In law enforcement scenarios, a lot of the forensically
| relevant data is available in the AFU state. Law enforcement
| takes advantage of this and often keeps seized iPhones powered
| on, but isolated from the Internet, until they can extract data.
|
| In Slovenia, devices have to be turned off the moment they are
| seized by their owner, prior to putting them into airplane mode.
| Razengan wrote:
| Also when thieves or muggers rob someone, the first thing they
| do is turn on Airplane Mode or force power-off.
|
| WHY the hell don't those actions require a passcode or bio
| authentication??
| saagarjha wrote:
| They could just put it in a foil-lined pocket instead.
| miki123211 wrote:
| I don't think people would be fine with being unable to power
| any electronic device down at need, even if they're not the
| owner.
|
| It feels like something that needs to be as easy as possible,
| for safety reasons if not anything else.
|
| Now what I'd like to see is an extension of their protocol
| that is used to locate iPhones that would also let them
| accept a "remote wipe" command, even when powered down.
| mccraveiro wrote:
| You can definitely block AirPlane mode without a passcode on
| iOS. I disabled the access to the control center when the
| iPhone is locked. Therefore thieves won't be able to do so.
| zarzavat wrote:
| This doesn't work if they steal it out your hand while it's
| unlocked.
| 4lun wrote:
| Slight mitigation to this is you can add an automation
| via the Shortcuts app to be triggered when airplane mode
| is enabled, and set the actions to immediately lock your
| device and disable airplane mode
|
| Downside is that you need to manually disable the
| automation if you actually wish to use airplane mode (and
| also remember to re-enable it when done)
| NamTaf wrote:
| I've set two automations: 1) When airplane mode is
| activated, lock the screen. 2) When airplane mode is
| activated, turn it back off. That'll give me the most
| opportunity to either track it and/or lock it down
| remotely.
|
| I can remember to disable the shortcut whenever I fly and
| need to enable it.
|
| If they pop my SIM (my provider doesn't use eSIMs...)
| then there's a PIN on it to prevent use in another
| device.
| mavhc wrote:
| I assume apple has something similar to
| https://support.google.com/android/answer/15146908
|
| Theft Detection Lock uses AI, your device's motion
| sensors, Wi-Fi and Bluetooth to detect if someone
| unexpectedly takes your device and runs away. If Theft
| Detection Lock detects that your device is taken from
| you, it automatically locks your device's screen to
| protect its content.
| Wowfunhappy wrote:
| You need to be able to forcibly power off the phone when it's
| frozen.
| newZWhoDis wrote:
| FYI: Everyone should disable control center from the Lock
| Screen to prevent that attack (airplane mode activation while
| locked).
|
| iPhones are still trackable while powered off, at least for a
| while.
| Shank wrote:
| To me the biggest takeaway is that Apple is sufficiently paranoid
| to add this feature. Some people (like John Gruber) advocate for
| activating bio lockout at the border by squeezing the volume and
| power buttons. I would say if you're the type of person who would
| do this, you should go one step further and power off.
|
| Similarly, if you're in a situation where you cannot guarantee
| your phone's security because it's leaving your possession, and
| you're sufficiently worried, again, power off fully.
| phinnaeus wrote:
| What do you do if you're at the border and they demand both the
| physical device and the password?
|
| Let's assume "get back on the plane and leave" is not a viable
| option.
| mzhaase wrote:
| Burner phone
| cherryteastain wrote:
| GrapheneOS duress password [1] and user profiles [2] are
| quite solid solutions for this scenario
|
| [1] https://grapheneos.org/features#duress
|
| [2] https://grapheneos.org/features#improved-user-profiles
| andyjohnson0 wrote:
| From the link:
|
| > GrapheneOS provides users with the ability to set a
| duress PIN/Password that will irreversibly wipe the device
| (along with any installed eSIMs) once entered anywhere
| where the device credentials are requested (on the
| lockscreen, along with any such prompt in the OS).
|
| In a border interrogation scenario, isn't that just likely
| to get you arrested for destroying evidence?
| verandaguy wrote:
| Depends on the border. In most democracies, and at most
| borders, and in most LE cases, there is a line between
| "destruction of my own property/data" and "destruction of
| evidence," where the latter usually needs a court
| document notifying the subject of the potential charge of
| their requirement to preserve evidence (for example, a
| subpoena, or in some cases, a direct request to avoid
| spoliation).
| myflash13 wrote:
| Theory. This is not how things work in practice, even in
| "democracies". Speaking as a person who has been harassed
| at the US border from Canada many times, I've learned it
| depends more on how the border agent "feels" about you.
| These people are often uneducated bullies who don't know
| or don't care about the law anyway. And if you start
| objecting on some legal basis, they can legally make
| things a LOT harder for you, including simply denying
| entry for no reason (yes, they have such a right). Better
| to cooperate rather than give the appearance of
| "destroying evidence" (even if completely legal) or
| you're in for a world of hurt if you got the wrong guy.
| darkwater wrote:
| Wella, if you are a "normal person" with actually nothing
| to hide, yes, cooperating as much as you can is probably
| the best thing to do. But if you are some "special
| person" (activist, journalist, diplomat etc) wiping out
| everything might be your best option.
| seanw444 wrote:
| I have a solution to that problem that works 100% of the
| time:
|
| I don't leave the US.
| iAMkenough wrote:
| 2 out of 3 people in the US live within U.S. Customs and
| Border Protection jurisdiction, where border agents can
| search without warrant if they determine they have
| "reasonable suspicion."
|
| Additionally, SCOTUS ruled in 2022 (Egbert v Boule) that
| someone who has had their Fourth Amendment rights
| violated by CBP agents are not entitled to any damages
| unless Congress clearly defines a punishment for the
| violation by a federal agent.
| seanw444 wrote:
| True, that's ridiculous. But luckily I am one of the 1
| out of 3.
| wepple wrote:
| That's a significantly higher bar. It's not foolproof though.
|
| I believe in most countries, customs can inspect your
| luggage. They can't force you to reveal information that
| they're not even certain you have.
|
| Under your situation, the best idea is to simply have a wiped
| device. A Chromebook, for example, allows you to login with
| whatever credentials you choose, including a near empty
| profile
| bananapub wrote:
| > I believe in most countries, customs can inspect your
| luggage. They can't force you to reveal information that
| they're not even certain you have.
|
| this isn't a very useful way to think about it.
|
| they can definitely search your luggage, obviously, but the
| border guards/immigration officials/random law enforcement
| people hanging around/etc can also just deny non-citizens
| entry to a country, usually for any or no reason.
|
| there's documented cases of Australia[0] demanding to
| search phones of even citizens entering the country, and
| the US CBP explicitly states they may deny entry for non
| citizens if you don't give them the password and while they
| can't deny entry to citizens, they state they may seize the
| device then do whatever they want to it[1].
|
| 0: https://www.theguardian.com/world/2022/jan/18/returning-
| trav...
|
| 1: https://www.cbp.gov/travel/cbp-search-authority/border-
| searc...
| ThePowerOfFuet wrote:
| You say no.
|
| Or, with GrapheneOS, you give them the duress password, on
| the understanding that you will have to set the device up
| from scratch IF you ever see it again.
| wutwutwat wrote:
| you can be forced to place your thumb on a sensor, or have
| the device held to your face.
|
| you can't be forced to remember a password you "forgot"...
|
| biometric authentication is not always your friend
| kevincox wrote:
| > you can't be forced to remember a password you
| "forgot"...
|
| No, but the border agents also aren't required to let you
| into the country. (Generally unless you are a citizen.)
|
| So border agents are very different than general laws of
| the country because while there may be legal protections
| about what they may be able to force you to do there are
| much less protections about when you have the right to pass
| the border (other than entering countries where you are a
| citizen).
| wutwutwat wrote:
| I never said anything about crossing a border. I said
| nobody can force you to remember something, for any
| reason, border crossing or otherwise
| projektfu wrote:
| I don't think there is a technological solution for this
| unless you have some sort of sleight-of-hand. Typically,
| border agents of countries with lots of transit do not
| stop people for very long. Some other countries (North
| Korea, perhaps) might put everyone through the wringer
| because they do not have a lot of crossings. If a border
| agent of a relatively free country is stopping you, they
| probably have some suspicion, in which case it is best to
| not be holding evidence in your hand.
|
| There are steganographic methods to hide your stuff. You
| can also use burners on either side of the border
| crossing and keep your main line clean. But bringing a
| device full of encrypted data (even if it's just your
| regular photo collection) that you refuse to unlock will
| probably be suspicious.
|
| I know that there are times when there are no reasons for
| suspicion and people get stopped anyway. The border agent
| didn't like your look, or racism, or an order came down
| from on high to stop everyone from a particular country
| and annoy them. If that's the case, it's probably still
| best to not have a lot of incriminating evidence on your
| person, encrypted or not.
| ReptileMan wrote:
| Don't carry a phone with you. You can always buy one after
| the airport.
| thesuitonym wrote:
| If that's in your threat profile, you should not be traveling
| with a phone. If this is a real threat for you, no amount of
| hardware/software security will beat a wrench:
| https://xkcd.com/538/
| mptest wrote:
| Also, lockdown mode and pair locking your device. Pair locking
| iirc is how you protect against cellubrite type attacks
| vsl wrote:
| Doesn't the volume+power gesture transition into BFU, i.e. be
| equivalent to power-cycling?
| jonpalmisc wrote:
| No. This is a myth, and while it does force you to enter your
| password instead of using biometrics on the next unlock, it
| is not the same as returning to BFU.
| maccard wrote:
| > I would say if you're the type of person who would do this,
| you should go one step further and power off.
|
| I'd travel with a different device, honestly. I can get a new-
| in-box android device for under PS60 from a shop, travel with
| that, set it up properly on the other side, and then either
| leave it behind or wipe it again.
| wutwutwat wrote:
| The PS60 burner sounds like a leader on the device security
| front. No way it could possibly be running an ancient version
| of android that is no longer getting security patches, or is
| hacked up to shit by the device manufacture to reskin it and
| install their vulnerable suite of bloatware, or built off of
| a base os and firmware flocked to by folks for its ease of
| being able to gain root access/root it and run whatever you
| want at the kernel level.
| kshacker wrote:
| It could be doing all that actually but you are not obliged
| to install all your apps on the burner, just the basic
| minimum.
| wutwutwat wrote:
| You're still walking around with a microphone and gps
| tracker connected to a cellular network even if the only
| thing you do is power it on
| brewdad wrote:
| If that's your threat model, don't carry ANY phone.
| Probably best not to carry any modern electronic device
| at all.
| wutwutwat wrote:
| Real criminals who don't want to be caught don't carry
| phones for this exact reason.
| colimbarna wrote:
| Sometimes the alternative blows up in your face though.
| maccard wrote:
| There's no guarantee your $1000 flagship isn't doing that
| either.
|
| I chose it because it's a mainstream provider (Nokia)
| readily available running a supported version of android
| (12).
|
| If you want to install a custom rom, you can get an older
| flagship (galaxy s9) and flash it for about the same price.
|
| My point is if your threat model is devices seized at
| border, then a burner phone is far more suitable for you
| than a reboot.
| wutwutwat wrote:
| levels of trust. I have more trust in the largest most
| heavily scrutinized device manufacture making an attempt
| at security than I do with a rando burner device
| reseller. To be clear, I don't trust either fully, but
| one has way less trust than the other
| avianlyric wrote:
| The whole point of a burner is that you don't trust it.
| You only store what you absolutely need to store on
| there, if anything, and basically assume it's compromised
| the second it leaves your sight.
|
| The advantage of a burner phone is that it can't contain
| anything important, because you've never put anything
| important on it, or connected it to any system that
| contains important data. So it doesn't really matter if
| it's compromised, because the whole point of a burner, is
| that it's so unimportant you can _burn it_ the moment it
| so much as looks at you funny.
| wutwutwat wrote:
| Something a lot of people don't really consider is that
| people who are doing things that could get them unwanted
| attention, they wouldn't have incriminating evidence on
| any device, burner or otherwise. So the theoretical ways
| around not getting busted, like using a burner, are for
| movie villains and bond type secret agents. Real
| criminals (smart ones anyway) aren't conducting anything
| important over any network, be it ip, telephony, morse
| code, smoke signal, or otherwise, regardless of the burn-
| ability of the device they would be using to do so
| 486sx33 wrote:
| If I had to guess, there must have been an exploit in the wild
| that took advantage of this. It sounds like it eliminates the
| oldest tools in one swoop. Which is pretty sweet
| gruez wrote:
| Even without an exploit in the wild, having such a feature is
| critical for security. Otherwise any device that's seized by
| police can be kept powered on indefinitely, until firms like
| Cellebrite can find an exploit.
| wang_li wrote:
| > advocate for activating bio lockout at the border
|
| This is a terrible idea. When you're crossing a border you have
| to submit to the rules of entry. If one of those rules is that
| you let them create an image of your phone with all of its
| contents, that's the rule. If you say no, then, if you're
| lucky, you get to turn around and return to where you came
| from. If you're not lucky, then you get to go to jail.
|
| What needs doing is the ability to make a backup then a way to
| reconcile the backup at a later date with the contents of a
| device. That is, I should be able to backup my phone to my home
| computer (or cloud I guess) and then wipe my phone or
| selectively delete contents. Then I travel abroad, take photos
| and movies, exchange messages with people, and so on. Then when
| I get home I should be able to restore the contents of my phone
| that were deleted without having to wipe all the new stuff from
| the trip.
| lofaszvanitt wrote:
| More security theatre.
| bayindirh wrote:
| Elaborate.
| mjlee wrote:
| I had to look up what SRD meant. It's a Security Research Device
| - "a specially fused iPhone that allows you to perform iOS
| security research without having to bypass its security
| features."
|
| https://security.apple.com/research-device/
| 486sx33 wrote:
| Nice work, and appreciate the time you spent!
|
| "I also downloaded an older kernel where Apple accidentally
| included symbols and manually diffed these versions with a focus
| on the code related to inactivity reboot. The kernel has three
| strings relating to the feature:" sounds like a little luck there
| for sure !
___________________________________________________________________
(page generated 2024-11-18 23:01 UTC)