[HN Gopher] Disclosure of three 0-day iOS vulnerabilities
       ___________________________________________________________________
        
       Disclosure of three 0-day iOS vulnerabilities
        
       Author : jayhoon
       Score  : 1829 points
       Date   : 2021-09-24 00:28 UTC (22 hours ago)
        
 (HTM) web link (habr.com)
 (TXT) w3m dump (habr.com)
        
       | soheil wrote:
       | Has anyone on HN been able to run any of these 0-days to see if
       | they work on their device?
        
         | ntSean wrote:
         | Yep, here is a thread which has screenshots on the latest iOS
         | update:
         | https://twitter.com/keleftheriou/status/1441252412061204486
        
       | eurasiantiger wrote:
       | I am seriously considering throwing my $800 phone at a concrete
       | wall.
        
       | diebeforei485 wrote:
       | This is crazy. At this point it's pretty well established that
       | Apple isn't really going to pay you much if at all. Might as well
       | disclose in 90 days at this point.
        
         | sneak wrote:
         | Full disclosure is always responsible, even if the vendor is
         | not notified in advance.
        
           | mrslave wrote:
           | This is a part of our industry I do not follow beyond
           | headlines. A lot of those headlines are about hackers trying
           | to be responsible getting screwed out of supposed bounties
           | that to my mind already appear quite small. Also responsible
           | companies doing very little to quickly close them. Does
           | anyone have any insight into how the market for
           | vulnerabilities operates? Is there is a significant disparity
           | in price between official/responsible disclosures and private
           | sales?
        
             | diebeforei485 wrote:
             | Private buyers almost certainly pay a higher amount and
             | their payments arrive much sooner.
             | 
             | Apple's published rates are high (up to $1M), but in
             | practice they pay a lot lower.
        
             | TrueDuality wrote:
             | As someone that actively works in the security industry and
             | has spent quite a bit of time tracking this... Yes, there
             | is a massive disconnect in pricing for private acquisitions
             | of vulnerabilities in commonly used software.
             | 
             | Almost always it's between a 2-5 magnitude order of
             | difference in price between a bug bounty and what a company
             | like Zerodium pays. When they have a valuable enough
             | customer asking for something specific they'll even give
             | bonus rates between 2x-10x above their normal rates.
             | 
             | Here have a tweet where Zerodium is doing exactly that:
             | https://twitter.com/Zerodium/status/1437884808257024008
        
               | netsec_burn wrote:
               | Do you know of anyone personally who was paid?
        
               | Aissen wrote:
               | Oh sure, Zerodium pays (over time, as long as bug is
               | unpatched), if you don't care how your exploits are used
               | (will it be used to target middle east journalists or
               | jeopardize our democracies by watching over elected
               | representatives? who knows.); sure, they vet their
               | customers, and the customers swear they won't do anything
               | bad with it.
               | 
               | Note: not sure they would pay for these private
               | information leaks. They'd probably prefer a local
               | escalation and then do the data collection themselves.
        
               | black_puppydog wrote:
               | Holy fuck what kind of turds are these? Their "temporary"
               | bounty boost listing includes specifically Moodle.
               | 
               | That's an application that will be used by minors to a
               | large extent, meaning they're literally leaving kids the
               | world around unsafe.
               | 
               | How any of this can be legal is beyond me.
               | 
               | Btw they're also targeting pidgin, I'm imagining this
               | might be related to OTR sessions over tor...?
               | 
               | Edit: remembered moodle is used by universities as well,
               | so not overwhelminly but still....
               | 
               | Edit 2: IMHO working or having worked for one of these
               | companies should be a career ending move. Simply not
               | acceptable to be working in this field anymore. Not by
               | legal means of course, but as an industry we should
               | simply consider people who were willing to sign a
               | contract with these criminals to be unemployable. "Sorry
               | we don't do business with turds."
        
               | ChuckNorris89 wrote:
               | _> as an industry we should simply consider people who
               | were willing to sign a contract with these criminals to
               | be unemployable._
               | 
               | By that same logic we coul include mass ad/surveillance
               | companies like Google and Facebook to the list. IMHO
               | those do way more damage to society as a whole. Where do
               | we draw the line?
        
               | black_puppydog wrote:
               | The fact that drawing any specific line is always wrong
               | to an extent, and that it is difficult, is not a good
               | argument against drawing a line at all.
               | 
               | We have tons of jobs you're even legally not allowed to
               | do, no matter how profitable. We're literally talking
               | about people who deal in vulnerabilities in software used
               | by minors, with the express intent of keeping these open.
               | 
               | In my book, that is beyond the line. Change my mind.
        
             | eyegor wrote:
             | So most public companies don't even run bug bounties. The
             | ones that do may or may not acknowledge your disclosure,
             | and they decide what your vulnerabilities are worth
             | regardless of any scales they might post on a blog. So in a
             | best case scenario, you get maybe 10-100k for a world
             | ending RCE + escalation but most of the time you get no
             | response or <1k. On the gray market, though, something like
             | that will easily sell for over 100k, sometimes several
             | million. Generally it's frowned upon in academic circles,
             | but there are a handful of large brokers like zerodium who
             | are happy to pay out for interesting bugs.
        
               | illusionofchaos wrote:
               | Zerodium is not interested in this kind of bugs. If they
               | own at least one RCE+LPE, they can already access all
               | data on any device and more
        
       | Ansil849 wrote:
       | > My actions are in accordance with responsible disclosure
       | guidelines (Google Project Zero discloses vulnerabilities in 90
       | days after reporting them to vendor, ZDI - in 120). I have waited
       | much longer, up to half a year in one case.
       | 
       | "Responsible" disclosure guidelines only benefit corporations.
       | They do not protect consumers. Why should independent researchers
       | - working for free, no less (and sorry, the well-below-minimum-
       | wage pittance that is most bounties does not count as not working
       | for free) have to cow tail to corporate guidelines?
       | 
       | If you find a vulnerability, do everyone a favor and disclose it
       | immediately. This places pressure on the corporation to fix it
       | immediately, instead of ignoring it indefinitely.
        
         | samhw wrote:
         | FYI, it's "kowtow", not "cow tail".
         | 
         | Also, it's "This makes it immediately available to exploit
         | before a fix can even _theoretically_ be developed", not "This
         | places pressure on the corporation to fix it immediately".
        
       | davewritescode wrote:
       | I'm not defending Apple but looking at the code published here,
       | it's clear that most, if not all, of these bugs could be caught
       | via static analysis which Apple obviously uses as part of its
       | approval process.
       | 
       | Frankly, I'm a lot more concerned with bugs that have to deal
       | with input handling than SDK bugs that developers can use to do
       | bad things.
       | 
       | This is likely a non-issue for those of us who haven't jailbroken
       | our devices.
        
         | quotemstr wrote:
         | > of these bugs could be caught via static analysis which Apple
         | obviously uses as part of its approval process.
         | 
         | What? Are you suggesting that OS security bugs are in fact non-
         | issues because static analysis can detect programs that exploit
         | these bugs?
         | 
         | No, it doesn't work that way. You can _always_ encode program
         | logic in a way that will defeat static analysis. All you have
         | to do is write a little interpreter with PEEK, POKE, and JUMP
         | opcodes, then encode your actual exploit logic using the little
         | instruction set you 've just created. You can make this sort of
         | indirection as elaborate as you want, and there's no way for
         | static analysis to see through it all. When this kind of
         | technique is used for DRM, it takes months of expert human
         | analysis to figure out what's going on. No way some scanner is
         | going to do that, especially if (unlike in the DRM cracking
         | case) there's no clear indication that there's something to
         | discover and decode in the first place.
        
         | illusionofchaos wrote:
         | > static analysis which Apple obviously uses as part of its
         | approval process
         | 
         | This analysis is a joke, it just scans strings inside binaries
         | against the list of symbols corresponding to what Apple
         | considers to be Private API. Gamed exploit can be uploaded to
         | the App Store and binary will pass their analysis with flying
         | colors
        
         | kif wrote:
         | If Apple allowed a jailbreaking application make it to the App
         | Store, then I do not trust their processes to not fail again.
        
       | Cieplak wrote:
       | Why must iOS use WiFi to run critical security updates? I assume
       | it's a kickback from telecoms to reduce network bandwidth from
       | users with unlimited mobile data plans?
        
         | vadfa wrote:
         | You got downvoted but this is correct. For a long time iOS
         | didn't allow you to perform big downloads (apps, updates, etc)
         | from the device if you were on mobile data because they didn't
         | want to upset the carriers. But I believe those days are over.
        
         | diebeforei485 wrote:
         | This has changed on the newer phones (12, 13):
         | 
         | https://www.macrumors.com/2020/10/21/iphone-12-can-download-...
        
         | [deleted]
        
         | 542458 wrote:
         | Does iOS have any way of telling if you're on an unlimited data
         | plan? If not, maybe it's just to prevent the footgun (and
         | subsequent bad PR) of somebody accidentally updating the OS
         | over an expensive metered connection. But it could also be a
         | carrier demand, not really sure.
        
           | rovr138 wrote:
           | Settings > Cellular
           | 
           | It shows my carrier, amount of data used and shows remaining
           | on my plan.
           | 
           | Mine reads,                   Usage: Used 7.43GB - Unlimited
           | 
           | If I click on it it has 3 fields.
           | 
           | Data, Calls, Messages
           | 
           | Data reads the same here. Calls and Messages simply say
           | 'Unlimited'
        
             | easton wrote:
             | My phone does not have this (iPhone on 15.0 in the US,
             | AT&T).
        
               | rovr138 wrote:
               | Huh, I'm on the US too, T-Mobile.
        
               | shever73 wrote:
               | Mine doesn't have this either (Europe), and I have
               | unlimited data too. I have a "Data Plan" setting under
               | "Mobile Data", which is not activated, so I'm guessing
               | that setting is only there if your provider gives you a
               | data plan that Apple recognises.
        
       | solarkraft wrote:
       | I get happy about iOS security vulnerabilities, because they
       | allow users to mess with the software of the device they own
       | through jailbreaking.
       | 
       | Hopefully one of these will end up in a jailbreak.
        
       | jacquesm wrote:
       | If Apple can't handle properly disclosed vulnerabilities on their
       | main revenue generating platform what does this say about other
       | companies? Nothing good I'm afraid.
       | 
       | Meanwhile the contact list on my dumbphone is perfectly safe.
       | Time and again that's been proven to be the right decision,
       | convenience seems to trump security in the eyes of many but I
       | just don't want to give this up until there is a 'cloud free'
       | smartphone.
        
         | desertraven wrote:
         | Would you consider something like the PinePhone once it's a bit
         | more usable?
        
         | matbatt38 wrote:
         | Dumb phone might just be backdoored as well, how do you trust
         | it? Do you have an open source dumb phone?
        
         | ChuckNorris89 wrote:
         | _> If Apple can 't handle properly disclosed vulnerabilities on
         | their main revenue generating platform what does this say about
         | other companies?_
         | 
         | It doesn't say anything about other companies, it just says
         | that Apple doesn't give two shits about relationships with
         | security researchers, despite their massive resources and
         | wealth even when smaller or FOSS teams do much better. Apple
         | are the king of user experience which made them insanely
         | wealthy but that's about it. In every other respect they are
         | anti-consumer, anti-developer, anti-reparability, anti-
         | researcher, anti-openness, anti-standardization AF and act like
         | major a-holes in general to anyone outside their org who isn't
         | a customer.
         | 
         | It's not that Apple can't be better on the other fronts if they
         | actually wanted to, it's that they actively choose not to be,
         | as that has no impact on their stock price or consumer
         | experience and in consequence to their executive pay. So why do
         | things differently if people still buy your products?
         | 
         | At this point, I wouldn't be surprised if the _" Apple is more
         | secure and has less vulnerabilities"_ moniker just stems form
         | researchers getting tired of dealing with Apple's BS of not
         | acknowledging or paying them, so instead they just keep quiet
         | and sell the 0-days they find on the exploit markets (hard
         | working honest researchers still need to eat and pay rent) only
         | for those exploits to later end up in the hands of shady
         | companies like NSO or nation states, therefore leading to no
         | news in the mainstream media about Apple related
         | vulnerabilities. Win-win for Apple I guess.
        
         | jgilias wrote:
         | I realize you don't mean it that way, but this comes off as a
         | bit 'whatabout-ish'.
         | 
         | It doesn't say absolutely anything abut other companies. It
         | just says that Apple doesn't take security nearly as seriously
         | as their Marketing and Sales department would want us to
         | believe.
        
       | WA wrote:
       | I'm wondering how the health-data incident works with respect to
       | the GDPR.
       | 
       | Apple says it stores Health data in a protected way on the
       | device.
       | 
       | In reality, health data is leaked through logs and can be
       | accessed by any other app. It is impossible to tell whether or
       | not this data has been accessed in the wild.
       | 
       | Since Apple failed to implement their claimed security features
       | properly and you need to assume exploitation by apps in the wild
       | in the worst case, this would require a disclosure to GDPR
       | authorities. Did they do it? Were they fined yet?
        
         | evercast wrote:
         | According to my understanding of GDPR, the data would need to
         | contain personal information i.e. something that allows you to
         | link it to an identifiable person. Quick search on google gives
         | the following definition of personal data [1]:
         | 
         | "Personal data are any information which are related to an
         | identified or identifiable natural person."
         | 
         | So if there is no personal data in the logs, it should not be a
         | GDPR breach.
         | 
         | [1] https://gdpr-info.eu/issues/personal-data/
        
           | cybrox wrote:
           | Not sure how far fetched an accusation can be in this case.
           | 
           | If the data is accessible in plain text on a device that is
           | clearly linked to an identifiable natural person, which is
           | data that an attacker can easily access, the point of "just
           | this one log file not containing the data" is pretty much
           | mute.
           | 
           | Would be an interesting case.
        
             | WA wrote:
             | It really is interesting. Apple could potentially claim
             | that they didn't connect _Personally Identifiable
             | Information_ with the leaked health data, but a third party
             | app, which gathered that data, did.
             | 
             | Depends on what exactly was in the logs. Did it contain my
             | emergency contact from Apple Health? Or my own contact
             | data? That would be bad.
        
               | illusionofchaos wrote:
               | You can see the logs in JSON inside Settings app. Also if
               | two vulnerabilities are used together, you can get full
               | name and email and connect it to health data
        
       | ThePhysicist wrote:
       | Can Apple retroactively identify apps that might have exploited
       | these vulnerabilities to exfiltrate personal data? In my
       | understanding they receive the full source code of an app for
       | review, so they probably have an archive with all revisions that
       | they could go through using automated tools to identify exploit
       | code? Would be good to know if these exploits have been used in
       | the wild, being able to exfiltrate the entire address book
       | without any user involvement whatsoever is quite scary.
        
         | 0x0 wrote:
         | There is no way they could prove that an app HASN'T exploited
         | this. They don't get source code, only compiled binaries, and
         | with objective-c's extremely dynamic nature, any app could
         | technically receive a HTTP response containing strings
         | containing class and method names to dynamically look up and
         | invoke, maybe even based on the app's IP address or only on
         | specific dates. So calls to these exploitable APIs could have
         | happened and there would be no way to prove otherwise.
        
           | illusionofchaos wrote:
           | Furthermore, no one stops you from developing an app and
           | planting RCE vulnerability inside the binary. Then you can
           | exploit it remotely when necessary and execute the code that
           | exploits any iOS vulnerabilities known to you.
        
             | 0x0 wrote:
             | True but it is complicated by the fact that code signing is
             | generally enforced for executable segments. (JIT
             | compilation entitlements are generally not available to
             | apps beyond Apple's own MobileSafari builds)
        
         | brigandish wrote:
         | It would probably take the exploitation of a security hole in
         | Apple's systems to find out, as they clearly have no desire nor
         | incentive to do this.
         | 
         | Is it odd that I'm now hoping this might happen while also
         | hoping for them to start patching up security holes?
         | 
         | Edit: typo
        
         | sumedh wrote:
         | > In my understanding they receive the full source code of an
         | app for review
         | 
         | I did not know that, is that even legal that Apple gets to look
         | at your IP.
        
         | saagarjha wrote:
         | Apple doesn't get your app's source code when reviewing it,
         | they just receive the binary.
        
       | jupp0r wrote:
       | Thanks to @illusionofchaos for sticking to responsible disclosure
       | and putting themselves under legal risk for the benefit of
       | Apple's users.
       | 
       | Who knows if any of these are exploited in the wild already
       | (Pegasus, etc) and by whom.
        
       | rkagerer wrote:
       | Good on you. I'd be happy to make a small contribution to your
       | legal fund if Apple tries to send lawyers after you. Even if
       | these vulns weren't critical there's no excuse for their inept
       | handling. Thank you for pressuring them to get their act
       | together.
        
       | resist_futility wrote:
       | Just looking at the readme sample code it appears to be using
       | APIs unavailable in iOS, so where is the vulnerability?
        
         | illusionofchaos wrote:
         | It's just marked as unavailable. Apple does that to try keeping
         | people from using XPC on iOS. Use the full code from GitHub, it
         | has a bypass for that Xcode check
        
           | resist_futility wrote:
           | Unless they have evidence of it getting past Apple and into
           | the App Store, just doing it dynamically doesn't change
           | anything
        
             | illusionofchaos wrote:
             | If you have a developer account that you are willing to
             | sacrifice and don't mind the possibility of legal action,
             | you can try that. I've managed to upload the binary built
             | from the source code from gamed exploit repository on
             | GitHub to App Store Connect and installed it onto my own
             | device via TestFlight. I didn't submit it for review, but
             | if the functionality would have been concealed, it would
             | easily pass.
             | 
             | As far as I know, how the review happens is that reviewers
             | just install apps onto their iPads, tap through all the
             | screens they can find and make their decisions based purely
             | on that. So if an app connects to server and asks what it
             | should do, it's possible to make an app behave differently
             | for reviewers and all other users.
        
       | 58x14 wrote:
       | This is such an incredible amount of vulnerable mission-critical
       | data.
       | 
       | - all contacts, including 3rd party messaging apps, with metadata
       | (interactions, timestamps, other stats) - full address book -
       | whether any app is installed - SSID of connected wifi
       | 
       | and formerly,
       | 
       | - medical info - device usage - screen time - device accessories
       | 
       | I don't keep anything mission critical on mobile, but this is
       | still a gargantuan set of exploits, and each appear extremely
       | straightforward to validate and pay out the security researcher
       | (and maybe even patch). It's utterly tragic how Apple (and
       | others) have devolved to become the same monolithic, careless
       | organizations they once displaced.
       | 
       | I really, really hope something changes. Soon.
        
         | blub wrote:
         | The only mitigating factor is that they're not remote
         | vulnerabilities.
         | 
         | That being said, this is more or less the industry standard.
         | And even if the other person mentioning this was downvoted,
         | they are right: this has been the case since forever and can
         | only be remedied through laws making companies responsible for
         | their failures. But neither the US government nor said
         | companies want this. It will have to get so bad that it visibly
         | harms US or EU security for something to move in this space.
        
           | user-the-name wrote:
           | And if you try to deploy these in an actual app, you will be
           | getting banned very, very hard.
        
             | jtbayly wrote:
             | Do you have any evidence that Apple has been checking for
             | these vulnerabilities in apps? I mean, if you tried now,
             | sure, I'm guessing you'd get caught (or will be soon). But
             | these have been around a long time.
        
               | user-the-name wrote:
               | No, I mean if you try it now.
        
               | Mindwipe wrote:
               | That seems a generous guess, given the mitigation of
               | fixing the bug would be _much_ less arduous to do and
               | hasn 't been done (for two of the exploits).
        
         | hsbauauvhabzb wrote:
         | If these are gargantuan, how would you describe a remote zero
         | click complete device compromise (complete with
         | camera/microphone access)? What about an exploit that can cause
         | the users phone to explode?
        
           | 58x14 wrote:
           | I would be an order of magnitude less concerned with
           | camera/mic access, compared to perfect historical proof of my
           | usage and communication patterns.
           | 
           | Exploits often feel like pathogens, probably why they share
           | the term virus. If a virus has a high mortality rate,
           | contagion is lower, because it frequently kills the host
           | before it can spread.
           | 
           | Similarly, I think a 'complete device compromise' is much
           | more likely to be identified, prioritized, and patched. The
           | vulnerabilities mentioned here represent the highest severity
           | without an immediately noticeable fallout. Props to the
           | researcher.
           | 
           | P.S. I wonder if the researcher's Russian nationality
           | (assumed from their other post) had any impact on their lack
           | of payout.
        
             | pvg wrote:
             | It's hardly 'perfect historical proof', not to diminish the
             | seriousness of the vulnerability. But more importantly, the
             | mechanism matters a great deal. This particular
             | vulnerability requires the install of a malicious app, a
             | much higher bar than a 'drive by' exploitation. This leaves
             | a trace and exposes the attacker to consequences. No
             | (statistically speaking) app producer with any interest in
             | continuing to use the platform would deploy such an exploit
             | even if they had access to it.
        
               | akiselev wrote:
               | _> This particular vulnerability requires the install of
               | a malicious app, a much higher bar than a  'drive by'
               | exploitation._
               | 
               | It's a much higher bar when it's a targeted attack but
               | not necessarily if it's a dragnet like when a malicious
               | party buys a browser extension from the creator to
               | harvest user data. The only real difference between the
               | two scenarios is iOS's significantly stricter review
               | process and sandbox - if this exploit can bypass both
               | [1], it doesn't matter whether the malicious developer
               | can be traced because it'll just be some indie dev who
               | just sold his username/password and signing keys for
               | $X0-Y00k to some shell corp in the Bahamas.
               | 
               | [1] Are these exploits detectable through static analysis
               | or some other automation? (I have no idea)
        
               | pvg wrote:
               | The 'only real difference' is a pretty big difference -
               | the iOS developer is much more strongly identified. It's
               | also not the only difference - what you can do with the
               | access is different and what you end up doing with the
               | access is different. But in both cases, there are strong
               | disincentives not to do very overtly malicious shit - few
               | extension takeovers go around stealing your online
               | banking password, even though they could.
               | 
               | A drive-by exploit has a lot fewer of these constraints.
        
               | saagarjha wrote:
               | Dumb implementations can be caught via static analysis.
               | Smart ones are not going to be caught until Apple
               | realizes they are exploiting people and reverse engineers
               | them by hand.
        
               | jtbayly wrote:
               | > No (statistically speaking) app producer with any
               | interest in continuing to use the platform would deploy
               | such an exploit even if they had access to it.
               | 
               | Except Facebook. Or another behemoth that felt they could
               | weather Apple's wrath if it ever came to it. Or a company
               | that Apple had granted special permission to do this,
               | like they did with Uber.
        
               | pvg wrote:
               | Apple killed a bunch of FB's tracking a few months ago
               | and the story is still making the rounds and is on the
               | front page as we speak.
               | 
               | The point isn't that this isn't a serious vulnerability
               | or is somehow unexploitable. It's just that there's a
               | great deal of friction in exploiting it for relatively
               | paltry returns. Nobody is going to get into a spat with
               | Apple and invite regulatory and law enforcement attention
               | to traceably and with undeniable intent steal your
               | contact list. Nobody is going to (like another commented
               | hypothesized) launch a supply chain attack to steal your
               | contact list with this vuln. At that point you'd find a
               | better vuln. This one isn't all that much better than
               | just misleading people into giving you contact list
               | permission.
               | 
               | The OP is saying it's somehow worse than drive-by or low-
               | interaction vulns that own up entire devices and have
               | been repeatedly found in the wild. I don't think that
               | holds up.
        
               | hsbauauvhabzb wrote:
               | I've always been curious how many developers might drop
               | this in their code but only activate against potentially
               | valuable targets
        
               | pvg wrote:
               | I'd think also almost zero. A lot of sleazy data
               | collection operates under at least some fig leaf pretense
               | of user consent, in this case there's none. Once the
               | vulnerability is discovered, Apple could find out if
               | you've deployed such code more or less ever. Then you'd
               | probably have problems bigger than just a contractual
               | dispute with Apple.
        
               | tonypace wrote:
               | Maybe some devs are allowed to do this. And that's why it
               | wasn't patched.
        
               | funcDropShadow wrote:
               | To exploit this you only have to infect one of the many
               | dependencies off user installed app. It is not true that
               | the attacker has to forge an identity in the App Store.
               | It is enough to insert your code somewhere along the
               | whole software supply chain. And that can be long, think
               | about ReactNative apps using NPM dependencies.
        
           | kbenson wrote:
           | > What about an exploit that can cause the users phone to
           | explode?
           | 
           | Possibly world ending, at least from the perspective of the
           | user whose phone explodes next to their face?
        
             | hsbauauvhabzb wrote:
             | I'm not saying the above issues don't matter, but they're
             | hardly the most critical things you could do to an iPhone.
        
             | klyrs wrote:
             | Do like GM, keep your phone at least 5 feet away from other
             | phones while not in use.
        
           | sam0x17 wrote:
           | You can do a lot more damage with someone's bank account than
           | you can by exploding their phone.
        
             | midasuni wrote:
             | Currently holding my phone, with a full charge. Basically a
             | hand grenade, about 12" from my face, with (thanks to
             | oversized phones), both hands on it.
             | 
             | At best id be blind and unable to use my hands. I don't
             | give a stuff about my bank account compared with that.
        
               | kube-system wrote:
               | There are no explosives in a phone though. Lithium
               | batteries _might_ burn when short circuited... however,
               | this requires physically damaging the battery, or
               | shorting the circuitry. This requires some seriously
               | silly hardware vulnerabilities (lithium cell protection
               | is typically implemented in dedicated silicon), not
               | simply a software vulnerability. And if you could
               | successfully perform an exploit like this, you would
               | probably drop the phone before it hurt you.
               | 
               | https://youtu.be/osfgkFyq7lA?t=246
        
             | saagarjha wrote:
             | Depends on where the phone is at the time. Do it while
             | they're on a call and it's going to _really_ suck.
        
             | hsbauauvhabzb wrote:
             | Won't matter much if it burns your house down while you're
             | inside.
             | 
             | A complete compromise can also get access to your bank,
             | mail accounts, message history, mic and camera. Which
             | vulnerability would you prefer be used against you?
        
               | bnjemian wrote:
               | I'd marry mic access, F#$% camera access, and kill bank
               | access. Maybe that's just a personal opinion, though.
        
           | mdoms wrote:
           | Gargantuan.
        
         | jonplackett wrote:
         | Is that really the case? Or were they just not such a big
         | target before when everyone was spending most of their time in
         | a windows desktop. Maybe they just got away with it more easily
         | in the past.
        
           | AuthorizedCust wrote:
           | Your apologetics don't work. iOS has had humongous user
           | counts and growing market share in its largest market for
           | over 10 years.
           | 
           | Source for historical market share:
           | https://www.statista.com/statistics/266572/market-share-
           | held...
        
           | frankfrankfrank wrote:
           | Until we understand and push through a system (whether law or
           | practice) that makes harming others, especially against their
           | will and intentionally, far more costly than the massive
           | returns and profits they today produce, NONE of these kinds
           | of behaviors will ever cease.
           | 
           | The examples are numerous;
           | 
           | * Violation of human right to privacy and property
           | 
           | * Violation of human right to not being tracked
           | 
           | * Illegitimate wars
           | 
           | * Pollution
           | 
           | * Drugs (legal and illegal
           | 
           | * Government incompetence
           | 
           | * Siphoning off potential and sabotaging developing countries
           | through human resource poaching called "immigration"
           | 
           | * The fed fraud
           | 
           | * Government theft and fractional enslavement through
           | taxation
           | 
           | * More fraud through inflation
           | 
           | * Yet more fraud trough money "printing"
           | 
           | * And for emphasis; being "secure in their persons, houses,
           | papers, and effects, against unreasonable searches and
           | seizures" (and no, that does not only apply to the
           | government)
           | 
           | ...and probably many more that I am forgetting are all
           | immensely profitable activities that damage and destroy and
           | defraud large numbers of people while providing immense
           | profits and benefits to a very small set of people who are
           | also the most powerful.
           | 
           | You may disagree with what I have to say, but fundamentally
           | regardless of which set of things you do and don't support
           | all have an underlying mechanic that they defraud everyone
           | while immensely profiting a parasitic ruling class, and that
           | applies to both the things you think are good (e.g.,
           | immigration) or bad (e.g., wars). The parasitic ruling class
           | has us squabbling over meaningless crumbs while they are
           | bursting from picking our pockets and exploiting us as they
           | always have, even if their con and lies have changed over
           | time.
        
             | AuthorizedCust wrote:
             | You're getting into implied relative privation here. Yes,
             | bad things exist, and yes, they aren't controlled in a way
             | we'd like. That doesn't excuse Apple and isn't that
             | relevant to an argument about Apple's immediate
             | responsibility for its errors.
        
             | sul_tasto wrote:
             | I'm convinced that the modern role of US political parties
             | is to simply keep the masses squabbling. I wish we could
             | get rid of gerrymandering and the two party dominance.
        
               | wslack wrote:
               | Unfortunately, the Supreme Court decided 5-4 that the
               | courts should have no power here, so its up to state
               | legislatures.
               | (https://www.npr.org/2019/06/27/731847977/supreme-court-
               | rules...)
        
               | Revercomb wrote:
               | Totally agree here. It has to be intentional at this
               | point.
        
               | smolder wrote:
               | Yes. Hyperpartisanship is the way to maintain minority
               | control of an ostensibly majority-controlled system. Both
               | parties feign gridlock over popular policy but cooperate
               | on what their owners want. Populist support for policies
               | doesn't matter when people are kept divided along party
               | lines. "Radical" nonconformists who aim to do what voters
               | want can always be defeated from within their party if
               | not by the other side.
        
             | mr_overalls wrote:
             | > Siphoning off potential and sabotaging developing
             | countries through human resource poaching called
             | "immigration"
             | 
             | You think immigration should be illegal?
             | 
             | > Government theft and fractional enslavement through
             | taxation
             | 
             | You really had me in agreement with the first few items.
             | You are not a serious person.
        
               | maerF0x0 wrote:
               | I'm going to guess that they're referring to only
               | allowing "cream of the crop" immigration. (eg H1B or
               | other visas often are only open to top candidates, not
               | everyone)
        
               | M277 wrote:
               | The problem with the argument that this is crippling
               | developing countries is that it's often the case that the
               | top talent can't really shine in their country. They
               | can't realize their potential, either due to the lack of
               | means or corruption or envy or whatever. I am from a
               | third world country and a large part of our scientists
               | did great things _because_ they immigrated, at the same
               | time, there are many great minds here that to do anything
               | at all have to partake in an endless uphill battle.
               | 
               | Like, I'd totally love to have our top engineers and
               | scientists come from the West and do some amazing things
               | here (they 100% can make a great change), but I feel that
               | they actually just can't.
        
               | maerF0x0 wrote:
               | Also it ignores, as i've observed with Filipinos in
               | Canada, that they often send like half of their paycheck
               | back "home" . Some of them live in developed world
               | "squalor" (like many people to a house, or taking the
               | bus) so their families can live like royalty in a low CoL
               | area.
        
         | headmelted wrote:
         | Cynically I feel like we're maybe expecting a lot here. Privacy
         | (and I'd assume security, given it's a necessity to achieve
         | that) _was_ a well-timed marketing drive at Apple, but that was
         | _months_ ago. You can't expect them to support these minor
         | features forever!
         | 
         | Besides, why focus on something as superficial as keeping your
         | private data safe when the new iPhone now comes in a _gorgeous_
         | pink finish. And with Ceramic Shield and the lightning-fast A15
         | chip? It's truly the best iPhone they've _ever_ made.
         | 
         | These puppies sell themselves without all that expensive
         | privacy talk.
         | 
         | Honestly their attitude to the bug bounty makes me wonder if
         | there's not a small group of engineers that keep screaming
         | about this problem just to have the door closed behind them and
         | a "lol nerds" giggle heard from the execs on the other side of
         | the door.
        
         | rastafang wrote:
         | > I really, really hope something changes. Soon.
         | 
         | I would not hold my breath... it's been like this for decades
         | at least.
        
       | Bellamy wrote:
       | All the hard working security researchers are this much
       | appreciated by the rotten Apple.
        
       | tyingq wrote:
       | >com.apple.gamed
       | 
       | I like the poetic nature of exploiting that one.
        
       | lisper wrote:
       | After the disclosure of the last critical 0-day, I went to update
       | the OS is my four iDevices. I upgraded three of them to iOS 14.8
       | with no trouble, but when I went to update the fourth it wouldn't
       | let me update to 14.8 but rather only offered me the option of
       | upgrading to 15.0. I didn't want to upgrade to 15.0, so I called
       | Apple support and the first-line tech said, "Oh, I can definitely
       | help you with that." I thought to myself that I'd give long odds
       | against, but let's wait and see. Long story short, the matter has
       | now been escalated two levels and is still not resolved. Funny
       | side-story: at one point the first-tier tech suggested I try
       | upgrading the phone using iTunes. iTunes has not existed in MacOS
       | for quite a while now. The way you talk to iDevices now is
       | through the Finder (which actually makes a lot more sense), but
       | apparently their tech support didn't get the memo.
       | 
       | Apple used to be the company that made devices that were secure
       | and "just worked". Now they are as bad as Microsoft in the bad
       | old days, and no one makes computers that "just work" any more.
       | 
       | :-(
        
         | Ansil849 wrote:
         | > Apple used to be the company that made devices that were
         | secure and "just worked".
         | 
         | This is a complete myth. In fact, not only did Apple devices
         | break all the time, but they were near-impossible for regular
         | users to repair on their own. A simple proof: how many broken
         | iPods did people used to have lying around?
        
           | vultour wrote:
           | I've never even heard of a broken iPod, who are these people
           | that have several lying around?
        
             | kube-system wrote:
             | Way back when they had spinning disks, they were pretty
             | failure prone. Although they were easy to replace, I
             | vaguely remember replacing one myself.
        
           | pram wrote:
           | None? I had an iPod Touch that lived in my glovebox for 9
           | years and it never died. My Rio Karma on the other hand
           | basically disintegrated. Thanks for asking!
        
           | lisper wrote:
           | > This is a complete myth.
           | 
           | No, it isn't. Snow Leopard was awesome. Mavericks was also
           | pretty solid. In fact, I'm still running that on my machines
           | today.
        
             | Ansil849 wrote:
             | Yes, it is. Snow Leopard and Mavericks are not devices. The
             | quote I am responding to is:
             | 
             | > Apple used to be the company that made devices that were
             | secure and "just worked".
             | 
             | Unless your first generation iPod still works wonders.
        
               | lisper wrote:
               | Sorry, I'm old-school. A computer is a device.
        
               | Ansil849 wrote:
               | > A computer is a device.
               | 
               | Correct. A computer is a device. Snow Leopard and
               | Mavericks--your two examples-- are, however, not
               | computers.
        
               | lisper wrote:
               | Oh, good grief, do you really need to get that pedantic?
               | Apple made computers (devices) that ran those operating
               | systems, and that combination mostly Just Worked (at
               | least it did for me).
        
               | Ansil849 wrote:
               | Pedantism is trotting out the fact that yeah, you can
               | find plenty of people still running Windows-whatever on
               | their Gateways, or lovingly cared-for TRS that still
               | "just works". The point is that by and large, Apple
               | devices are built in with planned obsolescence in mind
               | (see the lawsuit they settled a few months ago about
               | literally this).
        
               | masswerk wrote:
               | Ok, old (first gen) Mac Pro, preferably running Snow
               | Leopard.
               | 
               | BTW, I've one still running. Also a G3 from 1999 still
               | up.
        
               | blueboo wrote:
               | Mechanical hard drives that lived in pockets and
               | backpacks inevitably died. People seem to have been happy
               | with the lifespan they got, though.
        
         | [deleted]
        
         | sascha_sl wrote:
         | They might've sliced it into two parts on macOS, but it's still
         | iTunes, with all the issues iTunes had.
        
         | concinds wrote:
         | Sadly, Apple always gives the "latest and greatest" version all
         | the security updates, and is more selective about backports,
         | see here: https://support.apple.com/en-gb/HT212814 for stuff
         | you're unprotected from if you stay on iOS 14.
         | 
         | This includes arbitrary code execution with kernel privileges.
         | 
         | This has been the case for a long time for Apple, which forced
         | me to break my Catalina boycott (remember the macOS Vista
         | stories here?) because I don't want unfixed, publicized 0days
         | on my machine.
        
         | saagarjha wrote:
         | Here, I'll save you the trouble: download the iOS 14.8 IPSW for
         | your device, and then in the Finder when your device is
         | connected hold down the option key and hit "Update". Then
         | select the IPSW and it'll update your device with that.
        
       | meibo wrote:
       | Interesting to note: these exploits aren't caused by memory
       | safety, like a lot of other Apple exploits, but by plain API
       | abuse and bad design.
        
         | jareklupinski wrote:
         | that's why these were "Wontfix: Working Poorly as Designed" :^)
        
       | xpuente wrote:
       | Security through obscurity is a very bad idea. This is the main
       | Achilles heel of Apple.
        
         | ls65536 wrote:
         | Obscurity and secrecy seem to be so ingrained in the Apple
         | culture that I don't see this changing anytime soon short of
         | some kind of existential crisis or a major threat to a
         | substantial amount of their revenue. And even then it isn't
         | likely to be easy to turn the tide.
         | 
         | Right now we see these negative externalities of security
         | vulnerabilities being "paid for" by their customers, in the
         | vast majority of cases seemingly unwittingly, but that can only
         | go on for so long before it backfires (even if it's a long
         | time).
        
       | filoeleven wrote:
       | Does this rise to the level of a class action lawsuit? Especially
       | if I theoretically started receiving spam texts with more
       | personal info (my name, contacts' names) in them starting about
       | seven hours ago?
       | 
       | I haven't, I'm just curious.
        
       | gargs wrote:
       | Unfortunately, the fact of the matter is that the only way for
       | Apple to recognize a problem is to have a lot of news and
       | influential tech media cover it.
        
       | aguasfrias wrote:
       | That contact and messages access exploit is quite neat. Some
       | might say it even looks straightforward. I can only assume this
       | is the "private API" that darkly patterned apps use, because it
       | looks that obvious.
        
       | chewyfruitloop wrote:
       | 0-day ... what they mean is .... I found an issue. Its hardly a
       | 0-day, they've not posted any evidence of it actually being used
       | in the wild, just that "it can be used". Why is a normal "ooh
       | look a bug or two" all of a sudden hair on fire world is burning
       | news.... oh yeh ... I get more attention this way by exaggerating
       | 
       | Apple may have a terrible bug bounty and response process ... but
       | call a spade a spade
       | 
       | ... I saw a bird this morning ... help theres dinosaurs on the
       | loose!!!!
        
         | breakingcups wrote:
         | What is your definition of 0-day? Because they are exactly
         | right, this is a 0-day. Whether it's already actively being
         | exploited or not has no bearing on the definition.
         | 
         | I'll refer you to https://en.wikipedia.org/wiki/Zero-
         | day_(computing) to make up your own mind.
        
           | chewyfruitloop wrote:
           | normally I'd define it as something found in the wild being
           | exploited already ... not a bug thats been found, reported
           | and "ignored"
           | 
           | this seems to be a zero day just because Apple haven't seen
           | fit to respond the the reporter
        
             | samhw wrote:
             | > normally I'd define it as something found in the wild
             | being exploited already
             | 
             | Yeah, but ... that's not what it means. You can choose to
             | define "spoon" as "fork" too, but I don't see how it's
             | useful to go around complaining about other people using
             | the actual definition of the word.
        
         | ksml wrote:
         | A zero day is a zero day, regardless of whether it's been
         | exploited in the wild. There's a decent chance that well-funded
         | bad actors have already found these, and you have no idea
         | whether they've been used or not.
        
       | alphabettsy wrote:
       | Maybe it's just me, but these aren't what I think of when I hear
       | 0-day.
       | 
       | These are serious, but I was guessing remote code execution or
       | sandbox escape. It seems like we're talking about bypassing
       | privacy controls though.
       | 
       | That said, Apple needs to take this much more seriously. They
       | created the program reluctantly and it shows.
        
         | soheil wrote:
         | There needs to be a distinction between 0-days that make remote
         | code execution possible and those that don't. This is still a
         | pretty damning data leak.
        
           | wwn_se wrote:
           | There is a system, cvss scoring
           | https://www.first.org/cvss/specification-document
        
         | H8crilA wrote:
         | FYI, 0-day just means "first time made public".
        
           | tinus_hn wrote:
           | It means a vulnerability is made public while there is no
           | patch available. As opposed to releasing the information a
           | number of days after the patch was released.
        
             | cjbprime wrote:
             | It doesn't even really mean that anymore. From the blog
             | post here:
             | 
             | > I've reported four 0-day vulnerabilities this year
             | 
             | They're just using 0-day to mean "a new vulnerability
             | finding disclosed to the vendor privately", which is
             | becoming the new definition of the term.
        
           | alphabettsy wrote:
           | Completely aware of that. Just a weird perception thing for
           | me.
        
         | gzer0 wrote:
         | From an earlier post: This is a boat load of mission critical
         | data.
         | 
         | - all contacts, including 3rd party messaging apps, with
         | metadata (interactions, timestamps, other stats) - full address
         | book - whether any app is installed - SSID of connected wifi
         | 
         | - medical info - device usage - screen time - device
         | accessories
        
           | alphabettsy wrote:
           | I must have missed the medical info.
           | 
           | I guess I don't consider contacts mission-critical though I
           | definitely would not want them exposed and generally don't
           | give apps access to them.
        
       | annoyingnoob wrote:
       | If Apple will not do the ethical thing then why should people try
       | to help them? I say give Apple 30 days and then sell it to
       | someone else. Let Apple live their own reality.
        
       | raman162 wrote:
       | Unfortunate that this researcher, shame on apple for not handling
       | these vulnerabilities quickly.
       | 
       | I used to believe that iphones were more secure than android and
       | was considering making the switch. After reading this article and
       | with some other recent news (CSAM[1], spam on the app store[2]) I
       | don't think I'll be hopping on the iOS train anytime soon.
       | 
       | [1]:https://www.apple.com/child-safety/ [2]:
        
         | tonmoy wrote:
         | While Apple's recent behavior does seem bad, I'm personally
         | wondering if there is some quantitate measure comparing
         | iOS/Android before I make the opposite switch. I wonder if iOS
         | still may be more privacy friendly compared to alternatives
         | regardless of the recent issue (I genuinely have no clue)
        
           | raman162 wrote:
           | IMO, that seems like apple but the lines are getting closer
           | than ever
        
       | zibzab wrote:
       | > ... one was fixed in 14.7, but Apple decided to cover it up and
       | not list it on the security content page. When I confronted them,
       | they apologized, assured me it happened due to a processing issue
       | and promised to list it on the security content page of the next
       | update. There were three releases since then and they broke their
       | promise each time.
       | 
       | I think this is 100% intentional.
        
       | omnicognate wrote:
       | > medical information (heart rate, count of detected atrial
       | fibrillation and irregular heart rythm events)
       | 
       | > menstrual cycle length, biological sex and age, whether user is
       | logging sexual activity, cervical mucus quality, etc.
       | 
       | Wat? How, and under what circumstances is it collecting stuff
       | like cervical mucus quality??
       | 
       | Edit: ah, maybe I misread - it's "whether the user is logging
       | cervical mucus quality" I think. Still, wtf?
        
         | GoToRO wrote:
         | You need to select female under health.
        
           | omnicognate wrote:
           | I don't have an iDevice. Is this a built in health app or an
           | API for apps to record stuff?
        
             | saagarjha wrote:
             | The Health app stores all this data and third party apps
             | can ask to record to this database.
        
       | PostThisTooFast wrote:
       | "O-day?"
        
       | xvector wrote:
       | I suspect when Apple doesn't want to pay the bug bounty, they
       | just ignore it.
        
       | Amin699 wrote:
       | The vulnerably allows any user-installed app to determine whether
       | any app is installed on the device given its bundle ID.
       | 
       | XPC endpoint com.apple.nehelper has a method accessible to any
       | app that accepts a bundle ID as a parameter and returns an array
       | containing some cache UUIDs if the app with matching bundle ID is
       | installed on the device or an empty array otherwise.
        
       | JoeOfTexas wrote:
       | The problem is that cybersecurity is ridiculous hard problem. The
       | junior to senior developers are just using existing frameworks
       | with poor documentation. Any consumer technology will be beaten
       | to submission.
       | 
       | It's the same never-ending war as anti-cheat vs cheat.
        
         | api wrote:
         | It would really help a lot if:
         | 
         | (1) We used safe languages like Go, Rust, C#, Java, and Swift
         | instead of 1970s YOLO languages like C and C++.
         | 
         | (2) Our operating systems were designed from the ground up to
         | be secure in a modern environment.
         | 
         | Unix (all flavors) and Windows were both built long before
         | security was anywhere near as much of a concern as it is today.
         | Their security posture is very lax by modern standards.
         | Applications have a ton of permissions by default including
         | seeing the whole network and filesystem. Everything runs with
         | only basic memory isolation. Everything has access to an
         | enormous system call surface area.
         | 
         | It is extremely hard to apply security in retrospect to an
         | insecure system, especially a complex one with lots of legacy
         | support requirements. You are going to be playing a whole lot
         | of "whack a mole."
         | 
         | A modern secure OS would begin with the principle of least
         | privilege and be built from the ground up to isolate
         | applications as much as possible from anything they are not
         | entitled to access.
         | 
         | Oddly enough the web browser might be the best attempt. When
         | you browse the web you are basically swimming in malware, and
         | you are _relatively_ safe.
        
           | ziml77 wrote:
           | Entitlements are great, right up until someone has to start
           | manually deciding what access to give. After being blocked or
           | questioned by the OS a handful of times, people are going to
           | either disable the security or effectively disable it with
           | wildcard matches.
        
         | charlchi wrote:
         | If companies respected and paid platform/architecture
         | engineers, or SecOps/SysAdmin types fair amounts of money and
         | treated them with respect, instead of just throwing more and
         | more money at hordes of mindless devs who are "pushing
         | product", as virtually every single company does, maybe this
         | problem wouldn't exist.
         | 
         | I've had this conversation too many times. Security isn't hard,
         | it's just that nobody has respect for it. The guy who
         | understands software security isn't getting the respect he
         | deserves. The situation is so bad, some companies are literally
         | hiring people who know the equivalent of script kiddie
         | "penetration-testing".
         | 
         | Pay security engineers enough and listen very carefully to what
         | they have to say. Literally the only companies who seem to
         | understand this basic concept seem to be the intelligence
         | agencies, and a few other high profile companies.
        
         | foxfluff wrote:
         | > The problem is that cybersecurity is ridiculous hard problem.
         | 
         | I don't believe that. Making _perfectly_ secure software at
         | large scale is indeed very hard, but a lot of security issues
         | we see every day have little to do with lack of perfection.
         | There 's a ton of low hanging fruit out there.
         | 
         | > The junior to senior developers are just using existing
         | frameworks with poor documentation.
         | 
         | Yes, I do believe that the modern way of quickly ducktaping
         | junk together while paying little attention to security does
         | make for lots of security issues, which might give someone the
         | impression that cybersecurity is ridiculously hard.
         | 
         | Security requires proactive effort. It's not ridiculously hard,
         | but it needs to be done, it requires time (just as anything).
         | Leaving it for "hope you don't write buggy code" and "does
         | colleague notice the gaping hole in code review" is not doing
         | security, it's just winging it.
         | 
         | And I've never really been asked to do security, even when
         | literally working on a security product. Everyone just seems to
         | assume security is an automatic byproduct of skilled
         | programming, but it's not when the pressing concern is "how
         | many hours/days/.. does it take to ship this new feature oh and
         | we have three dozen other features that need to be implemented
         | soon" and "can we get it sooner?" and "why is it taking so
         | long, it's not that hard!"
         | 
         | It needs to be discussed, planned, designed, reviewed, tested,
         | verified, questioned, audited. Just like anything else, if you
         | want quality. None of this is ridiculously hard, but it needs
         | to be done for it to be done. If it's not being done, it's that
         | the organization doesn't care about it.
         | 
         | In a lot of ways it's similar to technical debt. Are you taking
         | the time to avoid it, reduce it, get rid of it? No? Then you're
         | accumulating it. Security issues accumulate just like technical
         | debt when ignored. And even the most skilled programmers do
         | generate technical debt (because they don't jump into a problem
         | with perfect knowledge of what needs to be done). Depending on
         | the organization, they may or may not get to fix it. Many
         | organizations just don't care and would rather have the devs
         | work on something "more productive."
        
         | Silhouette wrote:
         | _The problem is that cybersecurity is ridiculous hard problem._
         | 
         | Security is challenging, but there are lots of things we know
         | how to do better that many software developers still aren't
         | doing. For example, if you're still writing application-level
         | software in an inherently dangerous programming language in
         | 2021 and that application handles important data in a connected
         | device, you're part of the problem. If you ship a networked
         | hardware product with a standard password for all devices or
         | forget to disable the backdoor access used during development,
         | if you use encryption or hashing methods that are known to be
         | weak, etc. These things obviously won't solve all security
         | problems, but they are low-hanging fruit that is still being
         | left on the tree far too often.
        
         | dangerface wrote:
         | Don't apple make the frameworks they use? Im pretty sure they
         | could just patch them when they are notified of a
         | vulnerability.
        
         | hungryforcodes wrote:
         | You know, I'd love to think that the problem is cyber security
         | is hard -- which it IS -- but I'm starting to get the feeling
         | that the actual problem is that Apple doesn't care about this
         | kind of stuff. So many incredible vulnerabilities going back
         | generations of iPhones and iOS...the zero click iMessages one
         | floored me.
        
           | tyrfing wrote:
           | > the zero click iMessages one floored me.
           | 
           | In case there is any confusion, there has been at least one
           | of those a year for the past 3 years.
        
             | easton wrote:
             | What is dumb about it to me is that the solution in my mind
             | is simple: don't give Messages.app private API access. They
             | get access other messaging apps from the App Store can't
             | have and that's what's causing these vulnerabilities, but
             | all they need is APNS and access to the SMS service (which
             | is private but shouldn't be dangerous... right?).
        
               | kenferry wrote:
               | This last one was an issue in an image decoding.. it was
               | public API.
               | 
               | It's extremely common for an attacker to find a way to
               | exploit a maliciously crafted image. Take a look at
               | libpng, https://www.cvedetails.com/vulnerability-
               | list/vendor_id-7294...
        
         | jonathanstrange wrote:
         | You talk as if the Apple engineers couldn't have fixed those 3
         | vulnerabilities in half a year because it's too difficult for
         | them...
        
         | esalman wrote:
         | Apple makes ridiculous amount of money, and many Apple fanboys
         | I know believe their devices are hack-proof.
        
           | toofy wrote:
           | i'm not sure anyone who is at least semi-informed believes
           | their devices to be hackproof.
           | 
           | it's more likely they've considered which security issues
           | they're personally concerned about and decided the other
           | choices are as bad or worse.
           | 
           | once we actually dig into the specifics of an issue-
           | especially one as complicated as personal threat models
           | combined with actual usability-it's rarely a "my device is
           | now unhackable" cartoon caricature.
        
           | blowski wrote:
           | I don't think of my devices as being hack-proof, but as being
           | the best set of trade-offs for me personally between
           | security, privacy, usability, etc.
        
             | esalman wrote:
             | Agreed. I use Windows/Android for the same reason- I know
             | what I'm giving up but the upsides are important enough to
             | me. I am talking about people who don't have much clues
             | about either.
        
         | emsy wrote:
         | The problem is not that cyber-security is hard, but that a
         | trillion dollar company is incapable to handle security
         | disclosures.
        
           | Mtinie wrote:
           | In which case--if it's a legitimate deficiency--that doesn't
           | bode well at all for any other commercial enterprise. This
           | arms race is always tilted in favor of the attacker.
        
             | derac wrote:
             | There are much smaller companies which handle security
             | disclosures much better.
        
               | fragmede wrote:
               | but there are many many _many_ more that don 't, which
               | impeaches the few that don't. The fact that I can reach
               | the CEO of my local small business on the phone is a
               | great when they're sympathetic to my plight. it's
               | actually to the company's detriment when it turns out
               | they're a raging asshole.
        
           | numpad0 wrote:
           | I remember some of early Android phones that ran manufacturer
           | maintained Kernel turning out immune to then-undisclosed RCEs
           | years before disclosure.
           | 
           | I think it's less about disclosure handling but more about
           | being motivated to pay for or build a black magic code
           | analysis tools.
        
             | fragmede wrote:
             | got any links? that sounds like a fascinating story to
             | read!
        
         | godelski wrote:
         | > The problem is that cybersecurity is ridiculous hard problem.
         | 
         | This is why I wonder why "minimize your data exposure" is such
         | a controversial opinion.
        
           | jeltz wrote:
           | Because it makes developing new features slightly slower and
           | that is for mostly stupid reasons deemed unacceptable.
        
         | wyattpeak wrote:
         | Cybersecurity is a genuinely hard problem, but stuff like this
         | is dropping the ball entirely. It's not hard to solve exploits
         | like faulty permission-checking _after_ they 've been reported
         | to you.
         | 
         | Sure, there are always going to be problems you miss. I can
         | forgive them shipping with zero-days, it happens. Failing to
         | respond to reports is just that: failing.
        
           | Bhilai wrote:
           | Not to mention that there are still no mandatory security
           | courses in CS curriculum at most major universities.
        
           | eloisius wrote:
           | It really helps add some color to the motivations behind
           | notorization. It seems ridiculous to me that I have to jump
           | through so many hoops to run an executable that I trust.
           | Especially when Apple can't be bothered to follow up on real
           | vulnerabilities that have already been reported.
        
             | concinds wrote:
             | Worse is the fact that the bulk of Apple's security on Mac
             | is codesigning/notarisation, and a primitive signature-
             | based built-in antivirus. Windows seems to be doing so much
             | better that it's not even close, and in ways that aren't
             | user-hostile.
        
             | chii wrote:
             | Exactly - any PR propaganda about notorization or signing
             | making it safer for users is just BS. It's a gate-keeping
             | mechanism that adds a layer of power to apple and prevent
             | any control of the app market from slipping away.
        
               | noptd wrote:
               | Along with a solid bit of resume building for SecEng, I'd
               | say yeah exactly that.
        
         | Bellamy wrote:
         | Existing frameworks?
         | 
         | Does anyone knows the technology stack Apple uses?
        
         | howinteresting wrote:
         | Or better, we could stop making excuses for trillion-dollar
         | companies with terrible security practices.
        
           | toofy wrote:
           | i agree that we need to hold massive corporations to a higher
           | standard, but security _is_ incredibly difficult. this
           | comment is more directed at all of the comments who are
           | dismissive of how difficult good security is.
           | 
           | even some of the sharpest security/privacy minds on the
           | planet, who work for organizations who exist purely to make
           | secure software or hardware stumble often.
           | 
           | now add in that we actually expect our devices to also be
           | usable, and yes, security is very _very_ _very_ difficult.
           | 
           | if someone hasn't learned by now that every device will
           | eventually be cracked, we should question their ability to
           | reason.
           | 
           | now again, we should absolutely expect more from
           | apple/google/$company than we do, but we also can't hand-wave
           | away that security is very hard.
        
             | howinteresting wrote:
             | I mean, Google does a far better job than Apple responding
             | to reported vulnerabilities, pays a team to find
             | vulnerabilies, has extensive fuzzing infrastructure, etc.
             | Even among its peers Apple stands out as particularly bad.
        
         | TaylorAlexander wrote:
         | > The problem is that cybersecurity is ridiculous hard problem.
         | 
         | This is hard for me to believe for a company the size of Apple.
         | They were recently the wealthiest company on the planet and are
         | worth over a trillion dollars IIRC. They could slow down their
         | software development process, focus less on adding new
         | features, and prioritize fewer security holes. It seems like
         | such a huge risk to them that their devices are basically
         | always vulnerable. But the general public never really hear
         | about these 0-days so they may have some ability to ignore the
         | problem.
         | 
         | The cynic in me imagines that someone at Apple knows about
         | these bugs and they are shared with spooks for exploitation.
         | One could imagine that even for a responsible disclosure
         | program, they could share details of every new vulnerability
         | with some three letter agency who could have months of use out
         | of them before a patch is finally released.
        
           | dspillett wrote:
           | _> recently the wealthiest company on the planet ... They
           | could slow down their software development process, focus
           | less on adding new features, and prioritize fewer security
           | holes._
           | 
           | My inner cynic1 has a slightly different take on that: You
           | don't get to be the wealthiest company on the planet by doing
           | the right thing at the expense of the profitable things.
           | 
           | 1 He says, pretending it isn't also his outer and all-
           | consuming cynic!
        
             | skylanh wrote:
             | Ah, then the cynical part of you will recognize that
             | selling a widget is more lucrative then avoiding the loss
             | of someone else's private data.
             | 
             | Or at least that's what I just realized.
        
               | someguydave wrote:
               | people are unwilling to wait or pay more for high quality
               | products when a low quality substitute that is just
               | barely adequate is available
        
         | tester34 wrote:
         | Have they considered asking security questions on interviews
         | instead of algos?
         | 
         | like you know, shitty performing algo can be always rewritten,
         | leak or 0day cannot be reverted
        
         | wyager wrote:
         | > It's the same never-ending war as anti-cheat vs cheat.
         | 
         | Building secure software and building anti-cheat are nothing
         | alike.
         | 
         | Anti-cheat is fundamentally impossible (on a true general
         | purpose computer) because you're building software that has to
         | run in an environment where it can be dissected and modified.
         | 
         | Building a secure device (something which has to satisfy
         | certain security properties for all inputs through a
         | constrained API) is fundamentally possible. We're just too lazy
         | and cheap to do it. I don't even think it's "ridiculously hard"
         | - we would just have to spend more money on correct by
         | construction designs, formal methods, etc instead of spending
         | money on flashy UIs and random features no one cares about.
         | 
         | The number of employees at Apple working on security is
         | vanishingly small compared to the number of employees, say,
         | working on random siri features. And the way they approach
         | security in many apps is also wrong. Instead of having people
         | with formal correctness backgrounds designing the APIs against
         | which apps are built, they just have developers with no
         | security background putting everything together and then have a
         | few security people trying to pick up the pieces.
        
         | concinds wrote:
         | This seems like much more of an organisational dysfunction
         | problem than a computer science problem. I haven't heard
         | anything like this about Microsoft or Google: both seem
         | responsive and eager to fix within 90 days (mostly), have
         | responsible browser update models (where fixes for 0 days can
         | be pushed to the whole world within hours) instead of Apple's
         | irresponsible "you need a 3GB OS update even if the only fix is
         | 3 lines of code in Safari, but we don't push updates when ready
         | since we don't want to cause update fatigue, so you'll need to
         | wait 4 weeks for the fix to come in the next programmed OS
         | update", and so much more. Even Android seems decent security-
         | wise, since many of the processes that would cause security
         | concerns get updated through Play Store, even in really old
         | phones.
         | 
         | Another example: my grandma doesn't have WiFi, and I bought her
         | an iPhone, but iOS updates can't be done on 4G, which can only
         | be bypassed by internet-tethering a computer (which she doesn't
         | have), downloading the full IPSW, and installing it. She was on
         | iOS 14.0 until 2 weeks ago when I fixed it. And with Safari
         | being, according to security researchers, much less secure than
         | Chrome, that makes me shudder. This isn't an "anecdote" or an
         | edge case, not everyone lives in a developed country and
         | millions are just like my grandma, and Apple's poor security
         | design leaves her vulnerable for zero good technical reason,
         | where Android wouldn't. (They just dropped Google Play Services
         | support for Jelly Bean a few months ago, so even an old Android
         | phone would be reasonably secure). Caring about security
         | requires thinking of details like this.
        
           | guntars wrote:
           | The other day I tried to update my Macbook over a personal
           | hotspot and it happily downloaded about 2GB before the
           | computer went to sleep and I was greeted with a "Whoopsie,
           | failed to download the update, try again" message when I woke
           | it and, of course, it would just start over again. They don't
           | even support resuming the download! That's just embarrassing.
        
             | wrycoder wrote:
             | I've now tried three times to update Safari. The first two
             | times I left the laptop on charge with the lid open (I have
             | it set to auto update). The third time I did it manually
             | and watched. It got the download ok, but failed during
             | install. It told me the update failed, but with no reason
             | or suggestion regarding what to do next.
        
               | wrycoder wrote:
               | Edit: I should point out that not everyone has a high
               | data cell account. For example, I'm retired, and we don't
               | spend much time on the road. When we do, we're careful
               | how we use data, and we depend on wifi at restaurants and
               | hotels.
               | 
               | We are pretty heavy Internet users at home (Starlink),
               | but the cell plan is 2GB per month, shared between me and
               | my wife. We rarely overrun that.
               | 
               | I think that there are a lot of retired people with small
               | phone plans and no WiFi. All they do is a few phone calls
               | and a bit of email with the family. So, updating those
               | phones frequently is problematic. They probably take them
               | to the phone store where they bought them.
        
             | robertlagrant wrote:
             | Strangely I did the same thing last night and it did
             | resume, although you only see the resumption when it
             | starts. There's no prior indication that it will resume.
        
               | merb wrote:
               | it depends on your free storage. if it is less than X %
               | they will try to download it one go. the resumable
               | download will download all in chunks and than concat
               | them.
        
               | guntars wrote:
               | Just tried it again. I have 70GB available. Got on the
               | hotspot, started the download, unplugged the laptop,
               | waited for it to go to sleep, woke it up - same result. I
               | had to accept the EULAs again and it started to download
               | from about 50mb.
               | 
               | Having attempted (unsuccessfully) to write a resumable
               | HTTP/HTTPS downloader, which is what I suspect
               | nsurlsessiond is using behind the scenes - it's really
               | hard to get it right. Meanwhile I expect to be able to
               | start a BitTorrent download, throw the laptop down the
               | stairs, take out the hard drive and be able to
               | successfully resume it on a different computer because
               | that's a protocol that was actually designed for it.
        
               | codetrotter wrote:
               | But Apple has the luxury of controlling both the server
               | and the client. Why isn't it just a matter of using HTTP
               | range requests?
               | 
               | https://developer.mozilla.org/en-
               | US/docs/Web/HTTP/Range_requ...
               | 
               | And if they need to get more clever than that, why not
               | have a BitTorrent-like map of chunk hashes. So you
               | download the update, check the hash of the whole thing
               | and if it fails the hash check, download the chunk hashes
               | which will be say a 256 bit hash for every individual 32
               | MB chunk of the file. And for any chunk that fails the
               | hash check, use a HTTP range request to download it
               | again.
        
               | guntars wrote:
               | We can only speculate, but my guess is some combination
               | of "didn't feel the need to" and "the code responsible
               | for this is nested 6 layers deep between 3 different
               | frameworks".
        
           | slvng wrote:
           | Microsoft has been all over the cyber security news due to
           | their repeat vulnerabilities in Exchange and Azure AND the
           | way they have handled disclosures made to them
        
             | artiszt wrote:
             | u must have missed reading this : ".. Microsoft or Google:
             | both seem responsive and eager to fix within 90 days
             | (mostly) .."
             | 
             | but then u not need to read it. and who would given all the
             | exquisite experiences with M$ and/or Goo compared to
             | nightmares Apple delivers to u, overpriced, ofc
             | 
             | in a sense it is good reading tho after all in that it
             | indicates that exactly those are not the one's one does
             | meet in Apple-communities
        
               | iso1210 wrote:
               | It's spelt "you"
        
           | rixed wrote:
           | > my grandma doesn't have WiFi
           | 
           | Tech workers have difficulty taking into consideration
           | lifestyles they don't know exist, which is understandable. At
           | the end of the day this comes as another consequence of the
           | lack of diversity in tech, I guess.
        
             | msh wrote:
             | I don't think this is diversity specific. I know plenty of
             | young people who just use LTE and tether their computer to
             | their phone.
             | 
             | I think its more a legacy thing, when iPhones first came
             | out this made somewhat sense, to spare the mobile networks
             | the load.
             | 
             | If you have a 5G iPhone you can set it to download updates
             | over 5G but its just plain stupid you can't do it over 4G.
             | Over 2G or 3G it makes a little bit sense.
        
             | funcDropShadow wrote:
             | So, is this the lack of grandmas working at Apple in
             | software development? This is nothing you can fix by
             | following some diversity ideology. This is a question of
             | respecting different requirements from different user
             | groups. You cannot mirror every user group in the
             | development teams. How do you represent people of old age,
             | with illnesses, or certain disabilities in a development
             | team? How do you represent people living at certain
             | locations in the world in a development team? How do you
             | represent poor people or people of low education? Or how do
             | you represent people which live under oppressive regimes or
             | are the continued victims of criminals?
             | 
             | All these groups have different needs and expectations to
             | the products they buy or lease. And development teams and
             | their business cannot expect to understand that by having a
             | more diverse development team. They have to do better
             | requirements engineering, they have to listen better to
             | their customers and they have to decide and prioritize
             | those needs and expectations. E.g. A/B testing has abysmal
             | consequences for the needs and expectations of minorities.
        
               | shorner wrote:
               | Now that you mention it, I think there's a real lack of
               | grandmas in tech... maybe I shouldn't be saying this
               | publicly, but we don't have any at our company.
        
               | OldHand2018 wrote:
               | One of our product managers is a grandmother (she is
               | actually taking an early retirement soon because her 4th
               | grandchild is on the way).
               | 
               | We are in the B2B/B2EDU space so the "As a grandmother, I
               | think..." line of thought does not apply. However, she
               | has frequently had insights and observations that none of
               | us would have come up with. Once implemented, they have
               | been very successful/profitable.
               | 
               | So yes, absolutely, unless your company wants to be in a
               | very specific niche, the lack of true diversity in your
               | company is a drag on your success.
               | 
               | PS - my grandfather had a tech job in Sunnyvale. He
               | passed away last year at the age of 92. Point is -
               | everyone alive has lived in a world with pervasive
               | technology and computing. "They're old and can't
               | understand this stuff" is pure BS.
        
               | funcDropShadow wrote:
               | > One of our product managers is a grandmother
               | 
               | > However, she has frequently had insights and
               | observations that none of us would have come up with
               | 
               | > So yes, absolutely, unless your company wants to be in
               | a very specific niche, the lack of true diversity in your
               | company is a drag on your success.
               | 
               | The conclusion you are drawing here, does not follow from
               | your two observations above. The fact that your product
               | manager has had insights, that nobody else had in team,
               | does not mean a "lack of true diversity in your company
               | is a drag on your success." Some diversity may help in
               | certain situations and in others not. The above mentioned
               | insights and observations might just be the result of
               | competence and more experience of the product manager or
               | incompetence of the rest of the team. There may be many
               | other reasons. We don't know. We have one observation and
               | should refrain from generalizing. That this is because of
               | more diversity is just a speculation. A speculation that
               | fits an often repeated narrative, but that doesn't make
               | it a logical conclusion.
        
               | kiklion wrote:
               | Sometimes you need to accept you just aren't the target
               | audience of a product.
               | 
               | You may love cars. You might think Tesla's are amazing.
               | But if you live on a small island without an electrical
               | grid, it might not be the car for you just because it
               | doesn't come with its own solar panels.
        
               | funcDropShadow wrote:
               | But what do you do when you realize you are neither the
               | target sheep, ehh, customer, of iOS nor Android? But that
               | is just the other side of the coin of what I wrote about
               | the development teams. Sometimes, they just have to
               | accept that diversity is not going to solve all problems.
        
               | EvanAnderson wrote:
               | Owning an iPhone with no other means of Internet
               | connection makes one not the target market for an iPhone?
        
               | taylodl wrote:
               | All these different groups your speaking of are called
               | _personas_. Apple should have each of their personas
               | identified based off their technical prowess, their life
               | experiences, health, how they connect for user updates
               | and whatever else may differentiate a group of users from
               | one another and needs to be considered and accounted for.
               | 
               | Once you have that you create a user journey map for each
               | of those personas - in this case they need a user journey
               | map for updating. Your test teams then have to take each
               | persona and run through testing with those constraints
               | and capabilities in mind.
               | 
               | They don't have a high-speed network or are using a
               | cellular network? One or more personas should have
               | accounted for that. Color blind? One or more personas
               | should have accounted for that and there's software
               | available that can make your screen as it appears to
               | those who are colorblind.
               | 
               | These personas should be corporate-wide - these are your
               | _customers_ after all. I would be shocked if Apple isn 't
               | doing something like this, but then again, after getting
               | a glimpse at how the sausage is made I've come to the
               | conclusion Big Tech isn't any better at creating and
               | testing software (well, not _that_ much better) than
               | anyone else.
        
               | funcDropShadow wrote:
               | Of course, these groups are called personas in software
               | engineering circles. I was answering to people who
               | believe diversity will somehow fix all problems of this
               | world. And they do not talk about personas. They talk
               | about minorities, victims and races.
               | 
               | Before Apple or any other customer company describes
               | personas, there is an explicit or implicit decision of
               | which personas to consider and which not. But most
               | consumer targeted companies hide which personas they
               | consider and which not.
        
               | rixed wrote:
               | > They talk about minorities, victims and races.
               | 
               | When I mentioned diversity, what I actually had in mind
               | was that all tech workers in California have wifi, and
               | probably find this so normal to connect your phone to
               | wifi that they expect any normal user will do this on a
               | daily basis.
               | 
               | I'm not a grandma but I'm a tech worker with a very
               | different life style. Like the person I was replying to's
               | grandma, I have no wifi. Like her, I have to fake it from
               | time to time so that my Android phone will backup photos,
               | accept to download Google drive documents, etc.
               | 
               | So the diversity I had in mind is rather a diversity in
               | location and lifestyle.
        
               | taylodl wrote:
               | That's what I understood you to mean too. I just added in
               | differently-abled people. I've never seen a persona based
               | off minorities, "victims" (not sure what that means in
               | this context?), or races. It's always been capabilities
               | and life experiences.
        
           | Abishek_Muthian wrote:
           | > "you need a 3GB OS update even if the only fix is 3 lines
           | of code in Safari,...",
           | 
           | If Apple really cared, it would have changed this scenario by
           | now. A 7year old android device which can download/sideload
           | latest Firefox is technically more secure for web browsing
           | than an latest iOS device which is yet to receive an OS
           | update with the critical patch for the Safari.
           | 
           | But as Tim Cook proudly claims, People who use Apple products
           | are those 'don't want to take risky decisions themselves'
           | maybe they'd be willing to wait for that OS update however
           | long it takes.
        
           | wrycoder wrote:
           | Depending on her data plan, downloading several tens of gigs
           | over her 4G could have resulted in a rather exciting phone
           | bill.
        
           | scns wrote:
           | I hate blaming people and pointing fingers. It seems to me,
           | that the current CEO, who was the CFO at the same company
           | before, would be well advised to drop the "numbers & metrics"
           | MBA mindset.
           | 
           | Written five years ago but sadly still relevant:
           | https://steveblank.com/2016/10/24/why-tim-cook-is-steve-
           | ball...
        
             | Revercomb wrote:
             | Agreed here.
        
           | PostThisTooFast wrote:
           | Not to mention Apple's amateur-hour, dogshit-stupid policy of
           | forcing everyone to use an E-mail address as his user ID.
           | 
           | How many people probably think they need to use their E-mail
           | account password for every one of these idiotic organizations
           | that forces them to log in with their E-mail address? I'm
           | guessing a SHITLOAD. Your grandma definitely is not going to
           | understand the difference.
           | 
           | And that makes every one of those organizations the
           | gatekeeper to every user's E-mail account. One disgruntled
           | employee, one poorly secured system, or one nefarious forum
           | operator can now access untold numbers of E-mail accounts and
           | steal identities galore.
           | 
           | You don't see banks or brokerages forcing people to use a
           | goddamned E-mail address as a user ID, but Apple has doubled
           | down on this bullshit policy even after being called out on
           | it. No excuse.
        
           | mschuster91 wrote:
           | > but iOS updates can't be done on 4G
           | 
           | > This isn't an "anecdote" or an edge case, not everyone
           | lives in a developed country and millions are just like my
           | grandma
           | 
           | In too many countries, mobile data is _incredibly_ expensive.
           | If Apple were to allow over-the-air OS updates, you can bet
           | it would take only a week until the first class-action
           | lawsuit by people having their data caps blown through
           | because they did not understand that updates are _huge_.
        
             | concinds wrote:
             | I get that, but given that some countries have insanely
             | restrictive _Wi-Fi_ data caps (I heard 200GB, which I would
             | probably blow through in a week or less), I don 't see the
             | problem. Even if updates-over-4G are disabled by default,
             | power users should be able to turn it on. My grandma has
             | 70GB for 15EUR a month; she only makes FaceTime calls, so
             | uses on average 4GB of data a month. I'd love to use
             | SharePlay to share her screen, and teach her how to use her
             | phone, but that won't be possible, because she can't
             | install the update; it'll have to wait until the next time
             | I travel to see her.
             | 
             | I hear that Android lets you enable updates-over-4G, but
             | allows carriers to block the feature. That's detestable!
             | Surely, if I pay for data, I should be able to use it for
             | anything (legal) I deem important? I don't like
             | corporations making those decisions for me.
        
               | mschuster91 wrote:
               | > My grandma has 70GB for 15EUR a month
               | 
               | I'm German, and ... wtf, I'm envious. We have three
               | carriers: O2 (which is cheap, but their network is
               | horrible), Vodafone (which is expensive, has a good
               | network, and is constantly plagued by issues with its
               | customer service) and Telekom (expensive, good network
               | and decent customer service).
               | 
               | Telekom's cheapest plan is 40EUR a month which has 6GB,
               | and the biggest plan before flat-rate (at 85EUR) is 24 GB
               | for 60EUR a month...
        
               | FabHK wrote:
               | To be fair, looking around a bit, you can get
               | 
               | 14 GB for 13EUR (https://www.handyvertrag.de - it's a
               | brand of Drillish, reseller of the O2/Telefonica/E-Plus
               | network)
               | 
               | 40 GB for 19EUR
               | (https://bestellung.vitrado.de/offer/index/2b5pv1)
        
               | mschuster91 wrote:
               | Yeah, all of that is O2. The problem is, their service is
               | _straight shit_ outside and barely tolerable inside of
               | urban areas at best. My s /o is on their network, it's
               | _ridiculous_.
        
             | danmur wrote:
             | Given they can apply censorship per region I'm sure they
             | could apply a simple setting change per region if they
             | wanted to.
        
             | iaml wrote:
             | Google makes you explicitly confirm that you want to
             | download using cellular each time you update, noting that
             | it might induce charges. There, fixed, how come geniuses at
             | apple haven't considered this? Given how nowadays there's a
             | toggle for 5G and someone on hn told me it's due to
             | contract with carriers, I reckon the answer is that sweet
             | sweet money.
        
             | seszett wrote:
             | In many other countries, 4G is so inexpensive that many
             | people use it as their primary Internet connection though.
             | 
             | They don't see a need for a modem hooked to a wired line
             | necessitating a second subscription and procedures to
             | follow when you move apartments, when in any case you will
             | have a 4G connection that follows you around on your
             | smartphone.
        
               | Reason077 wrote:
               | I've been using 5G as my primary internet connection
               | since January 2020. It's much faster than wired broadband
               | in buildings that don't have fibre installed. It's also
               | cheaper (I pay PS30/month for unlimited data), there's no
               | contract locking you in to 12 or 24 months of service,
               | and I can take it with me whenever I travel or if I move
               | house.
        
               | jeffwass wrote:
               | If you have a desktop, how do you provide the 5G
               | connection?
               | 
               | Is there a better way than providing a wifi hotspot with
               | your mobile phone?
        
               | Reason077 wrote:
               | I have a Huawei E6878-870 5G mobile WiFi (other brands
               | are available), which provides a "proper" WiFi base
               | station, so no need for a phone hotspot. I have a
               | separate cheap SIM card for my phone.
        
           | samhw wrote:
           | > instead of Apple's irresponsible "you need a 3GB OS update
           | even if the only fix is 3 lines of code in Safari..."
           | 
           | Your comment seems to be implying that Apple could simply
           | ship the delta in the source code: i.e. something like `git
           | diff --minimal --word-diff=porcelain head^1`. Would this not
           | require iOS to be compiled on the device, and store its own
           | source code, a la `.git`? How would you address the issue of
           | the compiler itself needing to be updated as part of this
           | process?
           | 
           | Or are you suggesting they would ship the diff for the
           | _binary_? For one, I don 't think the delta in the resulting
           | binary would be as small as the delta in the source, due to
           | addresses changing, etc. There are tools like bsdiff which
           | can 'intelligently' handle this, but I believe they aren't
           | widely used for _operating system_ updates, as opposed to
           | standard userspace binaries.
           | 
           | In addition to this, the diff shipped would be relative to
           | whatever version the device is currently using, which would
           | necessitate storing a very large number of diffs, especially
           | for binaries, or else computing it on the fly (which simply
           | shifts the burden from disk space to CPU cycles).
           | 
           | Or have I misunderstood you entirely?
        
             | willis936 wrote:
             | They're suggesting a patch in the classical sense of the
             | term. A whole hog OS update does not qualify. These
             | complications were addressed 20+ years ago when whole hog
             | OS updates were unreasonable. It requires a certain amount
             | of cleverness, but that may be too much to ask of the fruit
             | company.
             | 
             | https://en.wikipedia.org/wiki/Patch_(computing)
        
               | samhw wrote:
               | I understand the change would be a patch, but that's
               | separate from the question of how you encode and ship it,
               | surely? How are you suggesting a small patch would be
               | shipped?
        
               | howinteresting wrote:
               | Android does them just fine:
               | https://source.android.com/devices/tech/ota/reduce_size
               | 
               | In particular, bsdiff has been around for a very long
               | time and is an industry standard binary diffing tool.
        
               | samhw wrote:
               | Yeah, I mentioned bsdiff in my original comment. I don't
               | think Android uses it for operating system updates,
               | though, as far as I can tell. I believe that page is
               | describing APK updates (i.e. app updates).
        
               | howinteresting wrote:
               | Well, monthly OS updates for Android tend to be just a
               | few megabytes, so I'm presuming they use some sort of
               | binary diffing algorithm.
        
               | rescbr wrote:
               | In the simplest way, not unlike Windows Update: snapshot
               | filesystem, start filesystem transaction, unzip changed
               | binary files, check new files integrity, end transaction.
               | 
               | Indeed, Apple used to distribute patches this way in the
               | past.
               | 
               | You also could ship a list of updated system files
               | hashes, compare to the installed files and just download
               | the changed ones, like rsync.
               | 
               | Better than shipping a whole new disk image every small
               | update they do.
        
               | samhw wrote:
               | Yeah, I'm sure it's theoretically doable, but it's one of
               | those things which would almost certainly require
               | substantial work given the massively complex edifice of
               | iOS. (And the iOS IPSW format, or even iOS DMGs/ADIs, are
               | very different from the OS X patches you mentioned.)
        
               | EvanAnderson wrote:
               | It's cheaper to make things that aren't maintainable. We
               | optimize for the dev, not the platform or the user.
               | 
               | It's the same lazy dev culture that gives us Electron
               | apps, or the lazy sysadmin culture that gives us docker.
               | 
               | It's cheaper to create massive incomprehensible and
               | unmaintainable edifice that requires massive storage /
               | processor / network inefficiency to maintain versus well
               | thought-out and "clever" systems that work efficiently.
               | 
               | Personally, I wish the days of optimizing for low-
               | resource machines, slow network connections, and
               | expensive storage weren't gone. As an end-user I think
               | the results of moving away from a culture of optimization
               | haven't been good. I think the ship has sailed, though.
        
           | acdha wrote:
           | > I haven't heard anything like this about Microsoft or
           | Google: both seem responsive and eager to fix within 90 days
           | (mostly)
           | 
           | Print Nightmare was part of a risky call in the printing
           | subsystem design which was recognized as such when they made
           | it in the 90s. That specific vulnerability was disclosed to
           | Microsoft in 2020, accidentally disclosed in public in June,
           | flailed at with incomplete patches and bad documentation all
           | summer while ransomware gangs exploited it, and has
           | theoretically finally been patched in the September release
           | following multiple incomplete patches.
           | 
           | The Exchange auto discover bug announced yesterday had
           | previously been reported with Microsoft telling the reporter
           | that it wasn't a bug. Oops.
           | 
           | This is hard but it's also important to remember that this is
           | an industry-wide failure because it's been cheaper to clean
           | afterwards than invest in proactive cleanup of old "stable"
           | code, and it will likely continue as long as there are no
           | financial consequences for a breach. Adding consequences
           | would change that dynamic but would also be a massive change
           | to the industry. It could endanger open source and would
           | almost certainly make everything more expensive.
        
             | smolder wrote:
             | It would not _certainly_ make everything more expensive.
             | The cost of breaches is significant, though difficult to
             | quantify, so the cost of effective prevention could be
             | lower.
        
               | acdha wrote:
               | That's _possible_ but fundamentally I'm looking at this
               | from the perspective of requiring new work to be done by
               | expensive people (infosec, dev, ops are all in-demand
               | skills) -- maybe some of that comes from changing team
               | priorities, in which case the cost is less feature work,
               | but in most cases I'd expect that to be hiring. There
               | aren't many breaches which have direct costs greater than
               | that _because_ companies have been able to avoid
               | penalties or compensation in most cases. If the cost of a
               | breach was greater than, say, a year of free credit
               | monitoring that calculation could change dramatically.
               | 
               | Ransomware has already changed this somewhat: now the
               | cost is halting operations for a potentially lengthy
               | period of time, and that has spurred a lot more awareness
               | that the current model is insufficient but not from what
               | I've seen significant efforts to change it.
        
           | user-the-name wrote:
           | Apple's entire bug reporting system has always been massively
           | dysfunctional. It is little surprise this extends to their
           | handling of externally reported security issues as well.
        
       | raspasov wrote:
       | I am not able to compile the first two of these (after the first
       | two I stopped trying) with the newest Xcode on iOS 15.0. I
       | haven't tried the other two or older tools.
       | 
       | The source code as given also had syntax errors in it.
        
         | larsnystrom wrote:
         | This makes one question whether there really are any security
         | vulnerabilities at all. Perhaps Apple isn't fixing these
         | because there's nothing to fix? I don't know enough to say
         | whether these vulnerabilities are real or not.
        
           | WA wrote:
           | The Gamed 0-day compiles just fine. I can run it on my iPhone
           | with iOS 14.8 and it shows 300 contact information gathered
           | apparently from various sources without me giving any
           | permission at all.
        
             | raspasov wrote:
             | Which XCode version are you using to compile?
        
               | WA wrote:
               | 12.5.1
        
               | raspasov wrote:
               | I am on 13.0, perhaps they fixed it because I can't
               | compile on older iOS versions with XCode 13.0 .
        
             | singularity2001 wrote:
             | maybe they 'fixed' it by not allowing any app of this kind
             | (specific Api call) in the appstore? wouldn't challenge you
             | to try and publish an example app though...
        
               | twodayslate wrote:
               | This is trivial to bypass
        
         | illusionofchaos wrote:
         | Follow the links to GitHub, the code there compiles perfectly,
         | the PoC inside the article is just a shortened version
        
           | raspasov wrote:
           | Thank you for the explanation! Will give it a shot.
           | 
           | P.S. Yes it works! Perhaps add a comments in the short
           | version where it says
           | 
           | //This shortened version does not compile, use the GitHub
           | version of the code
           | 
           | It said "proof-of-concept" and I generally expect PoC to work
           | as presented. My bad for not reading everything carefully.
        
             | illusionofchaos wrote:
             | Good idea, I've added the comment
        
         | tonmoy wrote:
         | Just curious, what kind of compilation error are you getting?
        
           | raspasov wrote:
           | 'init(machServiceName:options:)' is unavailable in iOS
        
             | illusionofchaos wrote:
             | This is just a check built into Xcode to try to keep you
             | from accessing XPC in iOS. The code on GitHub bypasses this
             | by calling this method dynamically through Objective-C
             | runtime
        
       | Thorrez wrote:
       | Props to the author. One small critique though:
       | 
       | > I've reported four 0-day vulnerabilities this year between
       | March 10 and May 4, as of now three of them are still present in
       | the latest iOS version (15.0) and one was fixed in 14.7
       | 
       | It would have been clearer if in each of the 4 vulnerabilities
       | the timeline was given. The article only gives a timeline for the
       | last vuln (the fixed one).
        
         | idoubtit wrote:
         | The second sentence of the article gives a sufficient timeline.
         | 
         | > I've reported four 0-day vulnerabilities this year between
         | March 10 and May 4
         | 
         | So the vulnerabilities were reported at least 140 days ago. He
         | also mentions 3 upgrades of iOS were published after his
         | reports.
        
       | r00fus wrote:
       | At some point Apple has to realize their bounty program as its
       | currently being run is tarnishing their privacy branding.
        
       | thatswrong0 wrote:
       | Explain I'm naive: why would Apple's bug bounty program be so
       | poorly run? Is it simply a sign of organizational failure? (e.g.
       | perhaps the managers running the program have been promoted to a
       | position that they simply don't belong in, and higher up execs
       | don't care? Or are they prioritizing profit over success?)
       | 
       | I would think that, given the profitability and positioning of
       | Apple in the marketplace, that they would be heavily incentivized
       | to provide large bounties for finding such destructive
       | vulnerabilities. And I imagine that there are plenty of security
       | people working there who genuinely care about fixing fix these
       | issues right away.
        
         | raesene9 wrote:
         | Here's my totally outsider informed guesswork. We've seen
         | similar problems recently with Microsoft, where legitimate
         | sounding issues are denied bounties, so this kind of issue is
         | not unique to Apple.
         | 
         | My guess would be, that MSRC and Apple's equivalent have an OKR
         | about keeping bounties under a certain level. Security is seen
         | as a cost centre by most companies, and what do "well run"
         | companies do with cost centres... they minimize them :)
         | 
         | I don't think that organizationally either company wants to
         | have bad security, and I don't think that individual staff in
         | those companies want to have bad security, but I do think that
         | incentive structures have been set-up in a way that leads to
         | this kind of problem.
         | 
         | I've seen this described as lack of resources in the affected
         | teams, but realistically these companies have effectively
         | limitless resources, it's that they're not assigning them to
         | these areas.
        
           | EMM_386 wrote:
           | > My guess would be, that MSRC and Apple's equivalent have an
           | OKR about keeping bounties under a certain level. Security is
           | seen as a cost centre by most companies, and what do "well
           | run" companies do with cost centres... they minimize them :)
           | 
           | It would have to be this.
           | 
           | If you start to increase the payout, you get more people
           | wanting the payout.
        
           | theluketaylor wrote:
           | Apple does not have cost and profit centers. They maintain a
           | single profit and loss balance sheet for the entire company.
           | 
           | That doesn't mean the functional areas don't have a budget or
           | resource constraints, but Apple's structure is quite
           | different from most companies.
           | 
           | I'd agree with the other comments that pin Apple's specific
           | issues on their insular culture that discourages all forms of
           | external communication if you're not part of marketing. Great
           | bug bountry programs require good communication with
           | researchers and some level of transparency, two things apple
           | is structured to avoid and discourage.
        
         | Bhilai wrote:
         | Having worked in at a large tech company with a big bug bounty
         | program and seeing tonnes of bugs come through, my experience
         | is that usually there is a wide disconnect between the bug
         | bounty program (situated in one org of the company) and the
         | engineering group responsible for fixing the bug (which is in a
         | different part of the company.) This is exacerbated by
         | misaligned incentives, bug bounty team wants fixes ASAP while
         | PMs/TPMs dont care about security and want engineers to be busy
         | on feature development vs fixing bugs. On top of that if the
         | leader of that org comes from a non-tech background then its
         | even harder to convince them to prioritize security. Bug bounty
         | teams are mostly powerless pawns in the politics between
         | leaders of several different orgs with varying cultures of
         | caring about security.
         | 
         | This is roughly how I have seen things work internally:
         | 
         | * When a bug report comes in, the bug bounty triage team tries
         | their best to triage the bug and if legit, passes it on to the
         | infosec team situated within the organization where the bug
         | belongs.
         | 
         | * Security team for the org then scrambles to figure out which
         | exact team the bug belongs to assigns it to them.
         | 
         | * A day or two later that team picks up the bug and there is
         | usual back forth on ownership, "oh, this is that part which
         | this another team wrote and no one from that time now works at
         | the company" or "its not us, its another team, please
         | reassign."
         | 
         | * Even when the right team is assigned to the bug, there are
         | discussions about priority and severity - "oh we dont think its
         | a high sev issue" types of discussions with PMs who have no
         | knowledge about security.
         | 
         | * Even when everything gets aligned, sometimes the fix is so
         | complicated that it cant be fixed within SLA. In the meantime,
         | security researchers threaten to go public, throws tantrums on
         | Twitter while Bug bounty chases internal teams for a fix.
         | 
         | * When the bug cannot be fixed within SLA, the engineering
         | folks file for an exception. This then gets escalated to a
         | senior leader who needs to approve an exception with agreement
         | from a leader within security. This takes a couple of days to
         | weeks and in the meantime, security researcher has now
         | completely lost it because they think no one is paying
         | attention to this crazy oh so critical bug they spent day and
         | night working on.
         | 
         | * When exception is granted, bug bounty swallows the pill and
         | tries to make up excuses on why it cant be fixed soon.
         | Eventually, 90days are over and researcher feels disrespected
         | and establishes animosity and starts to think everyone on the
         | other side a complete idiot.
         | 
         | * A blog shows up on HN and gets picked up by infosec twitter
         | and slowly media catches up. Now, internally everyone scrambles
         | to figure out what to do. Bug bounty team says "we told you so"
         | and engineering team figures out a magical quick band-aid
         | solution that stops the bleeding and partially fixes the bug.
        
           | raylu wrote:
           | it feels like we're conflating two issues here: fixing the
           | bug on time and paying out the researcher. at the point where
           | the bug is too complicated to fix within SLA and the
           | exception has been escalated to senior leadership, surely the
           | bug bounty team can pay the researcher?
        
           | mswtk wrote:
           | That honestly sounds like a failure to communicate with the
           | researcher first and foremost. If it's difficult to
           | prioritize the fix internally due to organizational politics,
           | that's one thing, but that shouldn't stop the bounty team
           | from communicating the status to the researcher. In fact,
           | that should be the simplest part of the whole process, as
           | it's completely within the purview of the bug bounty team. If
           | they handle that right and build some trust, they might be
           | able to successfully ask the researcher for an extension on
           | disclosure.
           | 
           | Case in point, Apple likely could have come out of this
           | looking much better if they didn't ignore and then actively
           | lie to illusionofchaos. That really isn't a very high bar to
           | clear.
        
         | plorkyeran wrote:
         | Apple has always been infamously bad at doing anything with
         | external bug reports. Radar is a black hole that is
         | indistinguishable from submitting bug reports to /dev/null
         | unless you have a backchannel contact who can ensure that the
         | right person sees the report.
         | 
         | Bug bounty programs are significantly more difficult to run
         | than a normal bug reporting service, so the fact that they're
         | so bad at handling the easy case makes it no surprise that
         | they're terrible at handling security bugs too.
        
           | tehabe wrote:
           | I used to submit bug reports for things I found in macOS or
           | any other applications, like that Pages would include a huge
           | picture in the files for no reason at all. But those bug
           | reports would usually be closed and "linked" to another bug
           | report you don't have access to. Essentially shutting you
           | out. At some point you just give up. At some point bugs are
           | getting fixed but there is no pattern to it.
        
           | mrtksn wrote:
           | I actually got response for a bug report saying "We fixed
           | that, can you try it on the next beta and send as a code
           | sample to reproduce it if the bug is still there". But that
           | bug was about the way SwiftUI draws the UI.
        
         | dangerface wrote:
         | Its company culture, some companies see failure as a problem to
         | be avoided others see failure as the road to success.
         | 
         | Evidently apple see failures like bugs as a problem to be
         | avoided and are just trying to avoid the problem with the
         | obvious result that they have problems and look like a failure.
         | 
         | Companies that accept failure as a consequence of trying will
         | learn and improve until they achieve success.
        
         | Bud wrote:
         | It's interesting to me that in this entire thread, nobody is
         | even mentioning or considering the possibility that COVID has
         | impacted Apple's operations.
         | 
         | It obviously has. It has affected every tech company. Certainly
         | it has affected mine. Whether this is an example of that, I
         | don't know, of course, but I think it's plausible.
        
           | Cipater wrote:
           | I'm curious. Would you accept it if Apple came out and said
           | that the reason this is happening is because of the COVID
           | pandemic affecting their operations?
           | 
           | Surely even if it were true, that is no excuse for a company
           | like Apple?
        
           | ok123456 wrote:
           | So what? Everyone has been affected. Apple doesn't somehow
           | get a pass at not doing their basic duties.
           | 
           | Sitting at home looking at a monitor and typing is largely
           | the same doing the same thing in an office. They're not
           | service sector workers, doctors, nurses, or truck drivers who
           | actually have had to deal with the impact of this head on.
        
           | saagarjha wrote:
           | I see no reason COVID has anything to do with Apple's poor
           | response to external reporters, which has been something that
           | has been a problem for decades.
        
           | howinteresting wrote:
           | Apple could have chosen to simply not release a new iPhone or
           | iOS version this year, if it wanted to. To the extent that
           | they prioritized that over fixing up their security infra,
           | that's on them.
        
         | grey-area wrote:
         | I imagine they are just overwhelmed.
         | 
         | Let's say they have a team of 6 engineers tasked with this.
         | They probably receive hundreds of reports a day, many bogus,
         | some real, but _all_ long winded descriptions like this framed
         | to make the vuln seem as bad as possible. In addition many vuln
         | reports are generated by automated tools and sprayed to
         | thousands of sites /vendors daily in the hope of one of them
         | paying out, they seem coherent at first glance but are often
         | nonsense or not really a vuln at all, and of course there are
         | many duplicates or near duplicates.
         | 
         | If each of these takes 20 mins to triage, 1 hour to properly
         | look at and days to confirm, you can see how a team of any
         | reasonable size would soon be completely submerged and unable
         | to respond to anything but the most severe and succinct
         | vulnerability reports in a timely way.
        
           | EMM_386 wrote:
           | > Let's say they have a team of 6 engineers tasked with this.
           | 
           | They are a _trillion_ dollar company. They can have as many
           | engineers as they 'd like.
        
             | grey-area wrote:
             | You'd probably be surprised how small some departments are
             | at large companies, if they're seen as a cost centre rather
             | than a profit centre.
             | 
             | I agree they could and should do a lot better, I'm just
             | imagining the probable reasons for this level of
             | dysfunction - the most likely explanation to me is an
             | overwhelmed, overworked department submerged in so many
             | requests that they can't filter out the useful ones or
             | respond in a timely way.
             | 
             | Just as one other example of this, the bug reporting system
             | at Apple is antiquated and seen as a black hole outside the
             | company, probably again due to underinvestment.
        
           | moonchrome wrote:
           | I don't buy this. They fixed one reported, got back to him
           | and acknowledge the lack of disclosure, apologised, promised
           | to fix it and never actually disclosed it 3 reports later.
           | 
           | It's not a case of "someone missed this" it's "this seems
           | dysfunctional".
        
           | ninth_ant wrote:
           | I feel like Apple could afford to staff a security program
           | like this. Much, much smaller and less wealthy companies
           | manage it.
        
             | raylu wrote:
             | much smaller, less wealthy companies attract far fewer
             | reports
        
         | jareklupinski wrote:
         | Apple didn't get rich by writing a lot of checks
         | 
         | ironically, spoken by Gates @0:52
         | https://www.youtube.com/watch?v=H27rfr59RiE
        
         | sseppola wrote:
         | Best explanation I've heard was in Darknet Diaries about Zero
         | Day Brokers, which was a fantastic listen! (https://open.spotif
         | y.com/episode/4vXyFtBk1IarDRAoXIWQFf?si=3...)
         | 
         | The short version is that if the bounties become too large
         | they'll lose internal talent who can just quit to do the same
         | thing outside the org. Another reason was that they can't offer
         | competitive bounties for zero days because they'll be competing
         | with nation states, effectively a bottomless bank, so price
         | will always go up.
         | 
         | I don't know much about this topic, but surely there are some
         | well structured bounty programs Apple could copy to find a
         | happy middle ground to reward the white hats.
        
           | shmatt wrote:
           | this is the real reason. not anything internal/culture
           | related
           | 
           | A good iOS 0-day is worth hundreds of millions of dollars in
           | contracts with shady governments. Apple can't compete with
           | that multiple times a year
        
             | heavyset_go wrote:
             | According to Zerodium, iOS exploits are cheaper than
             | Android exploits because they are so plentiful in
             | comparison.
        
         | techrat wrote:
         | > Explain I'm naive: why would Apple's bug bounty program be so
         | poorly run?
         | 
         | Hubris.
         | 
         | Apple's culture is still fundamentally the same from the day
         | they ran ads saying "Macs don't get viruses" to today. They
         | used a misleading ad copy to convince people they could just
         | buy a Mac and be safe, not needing to do anything else...
         | ignoring that Macs still got malware in the form of trojans,
         | botnets and such... and encouraging a culture of ignorance that
         | persists to this day. "It just works." etc.
         | 
         | So now their primary user base is majorly people who have zero
         | safe online habits.
         | 
         | And that sort of mentality feeds back into the culture of the
         | company... "Oh, we're not Windows, we don't get viruses. We
         | don't get viruses because our security is good. Our security is
         | good, obviously, because we don't get viruses." It, in effect,
         | is a feedback loop of complacency and hubris. (A prime example
         | of this is how Macs within the Cupertino campus were infected
         | with the Flashback botnet.)
         | 
         | Since their culture was that of security by obscurity (unlike,
         | say, Google's explicit design in keeping Chrome sandboxed and
         | containered for sites), closed source and again, hubris... it's
         | coming back to bite Apple in the ass despite their ongoing "We
         | don't get viruses" style smugness. If it's not about Macs not
         | getting viruses, it's about how Apple values your privacy
         | (implying others explicitly don't) and like with everything
         | else, it's repeated often enough to where the kool aid from
         | within becomes the truth.
         | 
         | Apple's culture is that of smugness, ignorance and yep...
         | hubris. Why should they have a serious, respectable bug bounty
         | program if they've been busy telling themselves that they don't
         | simply have these kinds of security problems that they've
         | bragged about never having?
        
         | achikin wrote:
         | I think they don't want to admit such a big security breach
         | publicly. At some extent privacy is their business.
        
         | fassssst wrote:
         | It doesn't matter how big of a company they are, the only thing
         | that matters financially is whether they're growing or not.
        
         | sam0x17 wrote:
         | Bug bounty programs are the antithesis of Apple's internal
         | methodology, culture, and way of doing business. They keep
         | everything close to the chest, they shun "outsiders", etc.. The
         | idea that someone outside of Apple, from the unwashed masses,
         | could find a flaw in Apple's own software is a pretty big pill
         | for them to swallow. Thus it doesn't surprise me there are
         | problems with their bug bounty program. I think if they could
         | they would prefer to just silence all vulnerability/bug reports
         | with a gag order rather than acknowledging or even
         | investigating them.
        
           | snuser wrote:
           | that's just dumb, like third parties do all the work and
           | contact you about critical bugs the only effort on Apple's
           | part of verification and some coordination which shouldn't be
           | a huge issue for a company the size of apple.. just hire a
           | team to do it and be done with it the whole 'secrecy culture'
           | is a bunch of hogwash
        
             | sam0x17 wrote:
             | And yet here we are ;)
        
             | 015a wrote:
             | Apple is all about silos.
             | 
             | So a security threat gets reported to this bug bounty team.
             | They are able to reproduce and confirm. The bug is in some
             | deep, crusty part of the kernel; the code for which isn't
             | available to this team, because Silos.
             | 
             | The team who does have access to this Silo is tracked down.
             | It gets processed into a ticket. Maybe it gets done, maybe
             | it doesn't. Their backlog is already maxed out, as is
             | everyone's.
             | 
             | The security team says "we've done all we can".
             | 
             | This is not a matter of "lol just hire a team". You need
             | leadership aligned, so the manager's manager can see their
             | task list, or open security issues, and say "what the fuck,
             | why are you not prioritizing this".
             | 
             | That's not Apple. Apple is product-driven. They actually,
             | legitimately don't care about Privacy and Security. Their
             | manager-managers get mad when products aren't delivered on
             | time. They may also push-back on purposeful anti-privacy
             | decisions. Its not in their culture to push back on
             | Security issues, or latent Privacy issues resulting from
             | those Security issues.
             | 
             | "Just tear down the silos" > Working for Apple is a cult.
             | The silos are a part of the cult; left-over from one of the
             | worst corporate leaders of all time, Jobs. Try telling a
             | cult member their beliefs are false.
             | 
             | "Grant the security team access to everything" > And, I
             | guess, also hire the smartest humans to ever exist on the
             | planet to be able to implement security improvements across
             | thousands of repositories in dozens of programming
             | languages, billions of lines of code? And, even if they
             | could, push those changes through a drive-by review and
             | deployment with a team on the other side of the planet
             | you've never met? You, coming into their house, and
             | effectively saying "y'all are incompetent, this is
             | insecure, merge this" (jeeze ok, we'll get to it, maybe
             | in... iOS 18)
        
             | jcims wrote:
             | I've worked on the bug bounty program for a large company.
             | We did the whole thing. It's hard. The part you're talking
             | about can be the hardest.
             | 
             | Is probably less than believable to read because it sounds
             | like it should be easy. I don't have any good answers
             | there. I'm also not suggesting that customers and
             | researchers accept that, but saying it's easy just
             | diminishes the efforts of those that run good ones.
        
               | vincnetas wrote:
               | could you try litle bit harder to provide any example why
               | it is "harder than it looks". you repeated multiple times
               | that its hard, but what exactly(aproximately) makes it
               | hard?
        
               | jcims wrote:
               | It's mostly just 'human factors'. What I'm describing
               | below applies across the spectrum of bug reports from
               | fake to huge to everything in between. Nothing of what
               | I'm listing below is an attempt to directly explain or
               | rationalize events in the article, it's just some context
               | from my (anecdotal) experience.
               | 
               | - The security researcher community is composed of a
               | broad spectrum of people. Most of them are amazing.
               | However, there is a portion of complete assholes. They
               | send in lazy work, claim bugs like scalps and get super
               | aggressive privately and publicly if things don't go
               | their way or bugs don't get fixed fast enough or someone
               | questions their claims (particularly about severity).
               | This grates on everybody in the chain from the triagers
               | to the product engineers.
               | 
               | - Some bounty programs are horribly run. They drop the
               | ball constantly, ignore reports, drag fixes out for
               | months and months, undercut severity...all of which
               | impact payout to the researcher. These stories get a lot
               | of traction in the community, diminishing trust in the
               | model and exacerbating the previous point because nobody
               | wants to be had.
               | 
               | - Bug bounties create financial incentives to report
               | bugs, which means that you get a lot of bullshit reports
               | to wade through and catastrophization of even the
               | smallest issues. (Check out @CluelessSec aka
               | BugBountyKing on twitter for parodies but kind of not)
               | This reduces SnR and allows actual major issues to sit
               | around because at first glance they aren't always
               | distinguishable from garbage.
               | 
               | - In large orgs, bug bounties are typically run through
               | the infosec and/or risk part of the organization. Their
               | interface to the affected product teams is going to
               | generally be through product owners and/or existing
               | vulnerability reporting mechanisms. Sometimes this is
               | complicated by subsidiary relationships and/or outsourced
               | development. In any case, these bugs will enter the pool
               | of broader security bugs that have been identified
               | through internal scanning tools, security assessments,
               | pen tests and other reports. Just because someone
               | reported them from the outside doesn't mean they get
               | moved to the top of the priority heap.
               | 
               | - Again in most cases, product owns the bug. Which means
               | that even though it has been triaged, the product team
               | generally still has a lot of discretion about what to do
               | with it. If its a major issue and the product team stalls
               | then you end up with major escalations through EVP/legal
               | channels. These conversations can get heated.
               | 
               | - The bugs themselves often lack context and are randomly
               | distributed through the codebase. Most of the time the
               | development teams are busy cranking out new features,
               | burning down tech debt or otherwise have their focus
               | directed to specific parts of the product. They are used
               | to getting things like static analysis reports saying
               | 'hey commit you just sent through could have a sql
               | injection' and fixing it without skipping a beat (or more
               | likely showing its a false positive). When bug reports
               | come in from the outside, the code underlying the issue
               | may have not been touched for literally years, the teams
               | that built it could be gone, and it could be slated for
               | replacement in the next quarter.
               | 
               | - Some of the bugs people find are actually hard to solve
               | and/or the people in the product teams don't fully
               | understand the mechanism of action and put in place basic
               | countermeasures that are easily defeated. This
               | exacerbates the problem, especially if there's an asshole
               | researcher on the other end of the line that just goes
               | out and immediately starts dunking on them on social
               | media.
               | 
               | - Most bugs are just simple human error and the teams
               | surrounding the person that did the commit are going to
               | typically want to come to their defense just out of
               | camaraderie and empathy. This is going to have a net
               | chilling effect on barn burners that come through because
               | people don't want to burn their buddies at the stake.
               | 
               | All of this to say it takes a lot of culture tending and
               | diplomacy on the part of the bounty runners to manage
               | these factors while trying to make sure each side lives
               | up to their end of the bargain. Most of running a bounty
               | is administrative and applied technical security skills,
               | this part is not...which is why I said it can be the
               | hardest.
        
               | OJFord wrote:
               | I think it's the phrases 'some coordination' and 'company
               | the size of Apple'. It's rarely the case (well,
               | hopefully?!) that a fix is as trivial as 'oh yeah, oops,
               | let's delete that `leak_data()` line' - it's going to
               | involve multiple teams and they're all going to think
               | anything from 'nothing to do with us' to 'hm yes I can
               | see how that happened, but what we're doing in our piece
               | of the pie is correct/needs to be so, this will need to
               | be handled by the [other] team'.
               | 
               | Not to say that people 'pass the buck' or are not taking
               | responsibility exactly, just that teams can be insular,
               | and they're all going to view it from the perspective of
               | their side of the 'API', and not find a problem. (Of
               | course with a strict _actual_ API that couldn 't be the
               | case, but I use it here only loosely or analogously.)
               | Someone has to coordinate them all, and ultimately
               | probably work out (or _decide_ somewhat arbitrarily - at
               | least in technical terms, but perhaps on the basis of
               | cost or complexity or politics or cetera) whose problem
               | to make it.
        
               | sam0x17 wrote:
               | What's worse, typically an exploit doesn't involve
               | knowledge of the actual line of code responsible -- it's
               | just a vague description of behavior or actions that
               | leads to an exploit, making it much easier to pass the
               | buck in terms of who is actually responsible for fixing
               | it. The kicker is if your department/project/whatever
               | fixes it, you're also taking responsibility for causing
               | this error / causing this huge affront to the Apple
               | way...
        
           | black_puppydog wrote:
           | That makes apple (the org, not the fanboys) sound a bit
           | cultish... Can't say I'm surprised though...
        
             | xondono wrote:
             | It is in SV after all..
        
             | xvector wrote:
             | It _is_ cultish.
        
             | minxomat wrote:
             | There are entire books that romanticise the cult aspect of
             | working at Apple.
        
         | logshipper wrote:
         | Disclaimer: I am not an Apple insider by any means, and this is
         | all a hypothesis.
         | 
         | Their management of the bug bounty program seems like a
         | reflection of their secretive (and perhaps sometimes siloed)
         | internal culture. I'd argue that for any bug bounty program to
         | be successful, there needs to be an inherent level of trust and
         | very transparent lines of communication - seeing as though
         | Apple lacks it internally (based on what I've read in reporting
         | about the firm) it is not particularly surprising that their
         | program happens to be run in the shadows as the OP describes.
         | 
         | I forget the exact term for it, but there is a "law" in
         | management which postulates that the internal communication
         | structures of teams are reflected in the final product that is
         | shipped. The Apple bug bounty seems to be an example of just
         | that.
         | 
         | Edit: Its called Conway's Law
        
       | floatingatoll wrote:
       | It must be nice to give up $100k by being impatient. I do
       | understand that OP probably feels a moral reason to do so, but
       | that $100k would be life-changing for me, even if it took 3 years
       | to pay out.
        
         | Bellamy wrote:
         | Sue Apple and lose another 100k.
        
         | diebeforei485 wrote:
         | There is no $100K coming.
         | 
         | Apple hopes you'll stay silent by dangling a hypothetical $100K
         | (or whatever large amount) in the vague future. Once they've
         | fixed the bug, they no longer have an incentive to pay you so
         | they won't.
        
           | FireBeyond wrote:
           | Haven't they done this in the past? "Oh thank you!" then
           | "Actually we already knew about it and had a fix planned, so
           | no bounty for you"?
        
             | diebeforei485 wrote:
             | Yes.
             | 
             | In some cases when they did pay, they paid significantly
             | less than their published rates.
        
               | moepstar wrote:
               | From the PoV of a security researcher - why even bother
               | disclosing responsibly (moral obligations aside)?
               | 
               | Best case scenario: you don't get sued into oblivion,
               | will be ghosted and gaslightened, receive pocket change
               | arbitrary amount of time later.
               | 
               | Compared to that, i suppose the exploit brokers got their
               | stuff together - after all, time _is_ money - chances are
               | someone else may stumble upon the same vulnerability...
        
               | floatingatoll wrote:
               | If the payout is higher priority to you than the ethics
               | of selling an exploit that governments around the world
               | will end up using to hunt and capture or kill political
               | dissidents, then you are of course free to sell it on the
               | exploit market :) I prefer to sleep at night, though.
        
           | babesh wrote:
           | I think the behavior is very Russian.
           | 
           | Hacker: You have a vulnerability bounty program. Well here
           | are three. Pay up.
           | 
           | Apple: [silence]
           | 
           | Hacker: [interprets this correctly as a fuck you.] Fuck me?
           | Fuck you!
           | 
           | Me: Love it!
        
           | floatingatoll wrote:
           | You're claiming that they maliciously lie and refuse to
           | payout because, based on OP, they screwed up on release notes
           | and didn't get it solved within the 90 day crunch period
           | between WWDC and release.
           | 
           | It took so little evidence for you to decide it's hopeless
           | and declare as fact your prediction. Maybe you felt this way
           | _before_ this post? Otherwise I'm just not sure how to
           | respond.
        
           | floatingatoll wrote:
           | Seems more likely it'll just take 3-4 years with months of
           | silence at a time, based on the extremely few security Radars
           | I've ever filed as a developer. 90 days to publication is
           | certainly a valid choice, but it's also a personal choice
           | that reduces a probable $100k payment in X years to a certain
           | $0 payment today. I would be fine with that delay. OP is not,
           | and that's fine too. I don't know whether that's an
           | acceptable choice or not to anyone else, but Apple should be
           | disclosing their communication practices a lot more clearly
           | here. I discourage participation by anyone who isn't willing
           | to wait a year between replies.
        
         | FireBeyond wrote:
         | Given that (according to the author) they've already lied at
         | least twice ("processing error, will be in next release")...
         | 
         | ... what gives you such high hopes that he will ever get his
         | 100K?
        
         | asteroidbelt wrote:
         | Don't count someone else's money.
        
         | 93po wrote:
         | This fellow has a lot more to gain than $100k by the popularity
         | and prestige he'll gather from publishing this. Especially
         | considering that Apple will never change their ways until
         | they're publicly shamed, the long term outcome of shaming them
         | is worth more than $100k if they actually change the policies
         | to take security researchers and the bug bounty seriously
        
           | floatingatoll wrote:
           | I would not consider Apple particularly concerned about shame
           | in regard to bounty program delays in communication and
           | publication, no matter how much people try.
        
             | 93po wrote:
             | I agree, but the shame of getting 0-day exploits published
             | on the web by someone who doesn't work at Apple might shame
             | them enough to change.
        
       | kybernetyk wrote:
       | At least the Game Center one is something an app developer could
       | easily stumble upon. I don't want to know how many apps are
       | already exploiting this.
        
         | illusionofchaos wrote:
         | That's exactly how it happened for me. I noticed that when an
         | app logs into Game Center, the notification is shown inside the
         | app, and not in a remote process like when you choose contacts
         | of compose an email. That led to easily discovering everything
         | else.
        
       | 1vuio0pswjnm7 wrote:
       | With these Apple-related vulnerability annoucements on HN,
       | usually we see response from a satisified Apple owner along the
       | lines of "This is fixed in [some new version number]". The thing
       | is, the problem isnt whether something is fixed, its that it was
       | broken to begin with. It passed "QA" at a trillion dollar company
       | and its a pre-installed fixture^1 on some relatively expensive
       | hardware item. If there is such an "it's fixed" response, it
       | usually rises to the top comment. The underlying message seems to
       | be, "No worries. You can safely forget about this and keep loving
       | Apple hardware running the crappy but user-friendly, irremovable
       | (work-in-progress) software that comes pre-installed on it."
       | 
       | How long till this "It's fixed" comment appears. Might come in a
       | different submission. For some folks, Apple can do no wrong. No
       | amount of truth can change their views. The only issue to these
       | folks is "fixing"; they are content to use software that is a WIP
       | but marketed as ready for primetime and to dismiss any ideas
       | about using software that is considered finished.
       | 
       | The best place for important data is external media not a "smart"
       | phone running an OS and software that the user does not have
       | control over. "Everyone else is doing it" doesnt transform a
       | stupid practice into a smart one, it just means no one will
       | criticise the choice. That of course also opens the door for
       | those following the herd to attack anyone who dares to suggest
       | deviating from the mainstream because "how can the majority of
       | people be wrong".
       | 
       | 1 The purchaser cannot remove it and install their own
       | replacement.
        
         | zibzab wrote:
         | Consider also time of introduction (of the vulnerability) to
         | time of patch.
         | 
         | We have now seen bugs that have been around for 4-5 years
         | before being discovered by ethical hackers and subsequently
         | patched.
        
         | Mindwipe wrote:
         | That would be a nice problem to have.
         | 
         | No, the issue is despite disclosure two of them still aren't
         | fixed.
        
         | shepherdjerred wrote:
         | The only bug-less software is software that was never written.
         | 
         | Please point me to a consumer OS that doesn't have security
         | vulnerabilities.
        
           | Krasnol wrote:
           | The joke here is that they advertise security.
           | 
           | This is their main claim atm.
        
         | azinman2 wrote:
         | No os/mobile platform is free of security bugs. No amount of
         | "QA" will be enough. Just look at the number of vulnerabilities
         | literally any OS has, or even any component such as Chrome or
         | Safari.
         | 
         | It is a shame that the author didn't get replies in time and
         | felt the need to disclose. I'm sure it'll at least get quickly
         | patched now.
        
       | smoldesu wrote:
       | I really wonder how different the mobile security landscape would
       | look if researchers were treated seriously. This is a mistake
       | that Google, Apple and even Microsoft (for the short time they
       | made phones) all made, and now we have to live with the
       | consequences of that. Along with your "post privacy" world, we
       | may as well march this as the epoch of "post security". It just
       | depends on how much someone is willing to spend to engineer one
       | (or three) zero-days from your target's machine. This used to be
       | _feasible_ , but recent ransomware and surveillance malware has
       | proven that it's commoditized now.
        
         | krater23 wrote:
         | I never thought that hacking like in Star Trek will be anytime
         | truth. But it looks like we slowy head for it...
        
         | bcook wrote:
         | >This is a mistake that Google, Apple and even Microsoft (for
         | the short time they made phones) all made, and now we have to
         | live with the consequences of that.
         | 
         | What consequences are we living with?
        
           | bottled_poe wrote:
           | Fear of data leaks for one. That's never going to be zero,
           | but it could be a lot better than it is.
        
       | MrGando wrote:
       | Obscure ad-tech companies would love to get those installed apps
       | like they were aggressively doing (Facebook & Twitter too) about
       | 5 years ago.
        
         | asteroidbelt wrote:
         | Facebook and Twitter exploited 0-day security vulnerabilities
         | to collect private data?
         | 
         | Can show the proof link please?
        
           | robterrell wrote:
           | I believe the poster was referring to the practice of testing
           | if an app was installed by calling [UIApplication
           | canOpenURL:] --- as I recall Twitter was found to check for
           | hundreds of different apps, to feed into their ads and
           | analytics business, and Apple later changed it to only be
           | callable 50 times.
        
             | saagarjha wrote:
             | Nit: for 50 different schemes, not 50 times.
        
               | MrGando wrote:
               | And you have to provide those schemes in your App's plist
               | as well if I recall correctly. Gated pretty hard.
        
       | tumblewit wrote:
       | I really hate the path Apple is taking. They make excellent
       | products, really the average Joe simply loves Apple products. But
       | they need to stop acting anti-consumer and anti-developer to
       | "protect" their IP. At this point they could release the
       | schematics of iPhone 13 and still people will buy Apple's iPhone
       | than someone who copied them. Rant over.
        
         | someguydave wrote:
         | Actually, these vulnerabilities are good evidence that Apple
         | does not make excellent products.
        
           | xyst wrote:
           | Probably means they make "good looking" products. But
           | underneath the hood, it's spaghetti.
        
       | neha1989 wrote:
       | Apple might have left these vulnerabilities for Pegasus like
       | softwares including the sotfware used by FBI and other agencies.
        
         | samhw wrote:
         | I don't know why you're being downvoted. My mind immediately
         | went to this, given the relative obviousness of the exploits
         | (compared to the massive supercomputer fuzzing runs done by
         | Project Zero &c), and given their refusal to either document or
         | fix these serious vulnerabilities in several releases.
        
       | fortran77 wrote:
       | I wonder if the culture of leaking and dissent within Apple is
       | enabling information to leak which assists the 0day authors?
        
         | dylan604 wrote:
         | did you mean 0day exploit authors? otherwise, wouldn't the
         | actual authors of the 0day be Apple employees on the iOS team?
        
         | rastafang wrote:
         | They don't need leaks to find vulnerabilities... although the
         | source code would probably help.
        
       | aetherspawn wrote:
       | If your annual revenue is above $100M, you should be held
       | accountable to a strict version of GPDR enforced by an ombudsman,
       | that requires you to patch all data leaking vulnerabilities
       | within 90 days, or pay out everyone who bought your product.
       | 
       | I just updated to iOS 15 and it now tells you which sites you
       | have been compromised on, or had your passwords/info compromised
       | on. To be clear, I use a password manager with a unique password
       | on every site, so it is difficult for something like this to have
       | a significant impact. Nethertheless, I was compromised on
       | hundreds of sites and products, and those were just the accounts
       | that iOS Safari knew about. None of them bothered to reach out
       | and tell me. Even my coffee machine was compromised. Ridiculous.
        
         | Cipater wrote:
         | Hang on, you have a coffee machine that is capable of being
         | compromised? How exactly?
         | 
         | Further to this, you claim that you have been compromised on
         | HUNDREDS of sites even though you use a unique password
         | everywhere?
         | 
         | How is this happening to you? Isn't this a huge concern?
        
           | rablackburn wrote:
           | Right, it's hard to compromise a kettle and manual grinder,
           | the only thing I could possibly consider a benefit of a
           | networked coffee machine is you can schedule it/script it
           | with home assistant.
           | 
           | But even then, I'm pretty sure you can buy simple electric
           | ones with timers...
        
             | consp wrote:
             | Only if it is HTCPCP(-TEA) compatible. Otherwise they
             | should not bother.
             | 
             | /s, since it is not the specialized coffee machine on the
             | office floor which is the biggest problem but the thousands
             | of ones at home where people do not even bother to firewall
             | it.
        
           | Hackbraten wrote:
           | > Further to this, you claim that you have been compromised
           | on HUNDREDS of sites even though you use a unique password
           | everywhere?
           | 
           | It's not too far-fetched.
           | 
           | I've been compromised on dozens of sites out of ~1.500 sites
           | on which I have accounts, all of them with unique email
           | addresses and unique passwords. Those dozens accounts are
           | just those I happen to know about (through HIBP, incoming
           | email spam, or the occasional site owner's disclosure) so
           | they're probably just the tip of the iceberg.
           | 
           | Sites are being breached left and right. If you're lucky, the
           | site owner tells you. Many won't.
        
           | wil421 wrote:
           | When I was looking for an espresso making a lot of them have
           | touch screens and connected features. They will wake up
           | before you get out of bed and have hot water ready. I
           | specifically bought one without touch screens and all that
           | crap. It takes maybe 30 seconds for the water to heat up.
           | 
           | Lots of people will get their regular coffee maker ready the
           | night before with water and ground beans. At a specific time
           | in the AM it will brew. Lots of old models have timers you
           | can set. I avoid smart devices like the plague. My Bosch
           | fridge is a smart fridge and I plan on putting it on a VLAN.
        
             | 0xffff2 wrote:
             | >My Bosch fridge is a smart fridge and I plan on putting it
             | on a VLAN.
             | 
             | As another new owner of a Bosch fridge, why put it on
             | anything at all? I just peeled the sticker that told me how
             | to connect off, threw it in the trash, and treat it just
             | like my old non-connected fridge. Is there actually some
             | beneficial feature that makes it worth connecting at all?
        
               | wil421 wrote:
               | The right french door is hard to close compared to our
               | last side by side fridge. For now it helps me remember to
               | close it when I'm in a different room and can't hear the
               | alarm.
        
               | lowbloodsugar wrote:
               | That is hilarious.
               | 
               | Problem: Right french door is hard to close and user
               | often leaves it open. User does not hear alarm.
               | 
               | Solution 1: Make french door easier to close / close
               | automatically.
               | 
               | Solution 2: Make alarm louder. Adjustable even.
               | 
               | Solution 3: Add networked computer, software, mobile
               | phone application, and wire them all up.
        
               | wil421 wrote:
               | It's caused issues with my wife thinking the kids milk
               | has gone bad because the door was ajar. Trust me when I
               | say peace of mind and possibly getting hacked is much
               | better than an angry wife plus hungry kids.
               | 
               | French door fridges have this flat piece that slides
               | behind one of the doors to make it airtight. Our previous
               | place had a Samsung that did the same thing, except it
               | was the left door and easier to shut.
               | 
               | It's made by Germans, they don't always make things
               | usable or even serviceable (German cars).
        
               | lowbloodsugar wrote:
               | I apologize, I did not mean to disparage your solution
               | for a purchase you already made. I was cynically
               | imagining the thinking of the manufacturer! Also, yes, we
               | have a fridge with french doors, so I understand the
               | problem. Ours just has a really effective alarm. Have you
               | tried sticking two small wedges under the front feet so
               | that the doors get a little more momentum when swung
               | closed? That was actually in our manual. (Yes I read my
               | fridge's manual).
        
         | kf6nux wrote:
         | Aren't Ombuds generally limited to investigations and
         | recommendations (not enforcement)? What country colors your
         | context? (e.g. I'm in the US where federal Ombuds aren't much
         | of a thing, though similar roles may be filled by other
         | persons).
        
         | whoknowswhat11 wrote:
         | God, I would HATE if the US follows the EU with this craziness.
         | I'm already sick of the cookie popups, now layer on the GDPR
         | insanity and we will definitely lose the privacy fight to users
         | who will be sick of this nonsense as well.
         | 
         | I've seen studies that show crap like GDPR (which makes
         | basically all normal interaction cumbersome) has like 10% of
         | folks clicking around to "opt-out" while 90% can't be bothered.
         | And of course, you COULD just clear your own cookies.
         | 
         | There is no more real security in the EU. Your mental health
         | records will be leaked there. The EU will spy on you like
         | crazy. And more.
        
           | cmeacham98 wrote:
           | GDPR cookie consent banners that make it more difficult to
           | opt out than opt in are illegal, and only continue to exist
           | because the GDPR is poorly and inconsistently enforced.
        
             | rndgermandude wrote:
             | Correct. The vast majority of such cookie banners you see
             | are illegal according to the GDPR, i.e. whenever you see
             | one that doesn't have equal prominent "Accept"/"Reject"
             | options next to each other.
             | 
             | The only reason these illegal banners still get used is a
             | lack of enforcement. Right now, the enforcement process is
             | rather slow, which is in part due to all this stuff being
             | "new" (the cookie ePrivacy is technically from 2009
             | already, but regulatory bodies with a clear focused mandate
             | to enforce infractions only really came into existence with
             | the GDPR) and thus regulatory bodies and sometimes courts
             | still trying to figure out the legal details, and acting
             | slow and (overly) cautious in order not to embarrass
             | themselves by issuing fines that are later thrown out in a
             | high court. (And then there is Ireland...). And more
             | generally, the law is rather slow regardless; the time it
             | takes to conclude any "important" case is measured in
             | years, and sometimes decades.
             | 
             | There are civil organizations such as noyb[1] trying to get
             | things going and "nudge" regulators into action, but even
             | with that it will be a few more years at least until the
             | legal questions around "what is an acceptable cookie
             | banner" are settled.
             | 
             | [1] https://noyb.eu/en/noyb-aims-end-cookie-banner-terror-
             | and-is...
        
             | thatswrong0 wrote:
             | Most of the cookie consent banners I see are illegal in
             | that case..
        
               | maccard wrote:
               | And how many have you reported to your local data
               | commissioner?
        
               | rcMgD2BwE72F wrote:
               | That's right.
        
             | throwawayswede wrote:
             | That's the point. GDPR without good enforcement is useless
             | and meaningless. I'd even argue that all this time since
             | GDPR and until something is done about enforcement if ever
             | (that is not just a random fine, which is considered cost
             | of doing business) all that GDPR is doing is allowing these
             | companies to come up with more elaborate ways to scam (I'm
             | looking at whoever the assholes who work, run, or are
             | remotely involved with trustarc.com).
        
             | Reventlov wrote:
             | Cookie consent banners have nothing to do with GDPR, but
             | with the ePrivacy directive. GDPR clarifies what is
             | "consent", but this is not what leaded to the proliferation
             | of cookie banners.
             | 
             | Please note if you have strictly necessary cookies, you
             | don't need to have cookie banners, and if your cookies are
             | anonymous, you don't need them either !
             | 
             | The proliferation of cookie banners just means that people
             | running such websites are usually terrible with regards to
             | consent, personally identifiable information, and so on.
        
               | cmeacham98 wrote:
               | Non-essential cookies are personal data as regulated by
               | the GDPR. It is true the EPD started the cookie consent
               | popup craze though.
        
               | RamblingCTO wrote:
               | Finally, at least one person in this thread who
               | understands that GDPR is not cookie banners. I mean WTF,
               | we're on hacker news. Oh wait, yeah, we're on hacker
               | news.
        
           | mmarq wrote:
           | The only interactions made cumbersome by GDPR are those with
           | organizations that abuse their users/customers' data.
           | 
           | Otherwise you don't even need a cookie banner.
        
             | tonypace wrote:
             | So, all of them.
        
               | neon_electro wrote:
               | Given that, is the law bad, or is the default behavior of
               | companies?
        
               | Nextgrid wrote:
               | Doesn't mean the law is bad.
        
           | howaboutnope wrote:
           | > crap like GDPR (which makes basically all normal
           | interaction cumbersome)
           | 
           | Only if you count "tracking users on first visit before they
           | do anything else" as normal. Otherwise, there isn't a banner
           | needed; sites could simply have a link to opt-in to tracking
           | in the header or footer, and not track unless the user opts
           | in.
           | 
           | This is like passing a law making it illegal to just hit
           | people in the street, requiring you have to ask them for
           | consent first. So most of the people who want to hit others
           | up come up with some gish gallop that most people fall for,
           | and then hit them.
           | 
           | And people bitch about the _law_ , and claim it "makes it
           | necessary for people to chew off the ear of other people they
           | pass by in the streets"... with a straight face, that's what
           | they twist it into, with an air of indignation even... and
           | not just for a few weeks, until they read up and the initial
           | misunderstandings are cleared up, but year in and year out,
           | because they _never_ read up, and the falsehoods you just
           | posted keep getting repeated.
        
             | filleokus wrote:
             | >> crap like GDPR (which makes basically all normal
             | interaction cumbersome)
             | 
             | GDPR do make a lot of things cumbersome, not only if you
             | are doing "bad" things.
             | 
             | Remember that GDPR covers information gathered and stored
             | on paper as well. And it covers not only companies but also
             | organisations, like children's soccer clubs.
             | 
             | So let's say you have a printed list where kids and their
             | parents signup with name and phone numbers, you should
             | probably have a data integrity policy and someone akin to a
             | DPO. In your small non-profit soccer club!
             | 
             | (My problem with GDPR is that it doesn't really, at least
             | so far, hinder the worst trackers, but incur large cost all
             | across society, even where handling personal data isn't
             | really a problem)
        
               | howaboutnope wrote:
               | > GDPR do make a lot of things cumbersome, not only if
               | you are doing "bad" things.
               | 
               | That's a far cry from "they make these cookie banners
               | necessary". Tracking people without consent on first
               | visit is what makes them necessary. The anger is
               | consistently misdirected at the people who violate the
               | boundaries of others, not the law that requires consent
               | for it.
               | 
               | > So let's say you have a printed list where kids and
               | their parents signup with name and phone numbers, you
               | should probably have a data integrity policy and someone
               | akin to a DPO. In your small non-profit soccer club!
               | 
               | "We'll ask them if it's okay to store it, and once they
               | leave the club we delete their contact information after
               | N months." Now you have a policy. The person who does
               | everything else, the person who is already secretary,
               | receptionist, accountant, project manager, janitor,
               | coach, counselor, CEO, is now _also_ the PDO.
               | 
               | Human rights being trampled on with an ever increasing
               | mesh of surveillance by big agencies and corporations as
               | well as little informants are such gross violations, such
               | a terrible trajectory we put society on, that mere
               | _complication and discomfort_ is not something that can
               | ever trump them in my book. I would even say if you can
               | 't put food on the table without ignoring the human
               | rights of others, just don't put food on the table --
               | because that's the negotiable part, while the
               | preservation of human rights is not. We _need_ human
               | righs, we don 't _need_ ad-hoc low-effort soccer clubs.
               | Like, at all. Just get a ball and some friends in that
               | case.
        
               | matthewmacleod wrote:
               | _So let 's say you have a printed list where kids and
               | their parents signup with name and phone numbers, you
               | should probably have a data integrity policy and someone
               | akin to a DPO. In your small non-profit soccer club!_
               | 
               | Yes! You should!
               | 
               | This is the same as if your small, non-profit club deals
               | with dangerous chemicals - it needs to make sure that the
               | appropriate risk assessments are done, and safety
               | information is available to users. Or any club dealing
               | with children - it may need to make sure that the people
               | have an appropriate background check.
               | 
               | Likewise, holding personal data is a risk to the people
               | whose data is held. If you want to hold on to that data,
               | your responsibility should include making sure that it is
               | stored and used safely. If you don't want to pay that
               | cost, then stop holding it.
        
               | filleokus wrote:
               | Your view is of course fully valid, and probably the view
               | reflected in the GDPR legislation.
               | 
               | To use your metaphor of chemicals:
               | 
               | I see the current situation as if the soccer club is
               | handling a 1L container of consumer-grade vinegar
               | weedkiller, and is required to do pretty cumbersome
               | things to document their use and keep it "safe". Many of
               | them have consulted some firm or expert to get boiler-
               | plate documentation, because even if fines are unlikely
               | they are anxious about them.
               | 
               | At the same time, we have enormous commercial actors that
               | handle millions of liters of radioactive wastewater in
               | rusty containers. These companies have, for sure, spent a
               | lot of money on "compliance". Some small improvements
               | have surely been made, but the fundamental business
               | practice among these actors of handling radioactive
               | wastewater have not changed. Some "large" fines have been
               | given, but they barley make a dent in the enormous
               | profitability of handling these toxic things.
               | 
               | At least not yet, 3 years in. Maybe it will change in the
               | future, and the big actors will fundamentally change
               | their behaviour.
               | 
               | If that happens, I can agree that the weedkiller
               | documentation is worth the cost, but so far I'm
               | sceptical.
               | 
               | (Since this is an Apple thread, I think its interesting
               | to compare the _real_ privacy gain of GDPR as a whole, vs
               | Apple's simple tracking-popup)
        
             | tonypace wrote:
             | You cannot claim that GDPR is a good law, not with the
             | galaxy-sized loophole where you can track the vast majority
             | of people just like before, as long as your annoy them
             | first.
             | 
             | Better than nothing, sure, but not good.
        
               | howaboutnope wrote:
               | I didn't say it's perfect, but it's surely better than
               | it's presented by people who get _nothing_ about it
               | right.
        
               | Nextgrid wrote:
               | The GDPR explicitly disallows "annoying" users into
               | granting consent. Here are the ICO guidelines about it:
               | https://ico.org.uk/for-organisations/guide-to-data-
               | protectio...
               | 
               | If you annoy users into clicking "accept" then you are in
               | breach and may as well just track users without asking at
               | all.
        
             | willio58 wrote:
             | What's the incentive for a user to opt-in to tracking?
        
               | bogwog wrote:
               | Some people claim that they want personalized ads (at
               | least on HN and similar communities)
               | 
               | I've never met anyone in real life who wasn't creeped out
               | by a targeted ad. Everyone nowadays has a story about how
               | they were having a conversation with someone about
               | something, and then one of their devices served them an
               | advertisement for the thing they were talking about.
               | 
               | Everyone finds that creepy as hell, but it's hard to
               | attribute it to any particular device/company, and even
               | if you find out that it's your Alexa that's spying on
               | you, it's hard to throw it out. Not only is it a waste of
               | money, it's a loss of a convenience, and there's some
               | social pressure involved. Like do you really want to be
               | seen as that one paranoid weirdo who doesn't trust
               | technology that everyone else is using?
        
               | snovv_crash wrote:
               | I mean, isn't that kind of the point?
        
           | arghwhat wrote:
           | The cookie popup is something else, and the result of
           | misinterpreted legislation and companies trying to cry foul
           | that they can't do whatever they want anymore. It's not at
           | all needed for common cookie usage - just the unexpected
           | soul-selling kind.
           | 
           | GDPR restricts companies from using data however they want,
           | makes you able to obtain a copy, and makes you able to
           | require it deleted. It also required the company to document,
           | provide a person responsible to contact, etc.
           | 
           | It only benefits you as a consumer, and the downside in
           | proper uses is that the webpage/app might have a page
           | somewhere with data processing information should you be
           | curious. Nothing major.
           | 
           | Any nuisance are poor implementations or because companies
           | can no longer do terrible unexpected things, like selling
           | your data to hundreds of companies just by accessing a page,
           | without permission, because it is nonsense no one would
           | expect or want.
           | 
           | And with everyone clicking "no" (which must be at least as
           | easy as clicking "yes" as determined in court), the practice
           | would die eventually.
        
           | arvindamirtaa wrote:
           | Is there a point to this...? Or did you just want to crap on
           | GDPR?
           | 
           | Not saying it's good or bad. But just...relevance?
        
       | xvector wrote:
       | With such a successful privacy marketing campaign, Apple doesn't
       | really need to care about security anymore to make boatloads of
       | money. The public is mislead into thinking iPhones are secure and
       | news outlets won't report on these minutiae.
        
       | jimmont wrote:
       | Technology today on the software side is broadly at the place
       | medicine was with elixirs and potions. Given the variability in
       | medical practice across the US maybe it's still a prevalent
       | pattern in different form.
        
         | raspasov wrote:
         | 100%. Perhaps medicine is not that far ahead either. There's a
         | a number of drugs (esp. the ones that deal with the brain)
         | where we observe the effects but the mechanism of action is not
         | well understood.
        
       | jonahx wrote:
       | Are there any partial mitigations you can take until these are
       | patched?
        
         | soheil wrote:
         | Delete any apps you don't want others know you downloaded or be
         | linked to you immediately. If your wifi name for some reason is
         | something sensitive rename it. The address book/sms one is
         | tricky, maybe make a backup of your iPhone and if you're truly
         | paranoid delete all your contacts and sms messages and restore
         | them when Apple releases a patch?
         | 
         | This is truly a massive fail on the part of Apple and I hope
         | there is as big of a backlash from their users.
        
         | babesh wrote:
         | Don't update your apps till after Apple releases a patch. The
         | first two are API calls that apps can make.
         | 
         | An exploit wishing to exploit these vulnerabilities has to be
         | coded to make these calls. Most apps don't dynamically
         | construct arbitrary API calls. In fact, you can't do that in
         | Swift AFAIK. You have to drop to Objective-C or C to do that.
         | 
         | So most apps need to be updated to exploit the vulnerability.
         | The only exceptions would be apps that are intentionally
         | constructed to call arbitrary APIs or at least with arbitrary
         | parameters. The first would be a violation of developer
         | agreements but that hasn't stopped people in the past. Also,
         | these aren't even private APIs. These are public APIs that got
         | exploited due to not properly checking parameters/entitlements.
         | 
         | I wonder if Apple isn't running static analysis tools right now
         | to look for these vulnerabilities against all apps.
        
           | pjmlp wrote:
           | The whole point of Swift is to be next generation Objective-C
           | and C on Apple platforms, no need to drop down to other
           | languages.
           | 
           | In fact, the prof of concepts shown in the article are all
           | written in Swift.
        
             | babesh wrote:
             | I wasn't clear. It is dynamically constructing an API call
             | that Objective-C allows. The objc_msgSend stuff.
        
               | pjmlp wrote:
               | Which you can call directly from Swift.
        
               | babesh wrote:
               | You are right.
               | 
               | https://steipete.com/posts/calling-super-at-runtime/
        
               | illusionofchaos wrote:
               | Look at the code of gamed exploit that I've uploaded to
               | GitHub, the app is written in Swift and it calls
               | Objective-C runtime functions from it
        
           | saagarjha wrote:
           | It's pretty trivial to encode a backdoor into your app that
           | would let you remotely call native code of your choice.
        
             | yccs27 wrote:
             | I guess this is the reason Apple restricts apps from
             | executing downloaded code.
        
               | saagarjha wrote:
               | This is without downloading additional code. Reuse
               | attacks such as ROP, or you could just embed an
               | interpreter with the ability to alter native register
               | state. It's not hard to get Turing completeness into your
               | app in a way that lets it call whatever it wants.
        
               | babesh wrote:
               | Yeah, it wouldn't be too hard to write an interpreter. It
               | is a lot like compiler class.
        
           | worrycue wrote:
           | > I wonder if Apple isn't running static analysis tools right
           | now to look for these vulnerabilities against all apps.
           | 
           | On a side note, this is one more reason Apple can cite for
           | their App Store exclusivity. If there is a vulnerability in
           | the OS exploitable by apps, and they can't get a patch out in
           | time, they can screen and prevent the download of such
           | dangerous apps.
           | 
           | Not a popular position here I know. But I'm correct no?
        
             | babesh wrote:
             | No. Those static analysis tools don't catch everything.
             | There are relatively well known and somewhat widespread
             | tricks to avoid being caught by them.
        
               | babesh wrote:
               | I speculate that GameKit is basically abandonware by
               | Apple. They even got rid of the app a few years ago.
               | 
               | There probably hasn't been hardening of it in years and
               | the initial work was probably developed in haste.
               | 
               | This is systemic. Apple has a bad habit of abandoning
               | software that isn't a priority. So, one shouldn't be
               | surprised that Apple hasn't fixed these exploits. And I
               | wonder if the author has fully mined GameKit for exploits
               | yet. Perhaps there are more to be found.
               | 
               | The architecture of iOS and OSX isn't conducive to
               | security AFAIK. It is more of an add-on as one can see
               | instead of being architected in.
        
               | illusionofchaos wrote:
               | I haven't checked further, maybe authentication token can
               | be used to gain access to Apple account and more data.
               | Also one other method could used to write arbitrary data
               | outside of an app sandbox, that might be useful for
               | further exploitation.
        
               | worrycue wrote:
               | Catching some is better than catching none. Apple will be
               | evolving their analysis tools too as they go along.
        
       | throwawayswede wrote:
       | As bad as this is for people who use iOS, I think it's good in
       | the long run.
       | 
       | People here are getting boggled down in details about how is it
       | possible for this to happen and what sort of policies apple has
       | internally for it to be possible, but that doesn't really matter.
       | Any company even 10% the size of APple should not be given the
       | benefit of the doubt because obviously they'd all prefer not to
       | have the major/minor embarrassment, if they can. Bounty programs
       | exist not because they care about security of their customers
       | only, but it's also a way to promote the company as security-
       | conscious and avoid having 0days sold on the black market.
       | 
       | But to overcome this you can just continue publishing 0-days
       | straight to the public. Put really easy to use sourcecode on
       | github/bucket/srht/etc... allowing script-kiddies and copy-
       | pasters to make use of them easily. This will either drive people
       | to lose trust or force Apple to scramble to release fixes, either
       | way it will push them to respect researchers and fix their bounty
       | program or setup better security guidelines in general.
       | 
       | Props to the author for following through and releasing.
        
       | exabrial wrote:
       | Bug Bounty programs represent everything Silicon Valley hates:
       | Interaction with customers and relationship management. They are
       | _forced_ to actually communicate and respond to people and can't
       | get away with just treating them like cattle.
        
       | speedgoose wrote:
       | Why anyone at Apple decided that it was acceptable to log medical
       | data in such an unsafe way?
       | 
       | I currently work in an IT health care company in Europe, and we
       | must alway store the data fully encrypted with strict access
       | control. We even decided to not make sure to not persist any
       | medical data on user devices to not take unnecessary risks. And
       | there, Apple logs everything on the iPhone? Why?
        
         | saagarjha wrote:
         | Isn't it better to persist medical data on the device rather
         | than putting it on Apple's servers?
        
           | jonathanstrange wrote:
           | What's funny about it is that apparently some of their
           | WatchOS/device combos have FIPS 140-2 and FIPS 140-3
           | certifications. Pretty useless security theatre if you then
           | shuffle the data around to other operating systems or into
           | arbitrary servers with complex infrastructures.
        
           | cybrox wrote:
           | Not if your API design is so bad that any third party can
           | access them, apparently.
           | 
           | Aside from that, I'm pretty sure they'll also get stored on
           | their servers, if you have not declined all the nagging
           | iCloud sync requests.
        
         | evercast wrote:
         | I have some doubt with respect to whether what author claims is
         | "medical data" is indeed medical. Practically speaking, the
         | data he mentions seems like the things collected by Apple Watch
         | and stored in the Health app. There is indeed heart rate
         | tracking, but can we really label this data as medical? IMHO
         | "medical" would relate more to professional diagnosis,
         | treatment etc. which according to Apple is stored in an
         | encrypted form [1]. Garmin devices also collect heart rate,
         | sleep stats etc. and I have never thought of these as medical
         | (health-related yes, but not medical). The line is thin though.
         | 
         | Since you work in the industry, perhaps you could share your
         | opinion how such data should be treated?
         | 
         | [1] https://www.apple.com/healthcare/health-records/
        
           | the_errorist wrote:
           | > menstrual cycle length, biological sex and age, whether
           | user is logging sexual activity, cervical mucus quality, etc.
           | 
           | These are hardly data collected by Apple Watch, unless
           | someone is being inventive with one. These come from
           | HealthKit. Which is alarming as HealthKit can also sync your
           | EHR from health providers.
        
           | chmod775 wrote:
           | Diagnostic data is a category of medical data. So yes, that
           | stuff is considering medical data.
        
           | kube-system wrote:
           | Even if "health data" and "medical data" aren't synonymous,
           | it's a distinction without a difference to their privacy
           | importance.
        
           | lazka wrote:
           | According to the GDPR health data is a special category that
           | needs extra care and heart rate falls in that category:
           | 
           | "Information derived from the testing or examination of a
           | body part or bodily substance"
        
             | the_errorist wrote:
             | "cervical mucus quality" sounds like it fits that
             | definition.
        
       | b8 wrote:
       | Isn't this why most researchers just sell their 0days to Zerodium
       | or drop it publicly? I've heard of multiple companies doing this
       | type of BS. There was a person called Polarbear/sandboxescaper
       | who dropped a few Win10 LPE's on GitHub. They claimed that
       | Zerodium also only pays out a small amount then resells the
       | exploit.
        
         | ianhawes wrote:
         | Yes, that person did drop 0days publicly and then promptly
         | faced an FBI investigation causing a tremendous level of stress
         | and irreparable mental health damage.
        
           | ryanlol wrote:
           | She faced a FBI investigation over the threats she was making
           | during her rants, not because she dropped 0days publicly. It
           | is not very cool of you to falsely insinuate that these
           | things are related.
        
           | b8 wrote:
           | It looks like that they no longer work for Microsoft anymore.
           | Weren't they making some not so great comments before the FBI
           | investigation though?
        
             | ryanlol wrote:
             | She said all kinds of things that would trigger
             | investigations, everything from threatening the president
             | to searching for foreign state hackers to attack the US
             | with her.
             | 
             | Surprisingly MSFT still hired her after this
        
         | google234123 wrote:
         | There aren't worth anything to Zerodium afaik
        
           | b8 wrote:
           | Maybe they would've been to ZDI.
        
       | devwastaken wrote:
       | One more reason why "closed systems" are not magically superior.
       | Closed systems still have vulnerability, and the culture that
       | creates and maintains the closed system shuns those that find
       | flaws in it. So much so that security researcher becomes, in
       | their minds and practices, synonymous with black hat actors. Why
       | would you report vulns to a company that doesn't want it? Go sell
       | it elsewhere and use that money to get a better device. Yet
       | researchers persist, for the good of secure technology.
        
       | xyst wrote:
       | Apple is going downhill. It's amazing how a trillion dollar
       | company can't even get this right. Wouldn't be surprised if the
       | IS&T group is running the bounty program. Probably offshored to
       | hell at this point.
        
       ___________________________________________________________________
       (page generated 2021-09-24 23:01 UTC)