[HN Gopher] Private Cloud Compute Security Guide
       ___________________________________________________________________
        
       Private Cloud Compute Security Guide
        
       Author : djoldman
       Score  : 248 points
       Date   : 2024-11-06 13:48 UTC (9 hours ago)
        
 (HTM) web link (security.apple.com)
 (TXT) w3m dump (security.apple.com)
        
       | jagrsw wrote:
       | If Apple controls the root of trust, like the private keys in the
       | CPU or security processor used to check the enclave (similar to
       | how Intel and AMD do it with SEV-SNP and TDX), then technically,
       | it's a "trust us" situation, since they likely use their own ARM
       | silicon for that?
       | 
       | Harder to attack, sure, but no outside validation. Apple's not
       | saying "we can't access your data," just "we're making it way
       | harder for bad guys (and rogue employees) to get at it."
        
         | ozgune wrote:
         | +1 on your comment.
         | 
         | I think having a description of Apple's threat model would
         | help.
         | 
         | I was thinking that open source would help with their
         | verifiable privacy promise. Then again, as you've said, if
         | Apple controls the root of trust, they control everything.
        
           | bootsmann wrote:
           | They define their threat model in "Anticipating Attacks"
        
           | dagmx wrote:
           | Their threat model is described in their white papers.
           | 
           | But essentially it is trying to get to the end result of "if
           | someone commandeers the building with the servers, they still
           | can't compromise the data chain even with physical access"
        
         | skylerwiernik wrote:
         | I don't think they do. Your phone cryptographically verifies
         | that the software running on the servers is what it says it is,
         | and you can't pull the keys out of the secure enclave. They
         | also had independent auditors go over the whole thing and
         | publish a report. If the chip is disconnected from the system
         | it will dump its keys and essentially erase all data.
        
           | plagiarist wrote:
           | I don't understand how publishing cryptographic signatures of
           | the software is a guarantee? How do they prove it isn't
           | keeping a copy of the code to make signatures from but
           | actually running a malicious binary?
        
             | dialup_sounds wrote:
             | The client will only talk to servers that can prove they're
             | running the same software as the published signatures.
             | 
             | https://security.apple.com/documentation/private-cloud-
             | compu...
        
               | warkdarrior wrote:
               | And the servers prove that by relying on a key stored in
               | secure hardware. And that secure hardware is designed by
               | Apple, who has a specific interest in convincing users of
               | that attestation/proof. Do you see the conflict of
               | interest now?
        
           | HeatrayEnjoyer wrote:
           | How do you know the root enclave key isn't retained somewhere
           | before it is written? You're still trusting Apple.
           | 
           | Key extraction is difficult but not impossible.
        
             | jsheard wrote:
             | > Key extraction is difficult but not impossible.
             | 
             | Refer to the never-ending clown show that is Intels SGX
             | enclave for examples of this.
             | 
             | https://en.wikipedia.org/wiki/Software_Guard_Extensions#Lis
             | t...
        
             | yalogin wrote:
             | Can you clarify what you mean by retained and written?
        
           | hnaccount_rng wrote:
           | But since they also control the phone's operating system they
           | can just make it lie to you!
           | 
           | That doesn't make PCC useless by the way. It clearly
           | establishes that Apple mislead customers, if there is any
           | intentionality in a breach, or that Apple was negligent, if
           | they do not immediately provide remedies on notification of a
           | breach. But that's much more a "raising the cost" kind of
           | thing and not a technical exclusion. Yes if you get Apple, as
           | an organisation, to want to get at your data. And you use an
           | iPhone. They absolutely can.
        
         | ant_li0n wrote:
         | Hey can you help me understand what you mean? There's an entry
         | about "Hardware Root of Trust" in that document, but I don't
         | see how that means Apple is avoiding stating, "we can't access
         | your data" - the doc says it's not exportable.
         | 
         | "Explain it like I'm a lowly web dev"
        
           | jolan wrote:
           | https://x.com/_saagarjha/status/1804130898482466923
           | 
           | https://x.com/frogandtoadbook/status/1734575421792920018
        
         | wutwutwat wrote:
         | every entity you hand data to other than yourself is a "trust
         | us" situation
        
           | fsflover wrote:
           | Unless it's encrypted.
        
             | wutwutwat wrote:
             | you trust more than I do
        
         | SheinhardtWigCo wrote:
         | It was always "trust us". They make the silicon, and you have
         | no hope of meaningfully reverse engineering it. Plus, iOS and
         | macOS have silent software update mechanisms, and no update
         | transparency.
        
       | m3kw9 wrote:
       | I will just use it, it's Apple and all I need is to see the
       | verifiable privacy thing and I let the researchers let me know
       | red flags. You go on copilot, it says your code is private? Good
       | luck
        
         | danparsonson wrote:
         | I've got a fully private LLM that's pretty good at coding built
         | right into my head - I'll stick with that, thanks.
        
         | z3ncyberpunk wrote:
         | Apple hands your data over to PRISM since 2012.
        
       | solarkraft wrote:
       | Sibling comments point out (and I believe, corrections are
       | welcome) that all that theater is still no protection against
       | Apple themselves, should they want to subvert the system in an
       | organized way. They're still fully in control. There is, for
       | example, as far as I understand it, still plenty of attack
       | surface for them to run different software than they say they do.
       | 
       | What they are doing by this is of course to make any kind of
       | subversion a hell of a lot harder and I welcome that. It serves
       | as a strong signal that they _want_ to protect my data and I
       | welcome that. To me this definitely makes them the most trusted
       | AI vendor at the moment by far.
        
         | patmorgan23 wrote:
         | Yep. If you don't trust apple with your data, don't buy a
         | device that runs apples operating system
        
           | yndoendo wrote:
           | That is good in theory. Reality, anyone you engage with that
           | uses an Apple device has leaked your content / information to
           | Apple. High confidence that Apple could easily build profiles
           | on people that do not use their devices via this indirect
           | action of having to communicate with Apple devices owners.
           | 
           | That statement above also applies to Google. There is now way
           | not prevent indirect data sharing with Apple or Google.
        
             | hnaccount_rng wrote:
             | Yes, if your thread model includes the provider of your
             | operating system, then you cannot win. It's really that
             | simple. You fundamentally need to trust your operating
             | system because it can just lie to you
        
               | fsflover wrote:
               | This is false. With FLOSS and reproducible builds, you
               | can rely on the community for verification.
        
               | philjohn wrote:
               | Not unless your entire stack down to the bare silicon is
               | also FLOSS, and the community is able to verify.
               | 
               | There is a lot of navel gazing in these comments about
               | "the perfect solution", but we all know (or should know)
               | that perfect is the enemy of good enough.
        
               | threeseed wrote:
               | We've seen countless examples of relatively minor
               | libraries being exploited which then cause havoc because
               | of a spider web of transitive dependencies.
        
               | hulitu wrote:
               | > You fundamentally need to trust your operating system
               | because it can just lie to you
               | 
               | Trust us, we are liars. /s
        
             | dialup_sounds wrote:
             | Define "content / information".
        
             | afh1 wrote:
             | Depending on your social circle such exposure is not so
             | hard to avoid. Maybe you cannot avoid it entirely but it
             | may be low enough that it doesn't matter. I have older
             | relatives with basically zero online presence.
        
           | isodev wrote:
           | That really is not a valid argument, since Apple have grown
           | to be "the phone".
           | 
           | Also, many are unaware or unable to make the determination
           | who or what will own their data before purchasing a device.
           | One only accepts the privacy policy after one taps sign in...
           | and is it really practical to expect people to do this by
           | themselves when buying a phone? That's why regulation needs
           | to step-in and enforce the right decisions are present by
           | default.
        
           | mossTechnician wrote:
           | But if you don't trust Google with your data, you _can_ buy a
           | device that runs Google 's operating system, from Google, and
           | flash somebody else's operating system onto it.
           | 
           | Or, if you prefer, you can just look at Google's code and
           | verify that the operating system you put on your phone is
           | made with the code you looked at.
        
           | threeseed wrote:
           | And if you don't trust Apple with your data you shouldn't use
           | a phone or internet at all.
           | 
           | Because as someone who has worked at a few telcos I can
           | assure you that your phone and triangulated location data is
           | stored, analysed and provided to intelligence agencies. And
           | likewise this would be applying to ISPs.
        
         | tw04 wrote:
         | As soon as you start going down the rabbit hole of state
         | sponsored supply chain alteration, you might as well just stop
         | the conversation. There's literally NOTHING you can do to stop
         | that specific attack vector.
         | 
         | History has shown, at least to date, Apple has been a good
         | steward. They're as good a vendor to trust as anyone. Given a
         | huge portion of their brand has been built on "we don't spy on
         | you" - the second they do they lose all credibility, so they
         | have a financial incentive to keep protecting your data.
        
           | talldayo wrote:
           | ...in certain places: https://support.apple.com/en-us/111754
           | 
           | Just make _absolutely sure_ you trust your government when
           | using an iDevice.
        
             | jayrot wrote:
             | >Just make absolutely sure you trust your government
             | 
             | This sentence stings right now. :-(
        
             | spondyl wrote:
             | When it comes to China, it's not entirely fair to single
             | out Apple here given that non-Chinese companies are not
             | allowed to run their own compute in China directly.
             | 
             | It always has to be operated by a sponsor in the state who
             | hold encryption keys and do actual deployments etc etc.
             | 
             | The same applies to Azure/AWS/Google Cloud's China regions
             | and any other compute services you might think of.
        
               | talldayo wrote:
               | It's entirely fair. Apple had the choice to stop pursuing
               | business in China if they felt it conflicted with values
               | they prioritized. Evidently it doesn't, which should tell
               | you a lot about how accepting Apple is of this behavior
               | worldwide.
        
               | musictubes wrote:
               | You don't have to use iCloud. Customers in China can
               | still make encrypted backups on their computers. I also
               | believe, but please correct me if I'm wrong, that you can
               | still do encrypted backups in China if you want.
               | 
               | All the pearl clutching about Apple doing business in
               | China is ridiculous. Who would be better off if Apple
               | withdrew from China? Sure, talldayo would sleep better
               | knowing that Apple had passed their purity test, I guess
               | that's worth a lot right? God knows consumers in China
               | would be much better off without the option to use
               | iPhones or any other Apple devices. Their privacy and
               | security are better protected by domestic phones I'm
               | sure.
               | 
               | Seriously, what exactly is the problem?
        
               | astrange wrote:
               | iCloud E2E encryption (advanced data protection) works in
               | China.
               | 
               | There are other less nefarious reasons for in-country
               | storage laws like this. One is to stop other countries
               | from subpoeanaing it.
               | 
               | But it's also so China gets the technical skills from
               | helping you run it.
        
           | afh1 wrote:
           | > There's literally NOTHING you can do to stop that specific
           | attack vector.
           | 
           | E2E. Might not be applicable for remote execution of AI
           | payloads, but it is applicable for most everything else, from
           | messaging to storage.
           | 
           | Even if the client hardware and/or software is also an actor
           | in your threat model, that can be eliminated or at least
           | mitigated with at least one verifiably trusted piece of
           | equipment. Open hardware is an alternative, and some states
           | build their entire hardware stack to eliminate such threats.
           | If you have at least one trusted equipment mitigations are
           | possible (e.g. external network filter).
        
             | warkdarrior wrote:
             | E2E does not protect metadata, at least not without
             | significant overheads and system redesigns. And metadata is
             | as important as data in messaging and storage.
        
               | afh1 wrote:
               | > And metadata is as important as data in messaging and
               | storage.
               | 
               | Is it? I guess this really depends. For E2E storage (e.g.
               | as offered by Proton with openpgpjs), what metadata would
               | be of concern? File size? File type cannot be inferred,
               | and file names could be encrypted if that's a threat in
               | your model.
        
               | mbauman wrote:
               | The most valuable "metadata" in this context is typically
               | _with whom_ you 're communicating/collaborating and
               | _when_ and _from where_. It 's so valuable it should just
               | be called data.
        
               | fsflover wrote:
               | How is this relevant to the private cloud storage?
        
               | Jerrrrrrry wrote:
               | No point in storing data if it is never shared with
               | anyone else.
               | 
               | Whom it is shared with can infer the intent of the data.
        
               | fsflover wrote:
               | Backups?
        
               | Jerrrrrrry wrote:
               | yes, got me there.
               | 
               | but i feel in the context (communication/meta-data
               | inference) that is missing the trees for the forest
        
           | ferbivore wrote:
           | Apple have name/address/credit-card/IMEI/IMSI tuples stored
           | for every single Apple device. iMessage and FaceTime leak
           | numbers, so they know who you talk to. They have real-time
           | location data. They get constant pings when you do anything
           | on your device. Their applications bypass firewalls and VPNs.
           | If you don't opt out, they have full unencrypted device
           | backups, chat logs, photos and files. They made a big fuss
           | about protecting you from Facebook and Google, then built
           | their own targeted ad network. Opting out of all tracking
           | doesn't really do that. And even if you trust them despite
           | all of this, they've repeatedly failed to protect users even
           | from external threats. The endless parade of iMessage zero-
           | click exploits was ridiculous and preventable, CKV only
           | shipped this year and isn't even on by default, and so on.
           | 
           | Apple have _never_ been punished by the market for any of
           | these things. The idea that they will  "lose credibility" if
           | they livestream your AI interactions to the NSA is
           | ridiculous.
        
             | lurking_swe wrote:
             | > They made a big fuss about protecting you from Facebook
             | and Google, then built their own targeted ad network.
             | 
             | What kind of targeting advertising am i getting from apple
             | as a user of their products? Genuinely curious. I'll wait.
             | 
             | The rest of your comment may be factually accurate but it
             | isn't relevant for "normal" users, only those hyper aware
             | of their privacy. Don't get me wrong, i appreciate knowing
             | this detail but you need to also realize that there are
             | degrees to privacy.
        
               | talldayo wrote:
               | > What kind of targeting advertising am i getting from
               | apple as a user of their products?
               | 
               | https://searchads.apple.com/
               | 
               | https://support.apple.com/guide/iphone/control-how-apple-
               | del...                 In the App Store and Apple News,
               | your search and download history may be used to serve you
               | relevant search ads. In Apple News and Stocks, ads are
               | served based partly on what you read or follow. This
               | includes publishers you've enabled notifications for and
               | the type of publishing subscription you have.
        
             | Tagbert wrote:
             | They have not been punished because they have not abused
             | their access to that data.
        
               | sunnybeetroot wrote:
               | Some might call this abuse:
               | https://news.ycombinator.com/item?id=42069588
        
             | commandersaki wrote:
             | > If you don't opt out, they have full unencrypted device
             | backups, chat logs, photos and files.
             | 
             | Also full disk encryption is opt-in for macOS. But the
             | answer isn't that Apple wants you to be insecure, they just
             | probably want to make it easier for their users to recover
             | data if they forget a login password or backup password
             | they set years ago.
             | 
             | > real-time location data
             | 
             | Locations are end to end encrypted.
        
             | threeseed wrote:
             | It's disingenuous to compare Apple's advertising to
             | Facebook and Google.
             | 
             | Apple does first party advertising for two relatively
             | minuscule apps.
             | 
             | Facebook and Google power the majority of the world's
             | online advertising, have multiple data sharing agreements,
             | widely deployed tracking pixels, allow for browser
             | fingerprinting and are deeply integrated into almost all
             | ecommerce platforms and sites.
        
           | natch wrote:
           | As to the trust loss, we seem to be already past that. It
           | seems to me they are now in the stage of faking it.
        
           | hulitu wrote:
           | > History has shown, at least to date, Apple has been a good
           | steward.
           | 
           | cough* HW backdoor in iPhone cough*
        
             | evgen wrote:
             | _cough_ bullshit _cough_
             | 
             | Don't try to be subtle. If you are going to lie, go for a
             | big lie.
        
           | vlovich123 wrote:
           | Strictly speaking there's homomorphic encryption. It's still
           | horribly slow and expensive but it literally lets you run
           | compute on untrusted hardware in a way that's mathematically
           | provable.
        
             | commandersaki wrote:
             | Yeah the impetus for PCC was that homomorphic encryption
             | wasn't feasible and this was the best realistic
             | alternative.
        
             | romac wrote:
             | And they are pushing in that direction:
             | https://machinelearning.apple.com/research/homomorphic-
             | encry...
        
           | sunnybeetroot wrote:
           | Didn't Edward reveal Apple provides direct access to the NSA
           | for mass surveillance?
           | 
           | > allows officials to collect material including search
           | history, the content of emails, file transfers and live chats
           | 
           | > The program facilitates extensive, in-depth surveillance on
           | live communications and stored information. The law allows
           | for the targeting of any customers of participating firms who
           | live outside the US, or those Americans whose communications
           | include people outside the US.
           | 
           | > It was followed by Yahoo in 2008; Google, Facebook and
           | PalTalk in 2009; YouTube in 2010; Skype and AOL in 2011; and
           | finally Apple, which joined the program in 2012. The program
           | is continuing to expand, with other providers due to come
           | online.
           | 
           | https://www.theguardian.com/world/2013/jun/06/us-tech-
           | giants...
        
             | theturtletalks wrote:
             | Didn't Apple famously refuse the FBI's request to unlock
             | the San Bernardino's attacker's iPhone. FBI ended up hiring
             | an Australian company which used a Mozilla bug that allows
             | unlimited password guesses without the phone wiping.
             | 
             | If the NSA had that info, why go through the trouble?
        
               | talldayo wrote:
               | > If the NSA had that info, why go through the trouble?
               | 
               | To defend the optics of a backdoor that they actively
               | rely on?
               | 
               | If Apple and the NSA are in kahoots, it's not hard to
               | imagine them anticipating this kind of event and
               | leveraging it for plausible deniability. I'm not saying
               | this is necessarily what happened, but we'd need more
               | evidence than just the first-party admission of two
               | parties that stand to gain from privacy theater.
        
             | astrange wrote:
             | That seemed to be puffery about a database used to store
             | subpoena requests. You have "direct access" to a service if
             | it has a webpage you can submit subpoenas to.
        
         | chadsix wrote:
         | Exactly. You can only trust yourself [1] and should self host.
         | 
         | [1] https://www.youtube.com/watch?v=g_JyDvBbZ6Q
        
           | 9dev wrote:
           | That is an answer for an incredibly tiny fraction of the
           | population. I'm not so much concerned about myself than
           | society in general, and self-hosting just is not a viable
           | solution to the problem at hand.
        
             | chadsix wrote:
             | To be fair, it's much easier than one can imagine (try
             | ollama on macOS for example). In the end, Apple wrote a lot
             | of longwinded text, but the summary is "you have to trust
             | us."
             | 
             | I don't trust Apple - in fact, even the people we trust the
             | most have told us soft lies here and there. Trust is a
             | concept like an integral - you can only get to "almost" and
             | almost is 0.
             | 
             | So you can only trust yourself. Period.
        
               | dotancohen wrote:
               | I don't even trust myself, I know that I'm going to mess
               | up at some point or another.
        
               | lukev wrote:
               | The odds that I make a mistake in my security
               | configuration are much higher than the odds that Apple is
               | maliciously backdooring themselves.
               | 
               | The PCC model doesn't guarantee they can't backdoor
               | themselves, but it does make it more difficult for them.
        
               | astrange wrote:
               | You also don't have a security team and Apple does have
               | one.
        
               | killjoywashere wrote:
               | There are multiple threat models where you can't trust
               | yourself.
               | 
               | Your future self definitely can't trust your past self.
               | And vice versa. If your future self has a stroke
               | tomorrow, did your past self remember to write a living
               | will? And renew it regularly? Will your future self
               | remember that password? What if the kid pukes on the
               | carpet before your past self writes it down?
               | 
               | Your current self is not statistically reliable. Andrej
               | Karpathy administered an imagenet challenge to himself,
               | his brain as the machine: he got about 95%.
               | 
               | I'm sure there are other classes of self-failure.
        
               | martinsnow wrote:
               | Given the code quality of projects like nextcloud.
               | Suggestions like this makes the head and table
               | transmugify into magnets.
        
               | commandersaki wrote:
               | > "you have to trust us."
               | 
               | You have fundamentally misunderstood PCC.
        
             | talldayo wrote:
             | Nobody promised you that real solutions would work for
             | everyone. Performing CPR to save a life is something "an
             | incredibly tiny fraction of the population" is trained on,
             | but it _does_ work when circumstances call for it.
             | 
             | It sucks, but what are you going to do for society? Tell
             | them all to sell their iPhones, punk out the NSA like
             | you're Snowden incarnate? Sometimes saving yourself _is_
             | the only option, unfortunately.
        
           | remram wrote:
           | Can you trust the hardware?
        
             | killjoywashere wrote:
             | There's a niche industry that works on that problem:
             | looking for evidence of tampering down to the semiconductor
             | level.
        
               | sourcepluck wrote:
               | Notably
               | https://www.bunniestudios.com/blog/2020/introducing-
               | precurso...
        
             | blitzar wrote:
             | If you make your own silicon can you trust that the sand
             | hasnt been tampered with to breech your security?
        
         | stavros wrote:
         | > that all that theater is still no protection against Apple
         | themselves
         | 
         | There is such a thing as threat modeling. The fact that your
         | model only stops _some_ threats, and not _all_ threats, doesn
         | 't mean that it's theater.
        
           | hulitu wrote:
           | > The fact that your model only stops some threats, and not
           | all threats, doesn't mean that it's theater.
           | 
           | Well, to be honest, theater is a pretentious word in this
           | context. A better word will be shitshow.
           | 
           | (i never heard of a firewall that claims it filters _some_
           | packets, or an antivirus that claims that it protects against
           | _some_ viruses)
        
             | stavros wrote:
             | Really? Please show me an antivirus that claims that it
             | protects against all viruses. A firewall that filters all
             | packets is a pair of scissors.
        
         | halJordan wrote:
         | Its not that they couldn't, its that they couldn't without a
         | watcher knowing. And frankly this tradeoff is not new, nor is
         | it unacceptable in anything other than "Muh Apple"
        
         | isodev wrote:
         | Indeed, the attestation process, as described by the article,
         | is more geared towards unauthorized exfiltration of information
         | or injection of malicious code. However, "authorized"
         | activities are fully supported where that means signed by
         | Apple. So, ultimately, users need to trust that Apple is doing
         | the right thing, just like any other company. And yes, it means
         | they can be forced (by law) not to do the right thing.
        
         | natch wrote:
         | You're getting taken in by a misdirection.
         | 
         | >for them to run different software than they say they do.
         | 
         | They don't even need to do that. They don't need to do anything
         | different than they say.
         | 
         | They already are saying only that the data is kept private from
         | <insert very limited subset of relevant people here>.
         | 
         | That opens the door wide for them to share the data with anyone
         | outside of that very limited subset. You just have to read what
         | they say, and also read between the lines. They aren't going to
         | say who they share with, apparently, but they are going to
         | carefully craft what they say so that some people get
         | misdirected.
        
           | astrange wrote:
           | They're not doing that because it's obviously illegal. GDPR
           | forbids sharing data with unknown other people.
        
         | 1vuio0pswjnm7 wrote:
         | "Sibling comments point out (and I believe, corrections are
         | welcome) that all that theater is still no protection against
         | Apple themselves, should they want to subvert the system in an
         | organized way. They're still fully in control."
         | 
         | It stands to reason that that control is a prerequisite for
         | "security".
         | 
         | Apple does not delegate its own "security" to someone else, a
         | "steward". Hmmm.
         | 
         | Yet it expects computer users to delegate control to Apple.
         | 
         | Apple is not alone in this regard. It's common for "Big Tech",
         | "security researchers" and HN commenters to advocate for the
         | computer user to delegate control to someone else.
        
         | derefr wrote:
         | The "we've given this code to a third party to host and run"
         | part _can_ be a 100% effective stop to any Apple-internal
         | shenanigans. It depends entirely on what the third party is
         | legally obligated to do for them. (Or more specifically, what
         | they 're legally obligated to _not_ do for them.)
         | 
         | A simple example of the sort of legal agreement I'm talking
         | about, is a trust. A trust isn't _just_ a legal entity that
         | takes custody of some assets and doles them out to you on a set
         | schedule; it 's more specifically a legal entity _established
         | by_ legal contract, and _executed by_ some particular law firm
         | acting as its custodian, that obligates _that law firm_ as
         | executor to provide only a certain  "API" for the contract's
         | subjects/beneficiaries to interact with/manage those assets --
         | a more restrictive one than they would have otherwise had a
         | legal right to.
         | 
         | With trusts, this is done because that restrictive API (the
         | "you can't withdraw the assets all at once" part especially) is
         | what makes the trust a trust, legally; and therefore what makes
         | the legal (mostly tax-related) benefits of trusts apply,
         | instead of the trust just being a regular holding company.
         | 
         | But you don't need any particular legal impetus in order to
         | create this kind of "hold onto it and don't listen to me if I
         | ask for it back" contract. You can just... write a contract
         | that has terms like that; and then ask a law firm to execute
         | that contract for you.
         | 
         | Insofar as Apple have engaged with some law firm to in turn
         | engage with a hosting company; where the hosting company has
         | obligations _to the law firm_ to provide a secure environment
         | for _the law firm_ to deploy software images, and to report
         | accurate trusted-compute metrics _to the law firm_ ; and where
         | _the law firm_ is legally obligated to get any image-updates
         | Apple hands over to them independently audited, and only accept
         | "justifiable" changes (per some predefined contractual
         | definition of "justifiable") -- then I would say that this _is_
         | a trustworthy arrangement. Just like a trust is a trust-worthy
         | arrangement.
        
           | neongreen wrote:
           | This actually sounds like a very neat idea. Do you know any
           | services / software companies that operate like that?
        
         | commandersaki wrote:
         | > _They're still fully in control. There is, for example, as
         | far as I understand it, still plenty of attack surface for them
         | to run different software than they say they do._
         | 
         | But any such software must be publicly verifiable otherwise it
         | cannot be deemed secure. That's why they publish each version
         | in a transparency log which is verified by the client and
         | _handwavy_ verified by public brains trust.
         | 
         | This is also just a tired take. The same thing could be said
         | about passcodes on their mobile products or full disk
         | encryption keys for the Mac line. There'd be massive loss of
         | goodwill and legal liability if they subverted these
         | technologies that they claim to make their devices secure.
        
       | _boffin_ wrote:
       | I really don't care at all about this as the interactions that
       | I'd have would be the speech to text, which sends all transcripts
       | to Apple without the ability opt out.
        
         | lukev wrote:
         | Settings > Privacy and Security > Analytics and Improvements
        
         | astrange wrote:
         | IIRC that uses servers on HomePods but not anything else.
        
       | max_ wrote:
       | Please don't fall for the cheap "Apple is pro privacy" veneer.
       | 
       | They cannot be trust any more. These "Private Compute" schemes
       | are blatant lies. Maybe even scams at this point.
       | 
       | Learn more -- https://sneak.berlin/20201112/your-computer-isnt-
       | yours/
        
         | jasongill wrote:
         | The core of this article, if I understand it correctly, is that
         | macOS pings Apple to make sure that apps you open are safe
         | before opening them. This check contains some sort of unique
         | string about the app being opened, and then there is a big leap
         | to "this could be used by the government"
         | 
         | Is this the ideal situation? No, probably not. Should Apple do
         | a better job of communicating that this is happening to users?
         | Yes, probably so.
         | 
         | Does Apple already go overboard to explain their privacy
         | settings during setup of a new device (the pages with the blue
         | "handshake" icon)? Yes. Does Apple do a far better job of this
         | than Google or Microsoft (in my opinion)? Yes.
         | 
         | I don't think anyone here is claiming that Apple is the best
         | thing to ever happen to privacy, but when viewed via the lens
         | of "the world we live in today", it's hard to see how Apple's
         | privacy stance is a "scam". It seems to me to be one of the
         | best or most reasonable stances for privacy among all large-cap
         | businesses in the world.
        
           | max_ wrote:
           | Have you read the linked article?
        
             | jasongill wrote:
             | Yes, that's why I commented, because the article's core
             | complaint is about the fact that the OS'es Gatekeeper
             | feature does an OCSP certificate validation whenever an app
             | is launched and there's no way to disable it, and that
             | supposed calling home could leak data about your computer
             | use over the wire.
             | 
             | However, it also has a LOT of speculation, with statements
             | like "It seems this is part of Apple's anti-malware (and
             | perhaps anti-piracy)" and "allowing anyone on the network
             | (which includes the US military intelligence community) to
             | see what apps you're launching" and "Your computer now
             | serves a remote master, who has decided that they are
             | entitled to spy on you."
             | 
             | However, without this feature (which seems pretty benign to
             | me), wouldn't the average macOS user be actually exposed to
             | _more_ potential harm by being able to run untrusted or
             | modified binaries without any warnings?
        
             | pertymcpert wrote:
             | Did you?
        
           | astrange wrote:
           | > This check contains some sort of unique string about the
           | app being opened,
           | 
           | It's not unique to the app, the article is just wrong. It's
           | unique to the /developer/, which is much less specific.
        
       | gigel82 wrote:
       | I'm glad that more and more people start to see through the thick
       | Apple BS (in these comments). I don't expect them to back down
       | from this but I hope there is enough pushback that they'll be
       | forced to add a big opt-out for all cloud compute, however
       | "private" they make it out to be.
        
       | h1fra wrote:
       | Love this, but as an engineer, I would hate to get a bug report
       | in that prod environment, 100% don't work on my machine and 0%
       | reproducibility
        
         | slashdave wrote:
         | That's a strange point of view. Clearly one shouldn't use
         | private information for testing in any production environment.
        
           | ericlewis wrote:
           | As a person who works on this kinda stuff I know what they
           | mean. It's very hard to debug things totally blind.
        
         | pjmlp wrote:
         | Usually quite common when doing contract work, where externals
         | have no access to anything besides a sandbox to play around
         | with their contribution to the whole enterprise software
         | jigsaw.
        
       | lxgr wrote:
       | This is probably the best way to do cloud computation offoading,
       | _if one chooses to do it at all_.
       | 
       | What's desperately missing on the client side is a switch to
       | _turn this off_. It 's really intransparent which Apple
       | Intelligence requests are locally processed and which are sent to
       | the cloud, at the moment.
       | 
       | The only sure way to know/prevent it a priori is to... enter
       | flight mode, as far as I can tell?
       | 
       | Retroactively, there's a request log in the privacy section of
       | System Preferences, but that's really convoluted to read (due to
       | all of the cryptographic proofs that I have absolutely no tools
       | to verify at the moment, and honestly have no interest in).
        
       | curt15 wrote:
       | For the experts out there, how does this compare with AWS Nitro?
        
         | bobbiechen wrote:
         | AWS Nitro (and Nitro Enclaves) are general computing platforms,
         | so it's different. You'd need to write a PCC-like
         | system/application on top of AWS Nitro Enclaves to make a
         | direct comparison. A breakdown of those 5 core requirements
         | from Apple:
         | 
         | 1. Stateless computation on personal user data - a property of
         | the application
         | 
         | 2. Enforceable guarantees - a property of the application;
         | Nitro Enclaves attestation helps here
         | 
         | 3. No privileged runtime access - maps directly to the no
         | administrative API access in the AWS Nitro System platform
         | 
         | 4. Non-targetability - a property of the application
         | 
         | 5. Verifiable transparency - a mix of the application and the
         | platform; Nitro Enclaves attestation helps here
         | 
         | To be a little more concrete: (1 stateless) You could write an
         | app that statelessly processes user data, and build it into a
         | Nitro Enclave. This has a particular software measurement
         | (PCR0) and can be code-signed (PCR8) and verified at runtime (2
         | enforceable) using Nitro Enclave Attestation. This also
         | provides integrity protection. You get (3 no access) for "free"
         | by running it in Nitro to begin with (from AWS - you also need
         | to ensure there is no application-level admin access). You
         | would need to design (4 non-targetable) as part of your
         | application. For (5 transparency), you could provide your code
         | to researchers as Apple is doing.
         | 
         | (I work with AWS Nitro Enclaves for various security/privacy
         | use cases at Anjuna. Some of these resemble PCC and I hope we
         | can share more details about the customer use cases
         | eventually.)
         | 
         | Some sources:
         | 
         | - NCC Group Audit on the Nitro System
         | https://www.nccgroup.com/us/research-blog/public-report-aws-...
         | 
         | - Nitro Enclaves attestation process:
         | https://github.com/aws/aws-nitro-enclaves-nsm-api/blob/main/...
        
       | natch wrote:
       | >No privileged runtime access: PCC must not contain privileged
       | interfaces that might enable Apple site reliability staff to
       | bypass PCC privacy guarantees.
       | 
       | What about other staff and partners and other entities? Why do
       | they always insert qualifiers?
       | 
       | Edit: Yeah, we know why. But my point is they should spell it
       | out, not use wording that is on its face misleading or outright
       | deceptive.
        
         | pertymcpert wrote:
         | Apple are running the data centers... this seems like an
         | extreme nit pick of language.
        
         | astrange wrote:
         | There aren't any other staff or partners.
        
       ___________________________________________________________________
       (page generated 2024-11-06 23:00 UTC)