[HN Gopher] Private Cloud Compute: A new frontier for AI privacy...
       ___________________________________________________________________
        
       Private Cloud Compute: A new frontier for AI privacy in the cloud
        
       Author : serhack_
       Score  : 25 points
       Date   : 2024-06-10 21:53 UTC (1 hours ago)
        
 (HTM) web link (security.apple.com)
 (TXT) w3m dump (security.apple.com)
        
       | ethbr1 wrote:
       | This entire platform is the first time I've strategically
       | considered realigning the majority of my use to Apple.
       | 
       | Airtag anonymity was pretty cool, technically speaking, but a
       | peripheral use case for me.
       | 
       | To me, PCC is a well-reasoned, surprisingly customer-centric
       | response to the fact that due to (processing, storage, battery)
       | limitations not all useful models can be run on-device.
       | 
       | And they tried to build a privacy architecture _before_ widely
       | deploying it, instead of post-hoc bolting it on.
       | 
       | >> _4. Non-targetability. An attacker should not be able to
       | attempt to compromise personal data that belongs to specific,
       | targeted Private Cloud Compute users without attempting a broad
       | compromise of the entire PCC system. This must hold true even for
       | exceptionally sophisticated attackers who can attempt physical
       | attacks on PCC nodes in the supply chain or attempt to obtain
       | malicious access to PCC data centers._
       | 
       | Oof. That's a pretty damn specific (literally) attacker, and it's
       | impressive that made it into their threat model.
       | 
       | And interesting use of onion-style encryption to expose the bare
       | minimum necessary for routing, before the request reaches its
       | target node. Also [0].
       | 
       | >> _For example, the [PCC node OS] doesn't even include a
       | general-purpose logging mechanism. Instead, only pre-specified,
       | structured, and audited logs and metrics can leave the node, and
       | multiple independent layers of review help prevent user data from
       | accidentally being exposed through these mechanisms._
       | 
       | My condolences to Apple SREs, between this and the other privacy
       | guarantees.
       | 
       | >> _Our commitment to verifiable transparency includes: (1)
       | Publishing the measurements of all code running on PCC in an
       | append-only and cryptographically tamper-proof transparency log.
       | (2) Making the log and associated binary software images publicly
       | available for inspection and validation by privacy and security
       | experts. (3) Publishing and maintaining an official set of tools
       | for researchers analyzing PCC node software. (4) Rewarding
       | important research findings through the Apple Security Bounty
       | program._
       | 
       | [0] Oblivious HTTP, https://www.rfc-editor.org/rfc/rfc9458
        
       | thomasahle wrote:
       | Did Apple say anything about what training data they used for
       | their generative image models?
        
       | jeffbee wrote:
       | A lot of this sounds like Apple has been 10-20 years behind the
       | state of the art and now wants to tell you that they partially
       | caught up. Verifiable hardware roots of trust and end-to-end
       | software supply chain integrity are things that have existed for
       | a while. The interesting part doesn't come until the end where
       | they promise to publish system images for inspection.
        
       ___________________________________________________________________
       (page generated 2024-06-10 23:00 UTC)