[HN Gopher] Apple's child protection features spark concern with...
       ___________________________________________________________________
        
       Apple's child protection features spark concern within its own
       ranks: sources
        
       Author : spenvo
       Score  : 1586 points
       Date   : 2021-08-12 22:26 UTC (1 days ago)
        
 (HTM) web link (www.reuters.com)
 (TXT) w3m dump (www.reuters.com)
        
       | istingray wrote:
       | How would Apple roll this back?
       | 
       | Seems unlikely, but stepping into "what if", what are people's
       | thoughts on what it would like for Apple? Does it mean Cook
       | resigning, firing a team. How do we get from here to there? Is
       | CSAM on iPhone the next AirPower where it just gets cancelled?
        
       | RIMR wrote:
       | I keep hearing the same argument from people, and I'm mostly in
       | agreement, but to summarize the argument:
       | 
       | > _" I am not afraid of personally being found with child
       | pornography, but I don't want Apple searching my files looking
       | for criminal intent"_
       | 
       | But I would argue that you SHOULD be afraid of personally being
       | found with child pornography.
       | 
       | First and foremost, this feature could be exploited for harm. If
       | anyone ever wants to hurt you, all they have to do is get access
       | to your unlocked Apple device long enough to drop CASM into a
       | backed-up folder, and you get reported to the feds.
       | 
       | Or, way more realistically, maybe you downloaded some amateur
       | porno that you didn't know had underage participants in it, and
       | you saved it where it got backed up to the cloud, and two years
       | later the government has added the video to their known CSAM
       | database and you get reported to the feds for having it.
       | 
       | Given how the mere accusation of child sex crimes is enough to
       | destroy careers, marriages, and lives, I see absolutely no reason
       | anyone should trust their Apple devices with anything should this
       | become normal practice.
       | 
       | And if this becomes normal practice, it follows that CASM won't
       | be the only thing they end up looking for.
        
       | hanselot wrote:
       | Why won't you buy the new iPhone Billy? Are you a pedophile?
       | Guilty until proven innocent.
        
       | DarkmSparks wrote:
       | There are two things that make me think this will be walked back.
       | 
       | Firstly, and most importantly, this kind of backdoor is the kind
       | of thing that makes big corps prohibit the use/purchases of
       | devices.
       | 
       | Secondly, it seems rife for abuse: dont like someone who uses an
       | ios device, msg them some cp and destroy their entire life.
        
         | swiley wrote:
         | Apple's choices are do this or give people admin rights on
         | their devices. Keep in mind there are iOS features for doing
         | admin on the device that are kept internally, so that wouldn't
         | require much (if any) new software.
         | 
         | One makes them look bad and the other makes them lose money.
         | Which do you think they will choose?
        
         | faeriechangling wrote:
         | It seems easy as hell for them to just change the system to do
         | server-side scanning instead of client-side scanning and it
         | would probably be enough to calm the horde.
         | 
         | I think people can understand, Apple can't have certain content
         | on their servers. People have a much harder time understanding
         | that Apple needs to make sure you don't have certain content on
         | your phone.
        
           | stetrain wrote:
           | My understanding is they already do server-side. But relying
           | on server-side alone means trying to end-to-end encrypt your
           | data on the server would put them in political/legal
           | crosshairs.
        
           | nrabulinski wrote:
           | If iCloud content was encrypted (as it should be given what
           | Apple says about themselves and privacy) they wouldn't and
           | couldn't care any less what the data on their servers
           | contains since, ideally, they'd have no way of decrypting
           | that data. No need for any privacy invasion!
           | 
           | > Apple needs to make sure you don't have certain content on
           | your phone. Excuse me, what? What does Apple care what _I_
           | have on _my_ device?
        
             | vineyardmike wrote:
             | > they wouldn't and couldn't care any less what the data on
             | their servers
             | 
             | Except they're legally forced to care. Thats why this is
             | happening at all. They have a legal obligation to care,
             | encryption be damned. They chose to preserve encryption
             | instead of preserve trust.
        
               | stjohnswarts wrote:
               | they really don't though. no one can force them to scan a
               | user for anything, however if they do scan and spot CP
               | then obviously they have a duty to report it. Just like
               | you have a duty to report child abuse if you're a
               | teacher. No scans, no CP, no issues. This has been forced
               | on them by the government by some means I think, either
               | the US government or the CCP.
        
               | vineyardmike wrote:
               | It sounds like (from discussions and links in HN and
               | media) they actually have an obligation to scan any data
               | they host.
               | 
               | IANAL and could be wrong.
        
               | ipv6ipv4 wrote:
               | No, they are not. I linked to the text of the law itself
               | [1]. In particular note the section titled "protection of
               | privacy" that states:
               | 
               | (f) Protection of Privacy. --Nothing in this section
               | shall be construed to require a provider to-- (1) monitor
               | any user, subscriber, or customer of that provider; (2)
               | monitor the content of any communication of any person
               | described in paragraph (1); or (3) affirmatively search,
               | screen, or scan for facts or circumstances described in
               | sections (a) and (b).
               | 
               | [1] https://www.law.cornell.edu/uscode/text/18/2258A
        
         | decebalus1 wrote:
         | > this kind of backdoor is the kind of thing that makes big
         | corps prohibit the use/purchases of devices.
         | 
         | I wouldn't be too hopeful about this. They can just disable
         | this feature if the device is enrolled in MDM. For enterprise
         | devices, the company itself is responsible for what's on them.
         | And depending on the company, they may already scan all content
         | for a bunch of stuff (like exfiltrating trade secrets or
         | whatever).
        
         | resfirestar wrote:
         | >dont like someone who uses an ios device, msg them some cp and
         | destroy their entire life.
         | 
         | For this to work one would have to send multiple photos and
         | convince the other person to save the photos to their library.
        
           | Tsiklon wrote:
           | WhatsApp on iOS for example will by default, save images that
           | people send to you to your photo gallery.
        
             | FabHK wrote:
             | What about WhatsApp on Android? If it also stores to the
             | local photo library, and then syncs to the cloud, then it
             | would have been subject to this "vulnerability" for a long
             | time, no?
        
         | azinman2 wrote:
         | > Secondly, it seems rife for abuse: dont like someone who uses
         | an ios device, msg them some cp and destroy their entire life.
         | 
         | I see this a lot. First of all, if you even have CP in the
         | first place that seems quite bad and that you've made yourself
         | vulnerable. Messaging someone will also reveal yourself as the
         | sender. But my bigger question of this hypothetical issue is
         | that Facebook, Google, Dropbox, Microsoft, etc etc have been
         | scanning for CP all of this time on their servers. So it's not
         | like this is somehow a new proposition. Thus wouldn't this
         | already be a big issue through the many, many more people that
         | use all of these other company's cloud products?
        
           | skinkestek wrote:
           | I think a lot of people overestimate how hard it is to stay
           | hidden for small jobs.
           | 
           | Furthermore there is a lack of imagination here:
           | 
           | - work from a hacked account (I've seen gmail accounts that
           | have been "shared" with an attacker for long enough to let
           | the attacker having conversations with their bank, i.e.
           | attacker used the account but stayed hidden until the bank
           | called to verify details.)
           | 
           | - VPNs exist.
           | 
           | - Russia exist.
           | 
           | - nasty images can be embedded into details of innocent high
           | resolution photos.
           | 
           | - very soon they'll have to scan pdfs, pdf attachments and
           | zip files otherwise this is just a joke.
           | 
           | - at this time it becomes even easier to frame someone
           | 
           | - etc
           | 
           | Remember, the goal is probably not to get someone convicted,
           | just to mess up their lives by getting business contacts,
           | family and neighbors to wonder if they were really innocent
           | or if they just got off the hook.
           | 
           | That said, the bigger problem with having a secret database
           | of objects to scan for is that it is secret: anything can go
           | into it, allowing all kinds of fishing expeditions.
           | 
           | I mean: if you have such a wonderful system it would be a
           | shame not to use it to stop copyright infringement in the
           | west, blasphemy (Christian or Atheist literature) in Muslim
           | countries or lack of respect for the dear leader (Winnie Pooh
           | memes) in China.
        
           | DarkmSparks wrote:
           | >Messaging someone will also reveal yourself as the sender
           | 
           | Sending a link to 4chan /b would probably be enough. Or a
           | hidden iFrame in a forum post.
           | 
           | Its very very easy to get images onto someones device without
           | their consent, getting them to upload them to a 3rd party is
           | completely different.
        
             | stjohnswarts wrote:
             | Currently it would not catch that 4chan thing as they only
             | scan things that are uploaded to icloud photos. So you'd
             | have to click on it and save to photos. This is the current
             | implementation. I would guess that once they have their
             | hooks in though all files with an image extension/header
             | will be scanned and reported.
        
               | corobo wrote:
               | Send it via WhatsApp where it gets added to photos and
               | later iCloud by default if I recall correctly
        
               | stjohnswarts wrote:
               | yeah I'm not sure I do know it's a separate step to move
               | photo from whatsapp to "general" photo access. I would
               | guess it is roped off or they wouldn't need that step.
        
               | ninkendo wrote:
               | Citation desperately needed.
        
               | corobo wrote:
               | Try it? I'm not Wikipedia, first party research is
               | allowed. Go for it
        
               | Karunamon wrote:
               | In general, any time you save a photo on iOS, it goes
               | into the camera roll, which _is_ automatically backed up
               | into iCloud.
        
               | ninkendo wrote:
               | That wasn't the claim though, the claim was that if I
               | text someone a photo via WhatsApp it will do all of this
               | automatically without them choosing to save it.
               | 
               | If WhatsApp does this by default then, ho boy is that a
               | terrible implementation.
        
             | spicybright wrote:
             | Idk, most phones you just save an image and a cloud service
             | auto syncs.
        
             | azinman2 wrote:
             | So again, how is this different from Google Photos,
             | Facebook messenger, etc etc?
             | 
             | If this really was such an issue we would have hit it long
             | ago. CSAM scanning is going occurring for photos that would
             | be uploaded to iCloud.
        
               | DarkmSparks wrote:
               | because its scanning the content of the device, not what
               | you did on the device.
               | 
               | An iFrame full of CP downloaded from 4chan that fills the
               | browser cache with CP just by visiting a harmless site
               | would not go anywhere near Google Photos, Facebook
               | messenger, etc etc
               | 
               | It would trigger a scanner checking the browser cache for
               | CP images.
               | 
               | Those are just the obvious ways, by "rife" I mean there
               | is any number of ways to get CP onto someones device
               | without them knowing, unless the device itself is being
               | scanned nothing would achieved by this.
               | 
               | And if you dont think people are that messed up, check:
               | https://en.wikipedia.org/wiki/Swatting
        
               | resfirestar wrote:
               | Good thing this system doesn't scan browser cache, then.
               | Unless you're programmatically extracting photos from
               | your browser cache and uploading them to iCloud, which
               | would be a pretty impressive way to make sure you're
               | using up all your storage.
        
               | skinkestek wrote:
               | Ok, here is something that happened that could have
               | triggerd this:
               | 
               | Back when I used WhatsApp I got a lot of nice photos from
               | friends and family there.
               | 
               | WhatsApp doesn't (at least didn't) have a way to export
               | them but I could find them on disk and I had an app that
               | copied that folder to a cloud service. Problem is this
               | folder contained all images from all chats.
               | 
               | Now I'm not very edgy so most of my pictures are
               | completely non-problematic (in fact I don't know of any
               | problematic ones), but once in a while someone will join
               | one of your groups and post shock pr0n.
               | 
               | I've long since stopped backing up everything since I use
               | Telegram now and it allows me to selectively backup the
               | chats I want.
               | 
               | But I wanted to mention just one very simple way were
               | doing something completely reasonable could result in an
               | investigation of an innocent person.
        
               | azinman2 wrote:
               | But that's not the proposed design. Browser cache isn't
               | getting advanced. It's photos for iCloud.
               | 
               | I'm not asking if people do bad things, I'm saying over
               | and over again this is getting coverage on Hn and people
               | are pointing to this hypothetical issue -- yet this
               | hypothetical issue has been possible for years on many
               | more devices.
        
               | DarkmSparks wrote:
               | As far as I have read its scanning any images and
               | messages on the device, as well as text entered into
               | Siri.
               | 
               | So accidentally stick a w in your siri teen porn search
               | and you'll be seeing https://www.apple.com/v/child-
               | safety/a/images/guidance-img__...
               | 
               | If it was photos taken on the device there would be no
               | existing hash for the image to match.
        
               | wutbrodo wrote:
               | Totally tangential, but i didn't realize that there were
               | well-advertised "anonymous helplines for at-risk
               | thoughts". I'm kind of curious about it as a pathology
               | (what does "help" look like?), but I'm uneasy about even
               | getting that in my search history
        
               | azinman2 wrote:
               | https://www.apple.com/child-
               | safety/pdf/Expanded_Protections_...
               | 
               | "Does this mean Apple is going to scan all the photos
               | stored on my iPhone?
               | 
               | No. By design, this feature only applies to photos that
               | the user chooses to upload to iCloud Photos, and even
               | then Apple only learns about accounts that are storing
               | collections of known CSAM images, and only the images
               | that match to known CSAM. The system does not work for
               | users who have iCloud Photos disabled. This feature does
               | not work on your private iPhone pho- to library on the
               | device."
        
       | beckman466 wrote:
       | I don't get it, I thought my data was safely in the clouds...
       | 
       | You're not telling me they can access the clouds right? /s
        
       | augstein wrote:
       | I'm still baffled by this whole issue.
       | 
       | Snooping around on users devices, goes against anything regarding
       | privacy they seemed to stand for. It makes me sick to think this
       | might even be legal. It is also sad they completely ignore any
       | valid concerns about this intrusion.
       | 
       | Whats the next step if we as a society accept this behavior?
       | Homepod listening in, until hashes of conversation snippets match
       | some prehashed terms and sounds potentially related to child
       | abuse and then alerting the authorities?
        
       | notquitehuman wrote:
       | This was such a weird way for them to announce that they're no
       | longer pursuing privacy as a differentiation strategy.
        
         | benhurmarcel wrote:
         | The whole reason why they come up with that complicated on-
         | device scan instead of simply scanning their server is because
         | they are trying to push e2ee, on the contrary.
        
         | colordrops wrote:
         | And that's all it ever was, perceived differentiation. They
         | were never truly concerned with privacy ever.
        
           | notquitehuman wrote:
           | It was easier to dismiss in the past because the evidence to
           | the contrary has been largely circumstantial, of sufficiently
           | dubious origin, or not direct enough for people to accept as
           | proof.
           | 
           | They publicly announced a backdoor this time. They can't un-
           | invent the capability. You can now prove that Apple is lying
           | about its concern for privacy by citing their own website.
        
           | stjohnswarts wrote:
           | Not really you were essentially immune to a lot of the ad
           | spying and individual databases that google built up on you
           | to sell to 3rd parties that advertised to you. Also apps were
           | given much less freedom to spy on you or access your on
           | device file. Apple just threw all the good sentiment they
           | built up over google down the shitter and try to convince us
           | otherwise with "think of the children".
        
             | colordrops wrote:
             | They've had a backdoor for the FBI even before they started
             | positioning themselves as the privacy company.
        
         | busymom0 wrote:
         | What makes it even weirder is that just couple months ago,
         | everyone was praising Apple for their whole "Privacy" ads when
         | they announced iOS 15 with the do not track defaults.
         | 
         | I even think that Apple would have faced less backpack over
         | this CSAM stuff if they hadn't used the "privacy" shtick just
         | couple months ago.
        
       | trhway wrote:
       | a bit of stretched speculation - may be Apple is just ahead of
       | the curve getting ready (and thus naturally bringing it closer -
       | dialectics though) for a future where "common carrier"
       | protections would get eroded even more. We already have FOSTA
       | laws against websites and it isn't a big stretch to imagine that
       | the Apple style scanning would become a "reasonable care/etc."
       | under FOSTA style laws.
        
       | Tsiklon wrote:
       | "The ark of the covenant is open and now all your faces are going
       | to melt" - to paraphrase "The Thick of It".
       | 
       | Prior to this when a government tapped Apple on the shoulder
       | asking to scan for what they (the government) deem illicit
       | material, Apple could have feasibly say "we cannot do this, the
       | technology does not exist".
       | 
       | Now the box is open, the technology does exist, publicly and on
       | the record - Now we are but a national security letter, or a
       | court order with an attached super-injunction, away from our
       | computers and phones ratting us out for thought crime, government
       | critique and embarrassing the establishment. At first it's the
       | photos, but what about messages and browsing history or the
       | legion other things that people use their devices for, when do
       | they follow?
       | 
       | CSAM is beyond abhorrent, morally reprehensible and absolutely
       | should be stopped. This is not in question. And I have no reason
       | to doubt that there is sincerity in their goal to reduce and
       | eliminate it from Apple's perspective.
       | 
       | But they have root; this system and it's implementation is a
       | fundamental betrayal of the implicit trust that their users have
       | placed in them (however foolish it was, in hindsight). It just is
       | not possible to regain that trust now that it is gone, when the
       | level of control is so one sided and the systems involved are so
       | opaque.
        
         | darwingr wrote:
         | So even if they go back on this the announcement served as
         | strong signalling of capability.
        
         | FabHK wrote:
         | > the technology does exist, publicly and on the record
         | 
         | And not only that, but with a little footnote saying "*
         | Features available in the U.S." - in other words, publicly and
         | on the record: this can be switched on and off on per
         | jurisdiction. If another jurisdiction comes and says "here's my
         | list of hashes of illegal images", how can Apple say "sorry,
         | can't do..."?
        
         | jjcon wrote:
         | Well stated. I am one of those that trusted Apple, and though
         | it seems totally obvious now (the problems present with not
         | having full control of ones device) for whatever reason it took
         | all this for me to realize how privacy truly requires that
         | control.
         | 
         | You are also right that it would take a lot for me to trust
         | Apple again and they are unlikely to do the things required for
         | that since transparency, openness are not typically how they
         | operate.
        
         | feanaro wrote:
         | > CSAM is beyond abhorrent, morally reprehensible and
         | absolutely should be stopped. This is not in question.
         | 
         | Agreed. And the same holds for this subversion of user control
         | of devices. It too is beyond abhorrent and morally
         | reprehensible.
        
           | Tsiklon wrote:
           | I agree with you, however they are different kinds of
           | reprehensible, I don't believe they are equivalent.
           | 
           | One stokes an intimate and personal fear or revulsion,
           | something that any parent or anyone with empathy will feel.
           | 
           | The second allows for quiet enforcement of the subversion of
           | the relationship the state has with it's citizens.
           | 
           | Both are awful, but in their own incomparable way.
        
             | sneak wrote:
             | CSAM is actually pretty rare in society. Most people aren't
             | sexually attracted to children.
             | 
             | iPhones running current iOS are not rare at all. Much of
             | our society carries and uses them.
             | 
             | They are not equivalent in scale.
             | 
             | Subverting every iPhone in the world (or country) to spy
             | for the pigs is a much worse crime than all of the CSAM in
             | the world (or country).
        
       | sebow wrote:
       | CSAM, like anything other pretensed upon "protect the children",
       | it's not mainly about actually protecting the children.
        
       | HomeDeLaPot wrote:
       | I feel like this is a good thing given the circumstances we're
       | already in. It might catch some predators while still keeping
       | everyone's photos private. Sure, it's ripe for abuse, but when
       | has it not been that way? Haven't we always had to trust Apple to
       | do the right thing with our iDevices?
       | 
       | This could marginally reduce Apple's leverage against evil
       | governments seeking to pry into iPhones since "if you can do it
       | for CP, you can do it for tank man". But with Apple already being
       | the dictator of all OS-level code running on your device, this
       | capability isn't anything fundamentally new. Do we have any
       | guarantee that certain devices in China haven't received a
       | "special" version of some software update that introduces a
       | backdoor?
       | 
       | Personally, I have an Android and hope that my next phone will
       | run some flavor of Linux.
        
       | scumcity wrote:
       | Why on earth is Apple, a good tech company, with a culture of
       | secrecy, using slack? Can it not build its own secure version of
       | slack?
        
         | beckman466 wrote:
         | > Why on earth is Apple, a good tech company, with a culture of
         | secrecy, using slack? Can it not build its own secure version
         | of slack?
         | 
         | What else are they going to do with all that cash they're
         | sitting on?
         | 
         | Also in this way they are being patriotic and supporting
         | 'American innovation'; i.e. the American Intellectual Property
         | regime [1].
         | 
         | [1] https://tribunemag.co.uk/2019/01/abolish-silicon-valley
        
       | rootusrootus wrote:
       | I think Apple actually thought they were doing the right thing.
       | Maybe they are, and we're overreacting. Things have to calm down
       | a bit first before that will be clear. But either way, it was
       | [yet another] PR miscalculation. A very Apple thing to do...
        
       | poidos wrote:
       | I sure am happy I don't work at their retail stores any more; I
       | shudder to imagine the abuse the folks in the stores are getting
       | over this.
        
       | [deleted]
        
       | andrewmcwatters wrote:
       | Stay tuned! On the next episode:
       | 
       | "Apple bravely scans your text messages. Users greeted by
       | 'Messages noticed this website contains misleading content, are
       | you sure you want to share it?' in iOS 16." -MacRumors (2022)
        
       | FabHK wrote:
       | 0. I've initially been ambivalent or even supportive of Apple on
       | this, because much of the initial criticism was uninformed and
       | easily refuted, or hysterical. But I'm getting around to this
       | being a momentous blunder.
       | 
       | 1. I understand Apple doesn't like to discuss its plans publicly,
       | but rather present the final product when it is ready to ship.
       | But if anything was a good candidate for one of the rare
       | exceptions, then this, no?
       | 
       | 2. Why announce unambiguously that this feature is "available" in
       | the US only at first, in other words, can be
       | enabled/disabled/configured per region? What is the plan when
       | country X comes and says "fine feature you've built there, here's
       | our list of perceptual hashes of illegal imagery, and the contact
       | data of our competent authority to notify in case of matches."?
       | Replying "ah, sorry, we can't" is not going to fly anymore, ever
       | (!).
       | 
       | Even if they walk this back, what defence will they have left in
       | the future if any government requires them to implement this
       | odious feature and scan all images on-device and notify some
       | authority if anything untoward (in that jurisdiction) is found?
        
       | jpxw wrote:
       | Question for Android users: do you have Google Photos backup
       | enabled?
       | 
       | I'd say most people have iCloud Photos enabled, so I'm trying to
       | gauge whether that's true of Google Photos too.
       | 
       | Google Photos also does CSAM scanning, I believe
        
         | vineyardmike wrote:
         | It's not about CSAM scanning its about where the scanning is
         | done!
         | 
         | > I'd say most people have iCloud Photos enabled, so I'm trying
         | to gauge whether that's true of Google Photos too.
         | 
         | I'd say most people don't have CSAM eitherway.
        
         | coolspot wrote:
         | On both platforms those are enabled by default.
         | 
         | 96% users keep defaults[1].
         | 
         | [1] - My estimate
        
       | amelius wrote:
       | Now I know what the "i" in iPhone stands for: intelligence.
        
         | joshxyz wrote:
         | and the s in iPhone 4s stands for signals, hahaha
        
       | sharken wrote:
       | Message to Apple should they care:
       | 
       | The next Apple iPhone upgrade is on hold, until all plans about
       | CSAM scanning on-device is cancelled.
       | 
       | Privacy matters.
        
       | manmal wrote:
       | > But Apple was surprised its stance then was not more popular,
       | and the global tide since then has been toward more monitoring of
       | private communication.
       | 
       | Personally speaking, Apple denying breaking into a phone in 2016
       | was indeed one of the reasons I believed them when they said they
       | stand up for privacy. This is now developing a foul after taste -
       | when it _really_ counts, they don't publicly stand up against
       | governments.
        
       | mensetmanusman wrote:
       | Our only option is minority screech.
        
         | istingray wrote:
         | Pikachu chooses "minority screech":
         | https://i.imgflip.com/5jfpjo.jpg
        
       | blub wrote:
       | Those being outraged about this decision haven't accepted yet
       | that one can't solve legal issues around encryption with
       | technology - not even with open source.
       | 
       | The UK for example can throw people in prison if they don't hand
       | over their passwords/keys.
       | 
       | Admitting this however would by necessity lead to admitting that
       | citizens of the US/EU have almost zero say in how their
       | government legislate encryption. All these countries wanted to
       | subvert encryption for decades already and they have all the time
       | in the world, unlimited budgets and every other possible tool in
       | their arsenal.
        
       | tempestcoder wrote:
       | Apple is already scanning photos for faces and other objects.
       | Just search in the photos app for something like dog or baby and
       | you'll see (assuming you've taken a photo of a dog or baby
       | before).
       | 
       | How is this much different?
        
       | rz2k wrote:
       | I stopped my Apple Music subscription and downgraded iCloud to
       | free, then donated a year of those fees to the EFF.
       | 
       | $13/month is nothing to Apple, and I really regret it since
       | photos in iCloud is really convenient, however it feels unethical
       | to contribute to the Apple services system when it will
       | inevitably be used to hurt people like dissidents and whistle
       | blowers.
       | 
       | It is arrogant to blow off security and human rights experts
       | outside of Apple as simply confused or mistaken, when there are
       | sufficiently informed and imaginative people within Apple to
       | understand the practical implications of preemptively
       | investigating users. Such contempt for users is also a sign of
       | worrisome complacency.
       | 
       | On the plus side, Apple has seemed to crowd out interest in third
       | party OS user interfaces. Knowing that Apple believes you're a
       | pedophile until they've run image analysis software on your stuff
       | is one way to motivate people. Maybe 2021 will be the year of
       | Linux on the desktop.
       | 
       | Anyway I hope that more users sacrifice the convenience of Apple
       | services now that its hostility is apparent, and I hope that more
       | people tell their political representatives that it is time for
       | meaningful updates to anti-trust legislation. I know that I was
       | complacent when I thought that having a choice, where at least
       | one of the companies didn't place ads in the start menu, was a
       | tenable state of the market.
        
         | hypothesis wrote:
         | I totally agree that those actions individually are not much.
         | However, with enough people doing those small actions, it will
         | add up in the long run. Plus we're going to be better off with
         | open alternatives.
         | 
         | They just lost a few thousands from me alone not upgrading all
         | my iDevices for next model year. Savings go to privacy advocacy
         | organizations.
        
         | swiley wrote:
         | I fomated my iPhone and drained the battery.
        
       | icf80 wrote:
       | People will just go back to use normal cameras for photos where
       | there is no scanning, the bad guys will use that anyway. It makes
       | no sense for Apple to do that, unless they are preparing for
       | something more big.
        
         | eqtn wrote:
         | What if there is a chat app(like WhatsApp) that saves the photo
         | on to the gallery? Someone can send such a photo, apple scans
         | it and notifies the authorities.
        
       | mensetmanusman wrote:
       | I'm more mad that they made a tool that will clearly be abused by
       | totalitarian governments that need to squash protests in HK for
       | example.
        
         | stjohnswarts wrote:
         | Certainly this can be used just as easily to analyze documents
         | on the phone for antigovernment sentiment and I think Apple
         | will use it for that eventually because "the Chinese government
         | demanded it"
        
       | sathio wrote:
       | I understand they are going to make perceptual hashes of images,
       | but what are they going to do to videos?
       | 
       | Did they not see Fight Club where frames were added on videos?
       | Are they going to hash each frame from videos as well? All 60 fps
       | 4k videos? I'm sure that a lot of people saw that movie, and some
       | of them might be pedophiles. It really looks like an half assed
       | solution
        
       | neximo64 wrote:
       | The only reason Apple would be so cavalier with CSAM is knowing
       | that the data is already accessible using other means. I guess
       | that's why they have a disregard for the privacy aspect of it.
        
       | gibsonf1 wrote:
       | Its a terrible move for their business, I am already looking for
       | an alternative.
        
         | istingray wrote:
         | Nice, what have you found so far? I just started a Nextcloud
         | today, 2GB free from most providers listed here:
         | http://nextcloud.com
        
           | gibsonf1 wrote:
           | Ahh, for a cloud, https://trinpod.us :)
        
       | jack_pp wrote:
       | CSAM is such a blatant excuse to me it blows my mind how such a
       | huge mindshare on here believes this is actually about child
       | protection or is a valid argument. Since this is public
       | information do you think pedophiles will just get caught by this?
       | Will this curb their appetite?
       | 
       | If somehow I found myself in an authoritarian state where porn
       | was illegal just on apple devices do you think I wouldn't just
       | switch to a different OS?
       | 
       | This is a bullshit excuse and we've seen it countless times. It
       | is manipulation 101, get the victim to agree to a small request
       | first then ask more and more.
       | 
       | I'm glad I never used apple devices and I'm not caught in their
       | ecosystem.
        
       | WhyNotHugo wrote:
       | A bit off topic, but I'm amazed about the huge media attention
       | this is getting, while a far more serious precedent is being set
       | in the EU, completely banning all private communications between
       | individuals:
       | 
       | https://www.patrick-breyer.de/en/posts/message-screening/?la...
       | 
       | The short version is: all communication providers may include
       | back doors and analyse messages of users with the same excuse.
       | The follow-up is to make this mandatory. In practice, this will
       | be a ban on any end-to-end encrypted communications.
        
       | post_break wrote:
       | Cancelled my Apple TV+, iCloud. In the process of selling my
       | iPhone and Apple watch. I know it seems crazy but I feel betrayed
       | and this is the only way I can protest this. Will I have less
       | privacy on android? Yes.
        
         | leucineleprec0n wrote:
         | It's arguable Android, even stock, is better for privacy
        
         | 6yyyyyy wrote:
         | I've had a pretty good experience with LineageOS for microG:
         | https://lineage.microg.org/
        
         | istingray wrote:
         | Taking action ftw. Any alternatives you're purchasing or
         | thinking about? I'm quite interested in http://puri.sm. Just
         | setup a NextCloud today, they have free 2GB of hosting plans:
         | http://nextcloud.com
        
           | post_break wrote:
           | I have a synology, so I think they have an app to push photos
           | directly to my home nas.
        
           | 6AA4FD wrote:
           | Synching will also do the trick, with more configuration
           | involved perhaps.
        
       | thevagrant wrote:
       | I think this has been developed by Apple attempting to balance a
       | complex issue: E.g. Government request asking them to report with
       | certainty that their cloud storage is not used for illegal
       | activity (CP in this particular case but each country/govt may
       | end up with different requirements).
       | 
       | Apple thought they found a clever way to satisfy a government
       | request of "Can you guarantee your storage is not used to house
       | XYZ". Apple then continues to be able to advertise 'privacy'
       | while retaining E2E encryption (or at least have E2E across all
       | services eventually).
       | 
       | What they didn't forsee is the potential slippery slope backlash
       | that the IT community has become concerned about.
       | 
       | The scanning feature could be used in more nefarious ways (if
       | modified in future). For example. The hash checking could be
       | altered to check and hash against each photo metadata field
       | instead of the photo itself.
       | 
       | Now we can find who took photos in a particular location at a
       | particular point in history while still retaining E2E!
       | 
       | Would it go that far? Or can we trust Apple to stop moving the
       | line in the sand?
        
       | wonderwonder wrote:
       | There is no way this doesn't get out of hand. This program is now
       | the number one hacking target for every nation on earth.
       | Currently it scans for CSAM, but can 'Free Hong Kong' images get
       | added? What about Q or anti Putin literature? What if the US
       | government believes incredibly sensitive documents were
       | photographed and lives depend on finding out who did it? Is apple
       | going to start reporting teens for selfies? This is a stone throw
       | from escalating to minority report. I know this seems a little
       | far fetched but it is very rare for governments or companies to
       | willingly become less intrusive, it always escalates.
        
         | [deleted]
        
       | ineedasername wrote:
       | The problem is that, as much as this capability is disturbing,
       | Apple chose it's first target well: It's difficult to be too
       | vocal about this without the risk of getting labelled weak on
       | underage sex trafficking and kiddie porn.
       | 
       | So, no matter the criticism, Apple won't seriously be pressured
       | to reverse course, and after that the door is open on incremental
       | increases in content monitoring.
       | 
       | (Not to mention the problems that could arise from false
       | positives.)
        
       | pdx6 wrote:
       | Has someone built a guide to self host on iOS/MacOS so the CSAM
       | issue is moot? Or what are the other platform options, like on
       | Android?
       | 
       | I'm concerned that Apple's stance on security has been
       | compromised and it is time to dump the platform or find an
       | acceptable modification.
       | 
       | I'm surprised there's no gist being linked on HN about to get
       | people bootstrapped for secure device backups as an alternative
       | to iCloud (and without a jailbreak).
        
       | enlyth wrote:
       | If it's on device, is it possible to reverse engineer the hashing
       | algorithm? I'm assuming some clever people can then brute force
       | some matching images with nonsense in them that test positive
        
       | justinzollars wrote:
       | I have advice for Apple employees. Think Different.
       | 
       | You've become Microsoft. I might as well buy a facebook phone.
        
         | echelon wrote:
         | Part of the reason this might succeed is the Apple fans.
         | They're defending Apple yet again despite the treasons against
         | our privacy being committed.
        
       | underseacables wrote:
       | What worries me most is that hackers and others with ulterior
       | motives, will engineer a situation creating a false positive, or
       | worse. If you want to absolutely destroy someone, accuse them of
       | possessing pornographic material of children. Even if it turns
       | out to be absolutely false, the stain is still there, the bell
       | cannot be unrung. Damage both in the short and long-term to
       | reputations, careers, the lives, will be devastating. None of us
       | are safe from this.
       | 
       | "yeah but there was that one time Apple thought she had kid porn
       | on her phone.. she said it was a false positive, police
       | investigated never prosecuted etc., but...better to not hire,
       | etc..you know..just in case"
        
       | akersten wrote:
       | As it should be sparked! Apple is asking its engineers to create
       | spyware and deploy it on millions of devices. "Concern" is
       | majorly understating my feelings right now.
        
         | [deleted]
        
       | amaajemyfren wrote:
       | I'm not an expert here but it may be time for white box no name
       | mobile phones that come with nothing that we have to set up
       | ourselves. My second computer was the first time I downloaded
       | from both kernel and kerneli and rolled my first Linux Kernel
       | based gnu system....
        
       | brandon272 wrote:
       | As I see people straining to defend Apple, coming up with all
       | types of reasons about on-device image checking will not be
       | abused or expanded, about how people shouldn't really expect to
       | not be scanned, I can't help but just think of Apple's own words
       | and current public position on privacy.
       | 
       | Here's their privacy page: https://www.apple.com/privacy/
       | 
       | First sentence:
       | 
       |  _" Privacy is a fundamental human right."_
       | 
       | Apple has adopted this as a "core value". Their executives use
       | this exact phrasing.
       | 
       | It is an extraordinary claim. They don't say, "We think privacy
       | is important". They claim it as a fundamental human right (i.e.
       | applying to everyone, unconditionally) and wrap themselves in the
       | banner of it.
       | 
       | How anyone can square that with statement, that self-described
       | "core value" with their recent on-device image checking
       | announcement, I really have no idea.
        
         | fjabre wrote:
         | They can't.
         | 
         | It will hurt their sales. It will hurt Apple.
         | 
         | They will have to walk this back or face devaluation.
         | 
         | They've already done a ton of damage to their brand. It's
         | really hard to believe they would sacrifice their brand like
         | this. This is Apple.
         | 
         | They made their reputation on privacy.
         | 
         | You are 100% right. You cannot square the two.
        
       | ahnick wrote:
       | How is Apple's CSAM implementation not a violation of the fourth
       | amendment?
        
         | dragonwriter wrote:
         | Apple isn't the government.
        
           | ahnick wrote:
           | They are acting on behalf of the government.
        
             | dragonwriter wrote:
             | AFAICT, they are not. But if you have evidence that this is
             | _government directed_ (reporting results tonthe government
             | isn't sufficient for the private party's action to be "on
             | behalf of" government for Constitutional purposes), please,
             | present it.
        
               | busymom0 wrote:
               | The problem is that this is very hard to prove since
               | government is a black box.
               | 
               | Take this example where Pelosi, one of the richest
               | members of Congress has personal calls with Tim Cook
               | where she also holds her most worth stock of 17%:
               | 
               | https://greenwald.substack.com/p/nancy-and-paul-pelosi-
               | makin...
        
             | nullc wrote:
             | If they are acting on behalf of the government there is a
             | massive secret conspiracy to illegally violate the
             | constitutional rights of hundreds of millions of Americans.
             | 
             | It wouldn't be the first time... but it shouldn't be our
             | first assumption.
             | 
             | Instead, in litigation caused by this scanning the
             | companies of testified that they conduct the scanning out
             | of their own free will with no coercion or incentive of the
             | government, simply because they don't want their brand
             | being connected to the distribution of child porn.
             | 
             | Implicitly, they'd rather be associated with unaccountable
             | mass surveillance-- with products that violate human
             | rights, and with a push of a button could be used to enable
             | genocide.
             | 
             | They aren't stupid, so you have to assume that they think
             | that the latter is less of a hit on their bottom line than
             | the former. Lets prove them wrong.
        
             | justinzollars wrote:
             | We don't have any direct evidence of this, or any way of
             | proving this, but it is suspicious
        
           | justinzollars wrote:
           | Why did Apple invest so much engineering effort, into
           | something so complex, without being told to do so? This move
           | doesn't benefit their brand, or the year of advertising they
           | have done to differentiate themselves with Google and
           | facebook as a privacy focused company.
        
             | dragonwriter wrote:
             | > This move doesn't benefit their brand, or the year of
             | advertising they have done to differentiate themselves with
             | Google and facebook as a privacy focused company.
             | 
             | They probably don't want "safe haven for child porn" to be
             | part of their brand distinction from other online firms.
        
               | colordrops wrote:
               | Who is pushing this message though? I've never heard
               | Apple accused of this.
        
       | rdiddly wrote:
       | Can we talk about the irony that the NCMEC's database of CSAM
       | (two abbreviations I didn't know before this week) is literally a
       | huge child porn collection, or is that too immature? The whole
       | story sounds like bad science fiction to me, I gotta say. But
       | sadly I think we have to beware of anything that's packaged as
       | anti-child-porn just like whatever's anti-terrorism, anti-crime
       | etc. etc. You should be thinking "What is the thing that's
       | slightly less heinous than child porn, that needed to be wrapped
       | in the magic cloak of comparative heinousness for anyone to be
       | horrified enough to accept it?"
        
         | throwaway93832 wrote:
         | Only the priests of high are sacred enough to handle the
         | artifices, to administer their truths to the gentiles.
         | 
         | https://www.latimes.com/california/story/2021-05-05/lapd-off...
        
       | mlindner wrote:
       | > A fundamental problem with Apple's new plan on scanning child
       | abuse images, critics said, is that the company is making
       | cautious policy decisions that it can be forced to change, now
       | that the capability is there, in exactly the same way it warned
       | would happen if it broke into the terrorism suspect's phone.
       | 
       | This is the most important part that seems to be missing in every
       | denial that Apple won't work with governments to spy on various
       | types of content. _It's not Apple's choice to make._ They have no
       | control over whether they're forced to or not.
        
       | icf80 wrote:
       | Apple 2019: 'what happens on your iPhone, stays on your iPhone'
       | 
       | https://9to5mac.com/2019/01/05/apple-privacy-billboard-vegas...
       | 
       | Apple 2021: but we will scan your photos anyway
       | 
       | I don't understand something they are scanning "using a database
       | of known CSAM image hashes"
       | 
       | This means new photos will not be in that database, so what is
       | the point of scanning for old photos as it will not prevent new
       | crimes!
        
       | tibbydudeza wrote:
       | I'd rather prefer server side than local device scanning.
       | 
       | What prevents CSAM from being compromised on the device by a
       | rouge state actor to smear and false flag opponents ???.
        
       | ZFH wrote:
       | "Apple says it will scan only in the United States and other
       | countries to be added one by one, only when images are set to be
       | uploaded to iCloud, and only for images that have been identified
       | by the National Center for Exploited and Missing Children and a
       | small number of other groups."
       | 
       | First reference I read about adding other countries as a done
       | deal, and especially about an incredibly opaque "small number of
       | other groups" being involved as well.
       | 
       | Man, after all they've said about being obsessed with privacy
       | they're really not doing themselves any favors here. What a
       | tragedy.
        
         | BikiniPrince wrote:
         | The children trope is a really big red herring. Anything to get
         | the foot in the door.
        
           | calvinmorrison wrote:
           | The 4 horsemen of the Privacy Apocalypse: csam, terrorists,
           | piracy and drugs!
        
             | flixic wrote:
             | I'd say swap piracy and drugs, and you have the escalation
             | order.
        
       | [deleted]
        
       | mmeinesz wrote:
       | I happened to speak to an employee of the police here in Germany.
       | He is in charge of backing up data from suspects. It is mostly
       | drug related, but second most: Child Porn. It is really
       | frightening about the amount of data he has worked on in the last
       | week. And that is only for a county of approx. 200k inhabitants.
        
       | righttoolforjob wrote:
       | This is the ultimate lock-in strategy. If you switch from iPhone
       | now, everyone will suspect you of being a pedo and you'll have to
       | endure the social consequences.
        
         | stjohnswarts wrote:
         | People switch back and forth between iphone and android all the
         | time.
        
           | righttoolforjob wrote:
           | Apologies for my joke.
        
       | akiselev wrote:
       | _> It 's a complete change of narrative and there's no easy way
       | to explain it and still defend Apple's Privacy narrative, wihout
       | doing extreme mental gymnastics._
       | 
       | Everyone who took Apple at their word was already doing extreme
       | mental gymnastics because Apple's privacy stance was a farce on
       | borrowed time to begin with. Now it's just blatantly obvious to
       | everyone.
        
         | dang wrote:
         | Please don't take HN threads further into flamewar. It makes
         | discussion shallower, more tedious, and nastier.
         | 
         | We detached this subthread from
         | https://news.ycombinator.com/item?id=28163326.
        
           | akiselev wrote:
           | What criteria do you use to evaluate whether a post takes a
           | thread further into a flamewar or not? In what way did my
           | reply make the rest of the discussion "shallower, more
           | tedious, and nastier"? I felt that this sparked a lively
           | (albeit short) debate about a blindspot that a great many
           | readers seem to have.
           | 
           | It took a six hour rollercoaster ride before flags killed it
           | - not even a fair shake for anyone to vouch despite no
           | flagged children.
        
             | dang wrote:
             | Two issues:
             | 
             | 1. Your comment didn't add any information--it was just
             | grandiose, inflammatory claims ("extreme mental
             | gymnastics", "farce to begin with" and "blatantly obvious")
             | 
             | 2. Replies like
             | https://news.ycombinator.com/item?id=28163531 are to be
             | expected to such comments. This is the way that discussion
             | degrades. (That was the top reply to your comment before I
             | downweighted it, so the effect was a lot more obvious
             | before that.)
             | 
             | Ok, three issues:
             | 
             | 3. This entire subthread is way more generic than the
             | better parts of the discussion. That's to be expected from
             | inflammatory comments that don't add information. Generic
             | threads are much more predictable and much less interesting
             | than specific ones: https://hn.algolia.com/?dateRange=all&p
             | age=0&prefix=true&sor....
        
               | akiselev wrote:
               | _> 1. Your comment didn 't add any information--it was
               | just grandiose, inflammatory claims ("extreme mental
               | gymnastics", "farce to begin with" and "blatantly
               | obvious")_
               | 
               | "Extreme metal gymnastics" was the only inflammatory
               | claim and it was the same language used by the person I
               | was replying to. "Farce" has no more charitable a synonym
               | [1] and "to begin with" does not make it any more or less
               | inflammatory. "Blatantly obvious" - really? It was.
               | Plenty of replies agree based on the merits of the
               | argument. This isn't a flamewar
               | 
               |  _> 2. Replies like
               | https://news.ycombinator.com/item?id=28163531 are to be
               | expected to such comments. This is the way that
               | discussion degrades. (That was the top reply to your
               | comment before I downweighted it, so the effect was a lot
               | more obvious before that.)_
               | 
               | People get defensive sometimes, so what? Take the
               | charitable interpretation. (and this is a deflection - I
               | was asking about my comment).
               | 
               | "Please stop insulting... Don't act superior..." - how in
               | the world did that get flagged!? How does this level of
               | genuine politeness degrade the discussion?
               | 
               | "No one thinks you're any smarter by saying "I told you
               | so"." - ...really? By HN snark standards thats praise.
               | 
               |  _> 3. This entire subthread is way more generic than the
               | better parts of the discussion. That 's to be expected
               | from inflammatory comments that don't add information.
               | Generic threads are much more predictable and much less
               | interesting than specific ones: ..._
               | 
               | That's a self fulfilling judgement (see above).
               | 
               | Worst case I'm guilty of extreme snark. Call it a coping
               | mechanism.
               | 
               | [1] https://www.thesaurus.com/browse/farce
               | https://www.thesaurus.com/browse/facade
        
               | dang wrote:
               | > Worst case I'm guilty of extreme snark.
               | 
               | That's a really bad worst case, far below the line the
               | guidelines draw. Would you mind reviewing them
               | (https://news.ycombinator.com/newsguidelines.html) and
               | taking the intended spirit of the site more to heart? We
               | want thoughtful, curious conversation here, not snark and
               | fulmination.
               | 
               | > People get defensive sometimes, so what?
               | 
               | The problem is that it evokes worse from others and leads
               | to a degenerative spiral, ultimately to flamewars, and in
               | the long run to the site burning itself to a crisp.
               | Remember that this has traditionally been the fate of
               | internet forums and HN was started as a conscious
               | experiment in trying to avoid that
               | (https://news.ycombinator.com/newswelcome.html). Scorched
               | earth is not interesting (https://hn.algolia.com/?dateRan
               | ge=all&page=0&prefix=false&qu...).
               | 
               | It's possible to learn not to provoke this kind of thing,
               | and that's what we're asking users here to do. Of course
               | one can't predict specifically how others will react, but
               | one can definitely play the odds. Since the odds are
               | roughly knowable in advance, we want users to post
               | comments with a positive expected value, so to speak: htt
               | ps://hn.algolia.com/?dateRange=all&page=0&prefix=true&sor
               | ...
        
               | akiselev wrote:
               | _> That 's a really bad worst case, far below the line
               | the guidelines draw. Would you mind reviewing them
               | (https://news.ycombinator.com/newsguidelines.html) and
               | taking the intended spirit of the site more to heart? We
               | want thoughtful, curious conversation here, not snark and
               | fulmination._
               | 
               | Please give me the benefit of the doubt that after a
               | decade on HN I have read the guidelines many times and
               | reply to the meat of my comment instead of the knee-jerk
               | rhetoric (aka joke) at the end.
               | 
               | *> The problem is that it evokes worse from others and
               | leads to a degenerative spiral, ultimately to flamewars,
               | and in the long run to the site burning itself to a
               | crisp.
               | 
               | IMO it demonstrably didn't.
        
               | dang wrote:
               | It did enough to make the point; but in any case, what
               | matters is not the outcome in any one particular case,
               | but the statistical outcome in the long run.
        
         | naeeqn wrote:
         | I agree. I wanted to believe that despite being a greedy
         | coropration - like they are all - and despite the hidden
         | compromises in security they do (like not encrypting backups
         | because the FBI asked them to) they would at least try their
         | best to avoid being scumbags.
         | 
         | Too bad. Scumbags they are.
        
         | busymom0 wrote:
         | While I knew that the "privacy" stance from Apple in prior
         | years was always a marketing facade to sell more devices, I
         | still continued to believe that they would keep sticking up for
         | privacy as much as possible.
         | 
         | It's kind of funny how I am very suspicious of governments and
         | politicians but when it comes to a trillion dollar corporation,
         | I was giving them more of benefit of doubt. I was wrong.
        
           | istingray wrote:
           | I appreciate you saying this, I felt the same way. I was
           | delighted when Apple started selling privacy hard, and hoped
           | it would persist. Their "Mind your own business" [1] ad was a
           | delight - only 2 months ago. Now my question is, which of the
           | annoying characters in the ad is Apple?
           | 
           | [1] https://www.youtube.com/watch?v=8w4qPUSG17Y
        
             | rossjudson wrote:
             | So few people realize this!
             | 
             | Say, what is Apple's advertising policy? Hmm. It looks very
             | familiar, doesn't it?
             | 
             | https://support.apple.com/en-us/HT205223
        
           | heavyset_go wrote:
           | > _I still continued to believe that they would keep sticking
           | up for privacy as much as possible._
           | 
           | Considering how often Apple gives up customer data without a
           | fight when requested to by the government, I don't really
           | believe that.
           | 
           | According to Apple, they respond to government data requests
           | with customers' data ~85% of the time, and 92% in cases of
           | "emergencies"[1].
           | 
           | Apple gave customer data from over 31,000 users/accounts
           | based on FISA requests and National Security Letter requests
           | in the first half of 2020 alone[1].
           | 
           | During that same 6 month period, Apple provided customers'
           | data to the government's data requests (not FISC related)
           | about 9,000 times[1].
           | 
           | [1] https://www.apple.com/legal/transparency/us.html
        
             | wyre wrote:
             | I don't know if the first half of 2020 is a good time frame
             | for sampling as this is when the country had the largest
             | amount of protests in history and the police are going to
             | be making a lot of requests in retaliation.
             | 
             | Doesn't make Apple look any better though. I'm curious
             | about how often Google and Facebook hand over information.
        
               | heavyset_go wrote:
               | There's a drop down menu that you can use to see other
               | time periods. The numbers and rates in the July to
               | December 2019 period are similar to 2020's.
        
             | heavyset_go wrote:
             | > _During that same 6 month period, Apple provided
             | customers ' data to the government's data requests (not
             | FISC related) about 9,000 times_
             | 
             | To clarify: The 9,000 figure is the number of individual
             | data requests, but the requests themselves asked for data
             | from a total of ~120,000 different users/accounts. Multiple
             | users' data can be requested in a single request.
        
           | 7sidedmarble wrote:
           | Have you ever considered that the profit motive might be
           | responsible for leading _all_ companies to act unethically?
           | In the future, it 's a good reason never to trust them.
        
           | akiselev wrote:
           | I don't necessarily think you were wrong to believe in Apple,
           | perhaps just a bit naive about how little "as much as
           | possible" is good for in this context.
           | 
           | Apple is a public corporation at the mercy of several
           | superpowers. Economic incentives and the people with the guns
           | make the rules. Full stop. Apple's privacy stance was always
           | one desperate/amibitious business unit or bureaucrat away
           | from complete collapse.
        
         | stjohnswarts wrote:
         | Please stop insulting people because they chose less privacy
         | than you. Don't act superior. I would posit that just about
         | everyone who reads hackernews knows that privacy and security
         | are on a spectrum. People complaining about this as the last
         | straw have every right to do so and this is a huge leap from
         | Apple's previous position. No one thinks you're any smarter by
         | saying "I told you so".
        
           | istingray wrote:
           | Important adjustment -- privacy and surveillance are what's
           | on a spectrum. Security is not the inverse of privacy.
        
             | NikolaNovak wrote:
             | I was going to say; I'm aware security is always a
             | compromise (vs speed or ease of use or budget etc), but
             | I've never seen that privacy and security are the natural
             | opposites / inverses in same spectrum.
        
               | istingray wrote:
               | "Security vs privacy" is unfortunately the most common
               | mistake made about privacy discussions.
        
           | akiselev wrote:
           | My daily drivers are an Android and an iPhone while my
           | PinePhone gathers dust, so I don't understand what this has
           | to do with my privacy choices. Nor did I say it to act
           | superior or imply that no one had a right to complain.
           | 
           | This is merely a reminder that Apple was founded in and most
           | HN readers live in the United States, a nation founded on the
           | idea that, without checks and balances, any power that can be
           | abused will be abused. Without something legally binding and
           | someone holding Apple to their word, their privacy stance was
           | never going to be anything more than a temporary ploy in
           | their ultimate goal: to make more money.
        
           | [deleted]
        
           | AuthorizedCust wrote:
           | I think what he is saying is that Apple has consistently not
           | acted in the interest of consumers. Therefore, why should we
           | take their privacy claims seriously? Remember when they
           | attacked journalists, for example? How about the corporate
           | culture of secrecy? That is far from consumer friendly.
        
           | b0tzzzzzzman wrote:
           | I have told people so?
        
         | btgeekboy wrote:
         | It's only a "farce on borrowed time" because you have the
         | benefit of hindsight.
        
           | GauntletWizard wrote:
           | I think you'll find that a few of us, above poster included,
           | have been saying all along it's a farce - that their stances
           | have been missing the point for a long time. In that case,
           | it's called foresight. Sometimes just luck, but repeated
           | lucky guesses are indistinguishable from predictive power.
        
           | tehjoker wrote:
           | No, this is obvious to anyone that understands that companies
           | one master: profit. The soon as it becomes convenient to
           | discard a "principled" stand, a company will do it. Apple was
           | using privacy as a wedge to attack the Android consumer base
           | or because they wanted their customers to believe they were
           | special to reinforce their velben brand, not because they
           | have some deeply held belief. In this case, perhaps Apple saw
           | a way to do a favor for the government in exchange for some
           | policy that will advantage it.
        
           | Valkhyr wrote:
           | Not true, unless by "having the benefit of hindsight" you
           | mean "having watched Apple's actions for the last few years".
           | 
           | Yes, they do a lot for privacy. But when it comes to their
           | bottom line, they also have a record of compromising on
           | privacy (and consumer rights in general) to preserve business
           | with less than freedom-loving countries such as China, the
           | UAE, Russia.
           | 
           | It is sometimes difficult to criticize them for this because
           | the financial loss to them would be huge if, say, China
           | kicked them out of their market (and utterly devastating if
           | they kicked them out of manufacturing in China), and because
           | people like to make the argument that iOS is (probably)
           | "still the the bets option for privacy in China" - but it
           | doesn't change the fact that in Apple's hierarchy of
           | priorities privacy ranks lower than making money.
           | 
           | To give a concrete example: if Apple allowed sideloading of
           | apps as Android does, Apple would no longer be in the
           | position to remove VPN apps on behest of China - but at the
           | cost of opening up app distribution outside their own store,
           | which means no free rent-seeking income from that anymore.
           | They'd now actually need to compete on providing the _best_
           | store for developers, which is obviously going to be more
           | work and cost for them. So, instead they choose the  "lesser
           | evil" of putting themselves in a position where they are the
           | only thing that stands (or rather: drops dead lie a wet sack)
           | between an authoritarian state and people trying to
           | circumvent that state's surveillance.
           | 
           | It's a good thing Apple shows more Courage(TM) when it truly
           | counts, for instance when it comes to ridding us all of that
           | terrible scourge of human existence, the 3.5mm jack.
        
             | OrvalWintermute wrote:
             | Was recently thinking that Apple may pull a Coke, and bring
             | back the 3.5mm for a number of reasons.
             | 
             | Except that they are making money on overpriced headsets
             | and acquisitions like Beatz.
             | 
             | I guess they have always compromised on privacy to make
             | more $$, whether it is the crummy protections for browsing
             | on iOS, selling out to Google, or the nearly required
             | bluetooth. Now, I am sure there are a lot more things to
             | add to this list..
             | 
             | Two Final thoughts:
             | 
             | From a business perspective this is a serious impairment on
             | Goodwill.
             | 
             | Loss of Privacy is going to negatively impact Apple
             | services.
        
           | swiley wrote:
           | Many of us saw this coming years ago and got into pretty
           | repetitive arguments with those who need the hindsight to see
           | this.
        
       | stephc_int13 wrote:
       | I want to own the devices I buy, hardware and software. Apple has
       | no business doing what is the role of a state.
       | 
       | I, quite simply, don't trust them.
        
         | Hackbraten wrote:
         | Me neither. I used to trust them to do the right thing for
         | their customers but not anymore.
        
           | stephc_int13 wrote:
           | I don't understand what they are thinking.
           | 
           | This project is probably going to cost millions, they won't
           | catch any pedophiles, they will only scare them so they don't
           | use Apple devices.
           | 
           | Nobody asked them to do it.
           | 
           | This is none of their business.
           | 
           | It can only creates new vulnerabilities, privacy issues and
           | general annoyance.
        
             | blisterpeanuts wrote:
             | Unfortunately, what with all the negative publicity this is
             | causing, Apple now has a huge incentive to catch somebody,
             | anybody, just to justify the project. The thing about
             | pedophiles is, all it really takes is an accusation; the
             | public will presume guilt, the target's job and home and
             | life are taken away, and Apple + NCMEC can say, "See? It
             | works." Even if the target is later exonerated, the damage
             | is done. The teams that vet the candidate images might even
             | have quotas to fill. "How many did you catch last month?"
             | Your innocent baby bath pictures might catch you up in a
             | net that destroys your life, just so some low wage clown
             | can claim they made quota.
        
             | himaraya wrote:
             | Just think of it as a form of lobbying.
        
               | stephc_int13 wrote:
               | What's the end-game here? I don't follow.
        
               | blisterpeanuts wrote:
               | It might make the Chinese government look more kindly on
               | Apple.
        
               | himaraya wrote:
               | Idk, avoiding regulation in general? Getting into the
               | government's good graces never hurt anyone
        
               | hoppyhoppy2 wrote:
               | I see it as Apple taking voluntary steps to prevent the
               | Earn IT Act (or something similar) from becoming law in
               | the US.
               | 
               | https://en.wikipedia.org/wiki/EARN_IT_Act_of_2020
               | 
               | Big tech companies are under a lot of US govt pressure
               | right now to crack down on CSAM, most especially Apple
               | because of their very low number of reports compared to
               | most other tech giants. I think Apple saw this as a way
               | to ease some of that government pressure while not
               | jeopardizing their ability to use end-to-end encryption,
               | which something like the EARN IT Act could effectively
               | make illegal by requiring a government backdoor for all
               | encrypted cloud services that operate in the US.
               | 
               | Apple probably saw the on-device CSAM scanning as a
               | small, widely-acceptable concession to make that could
               | prevent much bigger crackdowns, but maybe didn't
               | anticipate the level of blowback from people seeing the
               | CSAM scanning itself as an unacceptable government
               | backdoor on their own device.
        
               | natch wrote:
               | End game (being charitable to them here) is they can now
               | start encrypting iCloud photos, and iCloud backups for
               | that matter, with a key that they do not have any way to
               | access, while getting the FBI off their backs on this one
               | hot button issue.
               | 
               | That by itself makes it look ok.
               | 
               | Until you consider... there are a couple
               | counterarguments.
               | 
               | One, Apple has not actually enabled such private
               | encryption with the keys out of reach to Apple for iCloud
               | backups.
               | 
               | Two, that child protection in other countries will
               | sometimes be defined in such repugnant terms that it will
               | compromise Apple fully to scan for the hashes provided by
               | "child protection" organizations in those countries.
               | 
               | "Child protection" is in quotes not because I think
               | countries will get away with shoehorning, say, terrorist
               | content hashes in as purported child pornography hashes.
               | It's in quotes because the concept of child protection
               | can be so wildly bizarrely corrupted in some countries
               | for religious or ideological reasons. Who decides what is
               | off limits for children, from having gay parents, to
               | having friends of the opposite sex, to having an
               | unislamic head covering? Well, each random government, of
               | course, with Apple as the enabler, and individual and
               | human rights be damned.
               | 
               | So while the most charitably viewed end game may be good,
               | they seem to be papering over the real impacts this could
               | have.
        
             | mr_toad wrote:
             | > This project is probably going to cost millions, they
             | won't catch any pedophiles, they will only scare them so
             | they don't use Apple devices.
             | 
             | Mission accomplished? They can argue that encrypted iPhones
             | aren't being abused by pedophiles, and they can argue that
             | alternative App Stores will be avenues for illegal
             | material.
        
       | [deleted]
        
       | JumpCrisscross wrote:
       | > _Apple was surprised its stance then was not more popular, and
       | the global tide since then has been toward more monitoring of
       | private communication_
       | 
       | Huh. That's a big problem. If the American public doesn't support
       | privacy, we won't have it.
        
         | colinmhayes wrote:
         | The American public broadly does not care about privacy.
         | Facebook is more than enough proof of that.
        
           | nhumrich wrote:
           | Facebook does significantly more business outside the US then
           | inside of it.
        
             | colinmhayes wrote:
             | No one else cares about privacy either.
        
       | unityByFreedom wrote:
       | Good. I would be furious if I worked there. After the San
       | Bernardino case, I viewed them as the paragon of privacy and
       | security, to the extent I ignored most criticisms, including
       | their lack of support for right-to-repair and concerns over App
       | Store rejections. All of that is back on the table for me after
       | this decision.
       | 
       | It _is_ out-of-step with everything they 've been doing up to
       | this point, and it makes me wonder who has something over the
       | head of Apple that we aren't hearing about. The stated benefits
       | of this tech are far outweighed by the potential for harm, in my
       | view.
       | 
       | If Apple pushes forward on this, I want to hear a lot more from
       | them, and I want it to be a continuing conversation that they
       | bear until we all understand what's going on.
        
         | sneak wrote:
         | Apple turned over all of the San Bernardino iPhone data, as
         | iCloud Backup information is not e2e encrypted and is readable
         | in full by Apple without ever touching or decrypting the phone.
         | 
         | The Apple vs FBI narrative was a coordinated marketing move
         | (coordinated between the FBI and Apple) to push that "paragon
         | of privacy and security" brand message. It's false.
         | 
         | Apple explicitly preserves a non-e2e backdoor in their
         | cryptography for the FBI:
         | 
         | https://mobile.reuters.com/article/amp/idUSKBN1ZK1CT
         | 
         | It's not out of step at all. Apple actively cooperates with the
         | FBI, both operationally and in product planning/design phases.
        
         | polishdude20 wrote:
         | What's fishy to me is the fact that they are letting this be
         | known. Like, if you're trying to capture people who share CSAM
         | or have it on their phone, why would you go around saying that
         | you can now scan for it?
         | 
         | I think this announcement would have been well known to cause a
         | major outrage so why say anything? I'm wondering if this is a
         | cover up for something else. Get people talking about this in
         | the news and getting outraged about it while something else
         | goes unnoticed.
        
           | unityByFreedom wrote:
           | Announcing it isn't fishy. If they didn't announce it, they'd
           | risk a whistleblower telling the story for them. In fact, the
           | story did break a day before Apple could officially announce
           | it. That may already have hurt them more because they weren't
           | even part of the discussion at that point. What you want is
           | to tell the story on your terms. If someone beats you to it
           | you can't do that.
           | 
           | Unrelated to Apple, but that's one of the big problems with
           | misinformation. You can't get ahead of it because it's all
           | made up.
        
           | tsjq wrote:
           | > I'm wondering if this is a cover up for something else. Get
           | people talking about this in the news and getting outraged
           | about it while something else goes unnoticed.
           | 
           | that's my thought, too.
        
           | juniperplant wrote:
           | Also, I think it would have been a matter of time before some
           | researcher analyzing network traffic would have discovered
           | something.
        
       | AnonC wrote:
       | What's going to be sad to note over the next few months is yet
       | another record breaking year and another record breaking first
       | quarter for Apple (this is as per Apple's financial calendar).
       | 
       | All these concerns and outrage don't seem to matter to most
       | people, and a few boycotts here for Apple, a few more boycotts
       | there for Facebook, and some more for <name-another-for-profit-
       | bad-company> aren't making any difference to their bottomline.
       | 
       | To say I'm seriously disappointed and frustrated is putting it
       | very mildly.
        
       | literallyaduck wrote:
       | States need to put a law on the books with fines to prevent faux
       | encryption and scans without warrants which are unauthorized by
       | the device owner, further consent may not be covered nor may
       | electronic suppliers attempt to lease or license equipment to
       | circumvent these protections.
        
       | joelbondurant wrote:
       | I'm putting the golden gate bridge up for sale.
        
       | frankfrankfrank wrote:
       | Is there really anyone who does not see this for what it is, a
       | really transparent effort to introduce surveillance
       | infrastructure under the guise of "saving the children"?
       | 
       | If so, I would love to hear what assures you so much about this.
       | 
       | A couple reasons I don't believe a single word:
       | 
       | * After years of the surveillance state begging and pleading
       | about compromising encryption, they went silent after talks with
       | Apple and Google, over about the very time it can reasonably be
       | expected it would take to develop this "feature".
       | 
       | * The number of pedos this could potentially even catch is
       | supposedly so small that it is a compromise of everyone's rights
       | ... an inherent contradiction of not only Constitutional laws,
       | but the very concept of innocence before being proven guilty. Not
       | to mention that we have long been told there are no pedo rings,
       | even in spite of the fact that even without this "feature" the
       | last administration broke up more of these pedo rings than ever
       | before. Why do we need this "feature" now then?
       | 
       | * Any actual pedo rings will easily work around this feature
       | simply by altering images and changing the hash in various ways
       | too numerous to list right now.
       | 
       | * Innocent people could easily be swept up in this if they happen
       | to have images that hash to something the feds are comparing
       | against, within the threshold of tolerance. What happens then, a
       | secret court provides a secret court order to penetrate your
       | network and devices to confirm whether you are a pedo because you
       | took pictures of your grandchildren in a bathing suit?
       | 
       | * oh, did I mention the mountain of history of nothing but abuse
       | and lies and lies and abuse in every single way possible and even
       | to the point that he people exposing the not just illegal acts,
       | but evil acts, are deliberately targeted by the state?
       | 
       | Why anyone would believe Apple, let alone the government that is
       | clearly suspiciously standing in the background, looking away,
       | whistling into the sky trying to act as if it has nothing to do
       | with this, is beyond me. It's all just lies, front to back, top
       | to bottom, left to right and in every other dimension.
       | 
       | What this clearly will be misused for is to identify or
       | fingerprint dissidents and worngthinkers of any kind, including
       | those who thing they are now rightthinkers, who will find
       | themselves on the other side of the divide when they've expended
       | their usefulness.
        
       | arvindamirtaa wrote:
       | This is what Apple's policy USED to sound like. -
       | https://www.youtube.com/watch?v=BZmeZyDGkQ0
       | 
       |  _"...You can 't have a backdoor that's only for the good guys.
       | That any back door is something that bad guys can exploit."_
       | 
       |  _" No one should have to choose between privacy and security. We
       | should be smart enough to do both."_
       | 
       |  _" You're assuming they're all good guys. You're saying they're
       | good, and it's okay for them to look. But that's not the reality
       | of today."_
       | 
       |  _" If someone can get into data, it's subject to great abuse"._
       | 
       |  _" If there was a way to expose only bad people.. that would be
       | a great thing. But this is not the world. .... It's in everyone's
       | best interest that everybody is blacked out."_
       | 
       |  _" You're making the assumption that the only way to security is
       | to have a back door.....to be able to view this data. And I
       | wouldn't be so quick to make that judgement."_
       | 
       | No matter the mental gymnastics now, it's still a pretty drastic
       | departure from what it used to be.
        
       | fortran77 wrote:
       | If they're going to be scanning in the client, wouldn't it be
       | better to scan when someone is downloading something---the way
       | viruses abs malware are scanned? This way they can -prevent-
       | crime.
        
       | rbrbr wrote:
       | It is the first time that I have a reason to leave Apple
       | ecosystem. And I will if this goes trough, even though I do not
       | live in the US.
        
       | m3kw9 wrote:
       | On the contrary, it would be weird if everyone inside unanimously
       | agreed.
        
       | jmull wrote:
       | I just do not want Apple scanning my phone for the purpose of
       | finding something they can send to the police.
       | 
       | I'm not even talking about any "slippery slope" scenarios and
       | I'll never have any of the material they are looking for. And
       | right or wrong, I don't really fear a false identification, so
       | this isn't about a worry that they will actually turn me in. I
       | just don't want them scanning my phone for the purpose of turning
       | me in. I definitely do not want to take an OS update or new phone
       | that does this.
       | 
       | (False positives are the huge flaw in this system, though. There
       | are human judgements by anonymous people using opaque processes
       | without appeal throughout this. A lot of damage can be done
       | before anything goes to court, not to mention courts themselves
       | are susceptible to the illusion of infallible technology. Others
       | have well laid out the scenarios that will result in serious
       | consequences for innocent people.)
        
         | sunshineforever wrote:
         | This is exactly how I feel to a T. It's intangible to explained
         | but I'm not a criminal and I'm not on probation. Apple as zero
         | business scanning my phone for anything and I don't want that
         | to be done.
        
         | sbr464 wrote:
         | Franz Kafka.
        
         | samstave wrote:
         | Wait until you hear about judges who are actively on the take
         | to send people to jail - like the judge who was taking money
         | from for-profit juvenile detention centers to send kids to
         | their facilities on ridiculous charges.
         | 
         | Yeah - this is an architecture designed to be abused.
        
         | istingray wrote:
         | I like this take. It shortcuts around all the "but people
         | misunderstand the technology" back and forth and gets to the
         | root of it. "Don't scan my phone for stuff you can send to the
         | police."
        
           | laurent92 wrote:
           | Should we consider the phone:
           | 
           | - as part of the home?
           | 
           | - as part of the human body?
           | 
           | In either cases we're allowed to do crime at home as long as
           | no-one knows it.
        
             | ycombinete wrote:
             | I don't think that's how crime works
        
             | aj3 wrote:
             | No, you're not allowed to do crime even if you kill all
             | witnesses and destroy all evidence.
        
               | z3ncyberpunk wrote:
               | Funny, the CIA says otherwise
        
           | 6AA4FD wrote:
           | Yup. We don't need to demand privacy as a means to an end, it
           | is just respect I want from my belongings.
        
         | zepto wrote:
         | This is the most honest take on this that I've seen. The
         | slippery slope arguments don't make sense, nor does the risk of
         | false positives.
         | 
         | But the idea of being suspected even in this abstract way,
         | because of something _other people do_ , is at the very least
         | distasteful, bordering on offensive.
        
           | Karunamon wrote:
           | How do the slippery slope arguments not make sense? This is a
           | capability that:
           | 
           | 1. Did not exist prior (scanning stuff on the endpoint)
           | 
           | 2. Has a plausible abuse case (the same system, applied to
           | stuff that isn't CSAM)
           | 
           | I find this very compelling, especially in the shittier
           | regimes (China...) that Apple operates in.
        
             | zepto wrote:
             | Neither 1 nor 2 are in fact true. Spotlight indexes all
             | kinds of metadata, as does photos search. Adding an agent
             | to upload data from these is easier than extending the CSAM
             | mechanism, and the CSAM mechanism as is is not all that
             | plausible to abuse either technically or socially given how
             | clear Apple's promises are.
        
               | Karunamon wrote:
               | Why should Apple's promises be taken at face value?
               | Doubly so in a world where they can be compelled legally
               | to break those promises and say nothing (especially in
               | the more totalitarian countries they willingly operate in
               | and willingly subvert some of their other privacy
               | guarantees)?
               | 
               | What stops Apple from altering the deal further?
               | 
               | And if you have an answer for that, what makes you
               | believe that 10 years in the future, with someone else at
               | the helm, that they won't?
               | 
               | Your device is now proactively acting as a stool pigeon
               | for the government where it wasn't prior. This is a new
               | capacity.
               | 
               | And it's for CSAM. _For now_. The technological leap from
               | locally scanning just the stuff you upload, to scanning
               | everything, is a _very simple_ change. The leap from
               | scanning for CSAM to scanning for some other perceptual
               | hashes the government wants matched is _very simple_.
        
               | zepto wrote:
               | What stops them from altering the deal is that they have
               | said they won't, and there really is no evidence to
               | believe otherwise.
               | 
               | If they do, then they do. There are never any guarantees
               | about anyone's future behavior. However just saying 'what
               | stops you from becoming a thief?', does not imply that
               | you will become a thief.
               | 
               | There is no new capability other than what they have
               | said. They already had _far_ more general purpose
               | mechanisms for analysing and searching your files already
               | in the operating system.
               | 
               | The leap from doing spotlight indexing to searching for
               | the keywords 'falun gong' or 'proud boys' in your files
               | is also simple. So is the leap from searching your photos
               | locally for 'dogs', to searching locally for 'swastikas'
               | and reporting back when they are found.
               | 
               | If they decide to build some spyware, there is no need
               | for it to be based on this. It's a red herring.
        
               | Aaargh20318 wrote:
               | > the CSAM mechanism as is is not all that plausible to
               | abuse either technically or socially given how clear
               | Apple's promises are.
               | 
               | That's the problem: Apples promise means nothing exactly
               | because it's so easy to abuse.
               | 
               | Apple says they will refuse when asked to scan for
               | anything that is not CSAM. That's one of those nice
               | 'technically true' statements lawyers like to include.
               | 
               | Apple will not _have_ to refuse anything, because they
               | won't be asked.
               | 
               | Apple doesn't get a huge pile of CSAM images (obviously),
               | they get a list of hashes from a 3rd party. They have no
               | way of ensuring that these hashes are only for CSAM
               | content. And when 'manually' checking, they aren't
               | actually looking at potential CSAM, they are checking the
               | 'visual derivative' (whatever that means exactly),
               | basically a manual check if the hashes match.
               | 
               | So yes, they would _technically_ refuse if asked to scan
               | for other material, but they will just accept _any_ list
               | of hashes provided to them by NCMEC (an organization
               | created by the US government and funded by the US DoJ) no
               | questions asked. The US government could include any hash
               | they wish into this list without Apple ever noticing.
        
               | zepto wrote:
               | > And when 'manually' checking, they aren't actually
               | looking at potential CSAM, they are checking the 'visual
               | derivative' (whatever that means exactly), basically a
               | manual check if the hashes match.
               | 
               | The visual derivative is enough to tell the difference
               | between a collection of CSAM and say documents, or
               | pictures of protests.
               | 
               | It can't be abused without Apple's involvement.
        
               | Aaargh20318 wrote:
               | > The visual derivative is enough to tell the difference
               | between a collection of CSAM and say documents, or
               | pictures of protests.
               | 
               | How would you know ? They have never given details on
               | what this 'visual derivative' actually is. If it's
               | detailed enough to recognise it as CSAM, then Apple isn't
               | allowed to look at it.
        
               | zepto wrote:
               | That's a fair point.
               | 
               | However this is moot anyway. Using this mechanism to
               | search for things other than CSAM would be
               | unconstitutional under the 4th amendment.
               | 
               | https://www.economist.com/united-
               | states/2021/08/12/a-38-year...
        
               | mkmk wrote:
               | Apple very recently promised that "what happens on iPhone
               | stays on iPhone". At literally the same time, they were
               | building a system that was designed to break that
               | promise. Why should we believe their new promises when
               | their last promise was disingenuous?
               | 
               | https://9to5mac.com/2019/01/05/apple-privacy-billboard-
               | vegas...
        
               | zepto wrote:
               | How does this break that promise? Nothing leaves the
               | phone unless the user chooses to upload their photos to a
               | cloud service.
        
           | lm28469 wrote:
           | > The slippery slope arguments don't make sense, nor does the
           | risk of false positives.
           | 
           | Yeah no, there are about 1b iphones in use, even a ridiculous
           | low amount of false positive would be a major pain in the ass
           | for a lot of people regularly.
           | 
           | The slippery slope argument is totally valid, whatever legal
           | safeguards you have today might not exist tomorrow, but the
           | tech will stay here forever.
           | 
           | I actually fail to see how your second sentence support the
           | first. Treating everyone as a potential suspect of a crime
           | (which isn't even wide spread in the first place) is a major
           | slippery slope. Today it's "for the children and only in the
           | US" tomorrow it might be "find me pro Palestinians pictures
           | of people living in Israel". Nobody can predict what apple
           | stance will be in 25+ years
        
           | feanaro wrote:
           | I agree with this take, but the slippery slope arguments _do_
           | make sense, as exemplified by any spying technology ever.
           | Once a capability exists, it 's too enticing not to continue
           | expanding the scope.
        
             | somedude895 wrote:
             | Exactly.
             | 
             | Selling dev consoles on ebay or drugs on the darkweb? We'll
             | take the picture in the listing and find out who took it.
             | 
             | Can't find any matches? This is too ineffective. We'll need
             | Apple to store all hashes on all devices for 6 months.
             | 
             | The slippery slope argument makes a LOT of sense.
        
               | zepto wrote:
               | Nope. This would be an illegal search under the 4th
               | Amendment.
               | 
               | https://www.economist.com/united-
               | states/2021/08/12/a-38-year...
        
           | thevagrant wrote:
           | I would not say the slippery slope take doesn't make sense.
           | It is perfectly possible that had no one cried out about this
           | change then the next change would have been:
           | 
           | Large government to Apple: "Please now also create a hash on
           | each photo metadata field including date/time and location.
           | Let us query these. AND BTW, this change must be kept secret.
           | Thanks."
        
             | zepto wrote:
             | > Large government to Apple: "Please now also create a hash
             | on each photo metadata field including date/time and
             | location. Let us query these. AND BTW, this change must be
             | kept secret. Thanks."
             | 
             | How do you know this hasn't already happened? What does it
             | have to do with the CSAM technology?
        
               | feanaro wrote:
               | How do you know the invisible pink unicorn doesn't exist?
               | You don't, but going after the known bad things is the
               | most effective action you can take. You certainly don't
               | _ignore_ known bad things just because there may be even
               | worse unknown things.
        
               | zepto wrote:
               | Right, but this _isn't_ a bad thing. The bad things
               | people are claiming are things they imagine could be done
               | in the future.
               | 
               | Exactly like an Invisible Pink Unicorn.
               | 
               | If you go after imaginary bad things, you will never
               | stop. That is the problem with paranoia.
        
             | [deleted]
        
             | laurent92 wrote:
             | > it is possible that had no-one cried about this change,
             | 
             | The "UK porn filter" has already been extended to all sorts
             | of "<< extremism online >>" (to no effect in the capital of
             | knife attacks, it seems -- as usual invasive police rights
             | do not equal a reduction of criminality) and it's already
             | being proposed to be extended to:
             | 
             | - Online Harms
             | 
             | - Online Safety Bill
             | 
             | https://en.wikipedia.org/wiki/Web_blocking_in_the_United_Ki
             | n...
        
               | zepto wrote:
               | Right. My argument against the slippery slope argument is
               | not that governments won't make demands.
               | 
               | It's that if they do, this technology is irrelevant.
        
             | robertoandred wrote:
             | That's not at all how hashes work here.
        
               | thevagrant wrote:
               | I know that is not how it works currently, I was merely
               | suggesting how something could change.
        
           | drvdevd wrote:
           | > distasteful
           | 
           | This is the word that most captures my feeling not just
           | around the feature but how it was rolled out as well. Reading
           | the PR announcement and then the leaked memo just felt -
           | quite distasteful.
           | 
           | There are a number of things Apple has done and continues to
           | do that fit into that category for me personally - the fact
           | that iOS is so locked down for example.
           | 
           | But this really takes the cake. It's like someone at Apple
           | dialed up the distasteful to 11 and let loose.
           | 
           | The whole industry does distasteful things but this is a
           | harbinger of our inability to trust them at a deeper level.
           | Not just Apple.
        
           | fsflover wrote:
           | > The slippery slope arguments don't make sense
           | 
           | Yes, they do: https://news.ycombinator.com/item?id=28160092
        
             | zepto wrote:
             | No they don't. Searches for things other than CSAM would be
             | inadmissible under the 4th amendment.
             | https://www.economist.com/united-
             | states/2021/08/12/a-38-year...
        
               | psyc wrote:
               | That article is very obviously not about what is being
               | searched for. It's about _who_ it is being reported to.
               | For example, if Apple one day decided to detect
               | unauthorized copyrighted media, they could report it to
               | the RIAA. The RIAA is independent of the government, thus
               | the 4th amendment is not violated.
        
               | zepto wrote:
               | So then you must agree that the fears of government
               | overreach are _unwarranted._ Given that's what most of
               | the complaints here raw about, that's a big deal.
               | 
               | > For example, if Apple one day decided to detect
               | unauthorized copyrighted media, they could report it to
               | the RIAA. The RIAA is independent of the government, thus
               | the 4th amendment is not violated.
               | 
               | It's true that Apple could decide to implement any search
               | or scanning mechanism at any time, and report anything
               | they like to anyone they want to. So what? This is true
               | of anyone who writes software that handles user data.
               | 
               | What does that have to do with the CSAM mechanism? It
               | seems like an unrelated fear.
        
               | fsflover wrote:
               | > What does that have to do with the CSAM mechanism? It
               | seems like an unrelated fear.
               | 
               | He CSAM mechanism proves that Apple can and will go this
               | route, even if users are against that.
        
               | zepto wrote:
               | The CSAM mechanism proves that Apple will send
               | information to the RIAA or another non-government entity?
               | 
               | How exactly does it prove that?
        
         | argvargc wrote:
         | Who wants their printer to scan every document they print, and
         | if it sees something the printer manufacturer thinks is illegal
         | it's reported to the police?
         | 
         | Or their TV scans every frame displayed and every word said if
         | it sees anything the TV manufacturer thinks is illegal it's
         | reported to the police?
         | 
         | Or their blender has an array of chemical sensors in it, and if
         | it detects a substance the manufacturer thinks is illegal it's
         | reported to the police?
         | 
         | How about smart floorboards that detect illegal dancing by
         | women in Iran? Shall we do that too?
         | 
         | Apple where are you on that one? I can point you to a country
         | filled with eager customers, seems right up your alley.
        
           | burkaman wrote:
           | > Who wants their printer to scan every document they print,
           | and if it sees something the printer manufacturer thinks is
           | illegal it's reported to the police?
           | 
           | FYI most (all?) modern printers will refuse to print anything
           | they think is counterfeit money, and will include a unique
           | code traceable back to you in every printed document. -
           | https://en.wikipedia.org/wiki/Machine_Identification_Code
        
             | [deleted]
        
             | thescriptkiddie wrote:
             | Well, looks like I'm never buying a new printer.
        
               | Wohlf wrote:
               | >Developed by Xerox and Canon in the mid-1980s
        
             | asperous wrote:
             | Printers may do this but it doesn't call the police on them
        
         | tuatoru wrote:
         | > I just do not want Apple scanning my phone for the purpose of
         | finding something they can send to the police.
         | 
         | Yes. This is called vigilantism.[1] Apple is proposing to
         | become a vigilante.
         | 
         | 1. https://en.wikipedia.org/wiki/Vigilantism
         | 
         | "Vigilantism is the act of enforcement, investigation or
         | punishment of perceived offenses without legal authority."
        
           | otterley wrote:
           | Except that they do have the legal authority to do so. See 18
           | U.S.C. section 2258A(a)(1)(A)(ii).
           | https://www.law.cornell.edu/uscode/text/18/2258A
        
           | [deleted]
        
           | ransom1538 wrote:
           | Exactly this.
        
             | amelius wrote:
             | Except governments applaud it, so, no.
        
           | amelius wrote:
           | "Voyeurism is the act of observing other people without legal
           | authority."
        
         | Svoka wrote:
         | From what I understand, Apple does not want to scan your phone.
         | 
         | They want to have end-to-end encryption and store your photos
         | encrypted. This is why they scan before sending them encrypted.
         | 
         | They won't scan anything which you would not put in the cloud
         | anyway.
         | 
         | This is pretty clever way to preserve end-to-end encryption and
         | satisfy requirement not to store anything CP related on their
         | servers. Probably too clever for journalists to understand :(
        
           | feanaro wrote:
           | > This is pretty clever way to preserve end-to-end encryption
           | and satisfy requirement not to store anything CP related on
           | their servers.
           | 
           | Correction: Anything in the shady government-supplied
           | database, pushed through a private corporation, not CP. The
           | official _purpose_ today is CP (well, CSAM, which is already
           | more vague than CP), but the distinction is important. Also,
           | not only to _prevent_ storing that content on their servers,
           | but to _report you_ to the authorities.
        
           | isaac21259 wrote:
           | What? Scanning your phone is exactly what Apple is doing.
           | They say they won't scan anything not put on the cloud but
           | you have no control over that and it is subject to change at
           | anytime. They've already weakened their systems at the
           | request of the government so it wouldn't be at all unexpected
           | for them to scan photos not uploaded to the cloud. End-to-end
           | encryption is completely useless if you have spyware
           | installed on one of those ends.
        
         | BikiniPrince wrote:
         | You don't have anything they want today.
         | 
         | How difficult would it be to transform it into looking for
         | images from political dissenters? Now, you have a magical list
         | to find targets to probe further.
         | 
         | Maybe you want to identify behaviors you disagree with.
         | 
         | Images tell a story and maybe not to the people who are looking
         | out for your best interests.
         | 
         | Wrong think always changes.
        
           | calvinmorrison wrote:
           | Sure its "just" CSAM for Americans, but for Chinese citizens,
           | it won't be, of any other sizable market for Apple.
        
         | borplk wrote:
         | Not to mention how this kind of stuff can be abused. Think
         | about the digital version of the cop that drops a small bag
         | containing a white powder in the back of your car. Good luck
         | proving that it wasn't you.
        
         | e40 wrote:
         | Whether it's likely or not, I'm worried about false
         | identification of images.
         | 
         | Let's also remember that malware would easily drop illegal
         | images as a way to mess with the system and specific people.
        
         | cucumb3rrelish wrote:
         | It's not scanning your device, but your iCloud uploads, and if
         | configured, the imessages of your children
        
           | M4v3R wrote:
           | Please do not spread misinformation. The system DOES scan
           | your local device for photos that match CSAM hashes that are
           | downloaded to the device. What you probably meant is that
           | _currently_ it is only enabled if you have iCloud Photos
           | enabled (which almost everyone has), but now that the
           | mechanism for client-side scanning is in place there's
           | nothing preventing them to turn it on later regardless of
           | your user settings.
        
             | hypothesis wrote:
             | I don't like that system, but according to their summary
             | document, CSAM detection system is only (as described
             | today) processing images that are uploaded to iCloud.
             | 
             | [1] https://www.apple.com/child-
             | safety/pdf/Expanded_Protections_...
        
               | pdabbadabba wrote:
               | Please look more carefully at that document. It says, on
               | page 5: "Instead of scanning images in the cloud, the
               | system performs on-device matching using a database of
               | known CSAM image hashes provided by NCMEC and other child
               | safety organizations."
               | 
               | And look at the diagram. The matching of image hashes
               | with user photos occurs "on device."
        
               | hypothesis wrote:
               | I know that scanning is done by one's device.
               | 
               | I just think that there are better arguments against it
               | without unnecessary exaggeration. So I wanted to be more
               | specific on which images on your device are scanned
               | according to that paper.
               | 
               | Obviously I should have phrased is better, like "about to
               | be uploaded to iCloud"...
        
               | pdabbadabba wrote:
               | I see. I didn't realize that was the distinction you were
               | making, but now I see what you meant.
               | 
               | Just to spell this out (since I think this thread could
               | still be a little confusing):
               | 
               | 1. The photo scanning does occur on the device, not in
               | the cloud. However,
               | 
               | 2. Apple's explanation indicates that this device-based
               | scanning will only occur on photos that may be uploaded
               | to iCloud.
        
               | cucumb3rrelish wrote:
               | On device != Scanning your device. It scans your iCloud
               | uploads and if configured, iMessages (which you didnt
               | mean, because that doesnt work with hashes)
        
               | shishy wrote:
               | The general worry is that the fact that it happens on the
               | device means in the future it could conceivably scan your
               | device, even if there are some software checks in place
               | to only scan things being synced to iCloud.
        
               | notquitehuman wrote:
               | I'll need to see their source before I believe that.
               | Apple ("privacy is a human right") lost their presumption
               | of good faith when they announced that they will
               | automatically notify law enforcement if their algorithms
               | and processes suspect you're doing something naughty on
               | your device.
        
       | jijji wrote:
       | How long before journalists/activists in different countries get
       | jailed/killed for taking pictures of some event that the
       | government did not like and requested CSAM hashes from Apple,
       | unrelated to "child-safety"?
        
       | hvis wrote:
       | Imagine some state decides to focus the "war on crime" efforts on
       | scanning people's phones for selfies with handguns and/or white
       | powder.
       | 
       | That's what the CSAM initiatives sound like to me.
        
       | spenvo wrote:
       | Many people take the following maximalist stance: "Apple can't
       | save users' privacy in China, nor can it push back against laws,
       | so therefore it's unreasonable to ask them to hold the privacy
       | line." But for every China there is a Brazil, Hungary, or India.
       | 
       | For every "completely authoritarian" regime, you have handfuls of
       | (historically democratic) governments on their way toward
       | authoritarianism. In these countries, it's often not the _laws_
       | calling the shots but the politicians that happen to be in charge
       | at that moment.
       | 
       | Even in democracies with laws as guardrails, presidents will push
       | for data on political opposition - at times threatening techies
       | along the way (see: Brazil jailing WhatsApp execs for access to
       | data [0], Twitter/India raids, or, in the USA: Trump DOJ asking
       | Apple to turn over info on House Dems [1])
       | 
       | Simply "doing what you're told" around the world (no-questions-
       | asked, aka "being in the arena" as Tim Cook says) will turn you
       | into an enabler of authoritarianism, and not just in totally
       | FUBAR countries like China but in many nations with a lot of
       | freedom at stake.
       | 
       | 0 - https://washingtonpost.com/world/national-security/senior-
       | fa...
       | 
       | 1 - https://www.techmeme.com/210610/p51#a210610p51
       | 
       | From:
       | https://twitter.com/SpencerDailey/status/1426008754143776771
        
         | brandon272 wrote:
         | Also bear in mind that Apple makes the spectacular statement
         | they they consider privacy a "fundamental human right" and that
         | it's one of their core values. [0]
         | 
         | It's mind boggling to imagine that a company comes up with that
         | stance and advertises it so proudly without ever having
         | considered that it is at odds with what governments around the
         | world will want.
         | 
         | "I know we said privacy is a fundamental human right but that
         | was before the U.S. government asked us to give them your data"
         | 
         | 0. https://apple.com/privacy
        
       | wonderwonder wrote:
       | I assume this is some sort of machine learning algorithm. How did
       | they get the data set to train it on? That seems really strange.
       | How did they test it? Seems like any engineers involved on this
       | would come out with PTSD. Whatever they used to test it and train
       | it with is illegal to possess and could only be used with some
       | sort of massive government cooperation.
        
         | arn wrote:
         | It's not. It's a specific database of image hashes they are
         | comparing against. Not AI, Not ML.
        
           | alfalfasprout wrote:
           | That's not... entirely clear actually. Well rather, what
           | exactly the hash is based on. They call it "neural hash"
           | which implies there's maybe some kind of embedding being
           | used?
           | 
           | It's one thing to compute a simple hash. It's another
           | entirely if they're using some type of locality sensitive
           | hash since that very much has the possibility of false
           | positives and would be a major invasion of privacy (which it
           | already is).
        
         | jl2718 wrote:
         | Closer to a grayscale thumbnail. I was expecting something a
         | lot more sophisticated.
        
       | mrwww wrote:
       | They must comply with Chinese law. So they put iCloud data in
       | China. So users with harmful images of "free hongkong" and winnie
       | the pooh should be reported to authorities. Apple complies with
       | local laws right?
        
       | systemvoltage wrote:
       | Regarding smartphones, I wonder if we've just lost a grip on what
       | it means to have privacy. Even our expectations for privacy have
       | degraded and eroded over time. We just carry this device with our
       | _life_ encoded in a flash chip with keys to cloud access. _Life_.
       | Everything from emails to our walking gait, photos, text,
       | history, health records, etc is stored on this one device.
       | Dictators of the past would have a field day with this
       | infrastructure.
       | 
       | "Wouldn't it be wonderful if the people of future would willingly
       | carry a surveillance device that they depend on everyday to
       | communicate to others and carry with them everywhere?"
       | 
       | We need to move towards OSS and OSH phones with extreme urgency.
       | We also need to get more people involved in Ham Radio.
        
         | brokenmachine wrote:
         | Governments have wilfully _not_ kept up with the technology, so
         | they can spy on it all.
         | 
         | Imagine if the government suddenly decided that they would open
         | and scan everyone's physical mail into a permanent database.
         | Why should email be any different?
        
         | hughrr wrote:
         | Why ham radio? That doesn't make sense. There's no privacy
         | there by law. It is by nature a completely non anonymous
         | medium. Not only that if a country shuts down its comms the
         | hams are all on a nice list for their shit to be confiscated.
         | 
         | Im not a ham because of some romanticised dystopia avoidance.
         | It's just fun to make things that you can talk to people on :)
        
           | systemvoltage wrote:
           | Ham Radio does not depend on internet backbone. In the
           | extreme, you can communicate across the world with a few
           | basic electronic components which can be built at home. Sure,
           | government can listen to it but it's trivial to encrypt
           | (illegal on Ham bands) if you're oppressed by the government.
           | 
           | > nice list for their shit to be confiscated.
           | 
           | I am not sure if this would go well in the USA.
        
             | hughrr wrote:
             | It'd go really well for the government. You'd just end up
             | with a lot of dead hams. Disorganised morons with AR15's vs
             | professional army. I know who wins that one.
        
       | hackinthebochs wrote:
       | Personally I don't see on device scanning as significantly
       | different than cloud scanning. I think the widespread acceptance
       | of scanning personal data stored on the cloud is a serious
       | mistake. Cloud storage services are acting as agents of the user
       | and so should not be doing any scanning or interpreting of data
       | not explicitly for providing the service to the end user.
       | Scanning/interpreting should only happen when data is shared or
       | disseminated, as that is a non-personal action. But that cat is
       | out of the bag already.
       | 
       | Sure, scanning a personal device vs. data stored in the cloud
       | feels different because you supposedly own the device. But you
       | don't own the software that runs on it, so the distinction seems
       | artificial to me.
        
         | intricatedetail wrote:
         | Would you like to have a landlord coming to your flat as they
         | please to look for drugs? It's the same thing. They make you a
         | suspect by default.
        
           | pixl97 wrote:
           | Heck, this is worse.
           | 
           | More like do you want the bank coming into your house to look
           | for drugs because you have your loan with them.
        
         | istingray wrote:
         | Can you expand upon what you mean by "feels different because
         | you supposedly own the device"? Just want to get a clearer idea
         | of what you're saying. How does it feel?
        
           | hackinthebochs wrote:
           | I just mean that there's an intuitive sense of ownership over
           | a physical device that you don't have over data stored in the
           | cloud. My point is that this distinction is artificial. You
           | don't really own the (software on the) device, but you should
           | own the data the company is processing on your behalf.
           | Whether this processing happens on the physical device or in
           | the cloud seems like an irrelevant distinction.
        
             | istingray wrote:
             | Appreciate the context. I like this take as well, it brings
             | up questions about what "ownership" really means. Where do
             | you land on privacy then, do you go radically open or
             | radically closed, something else?
        
               | hackinthebochs wrote:
               | To me ownership means I have exclusive control over who
               | gets to use something and for what purpose. If I own my
               | data, someone processing this data on my behalf has no
               | right or obligation to scan it for illegal content. The
               | fact that this data sometimes sits on hard drives owned
               | by another party just isn't a relevant factor. Presumably
               | I still own my car when it sits in the garage at the
               | shop. They have no right to use it outside of purposes
               | explicitly agreed upon. I don't see abstract data as any
               | different.
               | 
               | Personally, I strongly favor privacy. I avoid all cloud
               | services that don't involve local encryption. I have an
               | old model rooted android phone with two firewalls. All
               | apps have internet access blocked by default. I stay as
               | anonymous as possible on social media. And so on.
               | Recently my group at work moved our repositories to
               | Github Enterprise. It made me uncomfortable. I voiced my
               | concerns but they fell on deaf ears.
        
               | istingray wrote:
               | "Presumably I still own my car when it sits in the garage
               | at the shop."
               | 
               | Brilliant, and illustrates how far off we are. I'll be
               | scaling my level of surveillance following your lead.
        
       | xhruso00 wrote:
       | Can I take photos with 3rd party camera app + save it within its
       | app sandbox (not to camera roll) to avoid scanning?
        
       | nbzso wrote:
       | Can you imagine the beautiful New World we are building under the
       | wise leadership of corporations and politicians motivated by
       | unstoppable greed, control and power?
       | 
       | Can you imagine "the screeching voices of minorities with
       | critical thinking"?
       | 
       | The big picture of hyperconnected future in which automated
       | systems will decide your fate is in place and it's running well.
       | It is not perfect. Yet. But it will be. Soon.
       | 
       | You will find a way to cope. As usual. As a collective. As a
       | "exemplary citizen with nothing to hide".
       | 
       | "A blood black nothingness began to spin. Began to spin. Let's
       | move on to system. System."
       | 
       | The system will grow. Powered by emotional reasoning created by
       | the social engineers, paid well to package and sell "The New
       | Faith" for the masses.
       | 
       | You are not consumers anymore. You are not the product anymore.
       | You are the fuel. Your thoughts and dreams, emotions and work are
       | the blood of the system. You will be consumed.
       | 
       | "Do you get pleasure out of being a part of the system? System.
       | Have they created you to be a part of the system? System."
       | 
       | Yes. The pleasure of collective truism. In the name of the
       | "common good" - the big, the bold and the brave brush, build to
       | remove any form of suspicion or oversight.
       | 
       | "A blood black nothingness. A system of cells. Within cells
       | interlinked. Interlinked"
       | 
       | Wake up, Neo...
        
       | anonymouswacker wrote:
       | If you recently purchased an iPhone with your Credit Card, now is
       | the time to file a dispute. I called my company (the American
       | one), and the person I was connected to was instantly aware when
       | I mentioned my Apple phone was now spying on me. 5 minutes later
       | I have a dispute and some semblance of hope. YMMV of course, but
       | what other choice do we have than to hit them in their pocket
       | book on our way out of their walled garden?
        
       | karakot wrote:
       | The damage is already done.
        
       | Wo-Ho wrote:
       | Imho is Apple not so much privacy focused.
       | 
       | From my network logs, my iPhone is always connected to Apple
       | servers.
       | 
       | Moving the locked iPhone a little bit on the table, very often it
       | establishes a connection to Apple. Why?
       | 
       | If I block all Apple servers, my iPhone tries continously connect
       | to Apple servers - which seems to drain the battery very fast.
       | 
       | Some years ago, I asked Apple wheter it is possilbe to switch of
       | these 'connections'. Answer: No, I have to trust Apple.
       | 
       | WH
        
       | m3kw9 wrote:
       | I'm a little uncomfortable about my iCloud photos being scanned
       | but ok with Apple doing it because I will give them benefit of
       | the doubt they can make it work and if not, reverse it. If google
       | were to do that, that's a hard no
        
       | franknstein wrote:
       | Seems to me like an instance of 'bundling problem'. What
       | government actually wants is yes/no answer to the question 'does
       | this photo contains child abuse', but with this 'solution' they
       | get much more information than simple yes/no answer, making abuse
       | of power possible.
       | 
       | Is it possible to just scan photos locally for child pornography,
       | encrypt those photos, and upload those encrypted photos and the
       | 'yes/no' answer that the scan returns plus some cryptographic
       | proof that the actual scan (some calculation) was performed on
       | the uploaded encrypted photo?
       | 
       | If this was possible it would enable privacy and also help
       | prevent child abuse (or at least prevent child pornography from
       | being stored on iCloud).
        
       | endisneigh wrote:
       | Don't use cloud services or closed-source services if you want
       | your stuff to be safe and you want your privacy to be maintained.
        
         | geofft wrote:
         | Er, do you have a recommendation for non-cloud services that
         | provide me with automatic backups of everything I do if
         | something happens to my apartment, or open-source services that
         | are free of all security flaws allowing hackers to compromise
         | your privacy?
         | 
         | Like I wholeheartedly get where you're coming from, but I'm not
         | sure what realistic alternatives look like.
        
           | endisneigh wrote:
           | Use a photo client on an android phone, encrypt the photos
           | and use NextCloud (self-hosted). Make sure your certificates
           | are up-to-date. Reverse this process on another client to see
           | the photos.
        
             | busymom0 wrote:
             | Unfortunately since iOS doesn't allow changing default
             | Camera app, using a third party photo client is not that
             | easy.
        
             | geofft wrote:
             | What sort of Android? Normal Google-provided Android, which
             | contains closed-source components? Or do you have an open-
             | source alternative in mind?
             | 
             | Where do I self-host NextCloud?
        
               | endisneigh wrote:
               | you can self-host NextCloud on your laptop or desktop.
               | 
               | you can use whichever flavor of Android you'd like - a
               | lot of people like GrapheneOS
        
               | jjulius wrote:
               | So, lemme preface this by saying I also use my own NAS at
               | home and wholly support your push for self-hosting. That
               | said, I'm gonna play devil's advocate here, because this
               | is an area I admittedly haven't put much thought into.
               | 
               | GP said:
               | 
               | >... non-cloud services that provide me with automatic
               | backups of everything I do if something happens to my
               | apartment...
               | 
               | And my first concern is that a laptop or desktop (or my
               | NAS) won't fare well if an apartment is broken into and
               | the devices are stolen, or they get severely damaged in a
               | natural disaster. What might a good mitigation be for
               | that that doesn't involve spinning up a VPS via DO, etc.?
        
               | octopoc wrote:
               | How about whole disk encryption with Duplicati backing up
               | to Sia[1]? This will let you do automatic backups to a
               | decentralized version of S3 where you pay cryptocurrency
               | (Siacoin) for the storage.
               | 
               | That way here's what you need to protect:
               | 
               | - Your whole-disk encryption keys / passwords
               | 
               | - Your Sia key
               | 
               | - Your password for Duplicati backups
               | 
               | [1]
               | https://duplicati.readthedocs.io/en/latest/05-storage-
               | provid...
        
               | endisneigh wrote:
               | > And my first concern is that a laptop or desktop (or my
               | NAS) won't fare well if an apartment is broken into and
               | the devices are stolen, or they get severely damaged in a
               | natural disaster. What might a good mitigation be for
               | that that doesn't involve spinning up a VPS via DO, etc.?
               | 
               | occasional offsite backups of all of your data
        
           | fgygjifdscg wrote:
           | Synology has a phone app that automatically backs up your
           | photos. Then a pointy clicky setup to backup your backups to
           | Glacier
        
           | gpm wrote:
           | Since we're on HN. Use something like restic and a cron job
           | backing up to something like blackblaze.
           | 
           | Yes, it's the cloud, but they don't control the backup
           | software (it's open source, not developed by them, using
           | public APIs) or the encryption keys (protected by a
           | passphrase client side, use a good one).
           | 
           | There may be less programmery equivalents of that, which have
           | scheduling built into the backup tool and so on. Just
           | recommending what I've used.
           | 
           | (This advice applies to general purpose computers, not sure
           | if there's an equivalent for android or iOS)
        
           | nonbirithm wrote:
           | What I _wouldn 't_ recommend is using a cloud service like
           | Flickr if you value the integrity of your data. Recently they
           | charged me with a "Community Guideline Warning" because I
           | accidentally included a folder filled with non-photo images,
           | and threatened to delete my account within three days. There
           | is no way to appeal or know which content is specifically in
           | violation. All they left was a curt, automated email. You can
           | request all your data from Flickr to save everything, but
           | they state that the process can take up to several weeks
           | before you're given a download link, longer than the three
           | days it takes for them to arbitrarily terminate your account.
           | 
           | Even with services like MEGA that tout keywords like
           | "encryption" and "privacy" without end, look at their Terms
           | of Service and there will be a clause stating that they will
           | still give all your data to any law enforcement agency that
           | puts in a request, _including encrypted files._ And they have
           | no choice but to do so, because of the very same issue that
           | 's forcing the personal privacy debate with Apple into the
           | spotlight - hosting any kind of illegal data, which is for
           | all intents and purposes synonymous with CSAM, makes your
           | service criminally liable in dozens of different
           | jurisdictions. A public cloud provider that allows arbitrary
           | file uploads and has absolute privacy _cannot exist_ because
           | of the necessity to allow law enforcement to investigate
           | reports of the illegal hosting of CSAM in dozens of different
           | countries.
           | 
           | Any expectation of privacy is a facade when dealing with the
           | cloud. Even beyond the issue of privacy, you're still at the
           | whims of the cloud provider in the end. Fail to play by their
           | rules, and you're done for.
        
       | mmaunder wrote:
       | Perhaps the fundamental flaw in thinking behind this tactical
       | error is that Apple thinks of your iPhone as their iPhone.
        
         | musesum wrote:
         | Same goes for MacOS, try deleting the News App on Big Sur.
        
         | stjohnswarts wrote:
         | They see ownership more as "guidelines" than actual rules :)
        
       | jl2718 wrote:
       | NCMEC is a private charity. Why isn't this being handled by law
       | enforcement?
       | 
       | Is this the beginning of the age of vigilante justice?
        
         | dagmx wrote:
         | I think you should look into the relationship between the NCMEC
         | and the government. It's not just some arbitrary charity
         | running it. They work with the government, with government
         | regulations, while providing a layer of separation, that
         | theoretically, should prevent government abuse.
        
         | zionic wrote:
         | NCMEC is literally just the feds, with a few civilian
         | employees.
        
       | 5etho wrote:
       | Good, I hope Apple will die. Not so much of value will be lost
        
       | jonplackett wrote:
       | In their attempt to make this extra private by scanning 'on
       | device', I think they've managed to make it feel worse.
       | 
       | If they scan my iCloud photos in iCloud, well lots of companies
       | scan stuff when you upload it. It's on their servers, they're
       | responsible for it. They don't want to be hosting CSAM.
       | 
       | It feels much worse them turning your own, trusty iPhone against
       | you.
       | 
       | I know that isn't how you should look at it, but that's still how
       | it feels.
        
         | nottorp wrote:
         | That _is_ how you should look at it.
         | 
         | They have no business scanning the phones for anything, period.
         | 
         | I'll refer you to the poem beginning with "First they came for
         | the Jews...".
        
           | [deleted]
        
           | petermcneeley wrote:
           | The poem begins with "First they came for the socialists.."
           | 
           | https://en.wikipedia.org/wiki/First_they_came_...
        
             | tpush wrote:
             | 'Communists' instead of 'socialists' actually; the US
             | version is inaccurate due to anti-communist propaganda.
        
               | stjohnswarts wrote:
               | Do you have a source for that?
        
               | FabHK wrote:
               | The original:                 Als die Nazis die
               | Kommunisten holten, habe ich geschwiegen;         ich war
               | ja kein Kommunist.       Als sie die Sozialdemokraten
               | einsperrten, habe ich geschwiegen;         ich war ja
               | kein Sozialdemokrat.       Als sie die Gewerkschafter
               | holten, habe ich geschwiegen;         ich war ja kein
               | Gewerkschafter.       Als sie die Juden holten, habe ich
               | geschwiegen;         ich war ja kein Jude.       Als sie
               | mich holten,         gab es keinen mehr, der protestieren
               | konnte.
               | 
               | https://de.wikipedia.org/wiki/Martin_Niemoller#Habe_ich_g
               | esc...
        
           | userbinator wrote:
           | "First they came for the pedophiles..."
        
         | jamesholden wrote:
         | Oh hey.. Roomba scanning for illicit drugs on your coffee
         | table. It sounds ridiculous, but that thing is a literal camera
         | roaming around your entire house, with machine learning.. so..
        
         | jpxw wrote:
         | In reality, the fact that they do the scanning on-device is
         | actually better privacy-wise. But I can see how instinctually
         | it "feels" wrong.
        
           | stjohnswarts wrote:
           | Turning your own device against you is better? There is no
           | way that is better for you as a user. It's great for
           | governments and police forces but it doesn't do anything for
           | you as a user but spy on you without getting a warrant.
        
           | eCa wrote:
           | Can you explain how it is better?
           | 
           | If they only scanned icloud I can choose to not use that.
           | 
           | If they scan on my iphone my choice would be to stop using
           | that and get a gphone instead.
        
             | fossuser wrote:
             | It only runs if you have iCloud photos enabled.
             | 
             | Then it only runs client side and alerts if there are
             | matches to CSAM hashes above some threshold.
             | 
             | The other way is it runs against unencrypted images in
             | iCloud and they'd know of any match.
             | 
             | The new method also paves the way for the iCloud photos to
             | be encrypted (they're currently not).
        
               | AlexandrB wrote:
               | > It only runs if you have iCloud photos enabled.
               | 
               | The detailed specifics of this would be interesting to
               | know. If I accidentally flip iCloud photos on and then
               | turn it off within a few minutes, would that give the
               | system license to scan all my photos once? What if my
               | phone is in airplane mode and I turn iCloud photos on?
               | Will the system scan photos and create vouchers offline
               | for later upload?
               | 
               | Edit: On a related note, will turning on iCloud photos
               | now come with a modal prompt and privacy warning so
               | you're fully informed and can't do it "by accident"?
               | Pardon my ignorance if this happens already. I've never
               | turned mine on and am not about to start.
        
           | rowanG077 wrote:
           | No it's not. Not scanning any content on device at all is
           | better privacy wise.
        
             | treesprite82 wrote:
             | Often, despite all its disadvantages, on-device scanning
             | would at least let know see what was being scanned for.
             | 
             | But Apple have circumvented that with NeuralHash. There's
             | no way for the user to verify that only CSAM is being
             | detected - which could be partially accomplished by having
             | iOS only accept hashes signed by multiple independent child
             | protection organisations (with some outside of FVEY).
        
         | ksec wrote:
         | I think what you describe and what is shown by Apple is "the"
         | concrete piece of evidence, and what I have suspected for long
         | that there is no longer any product person running the company.
         | It used to be Steve Jobs.
         | 
         | You have people arguing over the technical definition of
         | privacy, or on devices, off devices terminology. It doesn't
         | matter. The user doesn't understand any of it. And they just
         | feel exactly what you wrote. An intrusion of their private
         | space by a company that has been telling you how much they
         | value your privacy for a better part of decade.
         | 
         | These sort of technical explanation used to be a Tech Companies
         | / Google / Microsoft things. Absolutely make sense to nerds,
         | but nothing much to a layman. Now Apple are doing the same. If
         | you watch their Keynotes, you should notice the increase usage
         | of technical jargons or numbers for performance over the past 5
         | years. Something that Steve Jobs's keynote used to keep it to
         | minimal.
        
         | skybrian wrote:
         | They didn't do themselves any favors by blurring the line
         | between apps and the OS. If it's just Apple's Photos app doing
         | this, you can install a different app and not use that one.
        
           | stjohnswarts wrote:
           | That's true for now but once it gets normalized it will be
           | extended laterally across the whole phone.
        
         | ursugardaddy wrote:
         | I wouldn't be surprised if there was some weird telemetry tied
         | to this too
        
         | hamburgerwah wrote:
         | Also if you remotely think the scanning is going to stop at
         | CSAM I have a bridge in brooklyn to sell you. Copyrighted
         | material, national security, thought crime, are already inside
         | this trojan horse waiting to jump out.
        
         | dangerface wrote:
         | I think its just more visible, its easy to upload to the
         | internet and think im uploading to the internet, you don't
         | think I am sharing this with law enforcement and Apples third
         | parties.
        
         | pbasista wrote:
         | > your own, trusty iPhone
         | 
         | iPhone runs software which is in full control of Apple. They
         | take active measures to stop general public from auditing its
         | behavior (i.e. no source code, no API specifications, etc).
         | 
         | In my opinion, such a device can never be considered as
         | belonging to their users or "trusted" in any reasonable way.
         | The reason is that its root functionality always remains in
         | control of a different entity.
         | 
         | I believe that such a relationship can more accurately be
         | described as a lease.
         | 
         | Instead of providing specifications and source code, which
         | would help establish the fundamental trust in their products
         | and their behavior, Apple seems to increase their reliance on
         | the "trust me bro" promises, which their current user base has
         | accepted.
         | 
         | So far.
        
         | thefourthchime wrote:
         | I agree. While the difference is slight, one feels almost
         | expected, the other is creepy and invasive.
        
         | farmerstan wrote:
         | In a few years, all the pedophiles will stop using iPhone, and
         | then only innocent people will be scanned in perpetuity.
         | Remember, if someone makes new CSAM, it won't match any hashes
         | so even "new" pedophiles won't get caught by this.
         | 
         | So really, the steady state is just us regular folks getting
         | scanned. As the years go on, what is defined as CSAM will morph
         | into "objectionable material", and will then include
         | "misinformation" like today. They say they won't right now, but
         | where will things be in a few years?
        
           | stjohnswarts wrote:
           | Apple just announce I would posit it will be a few weeks, not
           | a few years. This news is everywhere.
        
           | rootusrootus wrote:
           | > if someone makes new CSAM, it won't match any hashes so
           | even "new" pedophiles won't get caught by this.
           | 
           | That assumes the hashes aren't recorded permanently.
        
           | benhurmarcel wrote:
           | > all the pedophiles will stop using iPhone
           | 
           | Sounds like "mission accomplished" for Apple then
        
         | Wowfunhappy wrote:
         | The practical difference is that with on-device scanning, the
         | system is just a few bit flips away from scanning _every_ photo
         | on your device, instead of just the ones that are about to be
         | uploaded. With server-side scanning, the separation is clear--
         | what 's on Apple's server can be scanned, and what's only on
         | your phone cannot.
         | 
         | This all plays so perfectly into my long-time fears about the
         | locked down nature of the iPhone. Sure, it's more secure _if_
         | Apple is a good steward, but no one outside of Apple can
         | inspect or audit what it 's doing!
        
           | jonplackett wrote:
           | Yes. This is it. When I wrote the above I couldn't quite put
           | my finger on why I felt like this but this is the reason.
           | It's the slippery slope of mine / yours.
        
           | FooHentai wrote:
           | >the system is just a few bit flips away from scanning every
           | photo on your device
           | 
           | I prefer to think of it as being just one national security
           | letter away from that happening.
           | 
           | Which is of course a schroedinger's cat kinda thing. It's
           | already been sent. Or it hasn't. But why would the national
           | security apparatus _not_ take advantage of this obvious
           | opportunity?
           | 
           | Anyone giving benefit of the doubt to this kind of stuff
           | nowadays is IMO very naive or just poorly informed of recent
           | digital history.
        
             | jdavis703 wrote:
             | How would the agency issuing an NSL be able to generate a
             | hash of a photo they're looking for? Presumably if they
             | already had a photo to derive the hash they'd already have
             | whatever it is they're searching for.
        
               | barsonme wrote:
               | It's fuzzy hashing, so they don't need the bit-for-bit
               | same image.
               | 
               | Outside the US and other western countries, this could be
               | a government (Russia? Chechnya?) forcing Apple to look
               | for LGBT pride flags.
        
               | silverpepsi wrote:
               | I think you already got one great reply, I have just one
               | thing to add to it: your post literally presupposes a 1.0
               | version of this software that never has its feature set
               | expanded. I don't think that's a reasonable assumption.
               | After all, with 1.0 the goal of catching this class of
               | person is barely achieved. They'll likely arrest people,
               | 99% of whom are just CSAM jpeg collectors who get an
               | extra kick out of viewing a class of image that is super
               | illegal. And nothing more.
               | 
               | Then for version 2.0, they'll realize the INTERPOL, FBI,
               | whatever can provide them a combo hash plus one string of
               | text that can nail a known active producer. The holy
               | grail. This small feature add to get so much closer to
               | the original "moral goal" will prove too appealing to
               | pass up. Now all the code is in place for the govt to
               | pass over requests for data pinpointing mere common
               | criminals.
        
               | gpm wrote:
               | Because you're searching for people, not documents.
               | 
               | For a normal law enforcement context, say you bust 1 drug
               | dealer, and are trying to find more. Maybe they have an
               | image of a "contract" (document outlining terms of a drug
               | deal) on their phone. You could find the other parties to
               | that contract by searching everyone's phones for that
               | image.
               | 
               | For a national security context, you could imagine
               | equivalents to the above, you could also imagine
               | searching for classified documents that are believed to
               | have been leaked, or maybe searching for documents you
               | stole from an adversary that you believe there spies are
               | likely to be storing on their phones.
               | 
               | I'm saying documents here instead of images, but plenty
               | of documents are just images, and I have little doubt
               | that they could get this program to expand to include
               | "scan PDFs, because you could make a CSAM into a pdf" (if
               | it doesn't already?).
        
               | unionpivo wrote:
               | The program has capability to upload questionable photos
               | for review.
               | 
               | Just make it do equivalent of .* of all photos on device.
               | It would be hard to argue, that's more difficult than
               | scanning for specific hash.
               | 
               | And there is nothing specific about images. Extending
               | this to scan arbitrary data is probably not that much
               | code, depending on how its program maybe configurable.
        
           | nitrogen wrote:
           | _With server-side scanning, the separation is clear_
           | 
           | In all these threads everyone is coming close to the crux of
           | the issue, but I want to restate it in clearer terms:
           | 
           | There is a sacrosanct line between "public" and "private,"
           | "mine" and "yours." That line cannot be crossed by Western
           | governments without a warrant. Cloud computing has
           | deliberately blurred this line over time. This on-device
           | scanning implementation blows right past that line.
           | 
           | Our tools before the computing revolution, and our devices
           | after, become a part of us. Our proprioception extends to
           | include them as part of "self." A personal device -- a tool
           | that should be wholly owned and wholly dependable, like pen
           | and paper -- that betrays its user, is a self that betrays
           | itself.
        
             | inkyoto wrote:
             | > There is a sacrosanct line between "public" and
             | "private," "mine" and "yours." That line cannot be crossed
             | by Western governments without a warrant.
             | 
             | This is a self-delusion, I am afraid. The line has been
             | crossed more than once, and it will be crossed again. UK
             | and Australian governments are just two prime examples of
             | waving terrorism and pedobear banners as a pretext to get
             | invasive with each new legislation, and the Oz government
             | already has a new legislation draft to make it a crime to
             | refuse cooperation with law enforcing services when they
             | request access to the encrypted content (think Signal
             | messages). Also, refer to the Witness K case to see how
             | cases that are unfavourable to the standing government
             | completely bypass a <<trustworthy>> Western judicial
             | system, including the Minister of Justice.
             | 
             | CSAM is guaranteed to be abused under whatever new pretext
             | politicians can come up with, and we won't even know that
             | Apple has been quietly subjugated to comply with it in
             | those jurisdictions. There will be even less transparency
             | and even more abuse of CSAM in <<non-Western>> countries.
             | That is the actual worry.
        
               | adolph wrote:
               | And all the while:
               | 
               |  _Sir James Wilson Vincent Savile OBE KCSG ( /'saevIl/;
               | 31 October 1926 - 29 October 2011) was an English DJ,
               | television and radio personality who hosted BBC shows
               | including Top of the Pops and Jim'll Fix It. He raised an
               | estimated PS40 million for charities and, during his
               | lifetime, was widely praised for his personal qualities
               | and as a fund-raiser. After his death, hundreds of
               | allegations of sexual abuse were made against him,
               | leading the police to conclude that Savile had been a
               | predatory sex offender--possibly one of Britain's most
               | prolific. There had been allegations during his lifetime,
               | but they were dismissed and accusers ignored or
               | disbelieved; Savile took legal action against some
               | accusers._
               | 
               | https://en.wikipedia.org/wiki/Jimmy_Savile
        
               | Wowfunhappy wrote:
               | Respectfully, I don't understand where you're going with
               | this at all. I could point to it and say, _" Wow, we need
               | to make sure nothing like that ever happens again, no
               | matter what the cost to personal liberty!"_
        
               | adolph wrote:
               | I can see that interpretation for sure. My impression is
               | that folks like Savile will always be protected from CSAM
               | by the same powers that be calling for CSAM. I'm not
               | certain if that viewpoint is realism or cynicism, or if
               | there is a difference.
        
               | dmix wrote:
               | There's never been any indication that there has been any
               | more difficulty in catching these guys with better
               | smartphone privacy features. It's easier than ever
               | largely because the police have become better at
               | investigations involving the internet.
               | 
               | Asking the police to put some investigative effort in
               | rather than a dragnet sweep of every photo on every
               | persons device is not too much to ask.
               | 
               | Importantly this should be the hard-capped precedent for
               | every sort of crime. There's always a ton of other ways
               | these people get caught without handing over preemptive
               | access to every innocent persons phone. Same with
               | terrorism, murder, etc.
        
               | [deleted]
        
               | adolph wrote:
               | Maybe I should add more text in support of the parent
               | comment's point. When I read "UK and Australian
               | governments are just two prime examples of waving
               | terrorism and pedobear banners," my first thought was:
               | all the while _protecting_ monsters like Savile.
               | 
               | I see push for CSAM as an alignment of powers in the form
               | of "The Baptist and Bootlegger." The well meaning moral
               | campaigners are a tool for folks who seek to surveil and
               | control.
               | 
               | https://en.wikipedia.org/wiki/Bootleggers_and_Baptists
        
               | joe_the_user wrote:
               | The line between private and public has been crossed many
               | times but we're still better off having that line
               | officially there.
               | 
               | Even if the cops and private security along with TLAs are
               | actively spying on people on an extremely regular basis,
               | we're better off if officially they aren't supposed to
               | be, if some court cases can get tossed for this, that
               | they have to retreat sometimes, etc.
               | 
               | This is why having Apple overtly spying on the "private
               | side" is still a big thing even with all the other
               | revelations.
        
               | mlindner wrote:
               | In the US however this division remains, even if it has
               | been lost elsewhere. (This loss is one of the reasons I
               | have had to never want to live in the UK nor Australia.)
        
               | feanaro wrote:
               | The line is not a delusion just because it happened to be
               | crossed at some point. The line is there because humans
               | intuitively consider it to be there.
               | 
               | The fact that the line is there gives us a reason and a
               | foothold to fight back when someone attempts crossing it.
        
             | N00bN00b wrote:
             | > that betrays its user, is a self that betrays itself.
             | 
             | And cloud computing are corporations that take your self
             | away from you.
             | 
             | And that was always going to happen. And people have been
             | fighting that idea in the shadows for a while.
             | 
             | This issue is just bringing it to the foreground in a most
             | intense way. "Think of the children."
             | 
             | They are coming for your self. They will take it. There's
             | not doubt. There never was. Your self is _too_ valuable and
             | you 've just been squandering it anyway.
             | 
             | Eventually you become Apple. Unless you start resisting,
             | but that's _hard_. Just slooooowly warm the water. This
             | froggy is fine. Nice and warm. Just don 't touch the bottom
             | of the pot.
        
           | zepto wrote:
           | How is it any more 'bit flips' away from scanning every photo
           | on your device than it was before?
        
             | Wowfunhappy wrote:
             | Because                   if (willBeUploaded) {
             | scanPhoto();         }
             | 
             | can become                   if (true) {
             | scanPhoto();         }
             | 
             | Obviously, this is stupidly oversimplified, I have no idea
             | how Apple has structured their code. But the fact of the
             | matter is, if the scanning routine is already on the phone,
             | and the photos are on the phone, all anyone has to do is
             | change which photos get scanned by the routine...
        
               | zepto wrote:
               | Right, but the CSAM scanning routine is absurdly narrow
               | and difficult to use for detecting anything but CSAM,
               | whereas a general purpose hash matching algorithm really
               | is just a few lines of code.
               | 
               | This whole 'now that they have a scanner it's easier'
               | reasoning doesn't make sense. iOS already had numerous
               | better matching algorithms if your goal is to do general
               | spying.
        
               | zionic wrote:
               | This is false information, the scanning system is highly
               | generalized to finding any type of photo via fuzzy
               | matched perceptual hashes.
               | 
               | It can target a pride flag just as easily as CP.
        
               | zepto wrote:
               | No, what you have said is bullshit. The perceptual hashes
               | are not generalized. They do not match 'child porn' or
               | 'pride flag'. They match specific photos from a database.
               | 
               | You might argue that that database could contain a
               | specific photo of a pride flag. You would be right, but
               | there are safeguards against that. The system will not
               | trigger with just one match. A set of multiple images
               | must match. So no, the system won't match a pride flag,
               | and no the system won't match a specific picture of a
               | pride flag.
               | 
               | The next argument would be that a whole load of pictures
               | of pride flags could be uploaded and then someone who has
               | enough of those pictures would trigger a match.
               | 
               | This is true from a technical standpoint, but there are
               | two more safeguards. The database is maintained by NCMEC,
               | which is a non profit dedicated to tackling child abuse,
               | _and_ matches are visually confirmed by Apple.
               | 
               | So, to match someone who just happens to have a pride
               | flag in a picture is not possible. To match someone who
               | has a set of specific pictures of pride flags is only
               | possible if both Apple and NCMEC collude.
               | 
               | So no - it can't target a pride flag at all, much less as
               | easily as CP.
        
               | mdoms wrote:
               | This is pure speculation.
        
               | zepto wrote:
               | Not one line of it is speculation. This is taken from
               | Apple's public docs.
        
               | cf499 wrote:
               | "(...) but there are safeguards against that."
               | 
               | Hah! Nice one :D
        
               | zepto wrote:
               | Apple have described the safeguards. If you want to claim
               | they are lying, by all means do so.
        
               | simion314 wrote:
               | >A set of multiple images must match.
               | 
               | How many exactly? The threshold is secret, it could be 1
               | or 2 or maybe it is dynamic and depends on stuff like
               | your identity.
        
               | zepto wrote:
               | It could be 1 or 2, but the claim is that it is set to
               | only detect people who are actually building a collection
               | of CSAM images because Apple doesn't want this system to
               | flag people who are unlikely to be actual criminals. I.e.
               | they don't just want no false positives, they don't want
               | positives that aren't highly likely to be criminal. I'm
               | guessing it's a lot more than 2.
        
               | simion314 wrote:
               | > but the claim is that it is set to only detect people
               | who are actually building a collection of CSAM
               | 
               | From what I read the threshold is for a different
               | purpose, ,the false positives are too many so to reduce
               | the number of reports the threshold workaround is
               | introduced, so is a hack to workaround the false
               | positives and not a threshold to catch people with big
               | collections.
        
               | zepto wrote:
               | I have seen both in comments from Apple. Certainly
               | spokespeople have talked about wanting any reports to
               | NCMEC to be actionable.
               | 
               | I think a lot hinges on what you call a 'false positive'.
               | It could mean 'an image that is not CSAM' or it could
               | mean 'an account that has some apparent CSAM, but not
               | enough to be evidence of anything'.
        
               | simion314 wrote:
               | I think it is clear that Apple will have no choice, if
               | they for sure find 1 CSAM image they can't say "it is
               | just 1 img so it is not a big collection..." they have to
               | send it to whoever to handle it.
               | 
               | What would help me with this kind of things would be more
               | transparency, like to know the algorithm, the threshold,
               | the real numbers of false positives, can the hashes be
               | set per country or individual, can independent people
               | look inside iOS and confirm that the code works as
               | described...
               | 
               | What would also help is to know for sure hoe many
               | children were saved by this kind of system,
        
               | zepto wrote:
               | > I think it is clear that Apple will have no choice, if
               | they for sure find 1 CSAM image they can't say "it is
               | just 1 img so it is not a big collection..." they have to
               | send it to whoever to handle it.
               | 
               | This is a misunderstanding of how it works. The threshold
               | has to be met _before_ a detection occurs. Unless the
               | threshold is met Apple doesn't get to know about _any_
               | images.
               | 
               | As for independent auditing and transparent statistics, I
               | fully agree.
        
               | Wowfunhappy wrote:
               | Isn't the scanning routine searching for fingerprints?
               | What happens if someone adds a fingerprint to the
               | database, which matches something other than CSAM?
        
               | zepto wrote:
               | No.
               | 
               | It's way more complex than that, and has been engineered
               | to prevent exactly that scenario.
               | 
               | I recommend you actually check out some of Apple's
               | material on this.
               | 
               | The idea that it's just a list or fingerprints or hashes
               | that can easily be repurposed is simply wrong, but is the
               | root of almost all of the complaints.
        
               | Wowfunhappy wrote:
               | I've looked through Apple's paper, what are you referring
               | to?
        
               | zepto wrote:
               | So you know it's not just a fingerprint. Did you notice
               | the parts about the safety vouchers and the visual
               | derivatives?
        
               | dannyw wrote:
               | Yes, I did. You do realise that CSAM is just a policy
               | control, and there is literally nothing technical from
               | Apple adding non-CSAM content to the same policy and
               | systems?
               | 
               | I feel like you may not have actually read the paper in
               | depth. The paper clearly shows this is a generalised
               | solution for detecting content that perceptually matches
               | a database. The "what" is CSAM today, but nothing about
               | the technicals require it to be CSAM.
               | 
               | To answer your specific response of safety voucher and
               | visual derivative:
               | 
               | - The 'safety voucher' is an encrypted packet of
               | information containing a part of the user's decryption
               | keys, as well as a low resolution, grayscale version of
               | the image (visual derivative). It is better described as
               | "backdoor voucher".
               | 
               | - The 'visual derivative' is just the image but smaller
               | and grayscale.
               | 
               | None of those two technologies have anything to do with
               | CSAM. Apple's statement that they will only add CSAM to
               | the database is a policy statement. Apple can literally
               | change it overnight to detect copyrighted content, or
               | Snowden documents, or whatever.
        
               | zepto wrote:
               | You are ignoring how the parts of the technology are used
               | to separate responsibilities.
               | 
               | > Apple can literally change it overnight to detect
               | copyrighted content, or Snowden documents, or whatever.
               | 
               | No, Apple doesn't run the database. NCMEC does.
        
               | Wowfunhappy wrote:
               | It doesn't matter where the fingerprints are coming from,
               | they're all just fingerprints. Today they come from
               | NCMEC. Tomorrow they could be from the Chinese Communist
               | Party.
        
               | zepto wrote:
               | It's hard to take seriously the complaint that this
               | mechanism would give the Chinese Communist party more
               | power.
               | 
               | Also, in the US, the 4th Amendment protects against the
               | abuse you are talking about:
               | 
               | https://www.economist.com/united-
               | states/2021/08/12/a-38-year...
        
               | jkcxn wrote:
               | By the way they already scan photos that aren't uploaded
               | to iCloud. I've never used iCloud and I can go on the
               | photos app and search for food for example
        
               | zepto wrote:
               | Exactly. _They already analysed photos on your phone for
               | search_. They don't in fact do CSAM detection on the
               | photos on your phone.
               | 
               | But why are people only objecting now when they were
               | already doing on-device analysis?
               | 
               | CSAM detection is a complete red herring. It is _less_ of
               | a general purpose scanning mechanism than the search
               | function that was already there.
        
               | Wowfunhappy wrote:
               | They're doing on device analysis, but there isn't any
               | process by which an image of a hotdog will be sent to
               | someone else without my knowledge.
        
               | zepto wrote:
               | Right, but neither is the CSAM code being used to detect
               | anything but CSAM.
               | 
               | If a bit can be flipped to make the CSAM detector go
               | evil, surely it can be flipped to make photo search go
               | evil, or spotlight start reporting files that match
               | keywords for that matter.
               | 
               | There is nothing special about this CSAM detector except
               | that it's narrower and harder to repurpose than most of
               | the rest of the system.
        
               | pritambaral wrote:
               | > ... to make photo search go evil, or spotlight start
               | reporting files that match keywords for that matter.
               | 
               | As your immediate parent has already said: there isn't
               | any process by which an image of a hotdog will be sent to
               | someone else without my knowledge.
               | 
               | Neither "photo search" nor "spotlight" report anything to
               | third parties currently.
        
               | zepto wrote:
               | No, but a 'bit flip' could make them do so just as easily
               | as a 'bit flip' could make the CSAM detector do something
               | nefarious.
        
               | pritambaral wrote:
               | How are you still not getting this?: "search" and
               | "spotlight" _do not have the code_ to report anything to
               | anyone.
        
               | ninkendo wrote:
               | Right, the difference is that now, you can be reported to
               | the authorities for a photo that the CSAM algorithm
               | mistook for child pornography, whereas before the image
               | classifying was purely for your own use.
        
               | zepto wrote:
               | This is simply a complete falsehood, because photos that
               | are not uploaded are not scanned for CSAM.
               | 
               | Photos for your own use that you do not upload to Apple's
               | cloud are completely unaffected.
        
           | slg wrote:
           | There is also an argument that the cloud scanning is a few
           | bit flips away from allowing random Apple employees from
           | looking through all my photos while on device scanning means
           | they have to be specifically looking for certain content. On
           | device scanning is probably preferred if one's fear is
           | someone stealing their nudes, which is one of the more common
           | fears about photo privacy and one that Apple has already
           | shown is a problem for them.
        
             | grishka wrote:
             | When you choose to use a cloud (aka someone else's
             | computer), you trust that someone else, because you're no
             | longer in control of what happens to your data once it
             | leaves your device. That's a sane expectation, anyway. But
             | you don't expect your own device to behave against your
             | interests.
        
             | Wowfunhappy wrote:
             | Sure, but I don't particularly care because I don't upload
             | sensitive photos to iCloud. As I'd recommend to everyone
             | else. If it's on someone else's computer, it's not yours.
             | 
             | But, I suppose my iPhone was never really mine either. I
             | knew that, I just never quite put two and two together...
        
               | slg wrote:
               | >Sure, but I don't particularly care because I don't
               | upload sensitive photos to iCloud
               | 
               | So you are saying you have nothing to fear because you
               | aren't hiding anything?
               | 
               | Is this a tacit admission that you don't want them
               | scanning your phone for CSAM because they will find
               | something?
               | 
               | I'm obviously not seriously accusing you of anything,
               | just pointing out how your line of argument applies
               | equally to privacy whether on the cloud or on your
               | device.
        
               | Wowfunhappy wrote:
               | > So you are saying you have nothing to fear because you
               | aren't hiding anything?
               | 
               | No, I'm saying that if you're hiding something, maybe
               | don't give it to someone else.
        
               | slg wrote:
               | "I'm not giving my photos to anyone. I am just taking
               | photos on my phone and Apple is automatically backing
               | them up to iCloud. I have no idea how that feature was
               | turned on or how to turn it off." ~ most normal people.
               | 
               | People here too often only think of these issues from the
               | perspective of the type of person who browses HN. Apple
               | is thinking of this from the perspective of an average
               | iPhone users.
        
               | jdavis703 wrote:
               | You have to configure iCloud backups when setting up a
               | phone for the first time. Someone who is non-technical,
               | but privacy conscious isn't going to do this.
        
               | slg wrote:
               | I don't know why technical people assume non-technical
               | people fully understand the privacy implications of
               | certain technology.
        
               | Wowfunhappy wrote:
               | Or, perhaps many regular users _do_ understand the
               | separation, and that 's why this issue is getting so much
               | attention.
        
               | slg wrote:
               | Is this issue getting much attention outside of tech
               | circles? I have seen a few news stories here or there,
               | but nothing major. I have not heard a word about it from
               | any of my non-tech friends or from the non-tech people I
               | follow on social media. Meanwhile people on HN are acting
               | as if this is one of the biggest stories of the year with
               | multiple posts on it every day.
               | 
               | There is a huge separation in the importance that the
               | tech community puts on the general concept of privacy
               | while the average person rarely worries about it.
        
               | Wowfunhappy wrote:
               | I mean, absent a gallop poll there's really no way to
               | know, but we were discussing this at my office today, and
               | I work at a graphic design studio, not a software
               | development firm.
        
           | throaway46546 wrote:
           | I agree. Our phones tend to hold all of our most intimate
           | data. Violating them is akin to violating our homes. It would
           | be nice if the laws saw it this way.
        
           | tmdg wrote:
           | I feel like this is a false sense of security. Even before
           | this change, they can easily access and scan photos on your
           | device. If they do any post-processing of the image on
           | device, they already do.
        
             | shuckles wrote:
             | Apple already scans your photos for faces and syncs found
             | faces through iCloud. I'd imagine updating that machine
             | learning model is at least as straightforward as this one.
        
               | Wowfunhappy wrote:
               | They're searching for different things though. To my
               | knowledge, before now iOS has never scanned for
               | fingerprints of specific photographs. It would be so darn
               | easy to replace the CSAM database with fingerprints of
               | known tiananmen square photos...
        
               | samstave wrote:
               | So you have User A - they upload a pic with User A and
               | Peoples B,C,D,E,Z
               | 
               | icloud scans for those faces
               | 
               | finds those faces and ties them to other ID accounts via
               | face - then via fingerprint recognition to a device, and
               | to a location based on IMEI etc.
               | 
               | Apple's platform is literally the foundation for the most
               | dystopian digital tool-set in history...
               | 
               | Once the government is able to crack the apple war chest,
               | everything is fucked.
        
               | samstave wrote:
               | --MBS has entered the chat.
               | 
               | Go fuck yourself.
        
               | shuckles wrote:
               | That is a distinction without a difference. I'm sure you
               | could put together quite a good tank man classifier
               | (proof: Google Reverse Image Search works quite well),
               | and it'd catch variations which a perceptual hash
               | wouldn't.
               | 
               | The only difference is intent. The technical risk has not
               | changed at all.
        
               | dpedu wrote:
               | That is to say face scanning is equally insidious as the
               | new feature?
        
               | shuckles wrote:
               | The technical risk to user privacy - if your threat model
               | is a coerced Apple building surveillance features for
               | nation state actors - is exactly the same between CSAM
               | detection and Photos intelligence which sync results
               | through iCloud. In fact, the latter is more
               | generalizable, has no threshold protections, and so is
               | likely worse.
        
               | int_19h wrote:
               | It's the _legal_ risk that is the biggest problem here.
               | Now that every politician out there knows that this can
               | be done for child porn, there 'll be plenty demanding the
               | same for other stuff. And this puts Apple in a rather
               | difficult position, since, with every such demand, they
               | have to either accede, or explain why it's not "important
               | enough" - which then is easily weaponized to bash them.
               | 
               | And not just Apple. Once technical feasibility is proven,
               | I can easily see governments _mandating_ this scheme for
               | all devices sold. At that point, it can get even more
               | ugly, since e.g. custom ROMs and such could be seen as a
               | loophole, and cracked down upon.
        
               | shuckles wrote:
               | This hypothetical lacks an explanation for why every
               | politician has not demanded Apple (or say Google) do this
               | scope creep already for photos stored in the cloud where
               | the technical feasibility and legal precedent has already
               | been established by existing CSAM scanning solutions
               | deployed at scale.
        
               | int_19h wrote:
               | I have to note that one of those solutions deployed at
               | scale _is_ Google 's. But the big difference is that when
               | those were originally rolled out, they didn't make quite
               | that big of a splash, especially outside of tech circles.
               | 
               | I will also note that, while it may be a hypothetical in
               | this particular instance as yet, EU already went from
               | passing a law that allows companies to do something
               | similar _voluntarily_ (previously, they 'd be running
               | afoul of privacy regulations), to a proposed bill making
               | it _mandatory_ - in less than a year 's time. I don't see
               | why US would be any different in that regard.
        
               | shuckles wrote:
               | Ok but now you've said that the precedent established by
               | Google and others already moved the legislation to
               | require terrible invasions of privacy far along. You
               | started by saying Apple's technology (and, in particular,
               | its framing of the technology) has brought new legal
               | risk. What I'm instead hearing is the risk would be
               | present in a counter factual world where nothing was
               | announced last week.
               | 
               | At this point of the discussion, people usually pivot to
               | scope creep: the on-device scanning could scan all your
               | device data, instead of just the data you put on the
               | cloud. This claim assumes that legislators are too dumb
               | to connect the fact that if their phone can search for
               | dogs with "on-device processing," then it could also
               | search for contraband. I doubt it. And even if they are,
               | the national security apparatus will surely discover this
               | argument for them, aided by the Andurils and NSOs of the
               | world.
               | 
               | As I have repeatedly said: the reaction to this
               | announcement sounds more like a collective reckoning of
               | where we are as humans and not any particular new risk
               | introduced by Apple. In the Apple vs. FBI letter, Tim
               | urged us to have a discussion about encryption, when we
               | want it, why we want it, and to what extent we should
               | protect it. Instead, we elected Trump.
        
               | int_19h wrote:
               | The precedent established by Google et al is that it's
               | okay to scan things that are physically in their data
               | centers. It's far from ideal, but at least it's somewhat
               | common sense in that if you give your data to strangers,
               | they can do unsavory things with it.
               | 
               | The precedent now established by Apple is that it's okay
               | to scan things that are physically in possession of the
               | user. Furthermore, they claim that they can do it without
               | actually violating privacy (which is false, given that
               | there's a manual verification step).
        
               | shuckles wrote:
               | The precedent established by Apple, narrowly read, is
               | it's ok to scan data that the user is choosing to store
               | in your data center. As you pointed out, this is at least
               | partly a legal matter, and I'm sure their lawyers - the
               | same ones who wrote their response in Apple vs. FBI I'd
               | imagine - enumerated the scope or lack thereof.
               | 
               | Apple's claim, further, is that this approach is more
               | privacy-preserving than one which requires your cloud
               | provider to run undisclosed algorithms on your plaintext
               | photo library. They don't say this is not "violating
               | privacy," nor would that be a well-defined claim without
               | a lot of additional nuance.
        
               | zepto wrote:
               | Exactly this. The whole thing is a red herring. If Apple
               | wanted to go evil, they can easily do so, and this very
               | complex CSAM mechanism is the last thing that will help
               | them.
        
               | shuckles wrote:
               | I've read your comments, and they are a glass of cold
               | water in the hell of this discourse. This announcement
               | should force people to think about how they are governed
               | - to the extent they can influence it - and double down
               | on Free Software alternatives to the vendor locked
               | reality we live in.
               | 
               | Instead, a forum of presumably technically savvy people
               | are reduced to hysterics over implausible futures and a
               | letter to ask Apple to roll back a change that is barely
               | different from, and arguably better than, the status quo.
        
               | zepto wrote:
               | Thanks. I couldn't agree more.
               | 
               | We need both - develop free software alternatives (which
               | means to stop pretending the alternatives are good
               | enough), and to get real about supporting legal and
               | governance principles that would protect against abuses.
               | 
               | If people want to do something about this, these are the
               | only protections.
        
               | mlindner wrote:
               | Nonsense. Building an entire system as opposed to adding
               | a single image to a database is a substantially different
               | level of effort. In the US at least this was used
               | successfully as a defense. The US cannot coerce companies
               | build new things on their behalf because it would
               | effectively create "forced speech" which is forbidden by
               | the US Constitution. However they can be coerced if there
               | is minimal effort like adding a single hash to a
               | database.
        
               | shuckles wrote:
               | Photos intelligence already exists, and if people are
               | really going to cite the legal arguments in Apple vs.
               | FBI, then it's important to remember the "forced speech"
               | Apple argued it could not be compelled to make was
               | changing a rate limit constant on passcode retries.
        
               | fraa-orolo wrote:
               | A false positive in matching faces results in a click to
               | fix it or a wrongly categorized photo. A false positive
               | in this new thing may land you in jail or have your life
               | destroyed. Even an allegation of something so heinous is
               | enough to ruin a life.
               | 
               | The "one in trillion" chance of false positives is
               | Apple's invention. They haven't scanned trillions of
               | photos and it's a guess. And you need multiple false
               | positives, yet no one says how many, so it could be a low
               | number. Either way, even with how small the chance of it
               | being wrong is, the consequences for the individual are
               | catastrophic. No one sane should accept that kind of
               | risk/reward ratio.
               | 
               | "Oh, and one more thing, and we think you'll love it. You
               | can back up your entire camera roll for just $10 a month
               | and a really infinitesimally minuscule chance that you
               | and your family will be completely disgraced in the
               | public eye, and you'll get raped and murdered in prison
               | for nothing."
        
               | systoll wrote:
               | Ok.
               | 
               | So iCloud Photos circa 2020 [and Google Photos and
               | Facebook and Dropbox and OneDrive] aren't a risk you
               | should be willing to take.
               | 
               | This feature doesn't change anything in that regard; the
               | scanning was already happening.
        
               | XorNot wrote:
               | I literally do not take that risk in 2021. I do,
               | currently, make the reasoned assurance that the
               | computational overhead of pushing changes down to my
               | phone, and the general international security community,
               | are keeping me approximately abreast of whether my
               | private device is actively spying on me (short answer: it
               | definitely is, longer answer: but to what specific
               | intent?)
               | 
               | Apple's new policy is: "of course your phone is scanning
               | and flagging your private files to our server - that's
               | normal behavior! Don't worry about it".
        
             | lifty wrote:
             | It's not a false sense of security, it's a clear
             | delimitation between theirs and mine; Debian package
             | maintainers can also slip a scanner on your machine but
             | that is a big line to cross on purpose and without
             | notifying the user.
        
               | totetsu wrote:
               | But with a debian package you can choose not to accept
               | the upgrade and see any funny business in the release
               | source code..
        
               | gpm wrote:
               | Somewhere along the line someone is producing and signing
               | the binaries that find their way onto their computer,
               | they could produce those binaries from different source
               | code and I would be none the wiser.
               | 
               | Debian tries to be reproducible, so to avoid being caught
               | they might need to control the mirror to so that they
               | could send it to only me. I.e. if I'm lucky it would take
               | a total of 2 people to put malicious binaries on my
               | computer (1 with a signing key, 1 with access to the
               | mirror I download things from).
        
               | least wrote:
               | That is technically true but in a real very practical
               | sense everyone here using OSS absolutely is trusting a
               | third party because they are not auditing every bit of
               | code they run. For less technical people there is
               | effectively zero difference between open and closed
               | software.
               | 
               | It's really disingenuous to suggest that open source
               | isn't dependent on trust, you just change who you are
               | trusting. Even if the case is someone else is auditing
               | that code, you're trusting that person instead of the
               | repository owners.
               | 
               | I'll concede that at least that possibility to audit
               | exists but personally I do have to trust to a certain
               | extent that third parties aren't trying to fuck me over.
        
               | [deleted]
        
               | totetsu wrote:
               | Thinking about this.. I guess my trust, is that someone
               | smarter than I will notice it, cause a fuss, and the
               | community will raise pitch forks and.. git forks. My
               | trust is in the community, I hope it can stay healthy and
               | diverse for all time.
        
               | m4rtink wrote:
               | Maybe if you drink from the NPM PyPI firehouse without
               | checking (as too many do unfortunately).
               | 
               | For regular Linux distribution there are maintainers
               | updating packages from upstream source that can spot
               | malicious changes slipped in upstream. And if maintainers
               | in one district don't notice, it is likely some in
               | onether distro will.
               | 
               | And there are LTS/enterprise distros where upstream
               | changes take much longer to get in and the distro does
               | not change much after release. Making it even less likely
               | a sudden malicious change will get in unnoticed.
        
               | jancsika wrote:
               | > Even if the case is someone else is auditing that code,
               | you're trusting that person instead of the repository
               | owners.
               | 
               | Suppose Debian's dev process happened at monthly in-
               | person meetings where minutes were taken and a new
               | snapshot of the OS (without any specific attribution)
               | released.
               | 
               | If that were the case, I'd rankly speculate that Debian
               | devs would have misrepresented what happened in the
               | openssl debacle. A claim would have been made that some
               | openssl dev was present and signed off on the change.
               | That dev would have then made a counterclaim that regular
               | procedure wasn't followed, to which another dev would
               | claim it was the openssl representative's responsibility
               | to call for a review of relevant changes in the breakout
               | session of day three before the second vote for the
               | fourth day's schedule of changes to be finalized.
               | 
               | Instead, there is a very public history of events that
               | led up to the debacle that anyone can consult. That
               | distinction is important-- it means that once trust is in
               | question, _anyone_ -- including me-- can pile on and view
               | a museum of the debacle to determine exactly how awful
               | Debian's policy was wrt security-related changes.
               | 
               | There is no such museum for proprietary software, and
               | that is a big deal.
        
               | least wrote:
               | That's certainly true, and it is a strong 'selling
               | point,' so to speak, for open software. But openness is
               | just one feature of many that people use for making
               | considerations about the sort of software they run and
               | frankly, for an average consumer, it probably weighs
               | extremely low on their scale, because in either case it's
               | effectively a black box, where having access to that
               | information doesn't actually make them more informed, nor
               | do they necessarily care to be informed.
               | 
               | Most people don't care to follow the controversies of
               | tech unless it becomes a tremendously big issue, but even
               | then, as we've seen here, there are plenty of people that
               | simply don't have the technical acumen to really do any
               | meaningful analysis of what's being presented to them and
               | are depending on others to form their opinion, whether
               | that be a friend/family member or some tech pundit
               | writing an article on a major news organization's
               | website.
               | 
               | Trusting Apple presents a risk to consumers but I'd argue
               | that for many consumers, this has been a reasonable risk
               | to take to date. This recent announcement is changing
               | that risk factor significantly, though in the end it may
               | still end up being a worthwhile one for a lot of people.
               | Open Source isn't the be all end all solution to this, as
               | great as that'd be.
        
               | ipaddr wrote:
               | Trusting in a group of people like you to cover areas you
               | might not is the benefit of open source and a healthy
               | community.
               | 
               | With Apple you have to trust them and trust they don't
               | get national security order.
               | 
               | I trust that if everyone who had the ability to audit OSS
               | got a national security order it would leak and it would
               | be impossible for many who live in other nations.
        
             | stjohnswarts wrote:
             | The only way what you said is not true for any networked
             | device is to just go down to the river and throw it in and
             | never use a digital device again. It's not a false sense of
             | security, it's a calculated position on security and what
             | you will accept, moving spying from the server to the phone
             | was the last straw for a lot of people.
        
         | unityByFreedom wrote:
         | > I know that isn't how you should look at it
         | 
         | Why not? The argument seems perfectly reasonable to me.
        
         | guerrilla wrote:
         | > I know that isn't how you should look at it, but that's still
         | how it feels.
         | 
         | Why do you think you know that? You didn't say. It seems like
         | that is how you should see it since they are using your
         | processing power on your device potentially against you.
        
         | Kim_Bruning wrote:
         | I wonder how much of this is shifting the overton window. I'd
         | really not have people scanning my data at all really.
         | 
         | "If you give me six lines written by the hand of the most
         | honest of men, I will find something in them which will hang
         | him." said Cardinal Richelieu in the 16th century.
         | 
         | I'm pretty sure my phone and online accounts hold a lot more
         | data than just six lines. How do I know there won't be
         | accidents, or 'accidents'?
        
         | rowanG077 wrote:
         | Why is this not how you should look at it. This is exactly how
         | I look at it. It's your device, no company should be able to
         | scan shit on a device that they have sold to someone. That's
         | like keeping the key of a house after you have sold it and then
         | checking sneakily every week whether the new owners are doing
         | illegal stuff.
        
         | AceJohnny2 wrote:
         | Wild thought: could on-device scanning be considered a
         | violation of the 3rd Amendment (quartering soldiers [enforcing
         | agents of the government] in peoples homes [property]) ?
        
           | COGlory wrote:
           | My understanding is that there isnt a whole lot of
           | established anything re: the 3rd amendment. So...maybe, we
           | can dream?
           | 
           | I think warantless search is a much stronger argument,
           | though.
        
         | vineyardmike wrote:
         | > I know that isn't how you should look at it, but that's still
         | how it feels.
         | 
         | This is exactly how you should look at it. They turned your
         | device against you.
        
           | chrischattin wrote:
           | I downvoted this comment because it's hyperbole.
           | 
           | There are plenty of over very real concerns with this. For
           | example, say some activist intern at Apple decides they don't
           | like your politics and hits the report button, next thing you
           | know, the police show up to your door and are confiscating
           | all your devices. Just one of many potential abuses.
        
         | matt123456789 wrote:
         | Don't second-guess yourself. The viewpoint you express is
         | completely valid. Other child comments have pointed this out,
         | but corporate messaging does not get to tell you how _you_
         | choose to look at things.
        
         | pehtis wrote:
         | Yes, thats exactly how i feel. I'd still hate it if my iCloud
         | uploads are scanned but I'm already assuming that anyway.
         | 
         | But the fact that my iOS device can potentially report me to
         | any authorities, for whatever reason, is crossing a line that
         | makes it impossible to ever own an iPhone again. I bought my
         | first one in 2007 so I'm not saying this lightly..
         | 
         | Does anybody know if this policy will extend to macOS too?
        
           | stephc_int13 wrote:
           | It has already be announced, this is not only iOS, but all
           | Apple devices.
        
             | FabHK wrote:
             | Looks like CSAM detection is iOS/iPadOS only (for now?).
             | From Apple:
             | 
             | - Communication safety in Messages
             | 
             | [...] This feature is coming in an update later this year
             | to accounts set up as families in iCloud for iOS 15, iPadOS
             | 15, and macOS Monterey.*
             | 
             | - CSAM detection
             | 
             | [...] To help address this, new technology in iOS and
             | iPadOS will allow Apple to detect known CSAM images stored
             | in iCloud Photos.
             | 
             | - Expanding guidance in Siri and Search
             | 
             | [...] These updates to Siri and Search are coming later
             | this year in an update to iOS 15, iPadOS 15, watchOS 8, and
             | macOS Monterey.
             | 
             | https://www.apple.com/child-safety/
        
             | blisterpeanuts wrote:
             | Great. I recently invested $3800 in an excellent, new iMac.
             | Now I'm starting to wonder if I should have spent a couple
             | thousand less for a barebones PC and installed my favorite
             | Linux distro. It would have done 75% of what I needed, and
             | the other 25% (music and video work)... well, that's the
             | tradeoff.
             | 
             | If anyone in my circle of family, friends, and social
             | network asks my advice, the formerly easy answer "Get a
             | Mac, get an iPhone; you won't regret it!" is probably going
             | to be replaced with something more nuanced ("Buy Apple, but
             | know what you're getting into; here's a few articles on
             | privacy....").
        
               | selsta wrote:
               | The CSAM detection technical summary [1] only mentions
               | iOS and iPadOS.
               | 
               | If it does come to macOS it will be part of Photos.app,
               | as that's the only way to interact with iCloud Photos. I
               | would recommend you to avoid that app and cloud in
               | general if you care about privacy.
               | 
               | [1] https://www.apple.com/child-
               | safety/pdf/CSAM_Detection_Techni...
        
               | stjohnswarts wrote:
               | That's not what I read in multiple sources. They
               | including mac os as part of it. And why wouldn't they the
               | software will be easy to compile into the apple photos
               | app. https://www.zdnet.com/article/apple-child-abuse-
               | material-sca...
        
               | pehtis wrote:
               | If it stays like that then it's manageable i guess.
        
           | stjohnswarts wrote:
           | Yes it will come with the next "+1" version due out in late
           | fall (winter?). It will be an integral to the icloud photos
           | platform. Supposedly if you turn off iCloud photos backup it
           | gets turned off as well
        
           | xoa wrote:
           | > _Does anybody know if this policy will extend to macOS
           | too?_
           | 
           | I don't see how it could be in the same way, or at least it
           | only could be on brand new ARM Macs right? The thing is that
           | at least on Intel Macs Apple simply doesn't control the whole
           | stack. I mean, obviously, while more effort it's perfectly
           | possible to get x86 macOS running virtualized or directly on
           | non-Apple hardware entirely. So any shenanigans they try to
           | pull there can get subverted or outright patched out. Without
           | the whole hardware trust chain and heavy lock down like they
           | have on iOS this sort of effort, even if attempted, becomes
           | less threatening. I guess we'll see what they go for though
           | :(. They could certainly make it pretty nasty anyway so as
           | much as anything it's more of a forlorn hope they'll mainly
           | focus on low hanging fruit.
        
           | perardi wrote:
           | I think you're going to have to go to a flip phone, then.
           | 
           | Not to be, uh, flippant. Because this feels like a rather
           | obvious slippery slope that Google will be compelled to slide
           | down as well.
           | 
           | Feels like the only way to avoid such a thing would be dumb
           | phone + Linux laptop/desktop.
        
             | pehtis wrote:
             | I wouldn't go that far. I need something with a decent
             | browser at the very least.
        
             | alabamacadabra wrote:
             | They already do. Look up "federated learning". Lmao.
        
             | istingray wrote:
             | Thankfully flip phones aren't necessary! Startups like
             | Purism are at the cutting edge of building privacy-
             | respecting devices, check it out here: https://puri.sm
        
               | asddubs wrote:
               | and pinephone: https://www.pine64.org/pinephone/
               | 
               | Plan to get one as my next phone, although it should be
               | said that both of these offerings are still a bit rough
               | around the edges currently, but getting there
        
         | feanaro wrote:
         | > I know that isn't how you should look at it, but that's still
         | how it feels
         | 
         | It's _definitely_ how you should look at it because it 's
         | right. Once a device is compromised in this way, there's no
         | going back and the erosion of privacy and expansion of scope
         | will become neverending. If a capability exists, it will just
         | be too hard for spooks to keep their fingers out of the cookie
         | jar.
        
         | noobermin wrote:
         | Only thing I disagree with, it is how you should look at it and
         | that's why it feels wrong, because it is wrong.
        
       | _carbyau_ wrote:
       | So it boils down to companies make most of the OSs for devices.
       | 
       | Governments can compel companies to do stuff.
       | 
       | The only option for the truly privacy conscious is to not use a
       | company provided OS. The future is in buying a phone and flashing
       | the ROM you want.
       | 
       | Until _even owning_ a non-standard phone becomes illegal.
        
         | userbinator wrote:
         | _Until even owning a non-standard phone becomes illegal._
         | 
         | A similar scenario was found in Stallman's disturbingly-
         | prescient story 24 years ago:
         | 
         | https://www.gnu.org/philosophy/right-to-read.en.html
        
       | maerF0x0 wrote:
       | I maintain that I actually think Apple got told do this by the
       | Gov't the same way AT&T was told -- and gagged- to spy on
       | americans.
       | 
       | see also:
       | 
       | https://www.wired.com/2006/01/att-sued-over-nsa-eavesdroppin...
       | 
       | https://www.theverge.com/2013/7/17/4517480/nsa-spying-prism-...
       | 
       | https://www.courthousenews.com/companies-staying-mum-on-fbi-...
        
       | [deleted]
        
       | dqpb wrote:
       | Look at it from the point of view of the majority non-abusive
       | parents: This can only increase the risk that their children will
       | be taken away from them. This includes Apple employees.
        
       | renewiltord wrote:
       | I actually really enjoy the loop of "we can trust them to police
       | us, they are trustworthy" to "oh no, they abused that power; why
       | did they do that" that society goes through.
       | 
       | All this HN rebellion is a storm in a teacup. Some innocent dude
       | will get in trouble and everyone will talk about how "tech
       | companies need to be regulated" and then they'll go back to "we
       | should trust the government to police us" after a few weeks of
       | burning through their outrage. Always exceptions. Always
       | surprised. Always trusting.
        
       | thatsabadidea wrote:
       | First this and then it'll be gradually expanded to offensive
       | memes against official political narratives.
        
       | Throwaway12349 wrote:
       | One thing I don't understand about this debate is that one of the
       | bigger concerns folks who are against the measure have is that
       | Apple might one day add non-CSAM photos to the set of photos they
       | scan for.
       | 
       | As far as I understand it, the CSAM hash database is part of the
       | OS, which Apple can update in any way they like, including to
       | read your messages or surreptitiously compromise the encryption
       | of your photos (and they can force your device to install this
       | code via a security update). We trust them not to do these things
       | (they already have a track record of resisting the creation of
       | backdoors), so I'm not sure why they we don't trust them to also
       | use this capability only for CSAM.
       | 
       | Sure, it would be technically easier for them to add the hash of
       | the tank man photo (an example of something an oppressive
       | government might be interested in) to their database after
       | something like this is implemented, but it's also not very hard
       | for them to add scanning-for-the-tank-man-photo to their OS as it
       | currently exists. Indeed, if the database of hashes lives on your
       | device it makes it easier for researchers to verify that
       | politically sensitive content is not present in that database.
        
       | cletus wrote:
       | Apple's track record on user protection compared to Google or
       | Samsung has been very good. For example, resisting wide net
       | Federal "because terrorism" warrants as much as they legally can.
       | 
       | There's a reason why Apple 0 day exploits sell for way more on
       | the black market than Android exploits.
       | 
       | Trust is hard to earn, easy to lose and even harder to regain.
       | User trust is such a huge part of Apple's business and success,
       | it's shocking to me they'd damage it so much and so fast.
       | 
       | A system that exists but has "protections" can and will be
       | abused. A system that doesn't exist can't be. It's really that
       | simple.
       | 
       | Here's the big question though: did Apple executives not realize
       | there would be a backlash against this? Or did they just not
       | care? Either is hard to fathom. Apple's board and shareholders
       | should be asking some pretty tough questions of Tim Cook at this
       | point.
        
         | nicce wrote:
         | I believe, that this chaos is mostly caused by the leaked
         | information, which was presented in a bit missleading way.
         | 
         | The fire was spreading, and Apple went along to announce it
         | publicly. It was too late to argue for reading their materials
         | properly. In-device scanning was on everyone's head.
         | 
         | Some have argued, that leak was even intentional to lower
         | pressures on new regulations. Or it was just whistleblowing.
         | 
         | But there is a lot of evidence, that Apple is going closer to
         | full E2EE with this PSI system, and original goal might have
         | been announce everything in incoming iPhone event. That might
         | have ended even into the positive light.
        
           | srswtf123 wrote:
           | Where can I find some of this evidence? I haven't been
           | following this closely
        
         | gundmc wrote:
         | > There's a reason why Apple 0 day exploits sell for way more
         | on the black market than Android exploits.
         | 
         | FWIW, every article I see is about iOS exploits being cheaper
         | than android because there are so many of them. Those articles
         | all seem to be from around 2019, though. It seems like before
         | 2019 what you said was true. I'm sure sure what prices are like
         | within the last two years.
         | 
         | https://www.google.com/search?q=android+vs+ios+exploit+cost
        
         | FabHK wrote:
         | That strikes me as a much more nuanced and sensible take than
         | all the facile "Apple's privacy stance has always just been a
         | marketing ploy to maximise profits, you sheeple".
        
         | swiley wrote:
         | >There's a reason why Apple 0 day exploits sell for way more on
         | the black market than Android exploits.
         | 
         | The opposite is true.
         | 
         | They're both _shit_ though. Use FOSS operating systems you can
         | administrate or don 't bother with smartphones.
        
         | whatgoodisaroad wrote:
         | And yet, the way Apple openly operates iCloud in China is
         | infinitely worse than Google's Dragonfly ever could have been.
         | Apple's reputation for privacy is entirely undeserved.
        
       | [deleted]
        
       | technick wrote:
       | The road to hell is paved with good intentions.
       | 
       | I'm going to start rallying for my employer to dump apple as a
       | vendor if they stay the course.
        
       | [deleted]
        
       | travoc wrote:
       | Can someone explain how Apple being coaxed or coerced into
       | searching all of our personal devices for illegal files by
       | federal law enforcement is not an unconstitutional warrantless
       | search?
        
         | schaefer wrote:
         | I am not a Lawyer, but my understanding is:
         | 
         | The wording of the CSAM law is that content should be scanned
         | when uploaded. That condition "upon upload" triggers the 3rd
         | party doctrine.
         | 
         | Apple has gone above and beyond here, not the actual US gov.
         | So, the bill of rights doesn't apply to Apple's decision to
         | scan content on device, just before it is uploaded.
        
           | stjohnswarts wrote:
           | Apple hasn't gone above and beyond anything here. The only
           | way they can save face is laugh at the government and not
           | give into their demands to set up a backdoor tool to scan
           | every iphone user.
        
           | nullc wrote:
           | There is no law that requires providers to scan users private
           | documents, even uploaded.
           | 
           | However, if they do review the material and see child porn
           | they're obligated to report it.
        
             | istingray wrote:
             | Any chance you have a good link explaining this? I saw the
             | parent comment earlier and wanted to add this. But I only
             | recently learned that the US federal government cannot
             | require or incentivize providers to scan user's private
             | documents, so didn't want to post without clear sources.
        
               | 0x426577617265 wrote:
               | It's covered in 18 U.S. Code SS 2258A - Reporting
               | requirements of providers
        
         | michaelbjames wrote:
         | As I understand, they're acting as an agent of the government,
         | it's a private company. So, the 4th amendment protecting
         | against unreasonable searches _by the government_ does not
         | apply.
        
           | himaraya wrote:
           | No, the Fourth Amendment applies to private actors acting on
           | behalf of the government.
        
             | dragonwriter wrote:
             | > No, the Fourth Amendment applies to private actors acting
             | on behalf of the government.
             | 
             | It applies to private actors acting _as agents of the
             | government_ , it doesn't apply to private actors who for
             | private reasons _not_ directed by the government conduct
             | searches and report suspicious results to the government
             | (other property and privacy laws might, though.)
        
               | simondotau wrote:
               | So the 4th Amendment doesn't cover Apple voluntarily
               | inducing their devices to check for CSAM prior to cloud
               | upload.
               | 
               | But the 4th Amendment would cover Apple being forced to
               | modify their system to scan for anything else the
               | Government has told them to.
        
               | 0x426577617265 wrote:
               | That's correct. If the government asked them to do it,
               | then it would require a warrant to conduct the search.
        
           | syshum wrote:
           | Acting as a agent, used to anyway, mean the 4th amendment
           | attached to the company
           | 
           | if they WERE NOT acting as an agent then the 4th would not
           | apply. I think they way they get around that is not do not
           | report to the FBI but instead to NCMEC a "non-profit"
        
             | ahnick wrote:
             | who in turn reports to the government?
        
               | syshum wrote:
               | Yes, but this is what happens when you make the 4th
               | amendment in to Swiss cheese with exception after
               | exception
               | 
               | you do not want criminals to escape justice to do you????
               | /s
        
             | nullc wrote:
             | > I think they way they get around that is not do not
             | report to the FBI but instead to NCMEC a "non-profit"
             | 
             | No. The courts have already explicitly rejected the entity
             | that NCMEC is a private entity. NCMEC can only handle child
             | porn via special legislative permission and is 99% funded
             | by the government.
             | 
             | The searches here are lawful because Apple searches out of
             | their own free will and commercial interest and when there
             | is something detected their employees search your private
             | communications (which the EULA permits).
             | 
             | This is also why Apple must review the matches before
             | reporting them-- if they just matched an NCMEC database and
             | blindly forwarded then to NCMEC then it would be the NCMEC
             | conducting the search and a warrant would be required.
        
               | [deleted]
        
             | dragonwriter wrote:
             | Reporting to the government doesn't make you a government
             | agent, doing the search _at the direction of the
             | government_ does.
             | 
             | If a private citizen, on their own initiative, searches
             | your house, finds contraband, and reports it to the
             | government, they may be guilty of a variety of torts and
             | crimes (both civil and criminal trespass, among others, are
             | possibilities), there is no fourth amendment violation.
             | 
             | If a police officer asks them to do it, though, there is a
             | different story.
        
           | mc32 wrote:
           | It's that third party doctrine that is so convenient when
           | acquiring information otherwise requiring a warrant to
           | obtain.
        
             | throwaway0a5e wrote:
             | It's right up there with Wickard v. Filburn in terms of
             | SCOTUS screw-ups.
        
           | slapfrog wrote:
           | Why stop there? Instead of getting pesky warrants to search
           | apartments, the government could just contract the landlord
           | to do the search for them. After all, the landlord owns the
           | property and ownership trumps every other consideration in
           | libertarian fantasy-land.
        
         | syshum wrote:
         | Because for the last at least 80 years the constitution as not
         | been seen as a document limiting government power, but instead
         | as a document limiting the peoples rights
         | 
         | it has literally been inverted from it original purpose
        
         | derwiki wrote:
         | IANAL but I presume I consented to this in their EULA.
        
           | ahnick wrote:
           | A EULA cannot force you to give up your constitutional
           | rights.
        
             | C19is20 wrote:
             | What about all the people not USA?
        
               | vageli wrote:
               | The USA is not the only country with a constitution.
        
               | int_19h wrote:
               | It's not, but it's one of the few that doesn't have some
               | verbiage along the lines of UDHR:
               | 
               | "In the exercise of his rights and freedoms, everyone
               | shall be subject only to such limitations as are
               | determined by law solely for the purpose of securing due
               | recognition and respect for the rights and freedoms of
               | others and of _meeting the just requirements of morality,
               | public order and the general welfare_ in a democratic
               | society. "
        
             | greyface- wrote:
             | Constitutional rights can be voluntarily waived.
        
               | dragonwriter wrote:
               | > Constitutional rights can be voluntarily waived.
               | 
               | That...varies. The right to a speedy trial can be waived.
               | The right against enslavement cannot. You can consent to
               | a warrantless or otherwise unreasonable search given
               | adequate specificity, but outside of a condition for
               | release from otherwise-constitutional more severe
               | deprivation of liberty (e.g., parole from prison) I don't
               | think it can be _generally_ waived in advance, only
               | searches specifically and immediately consented to when
               | being executed. But I don 't have any particular cases in
               | mind, and that understanding could be wrong. But
               | "constitutional rights can be waived" is definitely way
               | to broad to be a good useful guide for resolving specific
               | isssues.
        
               | nonameiguess wrote:
               | You can definitely consent to giving up the right to not
               | be subject to warrantless search and seizure. Everyone
               | who ever joins the military under UCMJ allows their
               | commander to conduct random inspections of personal
               | property at any unannounced time.
               | 
               | There are limitations, of course. If you live off-post in
               | private housing, your commander has no legal authority to
               | inspect it. They can only go through your stuff if you
               | live in government-owned housing.
        
               | dragonwriter wrote:
               | > Everyone who ever joins the military
               | 
               | That's not a waiver, that's a space warrantless searches
               | are reasonable under the fourth amendment and Congress'
               | Art. I, Sec. 8 powers with regard to the military, etc.
               | Otherwise, when we had conscription, conscripts, who do
               | not freely consent, would have been immune.
        
         | tartoran wrote:
         | They're not coaxed nor coerced
        
           | paulie4542 wrote:
           | No reasonable company would do something unreasonable as this
           | without being pressured.
        
           | stjohnswarts wrote:
           | You can bet they were by the TLA intel agencies, but you will
           | not find that documented anywhere other than "Met with
           | FBI/NSA/XXX for liason purposes". I'm sure they were given an
           | ultimatum and they folded.
        
         | mc32 wrote:
         | Probably the same way people say that when Twitter moderates
         | speech on their platform it's not censorship.
        
           | amznthrwaway wrote:
           | And it's not censorship when HN bans some posts.
        
           | throwaway0a5e wrote:
           | Or when the NSA feeds your texts through their ML and into
           | their DB it's not a "search" because humans haven't gone
           | looking for it.
        
             | corobo wrote:
             | Ooh the copilot defence
        
         | mzs wrote:
         | Because it's only a hash that matches and then a court order is
         | needed to get the actual contents.
        
         | breck wrote:
         | My money is this is secretly about nuclear, bio, chemical or
         | other WMD or "dangerous" material. I can't imagine that Apple
         | would be so stupid to open such a Pandora's box to an Orwellian
         | state and crimes against _all_ children in the name of making
         | no statistical difference in the porn problem. There has got to
         | be a hidden story here and severe pressure on top executives,
         | otherwise I can't see how the math makes sense. Please keep
         | digging, journalists!
        
           | cucumb3rrelish wrote:
           | Something tells me people with access to nuclear arsenal are
           | not going to take pictures of it with an iphone and share
           | them via imessages
        
             | tibyat wrote:
             | Don't know where you have been the past four years, frankly
             | I'm jealous. The US president literally did this:
             | https://www.wired.com/story/trump-tweeted-a-sensitive-
             | photo-...
        
             | 6AA4FD wrote:
             | You mean like american soldiers wouldn't store sensitive
             | nuclear info on public Quizlet decks? Hmm
        
           | akira2501 wrote:
           | > the porn problem.
           | 
           | Is the pornography the problem, or is the the abuse of
           | children?
        
         | Componica wrote:
         | It's most likely a demand by China so that they can create an
         | infrastructure to locate political dissidents. Oh look, a
         | Winnie the Pooh Xi meme ended up in your gallery/inbox. Why is
         | there a knock at the door? I'm pretty sure thats the real
         | reason.
        
           | endisneigh wrote:
           | what's this take based on - have a link or anything?
        
             | kurofune wrote:
             | >what's this take based on
             | 
             | Racism and ignorance.
        
               | mmastrac wrote:
               | In fairness, this is literally happening on Chinese
               | messenger services today so I wouldn't call it either of
               | those
        
               | XorNot wrote:
               | Apple isn't being pressured from China, but criticizing
               | the CCP as the brutal dictatorship it is is _not_ racist
               | (and that idea _is_ done verbatim CCP propaganda).
        
               | kurofune wrote:
               | Randomly suggesting that every authoritarian decision
               | taken unilaterally by an american company was made to
               | please the evil chinese government while offering no
               | substantial proof is not "criticizing the CCP", it is
               | just shoehorning US's far-right talking points into a
               | thread that has nothing to do with it.
        
             | someperson wrote:
             | I don't know about the suggestion that it's the government
             | of China pushing for the feature itself, but the fact the
             | feature now exists and WILL be used by authoritarian
             | regimes to scan for political content is clearly understood
             | by Apple employees. From the article:
             | 
             | > Apple employees have flooded an Apple internal Slack
             | channel with more than 800 messages on the plan announced a
             | week ago, workers who asked not to be identified told
             | Reuters. Many expressed worries that the feature could be
             | exploited by repressive governments looking to find other
             | material for censorship or arrests, according to workers
             | who saw the days-long thread.
        
               | endisneigh wrote:
               | your quote contradicts your statement. it seems like the
               | employees are worried that it _could_ be exploited, not
               | that it necessarily _will_ be.
        
               | [deleted]
        
           | thombee wrote:
           | Why then release this feature in the US? Why not just release
           | it in China and avoid all this negative press? Bad take is
           | bad
        
           | trainsplanes wrote:
           | It's silly to think the US government hasn't been twisting
           | their arm to do this for years.
        
             | Componica wrote:
             | My ears perk up whenever I hear a "Just think of the
             | Children" argument because after Sandy Hook, I'm pretty
             | certain the US could careless about children. There's a
             | real reason behind this.
        
               | null0pointer wrote:
               | Whenever someone makes a "think of the children" argument
               | it has absolutely nothing to do with whether they
               | actually care about children. They just want to make it
               | extremely difficult to counter-argue without being
               | labelled as pedophile adjacent. It is a completely
               | disingenuous argument 99% of the time it is used.
        
               | Jcowell wrote:
               | I feel like there's a fallacy for this but I'm not sure.
               | Either way doesn't matter the logic here isn't that the
               | US could careless about children , it's that the US cares
               | more about Gun rights than it does children , but that
               | doesn't say anything about the minimum level of care they
               | have , only the maximum.
        
             | adrr wrote:
             | My bet it was the US government. Predicted fallout doesn't
             | seem something Apple would do themselves with no monetary
             | gain.
        
         | ajsnigrutin wrote:
         | Technically it's not apple searching, it's your phone searching
         | itself. Sadly, they'll probably put it somewhere on page 89 of
         | their ToS.
        
           | bobsterman wrote:
           | If you do not understand what the software on your phone is
           | doing; you cannot control it; you do not own it. Apple owns
           | it.
        
         | alfiedotwtf wrote:
         | "The third-party doctrine is a United States legal doctrine
         | that holds that people who voluntarily give information to
         | third parties--such as banks, phone companies, internet service
         | providers (ISPs), and e-mail servers--have "no reasonable
         | expectation of privacy.""
         | 
         | https://en.m.wikipedia.org/wiki/Third-party_doctrine
        
         | alpaca128 wrote:
         | National Security Letters for example. They wouldn't even be
         | allowed to talk about it.
        
         | stjohnswarts wrote:
         | Because it's a contract between you and apple. You don't have
         | to use their phones and they're a corp not the government.
         | Personally I see it as an underhanded attempt for the
         | government to use apple as a police force in everything but
         | actually deputizing them. They're using a loop hole in the 4th
         | amendment to spy on you without a warrant.
        
         | gbrown wrote:
         | Simple - our laws haven't evolved with technology. That's why
         | your mail has about a million times more protection than your
         | email.
        
           | shuckles wrote:
           | USPS regularly X-rays mail, and the only protection against
           | abuse is policy (it's not admissible evidence in the United
           | States).
        
             | twobitshifter wrote:
             | Not only that, but they log all metadata on the envelope of
             | every letter sent.
        
               | 0x426577617265 wrote:
               | Law Enforcement can also get access to that metadata --
               | it's called a mail cover. See U.S. v. Choate.
               | 
               | "A mail cover is a surveillance of an addressee's mail
               | conducted by postal employees at the request of law
               | enforcement officials. While not expressly permitted by
               | federal statute, a mail cover is authorized by postal
               | regulations in the interest of national security and
               | crime prevention, and permits the recording of all
               | information appearing on the outside cover of all classes
               | of mail."
        
               | enriquec wrote:
               | and my building's mail gets routinely stolen
        
           | gjs278 wrote:
           | they used to use candles to unseal mail all the time
        
           | vkou wrote:
           | This cuts both ways. Our laws not evolving with technology is
           | also the reason why tapping and intercepting secured digital
           | communication is orders of magnitude more difficult than
           | tapping someone's analog phone line, or why decrypting a hard
           | drive is far more difficult for the police than sawing apart
           | a safe.
        
             | sgustard wrote:
             | Agreed. Technology has allowed abusive images to thrive and
             | reach new consumers. Any pressure to "evolve laws with
             | technology" will surely lead to more draconian anti-privacy
             | measures.
        
           | kilroy123 wrote:
           | Yup. That's what happens when the country is ran by a bunch
           | of old lawyers.
        
             | Barrin92 wrote:
             | probably depends more on the kind of old lawyer. I've
             | actually noticed that in general older people tend to be
             | more privacy conscious than younger ones. Old people at
             | least still know what secrecy of correspondence even is.
             | It's not even a concept that exists any more for a lot of
             | gen z.
             | 
             | In particular American supreme court arguments often stand
             | out to me in how clearly the judges manage to connect old
             | principles to newer technologies.
        
             | stjohnswarts wrote:
             | It has nothing to do with them being "old". Saying that is
             | a very ageist thing to say. I know a bunch of old techies
             | who are completely against this. It's about lack of
             | technical knowledge and how this is being twisted to seem
             | okay. Your average millennial doesn't give a damn about it
             | either.
        
             | EpicEng wrote:
             | >old lawyers
             | 
             | Meanwhile, it's predominantly young people who freely hand
             | their information to mega-corps to be disseminated and
             | distributed. Outside your (our) bubble young people don't
             | give two craps about privacy.
        
               | ink404 wrote:
               | Outside this bubble most people young or not don't give
               | two craps about privacy
        
         | nullc wrote:
         | They aren't being coaxed or coerced. If they were forced-- or
         | even incentivized-- by the government to perform these searches
         | than they'd be subject to the fourth amendment protecting
         | against warrantless searches.
         | 
         | Apple is engaging in this activity of their own free will for
         | sake of their own commercial gain and are not being
         | incentivized or coerced by the government in any way.
         | 
         | As such, the warrantless search of the customer data is lawful
         | by virtue of the customer's agreements with Apple.
        
           | 3grdlurker wrote:
           | > Apple is engaging in this activity of their own free will
           | for sake of their own commercial gain and are not being
           | incentivized or coerced by the government in any way.
           | 
           | Honest question: how do you _know_ this, for sure? Or is your
           | comment supposed to be read as an allegation phrased as fact?
        
           | azinman2 wrote:
           | > Apple is engaging in this activity of their own free will
           | for sake of their own commercial gain
           | 
           | What's the commercial gain?
        
             | sethammons wrote:
             | More access to markets subject to totalitarian regimes
        
               | azinman2 wrote:
               | I'm assuming you're kidding here, but if you look at the
               | list of countries already selling/supporting iPhones,
               | what's not on the US export ban list that's a
               | totalitarian regime?
               | 
               | https://www.apple.com/iphone/cellular/
        
               | stjohnswarts wrote:
               | China is obviously the big one as is Russia.
        
               | azinman2 wrote:
               | But they're already in both countries.
        
               | thombee wrote:
               | Why can't they provide this feature to these regimes? Why
               | do they implement it in the USA then? Seems like a
               | roundabout way to go about it lol
        
             | dannyw wrote:
             | The commercial gain can be "don't break us up under
             | antitrust law".
             | 
             | Apple is basically saying to governments: see our walled
             | garden? The more you allow us to wall it, the more we can
             | help you oppress your citizens.
        
             | nullc wrote:
             | Not being identified as the go-to platform for kiddy porn
             | collectors.
        
             | slg wrote:
             | This will likely be paired with the rollout of full
             | encryption for iCloud and they will sell the entire thing
             | as a pro-privacy move.
        
               | ninkendo wrote:
               | That would appear to have backfired.
        
       | whoisjuan wrote:
       | This CSAM Prevention initiative by Apple is a 180 degress change
       | of their general message around privacy. Imagine investing
       | hundreds of millions of dollars in pro-privacy programs, privacy
       | features, privacy marketing, etc... just to pull this reverse
       | card.
       | 
       | Of course this is going to spark concern within their own ranks.
       | It's like working for a food company that claims to use organic,
       | non-processed, fair-trade ingredients and in just a day deciding
       | that you're going to switch to industrial farming sourcing and
       | ultra-processed ingredients.
       | 
       | It's a complete change of narrative and there's no easy way to
       | explain it and still defend Apple's Privacy narrative, wihout
       | doing extreme mental gymnastics.
        
         | intricatedetail wrote:
         | Power 101: preach one thing do opposite
        
           | dang wrote:
           | Please don't post flamebait and/or unsubstantive comments.
           | We're trying for something a bit different from internet
           | default here.
           | 
           | https://news.ycombinator.com/newsguidelines.html
        
         | marticode wrote:
         | Apple privacy focus was always opportunistic: they couldn't get
         | a foothold into online advertising so they decided to make that
         | weakness a selling point instead.
         | 
         | They'll ditch privacy in a second if they can sell you enough
         | ads, just as they ditched privacy for Chinese customers when
         | the Chinese government asked them to.
        
           | the_other wrote:
           | > Apple privacy focus was always opportunistic: they couldn't
           | get a foothold into online advertising so they decided to
           | make that weakness a selling point instead.
           | 
           | You could spin that the other way: Apple's foray into
           | advertising was opportunistic, but didn't sit with their
           | focus on privacy. You'd need internal memos to prove it went
           | either way.
        
             | marticode wrote:
             | You could, but it seems more likely that a mega corporation
             | gave up on a huge market because it failed to succeed there
             | rather than moral principles. It's not like Apple has
             | showed a lot of moral standing when dealing with the CCP,
             | or isn't ruthlessly crushing competition when it gets the
             | chance.
        
               | TimTheTinker wrote:
               | I have a friend who knows Apple's head of user privacy.
               | He vouches that the guy is deeply serious about
               | security/privacy.
               | 
               | Between that and Apple's principled stance against the
               | FBI, I'm inclined to believe that _at that time_ , they
               | were making principled choices out of a real concern for
               | user privacy.
               | 
               | I suspect this time around they have a real concern for
               | dealing with the growing child exploitation/abuse
               | problem, but failed to prioritize system design
               | principles in their zeal to pursue positive change in the
               | world.
               | 
               | I don't envy them: in their position, they have an
               | incredible amount of pressure and the stakes are
               | incredibly high.
               | 
               | May cool heads and rational, principled thinking prevail.
        
             | dangerface wrote:
             | The focus on privacy was new and only after their focus on
             | advertising failed.
        
         | ChrisRR wrote:
         | Apple is heavy on their advertising. If there's a way to spin
         | it, they will
        
           | notquitehuman wrote:
           | They're certainly trying. But the video of Craig trying to
           | use a novel definition of "law enforcement backdoor" and
           | accusing people of being confused is contemptibly dishonest.
        
         | amcoastal wrote:
         | So who is responsible for this decision? Is it Tim Cook?
        
         | EastSmith wrote:
         | I was going to buy my first iphone ever this year. Because of
         | the privacy promises. Now I am not sure anymore.
         | 
         | Once they open the door they will not be able to close it ever.
        
           | bodge5000 wrote:
           | For me the only thing stopping me was the lack of type-c and
           | a headphone jack, which at the time was an annoyance but now
           | is turning out to be a blessing in disguise
        
         | throw2345678 wrote:
         | That is not an accurate analogy at all. Apple DID design this
         | with privacy in mind. What would work to fit your inaccurate
         | analogy is if Apple suddenly started reading everything of
         | yours in the plain, directly violating your privacy.
         | 
         | As it stands, the system is designed so that the phone snitches
         | you out if you have CSAM on it matching some perceptual hashes
         | (and plenty of it), but if you do not have CSAM above threshold
         | amounts then your phone never even phones home about it. It is
         | in fact a privacy-preserving mechanism in the sense that Apple
         | never sees your photos.
        
         | ransom1538 wrote:
         | If a grandmother takes a photo of their grandchild in a bathtub
         | this is technically against federal law 18 U.S.C. SS 2252.
         | There is no "I am grandma" exception. This is a serious felony
         | with serious consequences. If you use an algorithm without any
         | context I think you are looking at a ton of new 18 U.S.C. SS
         | 2252 (child pornography) charges through these scans. These
         | charges are notoriously difficult to defend against and even
         | the charge itself will ruin lives and careers [1]. Getting
         | custody back of your child will also be impossible [found
         | guilty or not]. I am getting my popcorn ready. Apple has no
         | idea what they have unleashed.
         | 
         | https://www.nbcnews.com/id/wbna8161067
         | 
         | * edit: hahaha downvotes thanks apple employees!
        
         | baggy_trough wrote:
         | Hey it's still private... unless you have content that matches
         | an opaque, government controlled list of forbidden content...
         | what could go wrong?
        
           | dang wrote:
           | Please don't post flamebait and/or unsubstantive comments.
           | We're trying for something a bit different from internet
           | default here.
           | 
           | https://news.ycombinator.com/newsguidelines.html
        
           | Covzire wrote:
           | Antifa or BLM or Burger King will never demand that they add
           | certain content to the burn list. Don't worry. Be happy.
        
             | dang wrote:
             | Please don't post flamebait and/or unsubstantive comments.
             | We're trying for something a bit different from internet
             | default here.
             | 
             | https://news.ycombinator.com/newsguidelines.html
        
         | foobiekr wrote:
         | To me the issue is that it makes no sense: if it's just
         | scanning iCloud photos, ___why does this feature need to exist
         | at all__?
         | 
         | Apple is saying this: "we need to scan your phone because we
         | need to look for this material, and by the way, we're only
         | scanning things _we've been scanning for years already_"
         | 
         | Basiclaly, everything else aside, in the current state of
         | things, this feature makes no sense. They are not scanning
         | anything they weren't scanning already (literally, everything
         | on iCloud) so why did they bother?
         | 
         | A bunch of people are making speculative claims that Apple is
         | doing this to enable encryption of data in iCloud, but there's
         | no reason to believe that's true. Apple certainly hasn't
         | mentioned it.
         | 
         | So .. why?
        
           | spuz wrote:
           | > "we need to scan your phone because we need to look for
           | this material, and by the way, we're only scanning things
           | _we've been scanning for years already_"
           | 
           | As far as we know, Apple is not already using their
           | decryption keys to decrypt images upload to iCloud and
           | compare them with known images. That's according to this blog
           | post that claims Apple make very few reports to the NCMEC:
           | 
           | https://www.hackerfactor.com/blog/index.php?/archives/929-On.
           | ..
        
           | AnonC wrote:
           | > A bunch of people are making speculative claims that Apple
           | is doing this to enable encryption of data in iCloud, but
           | there's no reason to believe that's true. Apple certainly
           | hasn't mentioned it.
           | 
           | There is nothing surprising here. Apple never mentions things
           | unless it's ready for release or is just about to be
           | released. It's just how Apple works.
        
           | Ceiling wrote:
           | Here's my hope: E2E encryption. Maybe the government is
           | throwing a fit about Apple harboring potential CP on its
           | server if it's E2E encrypted, so apple says, "fine well
           | review it prior to upload, and then E2E encrypt the upload
           | itself"
           | 
           | I can hope, I guess...
        
           | browningstreet wrote:
           | And, it's absurdly defeat-able. Which means there'll be an
           | escalation. Whether in v2 or beyond, this will get worse /
           | more intrusive.
        
         | [deleted]
        
         | hn_throwaway_99 wrote:
         | I actually think something else happened, and to be honest I
         | think many at Apple behind this decision are likely pretty
         | surprised by the blowback.
         | 
         | That is, it seems like Apple really wanted to preserve "end-to-
         | end" encryption, but they needed to do something to address the
         | CSAM issue lest governments come down on them hard. Thus, my
         | guess is, at least at the beginning, they saw this as a strong
         | _win_ for privacy. As the project progressed, however, it 's
         | like they missed "the forest for the trees" and didn't quite
         | realize that their plan to protect end-to-end encryption
         | doesn't mean that much if one "end" is now forced to run tech
         | that can be abused by governments.
         | 
         | This kind of thing isn't that uncommon with large companies. To
         | me it seems very similar to the Netflix Qwikster debacle of a
         | decade ago. Netflix could very clearly see the writing on the
         | wall, that the DVD-by-mail business needed to morph into a
         | streaming business. But they fumbled the implementation, and
         | what largely came across to the public was "you're raising
         | prices by 40%."
         | 
         | I give Netflix huge props for actually reversing the decision
         | pretty quickly - I find that exceptionally rare among large
         | corporations. Remains to be seen if Apple will do the same.
        
           | heavyset_go wrote:
           | > _That is, it seems like Apple really wanted to preserve
           | "end-to-end" encryption, but they needed to do something to
           | address the CSAM issue lest governments come down on them
           | hard._
           | 
           | This seems like an incredible reach to me.
        
             | hn_throwaway_99 wrote:
             | Why? It's already been well-reported that Apple makes a
             | teeny fraction of the NCMEC reports that other large cloud
             | providers make, and that the reason they hold the keys to
             | iCloud backups was from some requests/pressure from the
             | FBI.
        
               | rossjudson wrote:
               | What other "large cloud providers"? Ones that are similar
               | to Apple? Or general cloud companies?
               | 
               | And is "pressure from the FBI" really the only reason why
               | Apple might hold backup keys? Please sketch out your
               | "lost device, need to restore from backup" user journey.
               | For a real person, who has only an iPhone.
        
               | hn_throwaway_99 wrote:
               | > Please sketch out your "lost device, need to restore
               | from backup" user journey. For a real person, who has
               | only an iPhone.
               | 
               | How about "enter your master passphrase to access your
               | photos". Like literally all syncing password managers
               | work, or things like Authy work to back up your one-time
               | password seeds. It's not complicated.
        
               | heavyset_go wrote:
               | Because the government doesn't just care about CSAM, they
               | care about terrorism, drug and human trafficking, gangs
               | etc. Scanning for CSAM won't change the government's
               | opposition to E2EE, and Apple knows this because,
               | according to their transparency reports, they respond
               | with customers' data to NSL and FISA data requests about
               | 60,000 times a year.
               | 
               | Occam's razor also says they aren't playing 11th
               | dimensional chess, and that they just fucked up.
        
               | spuz wrote:
               | The point is that with this technology, Apple can now
               | please both the government and Apple users that want
               | their data in the cloud to be fully encrypted.
               | 
               | First they enable this technology to prove to the
               | government that they can still scan for bad stuff, then
               | they enable true E2E. I don't know if this is really
               | their motivation but it sounds plausible to me.
        
               | glogla wrote:
               | If they scan on device unencrypted and report what they
               | found (even if it's child porn now, we know that is not
               | going to last), they made the e2e effectivelly useless.
               | The point of end to end enceyption is to make the data
               | accessible to just one or two people - and that is not
               | fulfilled here.
               | 
               | EDIT: though you are right - they can please governments
               | by making encryption useless while lying to people they
               | still have it. Not sure that was your message though.
        
               | spuz wrote:
               | The difference between Apple being able to scan data on
               | your phone vs on their servers comes down to who is able
               | to access your data. In both cases, Apple has access. But
               | by storing both your data and your keys somewhere in
               | their cloud servers, that data becomes available to
               | governments with the appropriate subpoena, rogue Apple
               | employees and hackers. None of that would be possible
               | with true E2E encryption.
               | 
               | It seems plausible to me that Apple might believe some of
               | their privacy conscious users would prefer this situation
               | and hence "privacy" can be considered a plausible
               | explanation for their actions here.
        
               | pseudalopex wrote:
               | True E2E means the middle can't decrypt anything. You
               | meant false E2E.
        
               | spuz wrote:
               | No, I meant true E2E encryption. Unless I'm missing
               | something in your comment.
        
               | pseudalopex wrote:
               | Apple's system allows the middle to decrypt certain
               | images or "visual derivatives" at least. True E2E
               | doesn't.
        
               | spuz wrote:
               | I'm suggesting that the client-side CSAM scanning
               | technology could allow Apple to turn on true E2E
               | encryption and still satisfy the government's requirement
               | to report on CSAM which as you point out is not something
               | that Apple currently implements.
        
               | pseudalopex wrote:
               | The "client side" CSAM detection involves a literal
               | person in the middle examining suspected matches. The
               | combination of that and whatever you think true E2E means
               | isn't E2E by definition.
        
           | randallsquared wrote:
           | > _I think many at Apple behind this decision are likely
           | pretty surprised by the blowback._
           | 
           | Maybe they would have been surprised where it ended up, but
           | _someone_ in charge there is definitely on board with a
           | reversal on the previous privacy stance, given the doubling-
           | down with an email signal-boosting the NCMEC calling privacy
           | concerns the  "screeching of the minority".
        
           | etempleton wrote:
           | Something probably did happen. Apple no doubt became
           | increasingly aware of just how legally culpable they are for
           | photos uploaded to iCloud.
           | 
           | For content that is pirated Apple would receive a slap on the
           | wrist, but for content that shows the exploitation of
           | children? And that Apple is hosting on their owned servers?
           | There is no doubt every government will throw their full
           | legal weight behind that case.
           | 
           | Now they are stuck between a rock and a hard place. Do they
           | or do they not look at iCloud Photos? Well, not in the cloud
           | i guess, but right before you upload them. Is that any
           | better? I don't know? Probably not. They really have no
           | choice though. They either scan in the cloud of scan on the
           | device. Scanning in the cloud would at least let people know
           | where their content truly stands, but then they can't
           | advertise end-to-end encryption and on paper it may have felt
           | like a bigger reversal.
           | 
           | I think the long and short answer is. Apple cannot really
           | legally protect your privacy when it comes to data stored on
           | their servers. The end.
        
             | bostonsre wrote:
             | Yea, I agree. I see so many comments where people view the
             | issue is black and white and that apple is clearly in the
             | wrong. Yes, privacy is a great ideal. There are absolute
             | monsters in this world and there is zero chance that no
             | apple employees have brushed up against those monsters. I
             | would ask that every single developer step back and think
             | about putting themselves into a situation where software
             | that you write is being used by monsters to horribly scar
             | children around the world and ask yourself what you would
             | do. Saying the moral and right thing to do is to protect
             | the privacy of all of your users including those monsters
             | just cannot be done if you are not a sociopath that only
             | cares about profits and has glimpsed the horror that is out
             | there. Fuck if I would write any code that would make it
             | easier for those monsters to exist or turn a blind eye
             | towards it. It may be a slippery privacy slope but I don't
             | see how people that have glimpsed the horror would be able
             | to sleep at night without stepping out on that slope.
        
               | salawat wrote:
               | Easy.
               | 
               | When I was young, I was not effectively spied on. My
               | activity and digital trail were ephemeral, and hard to
               | pin down. i wasn't the subkect of constant analysis and
               | scrutiny. I had the privilege of being invited into the
               | engine of an idling train to share an Engineer's drinks
               | along with my parents and no one batted an eye.
               | 
               | There were pedophiles then too. guess what? We didn't
               | sacrifice everyone's right to privacy. We didn't lock
               | more and more rights out of the reach of minors. You
               | didn't need a damn college degree and an insane amount of
               | idealistic commitment to opt out of the surveillance
               | machine. I could fill most prescriptions, even out of
               | State.
               | 
               | We didn't _fear_ the monster. They were hunted with
               | everything we could justify mustering, but you weren 't
               | treated as one by default.
               | 
               | I imagine that maybe, somehow, tomorrow's youth may get
               | to experience that. Not live in fear, but in appreciation
               | of the fact that we too, looked into things absolutely
               | worth being afraid of, but resisted the temptation to
               | surrender yet more liberties for the promise of a safety
               | that will never manifest. Just change its stripes.
               | 
               | Those are the best nights.
        
               | specialist wrote:
               | > _We didn 't sacrifice everyone's right to privacy. We
               | didn't lock more and more rights out of the reach of
               | minors._
               | 
               | No. My elders and society just denied the monsters
               | existed. And punished any one who talked about it.
        
               | drunkpotato wrote:
               | Actually, as long as the monsters are powerful and
               | connected, they are well-protected, then and now.
        
               | californical wrote:
               | I think there will always be a small number bad people in
               | the world, that will always find a way to be bad.
               | 
               | I know they're out there in the world, but I leave my
               | home anyways, because living my entire life curled up in
               | my bed in fear would be a sad and short life.
               | 
               | I'm more afraid that people will lose the ability to be
               | happy because they're so full of fear and skepticism that
               | they don't form meaningful relationships, relax,
               | experience life.
               | 
               | I'm also more afraid of what can happen when a small
               | group of people are allowed to dictate and control the
               | lives of many, justifying it by pushing fear. Then it'll
               | be more than a small group of abusers harming a small
               | group of people (in this case, child abusers). It becomes
               | a different group of abusers (dictators) harming the
               | entire population through their dystopian politics.
               | 
               | How many people, including children, died when a group of
               | dictators used fear & skepticism in the 1930s to push an
               | agenda that ultimately led to WWII and concentration
               | camps?
               | 
               | We need to be extraordinarily careful that we don't try
               | to save a very small number of lives at the expense of
               | many lives in the future, as a result of policies that we
               | put in place today.
        
             | unstatusthequo wrote:
             | Then grow a pair and point at government influence. Leak
             | DOJ demand letters. Do something to indicate this was
             | forced on them. Or is a gag order in place? That's my bet
             | right now.
        
             | calvinmorrison wrote:
             | I doubt a judge can find apple liable for hosting encrypted
             | data if its illegal. That would mean every e2e storage
             | solution, or even just encryption and storing files on a s3
             | bucket host would be liable.
             | 
             | Apple should use it's gigantic legal arm to set good
             | precedents for privacy in court.
        
               | closeparen wrote:
               | First, there is nothing stopping a full legislative
               | assault on end to end encryption, various such proposals
               | have had traction recently in both the US and EU.
               | 
               | The lawyerly phrase is "knew or should have known." It
               | could be argued that now that this technology is
               | feasible, failure to apply it is complicity.
               | 
               | Think about GDPR. Eventually homomorphic encryption is
               | going to be at a point where "we need your data in the
               | clear so we can do computations with it on the server"
               | will cross from "failure to be on the bleeding edge of
               | privacy research" to "careless and backwards."
        
               | netheril96 wrote:
               | > First, there is nothing stopping a full legislative
               | assault on end to end encryption, various such proposals
               | have had traction recently in both the US and EU.
               | 
               | And nothing stops Apple from fighting against the
               | legislative assault. They have made themselves a champion
               | of user privacy and have deep pockets to fight the fight,
               | yet they just surrender at first attempt.
        
               | gotbeans wrote:
               | > champion of user privacy
               | 
               | > surrender at first attempt
               | 
               | The irony
        
               | calvinmorrison wrote:
               | The way to win that situation is to not play.
               | 
               | Pull all apple products from the shelves for a month and
               | launch a billion dollar ad campaign about what the hell
               | our government is up too.
               | 
               | At least try to stick to one governments. Tech companies
               | are entering tricky waters by trying to follow laws of
               | many nations (which conflict) simply because their
               | customers are there
        
             | Unklejoe wrote:
             | They could just keep scanning the data stored on their
             | servers like everyone else does (Google, Etc) and leave it
             | at that.
        
           | GuB-42 wrote:
           | Adding to the confusion, they announced both the AI powered
           | parental control feature for iMessage and hash-based scanning
           | of iCloud pictures.
           | 
           | The iMessage thing is actual spyware, by design, it is a way
           | for parents to spy on their children, which is fine as long
           | as it is a choice and the parents are in control. And that's
           | the case here.
           | 
           | The iCloud thing is much more restrictive, comparing hashes
           | of pictures before you send them to Apple servers. For some
           | reason, they do it client side and not server side. AFAIK,
           | iCloud photos are not E2E encrypted so I don't really know
           | why they are doing it that way when everyone else does it
           | server-side.
           | 
           | But combine the news and it sounds like Apple is using AI to
           | scan all your pictures. Which is not true. Unless you are a
           | child and still under your parents authority, it will only
           | compare hashes of picture you send in the cloud, no fancy AI
           | here, and it is kind of understandable for Apple not to want
           | to host child porn.
        
             | sneak wrote:
             | They're doing it clientside so that they can do it for
             | images and files that aren't ever sent off the device.
             | 
             | I am increasingly convinced this is a backroom deal, just
             | like the iCloud Backup e2e feature (designed and, as I
             | understand it, partially implemented) being nixed on demand
             | (but not legal compulsion) from the FBI.
             | 
             | If you get big enough, the 1A doesn't apply to you anymore.
        
           | xiphias2 wrote:
           | ,,Netflix could very clearly see the writing on the wall,
           | that the DVD-by-mail business needed to morph into a
           | streaming business.''
           | 
           | I don't argue with your point, but actually Reed Hastings
           | originally created Netflix the DVD rental company so that he
           | could turn it into a movie streaming company. He had to wait
           | for internet to be fast enough. His interview with Reid
           | Hoffman is pretty cool:
           | 
           | https://youtu.be/jYhP08uuffs
        
           | AnonC wrote:
           | > ...to be honest I think many at Apple behind this decision
           | are likely pretty surprised by the blowback.
           | 
           | If Apple is surprised by the blowback, it's entirely Apple's
           | fault because it didn't engage with many outside experts on
           | the privacy front.
           | 
           | I recall reading either in Alex Stamos's tweets or Matthew
           | Green's tweets that Apple was invited before for discussions
           | related to handling CSAM, but it didn't participate.
           | 
           | Secretiveness is helpful to keep an advantage, but I can't
           | believe that Apple is incapable of engaging with
           | knowledgeable people outside with legal agreements to keep
           | things under wraps.
           | 
           | This blowback is well deserved for the way Apple operates.
           | And I'm writing this as a disappointed Apple ecosystem user.
        
             | shuckles wrote:
             | Just because Apple didn't attend Alex's mini conference for
             | his contacts does not mean they didn't engage any privacy
             | experts. Furthermore, Alex is a security expert and not a
             | privacy expert. Finally, if that conference was really
             | pushing the envelope on privacy, where are the innovations
             | from those who did attend? The status quo of CSAM scanning
             | is at least as dangerous as an announcement of an
             | alternative.
        
           | jensensbutton wrote:
           | > didn't quite realize that their plan to protect end-to-end
           | encryption doesn't mean that much if one "end" is now forced
           | to run tech that can be abused by governments
           | 
           | Not just one end, but specifically the end that their users
           | thought they had control over.
        
           | joe_the_user wrote:
           | _Thus, my guess is, at least at the beginning, they saw this
           | as a strong win for privacy. As the project progressed,
           | however, it 's like they missed "the forest for the trees"
           | and didn't quite realize that their plan to protect end-to-
           | end encryption doesn't mean that much if one "end" is now
           | forced to run tech that can be abused by governments._
           | 
           | Sincere or not, the "the only way we could do true X is by
           | making X utterly useless and meaningless" thing certainly
           | seems like bureaucracy speaking. It's a testament to "the
           | reality distortion field" or "drinking the Koolaid" but it
           | shows that the "fall-off" of such fields can be steep.
        
             | onedognight wrote:
             | What has changed again? The practical state is that photos
             | are uploaded in the clear to iCloud today, and Apple can
             | scan them if they want or more importantly if the US
             | Government asks and with any valid subpoena must hand them
             | over.
             | 
             | Consider the this new hypothetical future state:
             | 1) photos are encrypted on iCloud         2) photos are
             | hash scanned before being encrypted and uploaded.
             | 
             | In this new state no valid subpoena can get access to the
             | encrypted photos on iCloud without also getting the key on
             | your device. What is your metric where this is not a better
             | state than the current state where every photo is available
             | in the clear to any government that asks and must be
             | preserved when given a valid order to do so?
             | 
             | If you don't upload photos to iCloud there is no change.
             | 
             | In all scenarios you have to trust Apple to write software
             | that does what they say and that they won't write software
             | in response to a subpoena to target you specifically.
             | 
             | The attack with the hash scanning is someone gets a hash
             | for something in your photos specifically (that doesn't
             | match a ton of people) and through some means gets it into
             | the database (plausible), but can't get anything without
             | going though this route. Specifically subpoenas can't just
             | get all your photos without going through this route.
             | 
             | My conclusion is that you are more protected in the hash
             | scan and encrypt state than in current state.
        
           | emodendroket wrote:
           | I agree with your analysis -- I also think it was probably
           | meant to mollify law enforcement without sacrificing
           | encryption. The same thought occured to me independently
           | right when it was announced.
        
           | bobiny wrote:
           | I'm not a techie, but I think iOS already has similar
           | technology: Camera detects faces, Photos can recognize people
           | and objects (or cats) -- all with help of AI (or ML). I think
           | they are just expanding this and combining with "share".
        
           | hereme888 wrote:
           | G
        
             | TriStateFarmer wrote:
             | You have misunderstood the entire tech story.
             | 
             | Apple are scanning the images only when they are uploaded
             | to iCloud, because they don't want the CSAM pictures on
             | their servers.
             | 
             | And you're not alone. Reddit, Facebook, Twitter etc are
             | full of shouting angry people who haven't understood the
             | story at all.
             | 
             | It's almost became a Rorschach test between those who
             | understand technology and those who like to shout at
             | things.
        
           | neop1x wrote:
           | They announced CSAM scanning but would you trust there is no
           | scanning if they said they reversed their decision and
           | eventually gave you E2E? Considering how big the firmware
           | size is and that all images are processed by their APIs and
           | they control both SW and HW they could hide such scanning in
           | the firmware or even hardware if they wanted or were forced
           | to. They could add a new separate microcontroller similar to
           | Secure Enclave which would (in addition to "AI-enhancing" the
           | picture) compute and compare CSAM hash. On a positive match,
           | the main CPU would decrypt and run a hidden reporting code
           | which would report the image over TLS-secured channel with
           | Secure Enclave-backed certificate pinning to prevent any
           | MITM. No one would notice because it would be decrypted only
           | after a positive match and no one could see the image being
           | sent out on a positive match because of TLS. It is not just
           | Apple, though. We currently don't have enough control over
           | our devices, especially smart phones. There are some vendor
           | binary blobs on Android too. Now or in the future, these
           | proprietary, very capable devices are ideal for many types of
           | spying. Stallman warned about this many years ago.
        
           | shakezula wrote:
           | There are methods for dealing with CP that would maintain
           | privacy, not subject humans to child pornography, and would
           | jive with their end to end encryption goals. But we'll never
           | know what they're doing under the hood and I think that's the
           | concerning part.
        
             | new_realist wrote:
             | Please elaborate on these magical methods.
        
               | shakezula wrote:
               | They're not magical. They're just smart use of
               | cryptography.
               | 
               | Ashton Kutcher's foundation Thorn
               | https://en.wikipedia.org/wiki/Thorn_(organization) does a
               | lot of work in this space. They have hashed content
               | databases, they have suspected and watchlisted IP
               | addresses, etc...
               | 
               | I don't know why I'm being downvoted. This is an active
               | area of cryptography research, and I've applied these
               | types of content checks in real life. You can maintain
               | good privacy and still help combat child pornography.
               | 
               | Sometimes HN takes digital cynicism a step too far.
        
           | [deleted]
        
           | shp0ngle wrote:
           | It all seems so weird.
           | 
           | It's actually _easier_ to the the scanning on the backend
           | compared to scanning of the user 's phones.
           | 
           | My opinion, maybe entirely wrong, is that it's related to how
           | Apple operates in China. In China, Apple does not actually
           | operate their backends, but their BE is operated by
           | government owned business, which is "leased" by Apple.
           | 
           | Maybe they don't want to enable scanning on BE for that
           | reason, and over-engineered local device scanning, that
           | really doesn't make much sense if you think about it twice?
        
           | [deleted]
        
           | Unklejoe wrote:
           | That would be the best case scenario, but then why wouldn't
           | they come out and say that?
           | 
           | They could have announced this all in one shot and saved
           | themselves a lot of blowback.
           | 
           | I believe they have to understand that, which makes me think
           | the theory of them planning to add E2E may not be true.
        
             | spuz wrote:
             | They would not talk about this in their public messaging
             | because it would require them to admit that their privacy
             | policies are influenced not just by their own company
             | morals but also by the government.
             | 
             | They would have to say something like "We know some users
             | want to fully encrypt their cloud data. We also want to
             | encrypt their data. But the government won't allow us to do
             | it until we protect their interests in stopping child abuse
             | and terrorism". That would contradict their marketing image
             | of having their interests being aligned with their users'.
        
           | mdoms wrote:
           | > there's no easy way to explain it and still defend Apple's
           | Privacy narrative, wihout doing extreme mental gymnastics.
           | 
           | > > Apple really wanted to preserve "end-to-end" encryption,
           | but they needed to do something to address the CSAM issue
           | 
           | There we are.
        
           | roody15 wrote:
           | That is, it seems like Apple really wanted to preserve "end-
           | to-end" encryption," ... except they still have not mentioned
           | anything about E2E encryption... and they currently don't
           | encrypt icloud backups.
           | 
           | You would think apple would get ahead of this story and
           | mention ... or maybe they don't have any E2E plans at all.
        
             | hn_throwaway_99 wrote:
             | This blog post got a lot of commentary on HN a couple days
             | ago: https://news.ycombinator.com/item?id=28118350.
             | 
             | iMessages are already E2E encrypted, but you are correct,
             | iCloud backups are decryptable with a warrant (and that was
             | reportedly added at the FBI's request). But I agree with
             | Ben Thompson's point in that blog post, that it's OK to not
             | have strong, unbreakable encryption be the default, and
             | that it's still possible to use an iPhone without iCloud
             | and get full E2E.
             | 
             | But with Apple's CSAM proposal is NOT possible to have an
             | iPhone that Apple isn't continuously scanning.
        
               | sneak wrote:
               | iCloud Backup is a backdoor in the iMessage e2e, and
               | Apple can decrypt all the iMessages. No warrant is
               | required for Apple to do so.
               | 
               | Even with iCloud off, the other endpoint (the phones
               | you're iMessaging with) will be leaking your
               | conversations to Apple because they will still have
               | iCloud Backup defaulted to on.
               | 
               | iMessage is no longer an e2e messenger, and Apple
               | intentionally preserves this backdoor for the FBI and IC.
               | 
               | They access 30,000+ users' data per year without
               | warrants.
        
               | sillysaurusx wrote:
               | > But with Apple's CSAM proposal is NOT possible to have
               | an iPhone that Apple isn't continuously scanning.
               | 
               | This is mistaken. If you turn off iCloud sync, Apple
               | isn't continuously scanning your phone.
        
               | mtgx wrote:
               | > iMessages are already E2E encrypted
               | 
               | Well, actually...I don't know if this post or another,
               | but I remember something about Apple being able to
               | invisibly add "someone else" to the E2E conversion if
               | they really wanted to. But at the very least, this post
               | mentions that most "E2E iMessages" are auto-backed up to
               | the iCloud, which is NOT E2E.
               | 
               | https://blog.cryptographyengineering.com/2013/06/26/can-
               | appl...
        
               | websites2023 wrote:
               | > But with Apple's CSAM proposal is NOT possible to have
               | an iPhone that Apple isn't continuously scanning
               | 
               | This is not accurate. Scanning doesn't happen without
               | iCloud photos enabled.
        
               | shbooms wrote:
               | >iMessages are already E2E encrypted, but you are
               | correct, iCloud backups are decryptable with a warrant
               | (and that was reportedly added at the FBI's request).
               | 
               | Not to be nit-picky but IIRC, this isn't exactly how it
               | went down. icloud backups have always been decryptable
               | and Apple had announced they were planning on changing
               | this and making them fully encrypted but when pressured
               | by the FBI, they scraped those plans and left them the
               | way they are.
        
               | js2 wrote:
               | > But with Apple's CSAM proposal is NOT possible to have
               | an iPhone that Apple isn't continuously scanning.
               | 
               | As currently implemented, iOS will only scan photos to be
               | uploaded to iCloud Photos. If iCloud Photos is not
               | enabled, then Apple isn't scanning the phone.
        
               | tremon wrote:
               | > As currently implemented
               | 
               | We don't actually know what is implemented. We only know
               | what Apple's PR machine is saying about it, or do we have
               | the source code somewhere?
        
               | hn_throwaway_99 wrote:
               | Woops, you're right, thanks for the correction.
               | 
               | I think the issue is that, as Ben Thompson pointed out,
               | the only thing preventing them from scanning other stuff
               | now is just _policy_ , not _capability_ , and now that
               | Pandora's box is open it's going to be much more
               | difficult to resist when a government comes to them and
               | says "we want you to scan messages for subversive
               | content".
        
               | js2 wrote:
               | Yes, of course that is true. I use iCloud Photos and find
               | this terribly creepy. If Apple must scan my photos, I'd
               | rather they do it on their servers.
               | 
               | I could maybe understand the new implementation if Apple
               | had announced they'd also be enabling E2E encryption of
               | everything in iCloud, and explained it as "this is the
               | only way we can prevent CSAM from being stored on our
               | servers."
        
               | AnthonyMouse wrote:
               | > "this is the only way we can prevent CSAM from being
               | stored on our servers."
               | 
               | Why is that supposed to be their responsibility?
               | 
               | Given the scale of their business, there is effectively a
               | certainty that people are transmitting CSAM via Comcast's
               | network. That's no excuse for them to ban end to end
               | encryption or start pushing out code to scan client
               | devices.
               | 
               | When you hail a taxi, they don't search you for drugs
               | before letting you inside.
               | 
               | Because they're not the police. It isn't their role to
               | enforce the law.
        
               | emodendroket wrote:
               | If you become known as a hub for this stuff Congress
               | probably won't see it that way.
        
               | alwillis wrote:
               | _Why is that supposed to be their responsibility?_
               | 
               | They (and Google, Microsoft, Facebook, etc.) are
               | essentially mandated reporters; if CSAM is on their
               | servers, they're required to report it.
               | 
               | It's like how a doctor is a mandated reporter regarding
               | physical abuse and other issues.
               | 
               |  _Because they 're not the police. It isn't their role to
               | enforce the law._
               | 
               | They're not enforcing the law; if the CSAM reaches a
               | certain threshold, they check it out and if it's the real
               | deal, they report to National Center for Missing and
               | Exploited Children (NCMEC); they get law enforcement
               | involved if necessary.
        
               | kelnos wrote:
               | > _They (and Google, Microsoft, Facebook, etc.) are
               | essentially mandated reporters; if CSAM is on their
               | servers, they 're required to report it._
               | 
               | I don't think that's correct. My understanding is that if
               | they _find_ CSAM, they 're obligated to report it (just
               | like anyone is). I don't believe they are legally
               | obligated to proactively look for it. (It would be a PR
               | nightmare for them to have an unchecked CSAM problem on
               | their services, so they do look for it and report it.)
               | 
               | Consider that Apple likely does have CSAM on their
               | servers, because they apparently don't scan iCloud
               | backups right now. I don't believe they're breaking any
               | laws, at least until and unless they find any of it and
               | (hypothetically) don't report it
        
               | kmeisthax wrote:
               | Last year Congress mulled over a bill (EARN IT Act) that
               | would explicitly require online services to proactively
               | search for CSAM, by allowing services to be sued by
               | governments for failing to do so. It would also allow
               | services to be sued for failing to provide law-
               | enforcement back doors to encrypted data. There's also
               | SESTA/FOSTA, already in law, that rescinded CDA 230
               | protection for cases involving sex trafficking.
               | 
               | Quite honestly, my opinion is that any service not
               | scanning for CSAM is living on borrowed time.
        
               | alwillis wrote:
               | _If Apple must scan my photos, I 'd rather they do it on
               | their servers._
               | 
               | Scanning on their servers is much less privacy preserving
               | for users but don't fret, Dropbox, Facebook, Microsoft,
               | etc. already scan stored photos.
        
               | lifty wrote:
               | Not necessarily. The scanning implementation can be
               | similar to what they plan on doing on your device. I
               | don't want them to do the scanning on my phone. If shit
               | hits the fan and I become a dissident, I would prefer to
               | have the option to stop using iCloud, Dropbox or any
               | other service that might enable my government or the
               | secret police to suppress me.
        
               | Razengan wrote:
               | Woops, yet another example of uninformed outrage. People
               | leading pitchfork parties without knowing the full story.
        
               | tshaddox wrote:
               | Given that they obviously already have the capability to
               | send software updates that add new on-device capability,
               | this seems like a meaningless distinction. It's already
               | "just policy" preventing them from sending any
               | conceivable software update to their phones.
        
               | kelnos wrote:
               | There's a difference -- often a big one -- between being
               | theoretically able to build some feature into a phone
               | (plus the server-side infra and staffing to support it),
               | and actually having built it. That's the _capability_
               | angle.
               | 
               | However, the difference between having a feature enabled
               | based on the value of a seemingly-unrelated user-
               | controlled setting, versus having that feature enabled
               | all the time... is basically zero. Additionally,
               | extending this feature to encompass other kinds of
               | content is a matter of data entry, not engineering.
               | That's the _policy_ angle.
               | 
               | When you don't yet have a capability, it might take a lot
               | of work and commitment to develop it. But on the
               | contrary, policy can be changed with the flip of a
               | switch.
        
               | Dylan16807 wrote:
               | Making a custom OS update is pretty different from adding
               | an entry into a database, and could also be mitigated
               | against.
        
               | hn_throwaway_99 wrote:
               | I disagree, strongly. Let's say you're authoritarian
               | government EvilGov. Before this announcement, if you went
               | to Apple and said "we want you to push this spyware to
               | your iPhones", Apple would and could have easily pushed
               | back both in the court of public opinion and the court of
               | law.
               | 
               | Now though, Apple is already saying "We'll take this
               | database of illegal image hashes _provided by the
               | government_ and use it to scan your phone. " It's now
               | quite trivial for a government to say "We don't have a
               | special database of just CSAM, and a different database
               | of just Winnie the Pooh memes. We just have one big DB of
               | 'illegal images', and that's what we want you to use to
               | scan the phones."
        
               | OneLeggedCat wrote:
               | > just Winnie the Pooh memes
               | 
               | I think it's hilarious that one of the world' great
               | nations would scan for exactly this.
        
               | spacedcowboy wrote:
               | There is so much disinformation about this with people
               | not being informed. Apple have put out clear
               | documentation, it would be a good idea to read it before
               | fear-mongering.
               | 
               | 1) The list of CSAM hashes is that provided by the
               | relevant US government agency, that is it.
               | 
               | 2) The project is not targeted to roll out anywhere other
               | than the USA.
               | 
               | 3) The hashes are baked into the OS, there is no
               | capability to update them other than on signed and
               | delivered Apple OS updates.
               | 
               | 4) Apple has exactly the same leverage with EvilGov as at
               | any other time. They can refuse to do as asked, and risk
               | being ejected from the country. Their stated claim is
               | that this is what will happen. We will see.
        
               | strogonoff wrote:
               | > Apple is already saying "We'll take this database of
               | illegal image hashes provided by the government and use
               | it to scan your phone."
               | 
               | This is incorrect.
               | 
               | Apple has been saying--since 2019![0]--that they can scan
               | for _any potentially illegal content_ , not just images,
               | not just CSAM, and not even strictly illegal.
               | 
               | That's what should be opposed. CSAM is a red herring.
               | 
               | [0] https://www.macobserver.com/analysis/apple-scans-
               | uploaded-co...
        
               | noduerme wrote:
               | The difference is that's scanning in the cloud, on their
               | servers, of things people choose to upload. It's not
               | scanning on the user's private device, and currently they
               | have no way to scan what's on a private device.
        
               | strogonoff wrote:
               | The phrase is "pre-screening" of uploaded content, which
               | is what is happening. I'm pretty sure this change in ToS
               | was made to enable this CSAM feature.
        
               | new_realist wrote:
               | Authoritarian governments have never needed preexisting
               | technical gizmos in order to abuse their citizens. Look
               | at what Russia and the Nazis did with no magic tech at
               | all. Privacy stems from valuing human rights, not easily
               | disproven lies about it being difficult to actually
               | implement the spying.
        
               | Razengan wrote:
               | > _Look at what Russia and the Nazis did_
               | 
               | Something strikes me as odd; Why do you say "Russia" but
               | not "Germany"? Why not say "Soviets" or something
               | similar?
               | 
               | Is it because Russia is still portrayed as the bogeyman
               | in the Anglosphere whereas the Germans are cool now?
        
               | vimacs2 wrote:
               | "easily disproven lies", what lies exactly?
               | 
               | Even a Jew stuck in a ghetto in Nazi Germany enjoyed
               | significantly better privacy at home from both the state
               | and capital than a wealthy citizen in a typical western
               | liberal nation today.
               | 
               | Soviet Russia and Nazi Germany in fact prove the exact
               | opposite of what point you think you're making. They were
               | foremost in using the "magic tech" of their day -
               | telecommunications and motorised vehicles.
               | 
               | Even then, they had several rebellions and uprisings that
               | were able to develop thanks to lacking the degree of
               | surveillance tech found today.
        
               | sharikone wrote:
               | True. You can have an authoritarian regime with little
               | technology.
               | 
               | But what people are afraid of is how little it takes when
               | the technology is there. And how Western governments seem
               | to morph towards that. And how some bastions of freedom
               | are falling
        
               | [deleted]
        
               | kragen wrote:
               | On the contrary, the reason Stalin and Hitler could enact
               | repressive measures unprecedented in human history was
               | precisely new technological developments like the
               | telephone, the battle tank, and IBM cards. Bad people
               | will never value human rights; getting privacy from bad
               | people requires you to make it actually difficult to
               | implement the spying, rather than just lie about it.
               | That's why the slogan says, "cypherpunks write code."
        
               | Grustaf wrote:
               | No, they could enact repressive measures because they had
               | almost absolute power. The form that oppression took
               | would obviously rely on the then current technology, but
               | the fundamental enabler of oppression was power, not
               | technology.
        
               | wizzwizz4 wrote:
               | > _but the fundamental enabler of oppression was power,
               | not technology._
               | 
               | But bad people will always have power.
        
               | vimacs2 wrote:
               | Power which was extended through the use of said
               | technology. I'm fine with focusing on regressive
               | hierarchical systems as the fundamental root of
               | oppression but pretending like the information and
               | mechanical technology introduced in the late 19th century
               | to early 20th century did not massively increase the
               | reach of the state is just being ridiculous.
               | 
               | Prior to motorised vehicles and the telegraph for
               | example, borders were almost impossible to enforce on the
               | general population without significant leakage.
               | 
               | Ironically in spite of powered flight, freedom of
               | movement is in a sense significantly less than even 200
               | years ago when it comes to movement between countries
               | that do not already have diplomatic ties to one another
               | allowing for state authorized travel.
               | 
               | Incidentally, this is part of why many punishments in the
               | medieval age were so extreme - the actual ability of the
               | state to enforce the law was so pathetically limited by
               | today's standards that they needed a strong element of
               | fear to attempt to control the population.
        
               | kmeisthax wrote:
               | There's a flipside to this, too: technology doesn't just
               | improve the ability of the state to enforce law (or
               | social mores), it also improves the ability of it's
               | subjects to violate it with impunity.
               | 
               | Borders in medieval Europe weren't nearly as necessary
               | because the physical ability to travel was limited.
               | Forget traveling from Germany to France - just going to
               | the next town over was a life-threatening affair without
               | assistance and wealth. Legally speaking, most peasant
               | farmers were property of their lord's manor. But
               | practically speaking, leaving the manor would be quite
               | difficult on your own.
               | 
               | Many churches railed against motor vehicles because the
               | extra freedom of movement they made possible also broke
               | sexual mores - someone might use that car to engage in
               | prostitution! You see similar arguments made today about
               | birth control or abortion.
               | 
               | Prior to the Internet, American mass communication was
               | effectively censored by the government under a series of
               | legally odd excuses about public interest in efficiently-
               | allocated spectrum. In other words, this was a technical
               | limitation of radio, weaponized to make an end-run around
               | the 1st Amendment. Getting rid of that technical
               | limitation increased freedom. Even today, getting banned
               | from Facebook for spouting too much right-wing nonsense
               | isn't as censorious as, say, the FCC fining you millions
               | of dollars for an accidental swear word.
               | 
               | Whether or not a technology actually winds up increasing
               | or reducing freedom depends more on how it's distributed
               | than on just how much of it there is. Technology doesn't
               | care one way or the other about your freedom. However,
               | freedom is intimately tied to another thing: equity.
               | Systems that economically franchise their subjects, have
               | low inequality, and keep their hierarchies in check, will
               | see the "technology dividend" of increased freedom go to
               | their subjects. Systems that impoverish people, have high
               | inequality, and let their hierarchies grow without bound,
               | will instead squander that dividend for themselves.
               | 
               | This is a feedback effect, too. Most technology is
               | invented in systems with freedom and equality, and then
               | that technology goes on to reinforce those freedoms.
               | Unequal systems squander their subjects' ability to
               | invent new technologies. We didn't see this with Nazi
               | Germany, because the Allies wiped them off the map too
               | quickly; but the Soviet Union lost their technological
               | edge over time. The political hierarchy they had
               | established to replace the prior capitalist one made
               | technological innovation impractical. So, the more you
               | use technology to restrict, censor, or oppress people,
               | the more likely that your country falls behind
               | economically and stagnates. The elites at the top of any
               | hierarchical system - aside from the harshest
               | dictatorships - absolutely do not want that happening.
        
               | emodendroket wrote:
               | Could they though? An authoritarian government could just
               | as easily say "if you do not implement this feature, we
               | will not allow you to operate in this country" and refuse
               | to entertain legal challenges.
        
               | adrianN wrote:
               | Apple is a fairly large company, that gives them some
               | monetary leverage over governments. It's not as simple as
               | you make it seem, I think.
        
               | ElFitz wrote:
               | China just hung _it 's_ biggest tech companies out to
               | dry.
               | 
               | Do you really think they would care in the slightest
               | about banning iPhone?
               | 
               | Or that Putin cares more about people not being able to
               | buy what amounts to a luxury item, than being able to
               | hunt down opposition supporters?
        
               | adrianN wrote:
               | Yeah I'm pretty sure that this wasn't an easy decision
               | for the Chinese government. Even in dictatorships you
               | can't afford to piss off the population _too_ much.
        
               | emodendroket wrote:
               | I'm not sure a move to soak rich guys getting a little
               | too big for their britches is that unpopular with the
               | general population.
        
               | emodendroket wrote:
               | Google is pretty big too but they implemented the
               | filtering in China as requested. Then when they decided
               | they didn't want to do that anymore there went their
               | site.
        
               | pjerem wrote:
               | Should this fact be reassuring or even more frightening ?
               | 
               | Because it's something to push back against governments
               | by saying << I can't >>. But it's a societal issue if a
               | corporation can say << I don't want >> to a government.
               | 
               | It's really not the same thing and Apple just burned
               | their << I can't >> card and are implying they can just
               | say << no >> to governments. Which is quite an even more
               | dystopian thing.
        
               | spacedcowboy wrote:
               | They can say "no" but they have to cope with the fallout
               | from that.
               | 
               | The statement from Apple is that they will say no.
               | Whether that actually ever happens is something we will
               | see, one way or another.
        
               | pseudalopex wrote:
               | Apple won't say no to a court order.
        
               | emodendroket wrote:
               | Of course they won't. The idea of a company refusing to
               | comply with the laws of the countries they operate in
               | purely out of some grand sense of principle just seems
               | naive. Especially if it's the country where HQ is.
        
               | laserlight wrote:
               | Is there a guarantee that Apple will retain their
               | monetary status, or even, money will be a leverage over
               | governments in future?
        
               | brandon272 wrote:
               | Who wants to be the government who shuts down Apple?
               | These are politicians we are talking about.
        
               | ElFitz wrote:
               | Those already doing far worse than banning a consumer
               | electronics company's products?
               | 
               | And, in all honesty, there's a lot of those.
        
               | brandon272 wrote:
               | I meant in the context of U.S. governments. What Democrat
               | administration wants to ban Apple? What Republican
               | administration wants to ban Apple?
               | 
               | Sounds like political suicide on either side: government
               | interference with the country's largest company, an
               | iconic brand, for no reason other than to hopefully spy
               | more on your citizens.
        
               | emodendroket wrote:
               | The concern being raised is about some unnamed
               | authoritarian government without US legal norms, though.
        
               | nl wrote:
               | Who wants to be the government that bans Winnie the Pooh?
               | 
               | Ohh... oops.
        
               | loyukfai wrote:
               | Chinese Apple users already have their data treated
               | differently.
               | 
               | Any chance this is already operational there?
        
               | zarzavat wrote:
               | Authoritarian countries are the least of the worries.
               | There's a large class of illegal imagery that is policed
               | in democratic countries: copyrighted media. We are only
               | two steps away from having your phone rat you to the MPAA
               | because you downloaded a movie and stored it on your
               | phone.
               | 
               | I can guarantee that some industry bigwigs are salivating
               | at the prospect of this tech. Imagine youtube copyright
               | strikes but _local to your device_.
        
               | kmeisthax wrote:
               | This would have been a huge fear of mine if this was 10
               | or 15 years ago and legitimate publishers were still
               | engaging in copyright litigation for individual
               | noncommercial pirates. The RIAA stopped doing individual
               | lawsuits because it was surprisingly expensive to ruin
               | people's lives that way over a handful of songs. This
               | largely also applies to the MPAA as well. The only people
               | still doing this in bulk are copyright trolling outfits -
               | companies like Prenda Law and Malibu Media. They make
               | fake porn, covertly share it and lie to the judge about
               | it, sue you, and then offer a quick and easy settlement
               | route.
               | 
               | (Yes, I know, it sounds like I'm splitting hairs
               | distinguishing between copyright maximalism and copyright
               | trolling. Please, humor me.)
               | 
               | What copyright maximalists really want is an end to the
               | legal protections for online platforms (DMCA 512 safe-
               | harbor) so they can go after notorious markets directly.
               | They already partially accomplished this in the EU. It's
               | way more efficient from their end to have one throat to
               | choke. Once the maximalists get what they want, platforms
               | that want to be legitimate would be legally compelled to
               | implement filters, ala YouTube Content ID. But those
               | filters would apply to content that's published on the
               | platform, not just things on your device. Remember:
               | they're not interested in your local storage. They want
               | to keep their movies and music off of YouTube, Vimeo,
               | Twitter, and Facebook.
               | 
               | Furthermore, they largely _already_ have the scanning
               | regime they want. Public video sites make absolutely no
               | sense to E2E encrypt; and most sites that deal in video
               | already have a content fingerprinting solution. It turns
               | out it was already easy enough to withhold content
               | licenses to platforms that accept user uploads, unless
               | they agreed to also start scanning those uploads.
               | 
               | (Side note: I genuinely think that the entire concept of
               | safe harbors for online platforms is going to go away,
               | just as soon as the proposals to do so stop being
               | transparently partisan end-runs around the 1st Amendment.
               | Pre-Internet, the idea that an entity could broadcast
               | speech without having been considered to be the publisher
               | of it didn't really exist. Publishers were publishers and
               | that was that.)
        
               | noduerme wrote:
               | One reason I haven't used itunes in years, and don't use
               | iOS, is that I have a huge collection of mp3s. It's
               | almost all of music which I bought and burned off CDs
               | over decades, and it's become just stupidly difficult to
               | organize your own mp3 collection via Apple's players.
               | Even back in the ipod days, you could put your mp3s on an
               | ipod but you needed piracy ware to rip them back off. I
               | can easily see this leading to people like me getting
               | piracy notices and having to explain that no, I bought
               | this music, since we're now in a world where not that
               | many people have their own unlocked music collection
               | anymore.
        
               | FridayoLeary wrote:
               | You could access the music no problem, just some metadata
               | was messed up.
        
               | emodendroket wrote:
               | They still sell iTunes Match subscriptions, a service
               | entirely premised on you having your own MP3s.
        
               | ElFitz wrote:
               | > Authoritarian countries are the least of the worries.
               | 
               | That really depends. And is quite a strange, perhaps
               | worrying, take.
               | 
               | Because, for many people, copyright strikes are the least
               | of their problems.
               | 
               | Not that I disagree on it being an issue.
        
               | zarzavat wrote:
               | The only authoritarian country that holds any sway over
               | Apple is China and China can easily just ban iPhones if
               | they feel like it, especially with Xi Jinping in charge.
               | 
               | Democratic countries are much more of a danger because
               | Apple has offices in them and cannot easily pull out of
               | the market if they feel the laws are unconscionable.
        
               | emodendroket wrote:
               | I think Apple is pretty tied to China both as a market
               | for their goods and as the place they get manufacturing
               | done.
        
               | shantara wrote:
               | Russia already forced Apple to allow pre-installing
               | government approved apps on local iPhones:
               | https://www.theverge.com/2021/3/16/22334641/apple-
               | follows-ru...
        
               | emodendroket wrote:
               | All these years I have heard of people getting in trouble
               | for sharing copyrighted materials with others but never
               | for possessing them or downloading them (BitTorrent
               | somewhat blurs the line here). And there's nothing
               | illegal about ripping your own CDs. I find this scenario
               | far-fetched. iTunes Match would seem to be a longrunning
               | honeypot if you take this seriously.
        
               | [deleted]
        
               | ribosometronome wrote:
               | Why do you think essentially no one is complaining about
               | using ML to understand the content of photos, then
               | (especially in comparison to this rather targeted CSAM
               | feature)? My impression is that both Apple and Google
               | have already been doing that since what? 2016? Earlier?
               | There's been no need for a database of photos, either
               | company could silently update those algorithms to ping on
               | guns, drugs, Winnie the Pooh memes, etc.
        
               | alwillis wrote:
               | _Why do you think essentially no one is complaining about
               | using ML to understand the content of photos..._
               | 
               | At last in Apple's case, ML is used only if a minor child
               | (less than 13 years old) who is on a Family account where
               | the parent/guardian has opted-in to the ability to be
               | alerted if potentially bad content is either sent or
               | received using the Messages app.
        
               | ribosometronome wrote:
               | Google and Apple both use ML to understand the content of
               | photos already. Go into your app and you can search for
               | things like "dog", "beer", "birdhouse", and "revolver".
               | Given that it can do all that (and already knows a
               | difference between type of guns), it doesn't seem like a
               | stretch to think it could, if Apple or Google wanted,
               | understand "cocaine", "Winnie the Pooh memes", or
               | whatever repressive-government style worries are had. And
               | it's existed for years, without slippery sloping into
               | anything other than this Messages feature for children.
        
               | ChrisKnott wrote:
               | By default iPhones autocategorise your photos, don't
               | they?
        
               | katbyte wrote:
               | The sure do, and now extract text from images
        
               | salawat wrote:
               | _raises hand_
               | 
               | Have been whining about big data for and ML recognition
               | systems for years.
        
               | callmeal wrote:
               | >As currently implemented, iOS will only scan photos to
               | be uploaded to iCloud Photos. If iCloud Photos is not
               | enabled, then Apple isn't scanning the phone.
               | 
               | Except for the "oops, due to an unexpected bug in our
               | code, every image, document, and message on your device
               | was being continuously scanned" mea culpa we will see a
               | few months after this goes live.
        
               | resfirestar wrote:
               | The potential for this is overblown and I think a lot of
               | people haven't taken the time to understand how the
               | system is set up. iOS is not scanning for hash matches
               | and phoning home every time it sees one, it's attaching a
               | cryptographic voucher to every photo uploaded to iCloud
               | that, if some number of photos represent hash matches
               | (the fact of the match is not revealed until the
               | threshold number is reached), then allow Apple to decrypt
               | the photos that match. This is not something where "oops
               | bug got non-iCloud photos too" or "court told us to flip
               | the switch to scan the other photos" would make any
               | sense, some significant modification would be required.
               | Which I agree is a risk especially with governments like
               | China/EU that like to set conditions for market access,
               | just not a very immediate one.
        
               | FireBeyond wrote:
               | > it's attaching a cryptographic voucher to every photo
               | uploaded to iCloud that, if some number of photos
               | represent hash matches
               | 
               | I see this number very quickly getting set to '1',
               | because the spin in the opposite direction is "What, so
               | you're saying, people get X freebies of CSAM that they
               | can store in iCloud that Apple will never tell anybody
               | about?"
               | 
               | _That_ is a whole other PR disaster.
        
               | spsful wrote:
               | I hear you, 100%, but as a longtime Apple and iCloud user
               | I have been through the absolute ringer when it comes to
               | iCloud and have little faith in it operating correctly.
               | 
               | 1. About 3 years ago, there were empty files taking up
               | storage on my iCloud account, and they weren't visible on
               | the website so I couldn't delete them. All it showed was
               | that an excessive amount of storage was taken up. Apple
               | advisors had no idea what was going on and my entire
               | account had to be sent to iCloud engineers to resolve the
               | issue. They never followed up, but it took months for
               | these random ghost files to stop taking up my iCloud
               | storage.
               | 
               | 2. Sometimes flipping them on and off a lot causes delays
               | in the OS and then you have to kill the Settings app and
               | re open it to see if they're actually enabled. Back when
               | ATT was just being implemented, I noticed it was grayed
               | out on my phone every time I was signed in to iCloud, but
               | was completely operational when I was signed out. Many
               | others had this issue and there was literally no setting
               | to change within iCloud; it was a bug that engineering
               | literally acknowledged to me in an email (they
               | acknowledged that it happened for some users and fixed it
               | in a software update).
               | 
               | Screwups happen in an increasingly complex OS and I just
               | feel that there will be a day when this type of bug
               | surfaces in addition to everything that already happens
               | to us end users.
        
               | kazoomonger wrote:
               | > it's attaching a cryptographic voucher to every photo
               | uploaded to iCloud that, if some number of photos
               | represent hash matches
               | 
               | OK, but what if Apple silently pushes out an update (or
               | has existing code that gets activated) that
               | "accidentally" sets that number to zero? Or otherwise
               | targets a cryptographic weakness that they know about
               | because they engineered it? That wouldn't require
               | "significant modification".
               | 
               | Funamentally, it's closed software and hardware. You
               | can't and shouldn't trust it. Even if they did do some
               | "significant modification" how are you going to
               | notice/prove it?
        
               | hypothesis wrote:
               | Apple can't even keep settings values correctly.
               | 
               | After they introduced AirTags I went and disabled item
               | detection on all my devices. Yesterday I went into
               | settings and guess what, that feature is enabled again.
               | On all my devices! I'm not sure when that happened, but
               | my guess would be that latest iOS update caused that...
               | 
               | My point is, you will only have things working to the
               | extent you have testing for, and test coverage is likely
               | less robust for less popular features or where perception
               | tells developers that "no one is going turn that mega-
               | feature off".
        
               | m4rtink wrote:
               | Well its an unauditable locked down proprietary black
               | box.
        
               | jq-r wrote:
               | Just like the facebook and it's Messenger application
               | "forget" that you had your in-app sounds turned off and
               | turns it off now and then.
        
               | alwillis wrote:
               | _Except for the "oops, due to an unexpected bug in our
               | code, every image, document, and message on your device
               | was being continuously scanned" mea culpa we will see a
               | few months after this goes live._
               | 
               | Only images that match the hashes of the database of CSAM
               | held by the National Center for Missing and Exploited
               | Children (NCMEC) that are uploaded to iCloud Photos are
               | checked.
               | 
               | Based on their technical documents, it's not even
               | possible for anything else to be checked; even if they
               | could, they learn nothing from documents that aren't
               | CSAM.
        
               | noduerme wrote:
               | >>> Only images that match the hashes of the database of
               | CSAM held by the National Center for Missing and
               | Exploited Children (NCMEC) that are uploaded to iCloud
               | Photos are checked.
               | 
               | Incorrect. All files on the phone will be checked against
               | a hash database _before_ being uploaded to iCloud. Any
               | time before, which means all the time, if you have iCloud
               | enabled.
        
               | pseudalopex wrote:
               | Nothing limits Apple to shipping just the hashes provided
               | by NCMEC. And people who worked with NCMEC's database say
               | it contains documents that aren't CSAM.
        
               | whoisjuan wrote:
               | That's wrong. They are also enabling on-device detection
               | for Messages.
        
               | js2 wrote:
               | Communication safety in Messages is a different feature
               | than CSAM detection in iCloud Photos. The two features
               | are not the same and do not use the same technology. In
               | addition:
               | 
               |  _Communication safety in Messages is only available for
               | accounts set up as families in iCloud. Parent /guardian
               | accounts must opt in to turn on the feature for their
               | family group. Parental notifications can only be enabled
               | by parents/guardians for child accounts age 12 or
               | younger._
               | 
               | https://www.apple.com/child-
               | safety/pdf/Expanded_Protections_...
        
               | b0tzzzzzzman wrote:
               | Yeah, for now this is what is stated. It should be
               | alarming how much they pivoted and changed with out any
               | background statment. Something more is at play.
        
               | m4rtink wrote:
               | Thats the thing - with an unauditable black box like iOS
               | you have to trust the vendor that it actually does what
               | they say.
               | 
               | If they suddenly start to break you trust - who knows
               | what they will push to that black box next time ? You
               | have no way to tell or prevent that, other than leaving
               | the whole platform altogether.
        
               | browningstreet wrote:
               | For now, in v1.
        
               | alwillis wrote:
               | _But with Apple 's CSAM proposal is NOT possible to have
               | an iPhone that Apple isn't continuously scanning._
               | 
               | Sure it is--just don't use iCloud Photos. They've been
               | quite clear about this.
        
               | dmix wrote:
               | iMessage isnt true E2E encryption. Apple could easily
               | decode messages for a 3rd party if they wanted to. This
               | has been widely discussed on HN previously. All they have
               | to do is insert an additional private key in the two
               | party chat, as the connection is ultimately handled
               | centrally.
               | 
               | There's a very good reason Signal (and previously
               | Whatsapp) were recommended over iMessage.
               | 
               | Not to mention the inability to swap iMessage off for SMS
               | which has made it a favourite exploit target for hacking
               | firms like NSO.
        
               | jakemauer wrote:
               | You can turn off iMessages and force the phone to use
               | SMS, however you can't do it on a per-contact or per-
               | conversation basis.
        
               | alwillis wrote:
               | _Apple could easily decode messages for a 3rd party if
               | they wanted to._
               | 
               | They don't have the keys, so how exactly would they
               | decrypt these messages?
        
               | dmix wrote:
               | They would be able to decrypt messages after the point of
               | interception, not retroactively.
        
               | trauco wrote:
               | "iMessages are already E2E encrypted,"
               | 
               | Unless you use iCloud Backup for iMessage, in which case
               | Apple holds the keys to decrypt them.
        
               | raxxorrax wrote:
               | They are not E2E encrypted if they are decryptable. We
               | should use the correct terminology here, especially to
               | not subject users to false security.
        
               | prophesi wrote:
               | > But I agree with Ben Thompson's point in that blog
               | post, that it's OK to not have strong, unbreakable
               | encryption be the default, and that it's still possible
               | to use an iPhone without iCloud and get full E2E.
               | 
               | I disagree completely on this. For one, users aren't
               | aware that using iCloud means that Apple has your
               | decryption key and can thereby read and share all of your
               | phone's data.
               | 
               | And two, opt-out is a dark pattern. Particularly if you
               | surround it with other hurdles, like long click-wrap and
               | confusing UI, it's no longer a fair choice, but
               | psychological manipulation to have users concede and
               | accept.
               | 
               | Third, as it's hinted, smarter criminals, the ones the
               | FBI should actually worry about, will know to opt-out. So
               | instead the vast majority of innocent users have their
               | privacy violated in order to help authorities catch
               | "dumb" criminals who could have very well been caught
               | through countless other avenues.
               | 
               | disclaimer: I just read Bruce Schneier's "Click Here To
               | Kill Everybody" after the pipeline ransomware attack, and
               | these points come straight from it.
        
               | speleding wrote:
               | As someone who frequently has to help older family
               | members recover from lost passwords and broken phones, I
               | sincerely hope they leave strong encryption of backups to
               | be "opt out".
               | 
               | It's fine if you know you want to be the sole person in
               | the world who can get to your data and have the
               | discipline to store master passwords, and you accept
               | Apple support cannot help you even if you have a purchase
               | receipt for the device. But for the average older person
               | that's not a good default.
        
               | londons_explore wrote:
               | > help older family members recover from lost passwords
               | and broken phones,
               | 
               | Apple just needs to design good UX to help these people.
               | 
               | Perhaps a large cardboard 'key' comes with every phone.
               | When first setting up the phone, you scan a QR code on it
               | to encrypt your data with the key, and then you tell the
               | owner that of they ever lose the key to their phone, they
               | won't be able to get in again.
               | 
               | People understand that - it's like losing the only key to
               | a safe.
               | 
               | From time to time you require them to rescan the key to
               | verify they haven't lost it, and if they have, let them
               | generate a new one.
        
               | cameronbrown wrote:
               | Far easier to develop a culture of using something
               | physical like yubikeys - over flimsy cardboard QR codes.
               | The key metaphor makes a lot more sense that way, too.
        
               | WastingMyTime89 wrote:
               | So you are suggesting making everyone life more
               | cumbersome with the very likely potential of losing all
               | the data on your phone because you lost a piece of
               | cardboard or forgot a key for which benefit exactly?
               | Being hypothetically safe for government scrutiny
               | concerning information they could a dozen other way?
               | 
               | I really like E2E encryption of messages with proper
               | backups on my end. It's good to know the carrier of my
               | messages can't scan them for advertising purpose. But for
               | my phone if I know there is no way to easily get the data
               | back if I have an issue that seems like a major hassle
               | with very little upsides.
        
               | GekkePrutser wrote:
               | You don't want that key to be in the box. Because of
               | Apple printed it, they know it.
               | 
               | It's better if the key is generated when you already have
               | the phone.
        
               | y7 wrote:
               | I agree. If Apple wants to go this route, they should
               | abstain from pushing the user towards using iCloud as
               | they currently do, and instead just present a clear opt-
               | in choice.
               | 
               | "Do you want to enable iCloud photos? Your private photos
               | will be uploaded encrypted to Apple's servers, so that
               | only you can access them. Before uploading, your photos
               | will be locally scanned on your device for any illegal
               | content. Apple will only be notified once a sufficient
               | volume of illegal content is detected."
        
               | oarsinsync wrote:
               | > _Your private photos will be uploaded encrypted to
               | Apple 's servers, so that only you can access them._
               | 
               | This would be a large change from the way it works today,
               | where Apple can view your iCloud photos. They've not made
               | any statements indicating that is going to change.
        
               | bbarnett wrote:
               | Yes, but...
               | 
               | An option like this, after the lawyers get at it, will be
               | 14 pages long.
               | 
               | "We can change this at any time", "except by court
               | order", "no guarantee", "not liable", and on and on and
               | on...
        
               | cheschire wrote:
               | I dunno, when I read the choices in iTunes about backups,
               | I didn't feel particularly manipulated. It seemed
               | straight forward. Trade offs were clear to me.
               | 
               | I guess some people just see dark patterns where I don't.
        
               | ElFitz wrote:
               | The least of the biases at play, here, is the change
               | aversion bias.
               | 
               | Making things "opt-out", without even making it hard,
               | dramatically increases the number of people who opt-in.
               | It can be used for many reasons.
               | 
               | For example, the UK switched it's organ donation laws
               | from opt-in to opt-out.
               | 
               | Another example, my government has decided that them
               | selling their citizen's personal information to garages,
               | insurance companies, etc, when we buy cars, should also
               | be opt-out.
               | 
               | So they freely sell the car make, acquisition date,
               | buyer's name,... [1][2]
               | 
               | Unless it has changed with GDPR. I've never bought a car.
               | 
               | While we're on the topic of GDPR, that is exactly why it
               | insists on making cookies, tracking, and the "sharing of
               | information with partners" opt-in.
               | 
               | And that is _without_ hiding what you're doing, making it
               | unclear or confusing, or requiring multiple actions to
               | change it.
               | 
               | (Links in French, sorry)
               | 
               | [1]: https://mobile.interieur.gouv.fr/Repertoire-des-
               | informations...
               | 
               | [2]: https://www.carte-grise.org/actus/2014/05-12-L-Etat-
               | vends-le...
        
               | jjgreen wrote:
               | _For example, the UK switched its organ donation laws
               | from opt-in to opt-out._
               | 
               | We have, yet oddly, it's still called "organ donation".
        
               | grkvlt wrote:
               | don't be disingenuous - fairly obviously it is a donation
               | because it is optional, voluntary and non-compensated.
        
               | kirillzubovsky wrote:
               | Indeed, up until reading these comments I had no idea
               | that iCloud wasn't encrypted.
               | 
               | Everything about Apples messaging makes you believe
               | otherwise. That seems pretty disingenuous.
               | 
               | Then again, 99% of consumers have very little choice that
               | doesn't include huddles and complicated setup. We all get
               | the same moon goo, under a differ different brands.
               | 
               | Good, bad, or just the fact of life?
        
               | porpoisemonkey wrote:
               | > Indeed, up until reading these comments I had no idea
               | that iCloud wasn't encrypted.
               | 
               | iCloud data is encrypted at rest (edit: except for Mail
               | apparently). The type of encryption (service or end-to-
               | end [E2E]) is specified here:
               | https://support.apple.com/en-us/HT202303
               | 
               | It can be argued that from a user's viewpoint not having
               | E2E encryption is tantamount to not having encryption at
               | all, but from a technical standpoint the data is
               | encrypted.
        
               | heavyset_go wrote:
               | It's encrypted at rest, but Apple has the decryption
               | keys, and will give up your customer data when asked to
               | by the government[1]. Also, iCloud photos are not
               | encrypted[2].
               | 
               | [1] https://www.apple.com/legal/transparency/us.html
               | 
               | [2] https://support.apple.com/en-us/HT202303
        
               | porpoisemonkey wrote:
               | > Also, iCloud photos are not encrypted[2].
               | 
               | According to the table on the second link iCloud Photos
               | are encrypted on the server (at rest). Am I missing
               | something?
        
               | prophesi wrote:
               | This is more from Schneier's book, but I would say the
               | most import reason E2E encryption should be the default
               | is that in the event of a data breach, nothing would be
               | lost. If a company's servers are hacked, they'd have
               | access to the symmetrical encryption keys, and therefore
               | all of the data. It also ensures that the company can't
               | be selling/sharing your data, as they don't have access
               | to it in the first place.
               | 
               | Edit: I also meant iCloud backups in my original post and
               | how Apple can decrypt your E2E encrypted iMessages with
               | the key the backups contain. But I posted it last night
               | and couldn't edit it once I caught the error. It would be
               | amazing for other iCloud services to have E2E encryption
               | so long as the implications of iCloud backups having your
               | encryption keys is stated front and center when choosing
               | to opt-in.
        
             | chrisfinazzo wrote:
             | CSAM scanning is probably the easiest thing they could do
             | to satisfy this demand, and if you don't use iCloud Photos,
             | you're not affected by it at all.
             | 
             | As far as encrypted backups go, it's an open question
             | whether they want to deal with the legal and support
             | headaches that such a change would bring. If they continued
             | to do nothing, Congress might force their hand by
             | legislatively outlawing stronger encryption - they had to
             | shit or get off the pot.
             | 
             | For users, if you enable this feature, but then lose your
             | password, you are entirely screwed and Apple can't help
             | you. Encrypting "In-transit" as a middle ground is likely
             | good enough for most people, until researchers manage to
             | come up with a better solution.
        
               | mapgrep wrote:
               | > For users, if you enable this feature, but then lose
               | your password, you are entirely screwed and Apple can't
               | help you.
               | 
               | Exactly the same with the phone if you forget your
               | PIN/passcode. So they already do this.
        
               | raxxorrax wrote:
               | Especially on Apple devices. You cannot just reset the
               | device if it was ever associated with an account and you
               | don't know the password anymore. You must contact Apple
               | support. Some people still want to employ Apple devices
               | in a business environment.
               | 
               | The amount of money that is spend on their devices is
               | just insane.
        
             | [deleted]
        
             | nicce wrote:
             | Well, iCloud Photos are now "kinda" E2E encrypted.
             | 
             | iOS 15 beta has new recovery option by secret (which is
             | useful only on E2EE?)
             | 
             | And Child Safety was released most likely because the
             | leaks. Motive of the leaks is unknown. They might be
             | waiting for September now.
        
             | gidam wrote:
             | what best way to "to preserve end-to-end encryption" than
             | shipping a technology that can bypass it anytime?
        
           | adam_ellsworth wrote:
           | What follows is conjecture:
           | 
           | >... to address the CSAM issue lest governments come down on
           | them hard.
           | 
           | And I think that is the springboard of why this is happening.
           | 
           | It's no secret that Apple has lobbyists which by turns have a
           | "pulse" on the trajectory of Bills in the House and Senate.
           | 
           | What immediately came to mind to me was H.R.485 of 2021's
           | House Session:
           | 
           | https://www.congress.gov/bill/117th-congress/house-
           | bill/485/...
           | 
           | My (layperson) estimation is that Apple knows the rewording
           | of it is liable pass both House/Senate in some way either in
           | the 2022 or 2023 sessions and are attempting to get ahead of
           | the issue. (most likely as a "rider" bill at this point)
           | 
           | That's my thought process on what may have precipitated
           | Apple's move toward "CSAM"; and I welcome it in some ways
           | (unequivocally, children should not be subjected to sexual
           | abuse) and am wary of it in others (ML is in its infancy in
           | too many respects, and if a false-positive arises who's to
           | say what's liable to happen on the Law Enforcement level).
           | 
           | There are infinitely many concerns to be discussed on both
           | sides of this argument, and I want both to succeed in their
           | way. CSAM, however, feels like an attempt to throw a blanket
           | over Apple's domains-of-future-interests while playing an
           | Altruism Card.
           | 
           | One thing is certain, though: they've opened the floodgates
           | to these discussions on an international level...
        
             | cronix wrote:
             | > It's no secret that Apple has lobbyists which by turns
             | have a "pulse" on the trajectory of Bills in the House and
             | Senate.
             | 
             | Great, let the bills pass and then they get to publicly
             | blame congress for being "forced to legally comply" instead
             | of jumping the gun and doing an about face on 80% of your
             | reputation built over the last 20+ years.
             | 
             | There is 0 reason to do this before being legally forced
             | to. It's not a gamble with a possible upside somewhere.
             | It's directly shooting yourself in the foot.
        
               | unityByFreedom wrote:
               | Yes, 100% agree, make congress go through with passing
               | such a law. Make it a public discussion. Make them
               | consider if they would like their own devices to be built
               | insecurely. They're among the biggest targets of people
               | who would like to subvert US policy.
        
               | adrianN wrote:
               | Your lobbyists might get some extra favors if you don't
               | pass the blame on congress.
        
           | dukeofdoom wrote:
           | Apple know what they are doing. Don't fool yourself by giving
           | plausible excuses. The further away you are from your phone
           | the more privacy you will have.
           | 
           | Almost everyone was traced back to the Jan 6th insurrection,
           | because unlike antifa they took their phones with them, so
           | easy trace for government. Now they are in solitary
           | confinement in prison while the world loughs at their
           | expectations of privacy from their phones.
           | 
           | If they get a covid passport implemented, you won't be able
           | to go out in public without your phone to prove you've been
           | vaccinated. So leaving a phone behind will not be much of an
           | option if this is to pass. Metadata and contact tracing will
           | keep track of everyone you're near. This dystopian future is
           | possibly just a few weeks away.
        
           | fidesomnes wrote:
           | The Apple employee throwaway justifying how much they suck.
        
           | volta83 wrote:
           | > but they needed to do something to address the CSAM issue
           | lest governments come down on them hard.
           | 
           | If Apple does not have decryption keys for iCloud content,
           | they can't decrypt it, only the user can. For Apple, it's not
           | technically feasible, and the law supports that.
           | 
           | Apple has now demonstrated that it is technically feasible to
           | overcome this challenge by doing things "pre-encryption" on
           | the user's device.
           | 
           | From a user's point-of-view, Apple's privacy is worth
           | nothing. It makes no sense for them to encrypt anything, if
           | they leak all data before encryption. Children today,
           | terrorism tomorrow. All your messages, photos, and audio, can
           | be leaked.
           | 
           | There are thousands of people at Apple involved in the
           | release of any single feature, from devs and testers, to
           | managers all the way up to VPs.
           | 
           | The fact that this feature made it to production proofs that
           | Apple is an organization that I can't trust with my privacy.
           | 
           | Their privacy team must be pulling their hairs out. They
           | better start looking for a new job.
        
             | unityByFreedom wrote:
             | > Apple has now demonstrated that it is technically
             | feasible to overcome this challenge by doing things "pre-
             | encryption" on the user's device.
             | 
             | > From a legal stand point, they are now screwed, because
             | now they _must_ do it.
             | 
             | That's not the right takeaway. They're only required to
             | turn over data to which they have access. If they continue
             | writing software that makes it so they don't have access to
             | the data, then they do _not_ need to turn it over because
             | they can 't. The fact that you can write software to store
             | or transmit unencrypted data is irrelevant.
        
               | mrweasel wrote:
               | Because Apple write the software, there will never be a
               | time where they do not at some point have access to the
               | data.
        
               | noduerme wrote:
               | Right, but there is a big difference between them having
               | the software in place already to steal files off a phone
               | for a government, versus a government telling them "you
               | must deploy this new software." In the past Apple has
               | said no to writing any new spyware for the government.
               | They would not be able to say no very easily if the
               | software is already on the devices.
        
               | shuckles wrote:
               | Apple has the software to "steal files off a phone"
               | through iCloud Backup. Whether they do this for
               | governments is a policy matter.
        
           | krumpet wrote:
           | I sold all the Netflix stock I had purchased when they
           | announced Qwikster. Woe is me!
        
         | TroisM wrote:
         | > This CSAM Prevention initiative by Apple is a 180 degress
         | change of their general message around privacy.
         | 
         | yes, but at least they are telling the truth now...
        
         | [deleted]
        
         | nextlevelwizard wrote:
         | I don't think this is a good metaphor. You still get the
         | privacy same privacy aspect as you did before. Only your iCloud
         | images are now being scanned for visual matches. If you are
         | already trusting Apple with rest of your private life this
         | hardly is complete 180 degree turn.
         | 
         | I don't quite understand why people are so against this. What
         | if some algorithm scans your images? If you are using any cloud
         | service to share your images then your images are already being
         | scanned and far more invasively. Is the the storing of hashes
         | that concerns you? Your password(s) is already being stored as
         | a hash and we trust that it can not be reversed back to plain
         | text, why wouldn't we trust that these image hashes can't be
         | reversed back into images?
         | 
         | I've seen some notion that this is a backdoor and governments
         | and other Lovecraftian entities could use it to suppress
         | freedom of speech by censoring certain images, but that would
         | require them to first have the image in question, then creating
         | hash of the image, and then inserting it into the Apple's
         | database and after that it would only be flagged for human
         | verification and again assuming you trust Apple to handle rest
         | of your digital life (such as iMessages) why wouldn't you trust
         | Apple to share your images? If some entity has it's hooks so
         | deep in apple that they start to censor images why wouldn't
         | they just stop all of your messages being sent?
         | 
         | Please correct me if I'm missing something obvious.
        
           | goldenkey wrote:
           | > why wouldn't we trust that these image hashes can't be
           | reversed back into images?
           | 
           | You should avoid making political or moral arguments when
           | your technical knowledge is 10 feet below your ego. An image
           | is orders of magnitude larger than a password, meaning that
           | many more collisions. Also, because it's a perceptual hash,
           | even more collisions. One does not simply reverse a
           | perceptual hash. Keep your hat on ;-)
        
             | nextlevelwizard wrote:
             | >You should avoid making political or moral arguments
             | 
             | I am not being political or moral. Why would you think
             | that? Is this one of those "since you don't immediately
             | agree with me it means you are the enemy and disagree on
             | everything" type things?
             | 
             | >An image is orders of magnitude larger than a password,
             | meaning that many more collisions.
             | 
             | Whole point of hashing algorithms is that they produce
             | unique hashes for different inputs. Wouldn't a larger input
             | value (bits of the image) make it harder to have
             | collisions?
             | 
             | >One does not simply reverse a perceptual hash
             | 
             | So what actually is the privacy concern here? There is no
             | way to get the original image from the hash at best you can
             | get an approximation of it.
             | 
             | >Keep your hat on ;-)
             | 
             | I guess this is suppose to be some kind of dig at me. No
             | need to be an asshole even if you don't agree with someone.
        
               | goldenkey wrote:
               | > Whole point of hashing algorithms is that they produce
               | unique hashes for different inputs. Wouldn't a larger
               | input value (bits of the image) make it harder to have
               | collisions?
               | 
               | Standard cryptographic hashes should have certain
               | properties like
               | https://en.wikipedia.org/wiki/Avalanche_effect
               | 
               | However, a perceptual hash desires the opposite. Due to
               | the pigeonhole principle, a larger space mapping to a
               | smaller space, involves more collisions. In fact, all
               | hashes have infinite collisions, one just tries to design
               | hashes that don't have many collisions on smaller length
               | inputs. Ultimately though, there will be at least 2^n
               | collisions for all inputs n bits long if the hash is n
               | bit output. You can easily calculate this by looking at
               | the excess in size between input space size and output
               | space size.
               | 
               | https://en.wikipedia.org/wiki/Pigeonhole_principle
               | 
               | Perceptual hashes are essentially designed to collide on
               | similar data. The fine details are lost. An ideal
               | perceptual hash algorithm would quantize as many
               | alterable properties of an image as possible. Contrast,
               | brightness, edges, hue, fine details, etc. In the end,
               | you have a bunch of splotches in a certain composition
               | that form the low dimensional eigenbasis of the hash.
        
               | nextlevelwizard wrote:
               | Thanks for that, I still dont get why this is such a big
               | privacy concern for everyone.
        
         | afro88 wrote:
         | > It's like working for a food company that claims to use
         | organic, non-processed, fair-trade ingredients and in just a
         | day deciding that you're going to switch to industrial farming
         | sourcing and ultra-processed ingredients.
         | 
         | No it's not, it's more like in just a day they started
         | including a chemical preservative amongst all the organic
         | ingredients. All the good stuff is still there, it's just
         | pointless now for people who want to eat pure organic food.
         | 
         | I agree with the sentiment though.
        
         | swiley wrote:
         | Apple only had pro privacy PR. They were always working this
         | way. What you're witnessing is a PR change and not a technical
         | one.
        
           | fastball wrote:
           | Aren't a number of things E2EE with Apple?
        
             | swiley wrote:
             | The few things that are get backed up in cleartext. Also
             | E2EE doesn't mean jack shit if someone else owns the ends.
        
         | EGreg wrote:
         | Apple was always pro "think of the children", not just privacy.
        
         | beezischillin wrote:
         | If I were to speculate, I've seen lots of references as to how
         | Apple reports way less of this to the appropriate authority
         | than the rest of their competition. It does kinda sound to me
         | like this is the consequence of external pressure. I cannot
         | prove this but speculate, of course.
         | 
         | I just don't think the messaging they had around this is
         | reassuring at all. While they had all sorts of technical
         | explanations as to how trustworthy this will all be because
         | they are in charge, the whole thing quickly went from "this is
         | only for iCloud photos" and only in the US to "3rd party
         | integration can happen" and "we're expanding it to other
         | countries". Which I guess is a logical next step but it is a
         | hint at expansion.
         | 
         | The walled garden becomes much less tempting after all of this,
         | especially right after a scandal like the Pegasus one.
        
         | masto wrote:
         | Apple was never pro-privacy. Apple has always been pro-Apple.
         | They will do whatever benefits them the most at any time. It
         | just so happens that with the failure of their own attempts to
         | start an advertising network, they saw an opportunity to spin
         | themselves as the privacy company and take digs at their
         | competitors. Nothing lasts forever. Maybe it's time for a new
         | spin.
        
           | m12k wrote:
           | Sure, but even if it was an illusion, they spent a lot of
           | time and energy building it up, just to burst it like this.
        
       ___________________________________________________________________
       (page generated 2021-08-13 23:01 UTC)