[HN Gopher] EFF Joins Global Coalition Asking Apple CEO Tim Cook...
       ___________________________________________________________________
        
       EFF Joins Global Coalition Asking Apple CEO Tim Cook to Stop Phone-
       Scanning
        
       Author : DiabloD3
       Score  : 330 points
       Date   : 2021-08-21 18:23 UTC (4 hours ago)
        
 (HTM) web link (www.eff.org)
 (TXT) w3m dump (www.eff.org)
        
       | bequanna wrote:
       | Does Tim Cook have a choice here?
       | 
       | I would be surprised to hear that the genesis of this idea was
       | inside of Apple vs. one or more govts pressuring Apple to add
       | this functionality for them.
       | 
       | It is also likely they even suggested that Apple should market
       | this as anti-pedo tech to receive the least pushback from users.
        
         | lrvick wrote:
         | Sure there is a choice.
         | 
         | They could open source the OS allowing independent auditing and
         | independent privacy focused builds to emerge so people have a
         | way to opt out, as they have on Android via projects like
         | CalyxOS.
         | 
         | That is of course if privacy was actually a serious Apple
         | objective.
        
         | pmontra wrote:
         | Apple doesn't have a history of complying with every request
         | from governments and police. They care more about their bottom
         | line IMHO. So I was surprised by this "feature". I don't see
         | how it sells more phones for them. Actually it could scare away
         | some customers because of false positives. Everybody with small
         | kids risks a match on some of their children pictures.
         | 
         | If Android also implements something like that I could end up
         | with a Linux phone as my main phone and an Android one at home
         | for the 2FA of banks and other mandatory apps. No WhatsApp but
         | I'll manage.
        
         | hoppyhoppy2 wrote:
         | The latest episode of _The Daily_ podcast [0] from The New York
         | Times said that Apple executives were told by members of
         | Congress at a hearing that if they didn 't do something about
         | CSAM on their platform the federal government would force them
         | through legislation. And it's not a completely idle threat;
         | just look at the proposed EARN-IT Act of 2020 [1], which would
         | pretty much outlaw end-to-end encrypted services without a law
         | enforcement backdoor.
         | 
         | [0] https://www.nytimes.com/2021/08/20/podcasts/the-
         | daily/apple-...
         | 
         | [1] https://en.m.wikipedia.org/wiki/EARN_IT_Act_of_2020
        
           | sharken wrote:
           | Be that as it may, there has not been presented any proof
           | that Congress would dictate client-side scanning on users
           | devices.
           | 
           | But it still doesn't change that Apple needs to stop on-
           | device scanning.
        
         | nullc wrote:
         | If Apple is performing the searching of user private data due
         | to pressure or incentive from the government it would make
         | apple an agent of the government from the perspective of the
         | fourth amendment. As such, these warrantless searches would be
         | unlawful.
         | 
         | If what you suggest were true, we should be even more angry
         | with Apple: it would mean that rather than just lawfully
         | invading their users privacy, that they were a participant in a
         | conspiracy to violate the constitutional rights of hundreds of
         | millions of Americans.
        
           | politelemon wrote:
           | The time to be angry with apple was years ago when they
           | launched their false marketing campaign claiming privacy on
           | their closed devices. A lot of people fell for it, and were
           | happy to believe whatever they said. All the while they have
           | been two-face-timing by turning over user data to governments
           | (including the US and putting user data on Chinese servers)
           | anyway, with the highest data turnover rates actually.
           | Everyone was happy to turn a blind eye to these happenings as
           | long as it didn't affect them.
           | 
           | We should be angry with _ourselves_. What's happening now is
           | that this has hit closer to a lot more people, who are now
           | dissecting every detail, blaming others, performing mental
           | gymnastics, and launching 'open letters' so that their brand
           | identity and perceptions aren't proven wrong. Convincing them
           | to roll back their recent changes will not somehow make Apple
           | devices private, when it was never private to begin with.
        
             | SCUSKU wrote:
             | What can I, as an individual, do to protect my digital
             | privacy then? Would that mean ditching the Apple ecosystem,
             | and going full linux laptop & phone? As much as I would
             | love to buy a pinephone, they just don't seem like a truly
             | viable alternative...
        
           | bequanna wrote:
           | Americans, Chinese, Europeans, everyone. This is an equal-
           | opportunity privacy rights violation. Investigatory and spy
           | agencies around the world must be thrilled that this is
           | moving forward.
           | 
           | Of course, we'll accept it and the dullest of us will even
           | cheer it as "doing the right thing" while stating that they
           | "have nothing to hide".
        
       | slownews45 wrote:
       | here is a quote from EFF complaint about Apple alerting parents
       | to kids under 13 being sent porn:
       | 
       | "The recipient's parents will be informed of the content without
       | the sender consenting to their involvement."
       | 
       | Are parents really outraged by this? Are people sending porn
       | upset about this?
       | 
       | My own view is as a parent if you send porn to my kids, I
       | shouldn't NEED your consent to be alerted to this. I paid for the
       | device, so it should do what I want, and if I turn this feature
       | on than it should alert me.
        
         | shuckles wrote:
         | For ideological consistency, I can only hope the EFF has a
         | lecture prepared about two party consent for kids who show
         | their phones to their parents if they receive a dick pic.
        
           | slownews45 wrote:
           | I think they are coming at this from the view that because
           | iMessage was one of the first actual E2E encrypted products,
           | showing the pic to the parent (after picture has reached end
           | user device and been decrypted and is ready to display)
           | should require consent. Ie, the encryption has been "broken".
           | 
           | That said - I don't find it very compelling. Parent paid for
           | device. Parent is responsible for child. Parent should be
           | allowed to control device (including setting a kids account
           | instead of adult which trigger this etc). So I want to
           | preserve MY right to both the device and to take care of my
           | child. EFF seems to be focusing oddly on the rights of
           | someone who DIDN'T pay for device.
        
             | shuckles wrote:
             | The picture is not sent to the parent. They are notified
             | that the child viewed a sensitive photograph, after warning
             | the child, and the parent can view the photograph on the
             | child's device.
        
       | mrtksn wrote:
       | The logical argument against this tech is solid but I despise it
       | because it materializes the theological idea of "always being
       | watched".
       | 
       | There's always an option to be part of the government and be on
       | the side that takes advantage of this tech to take out rivals,
       | make your job easier etc. but implementation of always watching
       | devices is a hill to die on.
        
       | OzzyB wrote:
       | Just EFF'ing sign it.
        
       | tpush wrote:
       | > [...] and the parental notification system is a shift away from
       | strong end-to-end encryption.
       | 
       | That particular statement doesn't make much sense to me. The
       | parental notification system is just a frontend action (one of
       | many, like link previews and such). What does that have to do
       | with iMessage's encryption?
       | 
       | I can see an argument about a shift away from _privacy_ (though
       | it only pertains to minors under 13 receiving sexually explicit
       | images). But I think it 's misleading to say that it relates to
       | iMessage's encryption in any way.
        
         | shuckles wrote:
         | EFF would probably argue that technologies like safe browsing
         | are also a shift away from E2E encryption. They were strongly
         | against email spam protection in the 90s for this reason.
        
           | aaomidi wrote:
           | You mean when they said blanket banning email lists is bad
           | and shouldn't be done? Because yeah that's still true.
        
             | shuckles wrote:
             | That was in the 2000s and a different issue and not opposed
             | on privacy grounds. There they argued they were defending
             | freedom of speech. In the 90s the argument was against
             | early probabilistic models.
        
           | vimacs2 wrote:
           | Framing the EFF's position as being against browser safety or
           | spam protection is disingenuous.
        
             | shuckles wrote:
             | I didn't say they were against either. I said their
             | opposition to iMessage parental control features as a
             | change to E2E encryption translates directly to other
             | messages client safety features that may reveal the content
             | of communication to a third party. (Nobody in the discourse
             | seems to take their FUD about iMessage seriously, given
             | that the focus has largely been on known CSAM detection.)
             | 
             | In particular, you can be concerned about E2E encryption
             | being compromised while still believing parents should have
             | some transparency over whether their kids are receiving
             | explicit images. Not clear EFF offers a solution that meets
             | both, but I did not say they are opposed to the latter.
        
         | GuB-42 wrote:
         | It is not about the encryption, it is about what an "end" is.
         | 
         | Generally, we consider the "end" to be the end user. If someone
         | else can see the message along the way, it is not end to end
         | anymore from a user perspective, even if it is from a network
         | perspective. And Apple has complete control over your device
         | through software updates. So that the leak is from your device
         | or from the network is a mostly meaningless technical detail.
        
           | simondotau wrote:
           | As a parent, I consider myself the "end user" of my child's
           | device, not my child. That I might be looped in on any
           | messages sent or received by this device is not at all a
           | leak, it's a convenience--much like how I can receive
           | messages sent to me on my phone and my laptop.
        
             | GuB-42 wrote:
             | When someone sends a message to your child, the message is
             | for your child, not for you. Your child is the "end point"
             | and you are an evesdropper.
             | 
             | This is a case where I think it is justified, as long as
             | your child is a minor and you are his legal guardian. But
             | as acceptable as it is, you are still a spy and the app is
             | spyware.
             | 
             | The fears, justified or not, is that the same feature that
             | can be used for parental control can also be used on adults
             | without their consent.
             | 
             | Personally, I think that right now, the fears are
             | overblown, but I also think that Apple got the backlash
             | they deserved. Privacy is not to be taken lightly, it is
             | something that both protects freedom and helps criminals,
             | also a strong political stance. It is not just a marketing
             | took against Google.
        
           | simondotau wrote:
           | > _And Apple has complete control over your device through
           | software updates._
           | 
           | This has been true of all operating systems with integrated
           | software updates since the advent of software updates. In
           | this respect, nothing has changed for over a decade.
        
           | tpush wrote:
           | But the message isn't being sent anywhere. The parental
           | notification stuff doesn't do that.
           | 
           | And obviously both the sender and receiver software processes
           | the unencrypted messages in a variety of ways.
        
         | dannyobrien wrote:
         | They went into this in more detail in 2019:
         | https://www.eff.org/deeplinks/2019/11/why-adding-client-side...
        
       | robertoandred wrote:
       | After the EFF lied about the system in the first place, does
       | anyone care what they're asking for now? What they're really
       | asking for is donations.
        
         | insomniacity wrote:
         | What did the EFF lie about? Missed that...
        
           | shuckles wrote:
           | 2/3rds of their original letter was spent describing a
           | parental control over sensitive content detection in iMessage
           | as an end to end encryption back door. At best, that is
           | highly cynical. At worst, it is an intentional conflation of
           | different features to raise false alarm.
        
             | dannyobrien wrote:
             | It's not cynical at all. It's what EFF has been warning
             | about since 2019 and before. (Disclosure: I worked at EFF
             | during this period. We were extremely concerned about the
             | potential role of client-side scanners and the concerted
             | push being made at that time by the intelligence community
             | to present this as a "solution" to end-to-end encryption's
             | privacy features.)
             | https://www.eff.org/deeplinks/2019/11/why-adding-client-
             | side...
             | 
             | It's also the consensus position of a very large number of
             | other digital rights groups and infosec experts. Do you
             | believe they are also cynical and raising a false alarm?
        
               | shuckles wrote:
               | Your link and most of the concern is about known CSAM
               | detection announced for iCloud Photo Library, yet, again,
               | 2/3rds of the original letter was about iMessage. Point
               | me to the expert consensus that the parental control
               | features announced were a threat to end to end
               | encryption.
               | 
               | The iMessage feature is a parental control feature where
               | kids over 13 receiving on-device classified sensitive
               | images have to click through to unblur them, and kids
               | under 13 will do the same and also have their parents
               | receive a notification that such an action was taken. The
               | parent in any case does not receive a copy of the image.
               | The EFF described it as such:
               | 
               | "Whatever Apple Calls It, It's No Longer Secure
               | Messaging"
               | 
               | and
               | 
               | "Apple is planning to build a backdoor into...its
               | messaging system."
               | 
               | The Center for Democracy and Technology who wrote this
               | letter they have co-signed said:
               | 
               | "Plan to replace its industry-standard encrypted
               | messaging system creates new risks in US and globally"
               | 
               | I, respectfully, don't see much evidence that these are
               | consensus views. Furthermore, I don't see how you can
               | characterize this feature as a back door without
               | believing safe browsing checks on links received in
               | iMessage is an encryption back door.
        
       | soziawa wrote:
       | This is based on pure speculation, but could it be that Apple
       | believes the covid tracking functionality has shifted power from
       | governments to Apple?
       | 
       | Many governments (e.g. Germany) wanted location instead of token
       | based tracking with central storage of location pretty much up
       | until the point where Apple and Google said that it won't happen
       | _.
       | 
       | _ This is based on my perception of Germany tech media coverage
       | of the issue.
        
         | tgsovlerkhgsel wrote:
         | > the covid tracking functionality has shifted power from
         | governments to Apple?
         | 
         | IMO this is off topic, but I think it absolutely has done that,
         | and has demonstrated it. I'm surprised governments haven't
         | screamed about it more loudly, but maybe they didn't want to do
         | that in an example where they were clearly on the wrong side
         | (pushing for privacy-violating approaches).
        
       | nullc wrote:
       | I wrote a message explaining why I won't be posting any more
       | neuralhash preimages and an index/summary of my posts on the
       | subject:
       | 
       | https://news.ycombinator.com/item?id=28261147
       | 
       | Unfortunately it got flagged after getting 41 upvotes. :(
        
       | slownews45 wrote:
       | I feel list this story keeps on showing up on HN.
       | 
       | Does anyone have a link to previous discussion on this topic?
       | Might help avoid endless re-hashes of the same arguments and
       | misinformation.
        
         | systemvoltage wrote:
         | I've come to abhor the term `misinformation`, I know your
         | intent but Twitter/FB has been using it in nafarious ways.
         | Anytime I hear it, it triggers censorship related thoughts for
         | me.
        
           | slownews45 wrote:
           | I agree! It's been misused to mean things someone doesn't
           | agree with too often :)
           | 
           | In this case we have been gettings tons of claims that what
           | apple doing will see them charged with child porn felonies
           | etc with almost no foundation.
           | 
           | We've also had lots of claims based on a failure to actually
           | read about what apple is doing (ie, will scan all content not
           | just that scheduled for upload to icloud etc).
        
             | zepto wrote:
             | Sure, but as someone who agrees with you, I'm not sure we
             | can call that 'misinformation'. It's just poor
             | understanding and bad arguments.
             | 
             | I have seen a little misinformation relating to this
             | subject - direct lies, fake news, fake links etc, but a
             | negligible amount.
        
               | slownews45 wrote:
               | All good points. Yeah, we should stop using the term I
               | think given it's lost a lot of its meaning. But a fair
               | bit of the discussion of this topic has been badly
               | misinformed.
        
               | zepto wrote:
               | I agree, but that is a feature of this particular
               | problem.
               | 
               | For example, I see a lot of people who are simply wrong
               | about how the hash matching works. One way to be wrong is
               | to think it's a cryptographic hash. Another way to be
               | wrong is to think it's just a perceptual hash match.
               | 
               | The problem is that these are both not crazy. The actual
               | solution is far more complex and non-obvious than most
               | people would suspect.
               | 
               | I think this is a genuine problem with the system. It is
               | hard for almost anyone to imagine how it _could_ be
               | trustworthy.
        
               | slownews45 wrote:
               | Apple have been pretty clear that a human will review
               | things. That is (I hope) a feature of any system that
               | leads to child porn charges. If not it should be - AI
               | gets things wrong and can be tricked (as can humans but
               | in different ways usually).
               | 
               | But agreed, the technical bits may not be obvious (though
               | apple released I thought a pretty darn complete paper on
               | how it all works).
        
               | zepto wrote:
               | Sure - but who wants a human reviewing their private
               | photos? I don't.
               | 
               | Unless you understand the technical bits you have know
               | way of knowing how rarely this is likely to happen in
               | practice.
        
               | slownews45 wrote:
               | Not me, but if they only review after a flag, that's best
               | you can do I think? facebook works this way too. Users
               | flag photos and someone looks and deals with them.
        
           | [deleted]
        
         | dessant wrote:
         | This is a new development, please don't try to suppress news.
        
           | slownews45 wrote:
           | I'm not trying to suppress news.
           | 
           | There have been a ton of letters, calls to action, complaints
           | and outrage about apple's actions.
           | 
           | Many of the conversations are HIGHLY repetitive.
           | 
           | I would strongly urge folks to review past discussions before
           | re-hashing things (again) or focus on new elements.
           | 
           | And these headlines "Global Coalition"? I hope folks realize
           | there is also a "global coalition" of governments that are
           | working to do much more to get access to your information.
           | Australia, China, the EU (yes, go on about GDPR but actual
           | privacy - forget about it), UK etc.
           | 
           | There may also be a "global coalition" of users who like
           | apple's products.
           | 
           | A better headline is EFF writes open letter with 90 other
           | civil liberties groups to Tim Cook?
           | 
           |  _Edited to list prior topics on HN about this:_
           | 
           | Apple's plan to "think different" about encryption opens a
           | backdoor to your life
           | 
           | EFF: https://www.eff.org/deeplinks/2021/08/apples-plan-think-
           | diff... 2260 points|bbatsell|16 days ago|824 comments
           | 
           | Apple Has Opened the Backdoor to Increased Surveillance and
           | Censorship
           | 
           | EFF: https://www.eff.org/deeplinks/2021/08/if-you-build-it-
           | they-w...
           | 
           | 618 points|taxyovio|10 days ago|303 comments
           | 
           | Tell Apple: Don't Scan Our Phones
           | 
           | EFF: https://act.eff.org/action/tell-apple-don-t-scan-our-
           | phones
           | 
           | 161 points|sunrise54|4 days ago|30 comments
           | 
           | Is EFF opposition really new news?
           | 
           | I've also noticed this weird forumulation more often. "Please
           | stop xxxx" rather than an actual discussion of a topic.
           | Ironic to try and suppress a conversation by claiming someone
           | elses comments are suppression?
        
             | stingraycharles wrote:
             | I think it's important to distinguish the comments from the
             | submission. I agree the comments are mostly repetitive, but
             | the submissions do, in fact, cover newsworthy facts that
             | should not be dismissed.
        
               | slownews45 wrote:
               | This is the forth or more piece about the EFF objecting
               | to Apple's work.
               | 
               | I was just mentioning - maybe it's worth reading the
               | comments on the FOUR prior articles about this EXACT
               | entity complaining about Apple (ignoring the other 30+ HN
               | posts on this) for this specific issue.
               | 
               | This is not "suppressing the news".
        
             | mlosgoodat wrote:
             | If the human element to this story was "cat gifs", you
             | might have a point.
             | 
             | The human element here is "world governments and
             | multinational corps beholden to nobody want to insert the
             | first of possibly many back doors into your emotional life"
             | 
             | Sit down. You're just looking for attention for your
             | ability to notice a basic pattern.
        
               | slownews45 wrote:
               | I find it very ironic that the people complaining about
               | "suppressing news" or "violation of rights" are writing
               | comments along the lines of sit down and shut up.
        
               | Dylan16807 wrote:
               | They're not telling you to stop any meaningful discussion
               | about the issues.
        
               | mlosgoodat wrote:
               | "Person on internet can perform basic arithmetic,
               | demonstrates by counting instances of repeat content." is
               | a headline that:
               | 
               | 1) we've seen over and over
               | 
               | 2) adds little to the context at hand
               | 
               | You're not wrong. You're late to the game spotting
               | patterns of reposting (it happens deal with it for
               | yourself) and undermining the context (putting it on
               | others to do better by your expectations.)
               | 
               | You have a body and mind that can be trained to find and
               | reflect on more interesting patterns and ideas. Have at
               | it. Don't bemoan all of reality not conforming to your
               | current state of cognitive awareness
        
             | jdjd6 wrote:
             | Please don't point out that using the word "please" someone
             | can pseudo politely insinuate that their opinion is correct
             | and anything you say is wrong.
        
         | rootusrootus wrote:
         | You can't stop it. Once a topic becomes popular on HN, any
         | plausibly related submission will instantly garner enough
         | upvotes to put it on the front page. Eventually it will fade
         | somewhat. Same thing happens with Boeing-related topics, for
         | example.
        
         | wskinner wrote:
         | https://hn.algolia.com/?dateRange=pastMonth&page=0&prefix=fa...
        
           | slownews45 wrote:
           | Wow - 11 stories with tons of upvotes (lots more with fewer).
           | 
           | A fair number are other "open letters" as well.
           | 
           | Helpful.
        
       | azinman2 wrote:
       | > As we've explained in Deeplinks blog posts, Apple's planned
       | phone-scanning system opens the door to broader abuses. It
       | decreases privacy for all iCloud photo users, and the parental
       | notification system is a shift away from strong end-to-end
       | encryption. It will tempt liberal democratic regimes to increase
       | surveillance, and likely bring even great pressures from regimes
       | that already have online censorship ensconced in law.
       | 
       | What's clear is the potential for future abuse mandated by
       | various gov (though that could easily be the same even without
       | all these CSAM measures.. not a new threat). However, the other
       | things are erroneously lumped in with it. It only weakens privacy
       | for iCloud photos if you have CP or photos in a CP database, and
       | it doesn't prevent E2E with texting... it just gives an
       | additional parental control in addition to the many numerous
       | parental controls (with them kids have no privacy already), and
       | is totally unrelated to the CSAM effort.
       | 
       | I admire the EFF in many ways, but I wish they'd be more exact
       | here given their access to expert knowledge.
        
         | sennight wrote:
         | > ...if you have CP or photos in a CP database...
         | 
         | Which database? I get the impression that people think there is
         | a singular repository for thoroughly vetted and highly
         | controlled CP evidence submission. No such thing exists.
        
           | azinman2 wrote:
           | It's an intersection between a US db and a not yet chosen
           | non-US db, which then will have a human reviewer verify its
           | CP before sending off to the authorities.
        
             | sennight wrote:
             | > What more could one ask for?
             | 
             | An independent audit for both the secret secondary
             | perceptual hashing algorithm and the chain of custody
             | policies/compliance for the "US db" and the disconcertedly
             | open ended "not yet chosen non-US db"?
        
               | Grustaf wrote:
               | What's the point of that? If you don't trust Apple, why
               | would you use Photos.app in the first place? They already
               | have 100% control over that, and can spy as much as they
               | want to. No need to go by way of the CSAM database, that
               | would be absurd.
        
               | sennight wrote:
               | I've never been a customer of Apple but I'll try and
               | imagine the experience... I might trust them to assemble
               | hardware and write software for my consumer needs - but
               | that doesn't mean I trust them to competently reason
               | about me potentially being a pedo. That is only a small
               | part of a much larger point, but it is reason enough
               | alone.
        
               | azinman2 wrote:
               | Do you ask for the same audit at Facebook, Google,
               | Microsoft, Dropbox, and countless others who are already
               | doing this and have been for years?
               | 
               | I do not share your same concern of some abused db
               | _today_.
        
               | sennight wrote:
               | Sure, but I don't expect it from third party cloud
               | platforms - in the same way I wouldn't expect
               | accountability from a garbage man who reports to the
               | police after finding evidence of crime in my garbage.
               | Apple is, for some insane reason, trying to establish the
               | precedent that the contents of your Apple product are now
               | part of the public space - where expectation of privacy
               | isn't a thing.
        
               | azinman2 wrote:
               | But that isn't true. This is only photos uploaded to
               | iCloud.
        
               | sennight wrote:
               | Why would you lie about something so easily disproven?
               | 
               | "Instead of scanning images in the cloud, the system
               | performs on-device matching..."
               | 
               | https://www.apple.com/child-safety/
        
               | azinman2 wrote:
               | Lie? I don't take kindly to such words, because you're
               | ascribing malicious intent where there is none. Please
               | check your tone... HN comments are about assuming the
               | best in everyone.
               | 
               | This is only applying to photos uploaded to iCloud. Every
               | single thing talks exactly about that, including the
               | technical details: https://www.apple.com/child-
               | safety/pdf/CSAM_Detection_Techni...
               | 
               | The hash matching is occurring on device, but only for
               | iCloud photo images:
               | 
               | > Before an image is stored in iCloud Photos, an on-
               | device matching process is performed for that image
               | against the database of known CSAM hashes. This matching
               | process is powered by a cryptographic technology called
               | private set intersection, which determines whether there
               | is a match without revealing the result. The device
               | creates a cryptographic safety voucher that encodes the
               | match result. It also encrypts the image's NeuralHash and
               | a visual derivative. This voucher is uploaded to iCloud
               | Photos along with the image.
               | 
               | Read that PDF. You'll see everything in it is designed
               | for iCloud photos only.
        
               | sennight wrote:
               | > Lie? I don't take kindly to such words, because you're
               | ascribing malicious intent where there is none.
               | 
               | Howdy partner, it seems my tone communicated the
               | sentiment as intended. It is unreasonable to ascribe
               | anything but malice in this case, because the rationale
               | strains credulity. Do you know at what point Apple is
               | determining your intent to sync photos? What processes,
               | exactly, are they hooking into in order to trigger the
               | local hash functionality? Is it the resource hogging
               | Photos-Agent frequently complained about? Is this
               | thresholding functionality restricted to cloud content,
               | or is the local database mentioned involved? Does that
               | local database only relate to images uploaded to the
               | cloud, and how would the deletion of local and/or cloud
               | content influence that?
               | 
               | These are all questions that are either unaddressed,
               | provide no assurance beyond "trust us", or hint at
               | logical conflicts between the stated purpose of local
               | scanning and the actual implementation details. With all
               | that in mind, granting Apple and its defenders the
               | benefit of doubt is laughably foolish.
        
               | simondotau wrote:
               | It is not a lie. The scanning is done on device, but
               | photos are not scanned unless they are going to be
               | uploaded to iCloud. Apple has explicitly stated this.
        
               | sennight wrote:
               | Oh, well if Apple says... I'm sure their statement
               | somehow completely aligns with all the potentially
               | conflicting interpretations one can draw from their PR,
               | their stated objectives, and the implementation details
               | observed, and it always will - forever.
        
               | simondotau wrote:
               | Apple has released fairly detailed technical summaries of
               | their system, far beyond what could be hidden behind
               | "conflicting interpretations" of material written by a PR
               | department. Have you read them? Are you claiming that
               | Apple is lying?
               | 
               | If your contention is that Apple is lying now, then you
               | have no reason to think Apple--or any other corporation
               | for that matter--hasn't been lying about your data
               | security for the past decade. Who knows, maybe Google
               | Chrome is sending everyone's passwords in plain text to
               | the NSA.
               | 
               | If your contention is that Apple might turn evil in the
               | future, that charge could be levied against against any
               | other company at any time. It's functionally
               | unfalsifiable.
        
               | amadeuspagel wrote:
               | Neither Google nor Microsoft scan pictures people have on
               | their devices running Android or Windows. I'm not sure
               | how that's even applicable to Facebook and Dropbox.
        
               | azinman2 wrote:
               | You're asking for an auditing chain presumably due to
               | concerns about governments putting in photos of things
               | that aren't CP. Apple is only doing this for photos that
               | get uploaded to iCloud, with this neural hash that gets
               | uploaded with it. The actual verification of a CP match
               | occurs on the server due to the hashes being blinded. So
               | in many ways, it's very similar to what these other cloud
               | providers effectively do -- search for CP matches on
               | uploaded data.
               | 
               | If you're concerned about non-CP being scanned for, then
               | you should already be concerned about that with everyone
               | else. Thus if you're asking for auditing of Apple, then
               | you should widen your request. If you do, then sure, I
               | can understand that. If you're not concerned about the
               | existing system, then I think you're being non-
               | consistent.
               | 
               | Most people in comments to me seem to be non-consistent,
               | and are being overly knee jerk about this... myself
               | included initially.
        
               | feanaro wrote:
               | We should definitely start. Also, do those companies
               | implement such subversive technology in devices they sell
               | you?
        
         | kurthr wrote:
         | "it just gives an additional parental control in addition to
         | the many numerous parental controls (with them kids have no
         | privacy already)"
         | 
         | Wait, sending data (of matching CP hashes) to law enforcement
         | is parental control?
        
           | aesh2Xa1 wrote:
           | Yours is a very fair negative reaction. The information in
           | the EFF's includes a portion where it seems to be concerned
           | only about alerting parents[1]. I think many parents would
           | find that reasonable. However, the fact that the information
           | will also be sent to the government [2] is just plainly an
           | abuse of privacy, goes outside of the relationship between
           | parent and child, and I do not imagine that parents would
           | find that reasonable.
           | 
           | > [1] Moreover, the system Apple has developed assumes that
           | the "parent" and "child" accounts involved actually belong to
           | an adult who is the parent of a child, and that those
           | individuals have a healthy relationship. This may not always
           | be the case; an abusive adult may be the organiser of the
           | account, and the consequences of parental notification could
           | threaten the child's safety and wellbeing. LGBTQ+ youths on
           | family accounts with unsympathetic parents are particularly
           | at risk. As a result of this change, iMessages will no longer
           | provide confidentiality and privacy to those users through an
           | end-to-end encrypted messaging system in which only the
           | sender and intended recipients have access to the information
           | sent.
           | 
           | > [2] When a preset threshold number of matches is met, it
           | will disable the account and report the user and those images
           | to authorities.
        
           | snowwrestler wrote:
           | Apple announced two separate things in one press release: a
           | CSAM-scanning system, and a parental control that uses AI to
           | attempt to detect nude pictures in iMessages and alert the
           | parents. The latter system does not send any info to Apple or
           | any authorities.
        
         | lamontcg wrote:
         | > It only weakens privacy for iCloud photos if you have CP or
         | photos in a CP database
         | 
         | Or if someone hacks your device and uploads one of those photos
         | to your iCloud.
         | 
         | (I still have no idea why people aren't pointing this out more
         | aggressively -- phones get hacked probably every minute of
         | every day, and acquiring those actual photos isn't that
         | difficult once you're willing to commit felonies -- hash
         | collisions are a distraction)
        
           | azinman2 wrote:
           | Because that then would already be a problem for Facebook,
           | Google, Microsoft, etc that host photos that hacked phones
           | could be uploading today.
           | 
           | And we're just not seeing that being the case. Because all
           | these providers have been doing this for so many years,
           | including the nearly 17 million photos identified by Facebook
           | last year, you'd figure there would be a lot more noise if
           | this was really going on.
           | 
           | In fact, I would venture to say it's far easier to hack a
           | Facebook account than it is iCloud that has many on-device-
           | based security protections that a cloud login without
           | mandatory 2FA (and often SMS when it is used).
        
             | lamontcg wrote:
             | We had SWAT teams for a long time before SWATing became
             | popular. The publicity that this has gotten is only going
             | to increase the chances that all these services start
             | getting abused. And who is to say that it hasn't happened
             | already and been entirely successful, but nobody believed
             | the victim.
        
               | azinman2 wrote:
               | So you think we're going to see a rise in people
               | uploading CP to others cloud providers?
        
               | lamontcg wrote:
               | Yes. I would bet the large majority of people here who
               | are now quick to point out that other cloud providers
               | have been doing this for years didn't know that fact a
               | month ago. We're now well armed with that information due
               | to arguing about this Apple issue. The fact that you're
               | so quick to inform me of the facts is precisely why I
               | think its more likely to happen.
        
               | tgsovlerkhgsel wrote:
               | I'm honestly surprised that's not a much more common way
               | of griefing (either specific or random targets) or
               | disabling accounts to deny a victim access to their
               | e-mail after an attacker has gotten access, used it to
               | reset passwords, and is now abusing the linked accounts
               | (with the added benefit that the victim may be too busy
               | being arrested to deal with the fraud).
               | 
               | Probably because the group that has/is willing to handle
               | CSAM is small enough that it doesn't overlap much the
               | other groups (e.g. account hijackers, or people who want
               | to grief a specific person and have the necessary
               | technical skills and the patience to actually pull it
               | off). For criminals it may not be worth the extra heat it
               | would bring, but from what I've heard about 4chan, I'm
               | surprised there is not a bigger overlap among "for the
               | lulz" griefers.
        
         | jtbayly wrote:
         | > It only weakens privacy for iCloud photos if you have CP or
         | photos in a CP database
         | 
         | Or people who have photos that hash the same as CP.
        
           | smoldesu wrote:
           | Or possess tampered photos that were engineered to be a hash
           | collision.
        
             | azinman2 wrote:
             | You'd have to not only have over 30 hash collisions, but
             | also have it collide with another secret hash function, and
             | then also have a human look at it and agree it's CP.
             | 
             | So what's the actual realistic issue here? This keeps
             | getting thrown around as if it's likely, yet not only are
             | there numerous steps against this in the Apple chain, this
             | would already be a huge issue with Dropbox, Facebook,
             | Microsoft, Google, etc who do CP scanning according to all
             | of the comments on HN.
        
               | nullc wrote:
               | > You'd have to not only have over 30 hash collisions
               | 
               | That's trivial. If the attacker can get one image onto
               | your device they can get several.
               | 
               | It's very easy to construct preimages for Apple's neural
               | hash function, including fairly good looking ones (e.g. h
               | ttps://github.com/AsuharietYgvar/AppleNeuralHash2ONNX/iss
               | ue... )
               | 
               | > collide with another secret hash function
               | 
               | The supposed other 'secret' hash function cannot be
               | secret from the state actors generating the databases.
               | 
               | Also, if it has a similar structure/training, it's not
               | that unlikely that the same images would collide by
               | chance.
               | 
               | > also have a human look at it and agree it's CP
               | 
               | That's straight-forward: simply use nude or pornographic
               | images which looks like they could be children or ones
               | where without context you can't tell. It's a felony for
               | them to fail to report child porn if they see it in
               | review, and the NCMEC guidance tells people when in doubt
               | to report.
               | 
               | Besides, once someone else has looked at your pictures
               | your privacy has been violated.
        
               | azinman2 wrote:
               | If this really was such a problem, then as I said, we'd
               | have been getting reports of this over the past 10+ years
               | it's already been in place at big cloud providers. So
               | where is all of this ruining of peoples lives by
               | uploading CP on their devices?
               | 
               | Also if you're a gov actor trying to frame someone, why
               | bother with a pre-image when you could put the real
               | images on it?
               | 
               | None of that is new today -- all that's new is Apple is
               | joining the effort to scan for CSAM, and instead of doing
               | it on server they're doing it on device right before you
               | upload in a way that attempts to be more secure and
               | private than other efforts.
        
               | feanaro wrote:
               | > So where is all of this ruining of peoples lives by
               | uploading CP on their devices?
               | 
               | Once the capability is in place on everyone's devices,
               | how are we supposed to guarantee it will never be used
               | maliciously? Just say no to the capability.
               | 
               | > Also if you're a gov actor trying to frame someone, why
               | bother with a pre-image when you could put the real
               | images on it?
               | 
               | Because the capability for this is now built-in in
               | everyone's phones.
        
               | jliptzin wrote:
               | What do you mean? People are arrested all the time for
               | having CP on their machines, I see it in the news
               | frequently. Impossible to know how many of them could
               | have just been framed, no one is giving the benefit of
               | the doubt to an accused pedo. And it never goes to trial
               | due to the possibility of enormous prison sentences. If
               | you're innocent would you risk 100 years in federal
               | prison going to trial or plead guilty and only face a few
               | years?
        
               | nullc wrote:
               | Many people seem to miss that the automated scanning
               | makes framing much more effective.
               | 
               | Say I sneak (psedo-)child porn onto your device. How do I
               | get authorities to search you without potentially
               | implicating myself? An anonymous tipline call is not
               | likely to actually trigger a search.
               | 
               | With automated mass scanning that problem is solved: All
               | users will be searched.
        
               | smoldesu wrote:
               | > So where is all of this ruining of peoples lives by
               | uploading CP on their devices?
               | 
               | It's already happening. Except we just choose to SWAT
               | people instead, since it's faster, easier, and there's
               | effectively no liability on the behalf of the caller.
        
               | tzs wrote:
               | > That's trivial. If the attacker can get one image onto
               | your device they can get several.
               | 
               | At which point everything you brought up about attacks on
               | the hash function is completely irrelevant because the
               | attacker can put _actual_ child porn from the database on
               | your device.
        
               | nullc wrote:
               | The apple system is a dangerous surveillance apparatus at
               | many levels. The fact that I pointed out one element was
               | broken in a post doesn't mean that I don't consider
               | others broken.
               | 
               | My primary concern about its ethics has always been the
               | breach of your devices obligation to act faithfully as
               | your agent. My secondary concern was the use of strong
               | cryptography to protect Apple and its sources from
               | accountability. Unfortunately, the broken hash function
               | means that even if they weren't using crypto to conceal
               | the database, it wouldn't create accountability.
               | 
               | Attacks on the hash-function are still relevant because:
               | 
               | 1. the weak hash function allows state actors to denyably
               | include non-child porn images in their database and even
               | get non-cooperating states to include those hashes too.
               | 
               | 2. The attack is lower risk for the attacker if they
               | never need to handle unlawful images themselves. E.g.
               | they make a bunch of porn images into matches, if they
               | get caught with them they just point to the lawful origin
               | of the images. While the victim won't know where they
               | came from.
        
         | relax88 wrote:
         | > It only weakens privacy for iCloud photos if you have CP or
         | photos in a CP database
         | 
         | Who controls what is in the database? What independent
         | oversight ensures that it's only CSAM images? The public
         | certainly can't audit it.
         | 
         | What is stopping the CCP from putting pro-Uighur or Xi Winnie
         | the Pooh images into the database? Or from the US using this to
         | locate images that are interesting from an intelligence
         | perspective (Say for example pictures of Iranian Uranium
         | centrifuges)? Apple says they will only add images that more
         | than one government request? All it would take is a few G men
         | to show up at Apple with a secret court order to do this no?
         | 
         | So... China and Hong Kong? The Five Eyes nations?
        
           | nullc wrote:
           | Their use of a highly vulnerable[1] "neural" perceptual hash
           | function makes the database unauditable: An abusive state
           | actor could obtain child porn images and invisibly alter them
           | to match the hashes of the ideological or ethnically related
           | images they really want to match. If challenged, they could
           | produce child porn images matching their database, and they
           | could had these images to other governments to unknowingly or
           | plausibly denyably include.
           | 
           | ...but they don't have to do anything that elaborate because
           | Apple is using powerful cryptography against their users to
           | protect themselves and their data sources from any
           | accountability for the content of the database: The hashes in
           | the database are hidden from everyone who isn't Apple or a
           | state agent. There is no opportunity to learn, much less
           | challenge the content of the database.
           | 
           | [1] https://github.com/AsuharietYgvar/AppleNeuralHash2ONNX/is
           | sue...
        
             | azinman2 wrote:
             | They have to come from the intersection of two databases
             | from two jurisdictions. So already that's out as you
             | suggest. Then you'd have to match _nearly exact photos_,
             | which isn't a vector for general photos of some random
             | minority. Then you'd need 30 of such specific photos, a
             | match with another secret hash, and then a human reviewer
             | at Apple has to say yes it's CP before anything else
             | happens.
             | 
             | I think there are plenty of reasons to be concerned about
             | future laws and future implementations, but let's be honest
             | about the real risks of this today as it's currently
             | implemented.
        
               | nullc wrote:
               | Every step you've described is unfalsifyable: You just
               | have to blindly trust that Apple is doing these things,
               | and that e.g. authoritarian regemes haven't compromised
               | Apple staff with access to the data.
               | 
               | > They have to come from the intersection of two
               | databases from two jurisdictions.
               | 
               | My message directly answered that. A state actor can
               | modify an apparent childporn image to match an
               | arbitrarily hash and hand that image to other agencies
               | who will dutifully include it in their database.
               | 
               | > Then you'd have to match _nearly exact photos_
               | 
               | It's unclear what you mean here. It's easy to construct
               | completely different images that share a neuralhash.
               | Apple also has no access to the original "child porn" (in
               | quotes because it may not be), as it would be unlawful to
               | provide it to them.
               | 
               | > but let's be honest about the real risks
               | 
               | Yes. Lets be honest: Apple has made a decision to
               | reprogram devices owned by their customers to act against
               | their users best interest. They assure us that they will
               | be taking steps to mitigate harm but have used powerful
               | cryptography to conceal their actions and most of their
               | supposed protections are unfalsifable. You're just
               | supposed to explicitly take the word of a party that is
               | already admittedly acting against your best interest.
               | Finally, at _best_ their protections are only moderate.
               | Almost every computer security vulnerability could be
               | dismissed as requiring an impossible series of
               | coincidences, at yet attacks exist.
        
               | simondotau wrote:
               | > _A state actor can modify an apparent childporn image
               | to match an arbitrarily hash and hand that image to other
               | agencies who will dutifully include it in their
               | database._
               | 
               | Even if a state actor constructs an image that is an
               | NeuralHash collision for the material they wish to find,
               | that only gets them through one of the three barriers
               | Apple has erected between your device and the images
               | being reported to a third party. They also need to cause
               | 30 image matches in order to pass threshold secret
               | sharing, and they need to pass human review of these 30
               | images.
               | 
               | Arguably investigation by NCMEC represents a fourth
               | barrier, but I'll ignore that because it's beyond Apple's
               | control.
               | 
               | > _You just have to blindly trust that Apple_
               | 
               | This has been true of all closed source operating systems
               | since forever. Functionally, nothing has changed. And
               | whatever you think of the decision Apple has made, you
               | can't argue that they tried to do it in secret.
        
               | nullc wrote:
               | The invocation of 30 images like it's a barrier confuses
               | me. I created a bunch of preimages posted on github, I
               | could easily create 30 or 3000 but at this point all I'd
               | be doing is helping apple cover up their bad hash
               | algorithm[1].
               | 
               | I pointed out above that the attacker could use legal
               | pornography images selected to make it look like child
               | porn. This isn't hard. Doing it 30 times is no particular
               | challenge. I didn't use pornographic images in the
               | examples I created out of good taste, not because it
               | would be any harder.
               | 
               | [1] I've made a statement that I won't be posting any
               | more preimages for now for that reason: https://github.co
               | m/AsuharietYgvar/AppleNeuralHash2ONNX/issue...
        
               | simondotau wrote:
               | If someone is trying to frame a known individual, the 30
               | image threshold may not be a significant barrier, I'll
               | grant you that. But if you're enlisting Apple's algorithm
               | to perform a dragnet search of the citizenry (e.g. leaked
               | state secrets) then this mechanism cannot be effective
               | unless the material in question is comprised of at least
               | 30 photographs.
        
               | nullc wrote:
               | I'll grant you that!
               | 
               | I have some residual nitpicks, on that point: many leaked
               | data troves are much larger than that, though it is a
               | material restriction.
               | 
               | The 30 threshold isn't leakless. Say you only have one
               | hit, it still gets reported to Apple. The software also
               | emits a small rate of "chaff", fake hits to help obscure
               | the sub-threshold real hits. But it could still be used
               | to produce a list of possible matches, including anyone
               | with targeted material plus people who emitted fake
               | matches, producing a list of potential targets much
               | smaller than the whole population, for enhanced
               | surveillance.
        
           | azinman2 wrote:
           | National Center for Missing and Exploited Children
           | intersected with a not yet determined db in another
           | jurisdiction, and then human moderators at Apple.
           | 
           | So what you're describing is something that is a concern for
           | the future that could exist anyway (and maybe already does at
           | places like Google that have _zero_ auditable processing of
           | data that already scan for CP from the same database above).
           | But let's not pretend that's today.
        
         | carlosdp wrote:
         | > (though that could easily be the same even without all these
         | CSAM measures.. not a new threat)
         | 
         | Apple has historically avoided being pressured by governments
         | to allow this kind of surveillance by arguing they can't be
         | forced to create functionality that doesn't exist, or hand over
         | information they don't have. It's the argument they made in the
         | San Bernadino case.
         | 
         | If they release this, it'll be much harder to avoid giving
         | governments that existing ability for other, non-CSAM uses.
        
           | shuckles wrote:
           | The request from the FBI in the San Bernardino case was to
           | change a passcode limit constant and retry timeouts. Those
           | are about as trivial to implement as any of the convoluted
           | government coercion database attacks against CSAM detection
           | being proposed here.
        
             | simondotau wrote:
             | The passcode limit constant is enforced by the secure
             | enclave. I don't know if it's been proven that the secure
             | enclave component of the device can be changed without the
             | device being unlocked. I'm not even sure it possible for
             | any operating system updates to occur on a device which is
             | locked.
        
               | snowwrestler wrote:
               | The technical feasibility of the FBI's request was never
               | the question, nor the basis of Apple's objection.
               | 
               | Of course it's even easier for Apple to say "no" to the
               | government if they literally cannot do what the
               | government is asking.
               | 
               | That's the basis of the EFF's objection to Apple's plans:
               | they think that by implementing this CSAM system, Apple
               | will turn an impossibility into a possibility.
        
               | shuckles wrote:
               | Not on the iPhone 5C which was the phone used by the
               | terrorist and did not have an SE. Locked iPhones can be
               | updated from DFU, but I think SE firmware can't.
        
             | ajsnigrutin wrote:
             | The difference here is, that apple would have to develop a
             | new feature for them, test it, and waste millions in lawers
             | to protect themselves from accusations of tampering with
             | the evidence (which a software update definitely is, and
             | who knows what FBI wanted in that software update, maybe
             | even to insert a fake sms to the sms database, or many
             | other things a good defense lawyer could bring to the
             | jury).
             | 
             | Here, it's different.. let's say there's a new wikileaks,
             | photos of secret documents... and again, first a few
             | journalists get the data, start slowly writing articles,
             | and FBI just adds the hashes to the database, and they can
             | find out who has the photos, with metadata even who had the
             | first, before the journalist, and they can find a leak.
        
               | azinman2 wrote:
               | But the FBI can't just add the hashes to the db. That's
               | why it's the intersection of two dbs in two
               | jurisdictions... to prevent exactly that kind of attack.
               | Then they need to pass a human reviewer as well.
        
               | xkcd-sucks wrote:
               | _Five_ Eyes
        
               | simondotau wrote:
               | You are suggesting that another country could launder the
               | request on behalf of the USA in order to circumvent the
               | 4th Amendment. Okay then, let's play that out.
               | 
               | Australia contacts Apple and demands they augment CSAM
               | detection so that every iPhone in the USA is now scanning
               | for image hashes supplied by Australia.
               | 
               | Apple says no.
               | 
               | End of hypothetical.
        
               | salawat wrote:
               | You haven't heard of how GCHQ would happily hand over
               | intelligence they had on U.S. Citizens?
               | 
               | I know for a fact there are efforts at creating fusion
               | centers across national boundaries. The question you need
               | to ask is not if but what will make you interesting
               | enough to mobilize against.
               | 
               | Thou shalt not build the effing Panopticon, nor it's
               | predecessors. Is that so hard to not do.
        
               | simondotau wrote:
               | Why would Apple comply with a demand from the GCHQ to spy
               | on US citizens?
        
               | azinman2 wrote:
               | It hasn't been announced yet who else's db will be used.
               | If it's not a 'five eyes' member... then what?
        
               | snowwrestler wrote:
               | > FBI just adds the hashes to the database
               | 
               | This is the crux of the argument right here, and I have
               | yet to see a detailed description of how the FBI would go
               | about doing that.
               | 
               | The most detail I've seen is in this thread, which
               | suggests that it would be difficult for the FBI to do it,
               | or at least do it more than once.
               | 
               | https://twitter.com/pwnallthethings/status/14248736290037
               | 022...
               | 
               | Has anyone seen something like this in the other
               | direction? Something that walks through "the FBI would do
               | this, then this, etc. and now they've coopted Apple's
               | system"?
        
           | anonuser123456 wrote:
           | Not really.
           | 
           | Apple simply tells the DOJ if any non-CSAM content is added
           | surreptitiously to the DB, Apple will drop CSAM scanning.
           | Also, if DOJ makes any request to scan for non CSAM by court
           | order or warrant, Apple will in kind drop support for the
           | technology.
           | 
           | Apple is making the good faith effort here. If DOJ makes a
           | bad faith effort, Apple is in now way required to continue to
           | participate.
        
           | pfranz wrote:
           | Ehh. I'd argue Apple has a mixed record. It's well known that
           | iCloud backups are unencrypted and often handed over to
           | authorities. In Jan 2020[1], it was reported they planned to
           | encrypt backups, but dropped the rollout due to pressure from
           | the FBI. I'm surprised they don't get called out because the
           | difference between this and San Bernadino makes sense to me
           | from a technical standpoint, from a practical standpoint and
           | to the laymen it seems hypocritical.
           | 
           | In this case, I actually kind of buy Apple's argument that
           | this will make it harder to cowtow to governments (assuming
           | they encrypt iCloud photos at some point). Right now they can
           | scan iCloud data and hand over photos and accounts without
           | user's knowledge. They can do that without informing users
           | (like they currently do with backups). With this in place the
           | database and logic ships with the OS. They would have to
           | implement and ship any changes world-wide. Users will have to
           | install that update. An alternative is Apple silently making
           | a server-side change affecting specific customers or
           | countries. With that said, I do understand people's concern
           | over the change.
           | 
           | [1] https://www.reuters.com/article/us-apple-fbi-icloud-
           | exclusiv...
        
       ___________________________________________________________________
       (page generated 2021-08-21 23:00 UTC)