[HN Gopher] Apple defends anti-child abuse imagery tech after cl...
       ___________________________________________________________________
        
       Apple defends anti-child abuse imagery tech after claims of 'hash
       collisions'
        
       Author : nickthegreek
       Score  : 222 points
       Date   : 2021-08-18 19:01 UTC (4 hours ago)
        
 (HTM) web link (www.vice.com)
 (TXT) w3m dump (www.vice.com)
        
       | m3kw9 wrote:
       | Hashes can always collide, but how likely is it for neuralhash?
        
       | firebaze wrote:
       | I think we're up for a new Apple CEO in a few days.
        
         | csilverman wrote:
         | The turmoil of Apple abruptly booting the CEO who's overseen
         | the most obscenely prosperous stretch in its history would cost
         | the company vastly more than the current controversy about
         | CSAM.
         | 
         | I'm not saying this as a fan of either Cook or their anti-CSAM
         | measures; I'm neither, and if anyone is ever wrongfully
         | arrested because Apple's system made a mistake, Cook may well
         | wind up in disgrace depending on how much blame he can/can't
         | shuffle off to subordinates. I don't think we're there yet,
         | though.
        
           | atonse wrote:
           | He's not Ballmer but Ballmer also presided over record
           | profits even though MS barely did anything remotely
           | interesting under his watch.
           | 
           | Just saying that money hides problems.
        
       | bruce343434 wrote:
       | Oh man. They really want to die on this hill don't they.
        
       | _trampeltier wrote:
       | Apple does check it just before you upload it to icloud. How much
       | money do they save it's is checked clientside? How much would it
       | cost for Apple to do it on there own servers, like everybody else
       | does it?
        
         | 63 wrote:
         | Iirc the rationale for doing it clientside isn't saving money,
         | but maintaining encryption. If it's checked client side, Apple
         | should never get unencrypted uploads unless they're suspected
         | to be CSAM
        
       | endisneigh wrote:
       | Am I the only one who finds no issue with this? Personally I'd
       | prefer this than no encryption and scanning in the cloud, which
       | is already possible.
       | 
       | All of the slippery slope arguments have already been possible
       | for nearly a decade now with the cloud.
       | 
       | Can someone illustrate something wrong with this that's not
       | already possible today.
       | 
       | Fundamentally unless you audited the client and the server
       | yourself either (client or backend) scanning is possible, and
       | therefore this "problem".
       | 
       | Where's the issue?
        
         | fritzw wrote:
         | > All of the slippery slope arguments have already been
         | possible for nearly a decade now with the cloud.
         | 
         | That doesn't make it ok. The fact that you and my mom have
         | normalized bad behavior doesn't make it any less offensive to
         | my civil liberties.
         | 
         | > Am I the only one who finds no issue with this?
         | 
         | If you only think about the first order effects. This is great,
         | we catch a bunch of kid sex pedos. The second order effects are
         | much more dire. "slippery slope" sure... but false arrests,
         | lives ruined based on mere investigations revealing nothing,
         | expanding the role of government peering in to our personal
         | lives, resulting suicides, corporations launching these
         | programs - then defending them - then expanding them due to
         | government pressure is *guaranteed*, additional PR and
         | propaganda from corporations and government for further
         | invasion in our personal lives due to the marketed/claimed
         | success of these programs. The FBI has been putting massive
         | pressure on Apple for years, due to the dominance and security
         | measures of iOS.
         | 
         | Say what you want about the death penalty, but many many
         | innocent people have dead in a truly horrific way, with some
         | actually being tortured while being executed. That is a perfect
         | example of _second order effects_ on something most of us
         | without any further information (ending murderous villains is a
         | good thing) would agree on. So many Death Row inmates have been
         | exonerated and vindicated.
         | 
         | edit:
         | https://en.wikipedia.org/wiki/List_of_exonerated_death_row_i...
        
         | xadhominemx wrote:
         | > All of the slippery slope arguments have already been
         | possible for nearly a decade now with the cloud.
         | 
         | Not only has it been possible for a decade, it's been happening
         | for a decade. Every major social media company already scans
         | for and reports child pornography to the feds. Facebook submits
         | millions of reports per year.
        
           | endisneigh wrote:
           | Yes, exactly. This is honestly nothing new.
        
             | mrkstu wrote:
             | This is new for Apple's customers. And its new in that the
             | device you bought and paid for is nosing through your
             | files.
             | 
             | Apple is introducing a reverse 'Little Snitch' where
             | instead of the app warning you what apps are doing on the
             | network, the OS is scanning your photos. Introducing a 5th
             | columnist into a device that you've bought and paid for is
             | a huge philosophical jump from Apple's previous stances on
             | privacy, where they'd gone as far as fighting the Feds
             | about trying to break into terrorist's iPhones.
        
               | endisneigh wrote:
               | Luckily for their customers it can be turned off. So,
               | what's the issue?
        
               | mrkstu wrote:
               | A huge part of Apple's value proposition is iCloud. If I
               | have to turn that off to keep the spy out of my OS, it's
               | value to me is dramatically diminished.
        
               | endisneigh wrote:
               | Since you presumably don't trust Apple to scan your
               | photos, it sounds like Apple might not be for you, then.
               | Who will you move to?
        
               | mrkstu wrote:
               | In the past I trusted Apple to resist government efforts
               | to spy on me to the best the law allowed. Now they are
               | _innovating_ ways to help a government spy literally live
               | in my pocket.
               | 
               | Trust is fragile and Apple has taken what in the past I
               | believed it understood to be a strategic advantage and
               | stomped it into little pieces.
        
         | robertoandred wrote:
         | Never let a good Apple controversy go to waste. Clickbait blogs
         | are thrilled to have something like this.
        
         | creddit wrote:
         | Why prefer your owned device tattling on you to Apple looking
         | at data you give them on their servers?
         | 
         | The reason people don't like this, as opposed to, for example,
         | Dropbox scanning your synced files on their servers, is that a
         | compute tool you ostensibly own is now turned completely
         | against you. Today, that is for CSAM, tomorrow, what else?
        
           | endisneigh wrote:
           | There's no difference - all cloud services that aren't
           | encrypting your data is subject to the same thing. Dropbox
           | could do the same thing tomorrow. If we're talking about
           | hypotheticals any vendor that handles your unencrypted files
           | can do this now.
        
             | creddit wrote:
             | You're not understanding. Dropbox already does scan your
             | files and I'm comparing Apples actions explicitly to that
             | fact. I'm pointing out that people don't care about that
             | because you're literally, voluntarily, giving Dropbox your
             | files. Here, Apple is controlling your phone to tattle on
             | you at an OS level (yes, I know, Apple says you need to
             | turn on iCloud Photos for this to run but the precedent is
             | the problem).
        
             | stevenalowe wrote:
             | big difference: you're now paying for the compute load,
             | instead of the cloud providers, for something that offers
             | no direct benefit to you and will most likely betray you in
             | unforeseen ways in the future (especially in less 'free'
             | countries)
        
           | mavhc wrote:
           | Why did you think you owned a closed source device?
        
             | creddit wrote:
             | Open Source fanatics are the worst. Even if all the
             | software on your phone was Open Source, you wouldn't have
             | the mental bandwidth to ever verify you trust it. You're
             | just farming out your trust to a different set of people
             | and hoping they're more trustworthy than those with closed
             | source. At best this MAY be reasonable because of incentive
             | alignment but that's super weak.
             | 
             | Not to mention the meaning of ownership is completely
             | unrelated to ability to view the designs of a given object.
             | I think I own the fan currently blowing air at me without
             | ever having seen a schematic for its controls circuitry
             | just fine and everyone for all of history has pretty much
             | felt the same.
        
               | mavhc wrote:
               | That's because your fan is hardware and your phone is
               | software. The hardware in your phone is only 1% of what
               | your phone is.
               | 
               | The point is it's hard to hide your evil plans in
               | daylight. Either you trust Apple or your don't. Same with
               | Microsoft's telemetry, they wrote the whole OS, if they
               | were evil they have 10000 easier ways to do it.
               | 
               | All Apple has to do is lie, you'll never know.
        
         | the8472 wrote:
         | > Personally I'd prefer this than no encryption and scanning in
         | the cloud, which is already possible.
         | 
         | What is also possible: No scanning on your device and encrypted
         | cloud storage. E.g. borg + rsync.net, mega, proton drive.
        
         | fuzzer37 wrote:
         | > Personally I'd prefer this than no encryption and scanning in
         | the cloud, which is already possible.
         | 
         | Why is that the alternative? How about everything is encrypted
         | and nothing is scanned.
        
           | saynay wrote:
           | Realistically, because if they don't have something to scan
           | for it, countries / EU are going to require them to provide a
           | backdoor to scan it. There are already proposals in the EU to
           | that affect.
        
           | endisneigh wrote:
           | This is already possible if you self host your stuff. I'm
           | talking about iCloud specifically here - backups are not
           | encrypted so it's either scan in the backend or scan on the
           | client.
           | 
           | If you don't want to be scanned you can turn it off. I
           | honestly don't see the issue. It seems the only thing people
           | can say are hypothetical situations here.
        
         | LatteLazy wrote:
         | >Personally I'd prefer this than no encryption and scanning in
         | the cloud
         | 
         | How about neither? Just let people have their privacy. Some
         | will misuse it. Thats life.
        
         | petersellers wrote:
         | Personally, I'd prefer no privacy intrusions at all.
         | 
         | The issue is that Apple previously was not intruding into their
         | user's privacy (at least publicly), but now they are.
         | 
         | It sounds like your argument is that Apple could have been
         | doing this all along and just not telling us. I find that
         | unlikely mainly because they've marketed themselves as a
         | privacy-focused company up until now.
        
           | endisneigh wrote:
           | I'm your situation can't you just turn off the scanning?
           | What's the issue?
        
             | petersellers wrote:
             | There are two different "features" being implemented -
             | Messages scanning for minors on a family plan (which can be
             | turned off) and iCloud Photo scanning (which can't be
             | turned off, as far as I know). So no, you can't just turn
             | it off.
        
               | endisneigh wrote:
               | But you can tho...
        
               | petersellers wrote:
               | Please show evidence where the feature can be turned off
               | (without having to completely disable iCloud photos).
        
               | endisneigh wrote:
               | You turn it off by turning off iCloud photos, I never
               | claimed otherwise.
               | 
               | If you don't trust Apple why would you use iCloud anyway?
               | Makes no sense.
        
               | petersellers wrote:
               | > You turn it off by turning off iCloud photos, I never
               | claimed otherwise.
               | 
               | I just have to point out the ridiculousness of this
               | statement. With your logic any feature in any product can
               | be "turned off" by not using the entire product at all.
               | For example, the radio in my car sounds like shit, I
               | guess I should just stop driving completely to avoid
               | having to hear it.
               | 
               | In reality, this new "feature" will be a requirement of
               | using iCloud Photos. The feature itself cannot be turned
               | off. If your answer is to stop using iCloud Photos, that
               | is no help for the millions of people who currently use
               | iCloud Photos.
               | 
               | > If you don't trust Apple why would you use iCloud
               | anyway? Makes no sense.
               | 
               | I've trusted Apple for a long time because I felt like
               | they were one of the only companies that cared about
               | consumer privacy. After these actions I am less convinced
               | of that. I'm not sure why that stance is so surprising.
        
               | endisneigh wrote:
               | > In reality, this new "feature" will be a requirement of
               | using iCloud Photos. The feature itself cannot be turned
               | off. If your answer is to stop using iCloud Photos, that
               | is no help for the millions of people who currently use
               | iCloud Photos.
               | 
               | iCloud Photos can be used on Windows, for example. This
               | scanning only applies to iOS devices. You could use
               | iCloud Photos and not be subject to the scanning.
               | 
               | > I've trusted Apple for a long time because I felt like
               | they were one of the only companies that cared about
               | consumer privacy. After these actions I am less convinced
               | of that. I'm not sure why that stance is so surprising.
               | 
               | Nothing about what they're doing is contradictory with
               | privacy beyond what they're already doing. The only
               | reason they're even implementing it this way is because
               | they do care about privacy. They could just not encrypt
               | and scan on the server like Google, Microsoft, Dropbox,
               | Box.com and more.
        
               | CubsFan1060 wrote:
               | Yes, it can be turned off.
               | https://www.macrumors.com/2021/08/05/apple-csam-
               | detection-di...
        
               | petersellers wrote:
               | Only if you disable iCloud photos completely. So no, you
               | cannot turn off the feature unless you stop using iCloud
               | photos.
        
               | CubsFan1060 wrote:
               | Correct. I'm not sure how that's a distinction from "you
               | can't turn it off" though. I believe they've been doing
               | this scan server-side for quite some time already.
        
               | petersellers wrote:
               | Here's the distinction: I used to be able to use iCloud
               | photos without having my photos scanned, and now I can't.
               | So I have to make a choice of either dropping iCloud
               | photos completely or submit to having all of my photos
               | scanned.
               | 
               | I don't think they have been doing server-side scanning
               | until now, hence the publicity. Do you have any evidence
               | that shows they've been doing this before?
        
               | CubsFan1060 wrote:
               | https://www.dailymail.co.uk/sciencetech/article-7865979/A
               | ppl...
               | 
               | I wasn't able to find the whole video though.
               | 
               | No time to watch, but.
               | https://www.ces.tech/Videos/2020/Chief-Privacy-Officer-
               | Round...
        
               | petersellers wrote:
               | The article you linked makes a surprising claim, and it
               | contradicts what the EFF said recently about this here -
               | https://www.eff.org/deeplinks/2021/08/apples-plan-think-
               | diff...
               | 
               | > Currently, although Apple holds the keys to view Photos
               | stored in iCloud Photos, it does not scan these images
               | 
               | It also seems weird that the EFF wouldn't have complained
               | about this before if Apple was known to be doing server-
               | side scanning for some time now.
               | 
               | I also am not going to watch through an hour video, but I
               | scanned the transcript and I didn't see anything that
               | said that Apple currently (at that time) scanned content.
        
               | CubsFan1060 wrote:
               | I can't find any articles about eff complaining about
               | gmail, onedrive, or discord doing similar scanning
               | either. All of those services (and more) do similar
               | scans.
        
           | shapefrog wrote:
           | > The issue is that Apple previously was not intruding into
           | their user's privacy
           | 
           | Apple reserves the right at all times to determine whether
           | Content is appropriate and in compliance with this Agreement,
           | and may screen, move, refuse, modify and/or remove Content at
           | any time, without prior notice and in its sole discretion, if
           | such Content is found to be in violation of this Agreement or
           | is otherwise objectionable.
           | 
           | You must not have been paying attention for the last 20
           | years.
        
       | mzs wrote:
       | >... Apple also said that after a user passes the 30 match
       | threshold, a second non-public algorithm that runs on Apple's
       | servers will check the results.
       | 
       | >"This independent hash is chosen to reject the unlikely
       | possibility that the match threshold was exceeded due to non-CSAM
       | images that were adversarially perturbed to cause false
       | NeuralHash matches against the on-device encrypted CSAM
       | database," ...
        
       | throwawaymanbot wrote:
       | This was never about protecting Children it seems. This was about
       | the ability to create and install the infrastructure for the mass
       | surveilling of people by smart device.
       | 
       | Hashes can be created for anything on a phone. And hash
       | collisions enable the "Near match" of hashes (Similar items).
       | 
       | Lets pretend.. you use your face to log in to an iPhone, and
       | there is a notice out for a certain person. If your face matches
       | the hash, will you be part of the scan? You betcha.
        
       | eurasiantiger wrote:
       | "the analyzed code is not the final implementation that will be
       | used with the CSAM system itself and is instead a generic
       | version"
       | 
       | So they are already running a generic version of this system
       | since iOS 14.3?
        
       | epistasis wrote:
       | What's shocking to me is how little Apple management understood
       | of what their actions looked like. Really stunning.
       | 
       | For a company that marketed itself as one of the few digital
       | service providers that consumers could trust, I just don't
       | understand how they acted this way at all.
       | 
       | Either there will be heads rolling at management, or Apple takes
       | a permanent hit to consumer trust.
        
         | whoknowswhat11 wrote:
         | We will see. I've heard lots of these predictions and I don't
         | buy them AT ALL.
         | 
         | What I have seen is a selling point for apple products.
         | 
         | I'd encourage folks to get out of the HN bubble on this - talk
         | to a wife, a family especially those with kids.
        
           | __blockcipher__ wrote:
           | Yeah, talk to people that know nothing on the issue besides
           | that they want to "protect the kids".
           | 
           | Why stop there? Get out of the HN bubble on the patriot act,
           | instead ask your neighbor's wife her thoughts on it. Get out
           | of the HN bubble on immigration, go ask a stereotypical
           | boomer conservative about it.
           | 
           | I think my sarcasm already made it overtly obvious but, this
           | is horrible advice you are giving and the fact that you don't
           | seem to be aware that pedophilia and terrorism are the two
           | most classic "this gives us an excuse to exert totalitarian
           | control" topics betrays your own ignorance (or, worse, you
           | are aware and just don't care).
        
         | ravenstine wrote:
         | Nah, their stock is up which means they'll continue on this
         | trajectory.
        
         | GeekyBear wrote:
         | The thing that's shocking to me is that Google, Microsoft and
         | all the big names in tech have been scanning everything in your
         | account (email, cloud drive, photos, etc) for the past decade,
         | without any noticeable uproar.
         | 
         | Apple announces that it is going to start scanning iCloud
         | Photos only, and that their system is set to ignore anything
         | below a threshold of ~30 positives before triggering a human
         | review, and people lose their minds.
        
           | rblatz wrote:
           | Because it isn't about the CSAM scanning. Apple keeps
           | reframing the argument back to that. most people expect that
           | google and Microsoft to scan files on their servers for CSAM,
           | people object to Apple turning your phone against its owner,
           | and once building out the capability to do this only policy
           | prevents it from being abused.
        
           | GuB-42 wrote:
           | Google is open about the fact they scan and track everything,
           | they even make it a point of convenience "hey, we looked at
           | your email and found a plane ticket, based on where you are,
           | you should leave in 2 hours if you want to be at the airport
           | on time".
           | 
           | Facebook, even more so, they are explicitly anti-privacy to
           | the point of being insulting.
           | 
           | Microsoft will happily show you everything they may send when
           | you install Windows, you can sometimes refuse, but not
           | always. They are a bit less explicit than Google, but privacy
           | is rarely on the menu.
           | 
           | As for Amazon, their cloud offers are mostly for businesses,
           | different market, but still, for consumers, they don't really
           | insist on privacy either.
           | 
           | So that if any of these company scan your pictures for child
           | porn, it won't shock anyone, because we know it is what they
           | do.
           | 
           | But Apple claims privacy as a core value, half of their ads
           | are along the lines of "we are not like the others, we
           | respect your privacy, everything on your device stays on your
           | device, etc...", they announce every (often legitimate)
           | privacy feature with great fanfare, etc... So much that
           | people start to believe it. But with that, people realize
           | that Apple is not so different from the others after all, and
           | if they bought an overpriced device based on that promise, I
           | understand why they are pissed off.
        
           | brandon272 wrote:
           | Here's the reason for the sudden uproar: The scanning of
           | things on third party servers was just barely tolerated by a
           | lot of people, whether it's for advertising purposes or
           | government intrusion. People accept it because it's
           | considered reasonable for these third parties to scan data in
           | exchange for providing a service for free (e.g. Google), or
           | because they need to have some degree of accountability for
           | what is on their servers.
           | 
           | When that scanning gets moved from the cloud to being _on
           | your device_ , a boundary is violated.
           | 
           | When that boundary is violated by a company who makes extreme
           | privacy claims like saying that privacy is a "fundamental
           | human right"[1], yes, people will "lose their minds" over it.
           | This shouldn't be shocking at all.
           | 
           | [1] https://apple.com/privacy
        
             | GeekyBear wrote:
             | When the scanning gets moved from the cloud to being on
             | device, Apple itself cannot see the results of the scan
             | until the risk that the result is only a false positive is
             | greatly reduced.
             | 
             | You would have to have 30 false positives before Apple can
             | see anything, which is unlikely, but the next step is still
             | a human review, since it's not impossible.
        
               | OnlineGladiator wrote:
               | I don't care. It's my device (at least that's how Apple
               | used to advertise it) and I disagree with Apple's policy,
               | full stop. I don't care about "think of the children"
               | (especially since they will be scanning pictures of my
               | own children), and furthermore I don't trust Apple not to
               | change the policy in the future. They've created a
               | backdoor and now the device is compromised and privacy is
               | broken. If you want to trust Apple go ahead. They have
               | eroded my trust and I doubt they could regain it within a
               | decade.
        
               | GeekyBear wrote:
               | >I don't care
               | 
               | If anything, you should be outraged that Google and
               | Microsoft have been scanning much more of your data, and
               | doing so in a much more intrusive way.
               | 
               | Apple only scans iCloud Photos and they do so in a way
               | that they can't see the results until they can be
               | reasonably sure it's not just a false positive.
        
               | brandon272 wrote:
               | Whether or not Apple can see the results is completely
               | irrelevant, to me at least. Automated surveillance is
               | still surveillance.
        
           | mightybyte wrote:
           | I think the difference here is that it's ON YOUR DEVICE. I
           | think there's a pretty clear understanding that if you upload
           | stuff to a cloud provider they can do whatever they want with
           | it. This is different. This is reaching into what has up
           | until now mostly been considered a private place. Law
           | enforcement often has to get warrants to search this kind of
           | thing.
           | 
           | This is the difference between putting CSAM on a sign in your
           | front yard (maybe not quite front yard but I can't come up
           | with quite the same physical equivalent to a cloud provider)
           | and keeping it in a password protected vault in your
           | basement. One of those things is protected in the U.S. by
           | laws against unlawful search and seizure. Cloud and on your
           | device are two very different things and consumers are right
           | to be alarmed.
           | 
           | I'll say it again, if you are concerned with this privacy
           | violation, sell your Apple stock and categorically refuse to
           | purchase Apple devices. Also go to
           | https://www.nospyphone.com/ and make your voice heard there.
        
             | GeekyBear wrote:
             | ON YOUR DEVICE (where it's encrypted in a way that Apple
             | can't read until the 30 image threshold is crossed) is more
             | private than doing the same scan on server where a single
             | false positive can be misused by anyone who can get a
             | subpoena.
        
               | heavyset_go wrote:
               | > _where it 's encrypted in a way that Apple can't read_
               | 
               | This doesn't matter because Apple can read iCloud data,
               | including iCloud Photos. They hold the encryption keys,
               | and they hand over customers' data for about 150,000
               | users/accounts a year in response to requests from the
               | government[1].
               | 
               | [1] https://www.apple.com/legal/transparency/us.html
        
               | mightybyte wrote:
               | > ON YOUR DEVICE (where it's encrypted in a way that
               | Apple can't read until the 30 image threshold is crossed)
               | 
               | First of all, you have to be able to read it to do the
               | comparison that can increment the counter to 30. So
               | regardless of whether it is or is not encrypted there,
               | they're accessing the unencrypted plaintext to calculate
               | the hash.
               | 
               | And yes, on my device is definitively more private than
               | on someone else's server--just like in my bedside drawer
               | is more private than in an office I rent in a co-working
               | space.
        
               | pasquinelli wrote:
               | what kind of encryption can't be decrypted until some
               | threshold has been crossed?
               | 
               | maybe you mean to say that apple says they won't read it
               | until that threshold has been crossed.
        
               | zepto wrote:
               | > what kind of encryption can't be decrypted until some
               | threshold has been crossed?
               | 
               | The kind Apple has built. You should read the docs. This
               | is literally how it works.
        
               | mightybyte wrote:
               | This is the same company that saved the disk encryption
               | password _as_ the password hint. You really trust them to
               | not screw this up when the stakes are that it could ruin
               | your life and /or land you in jail? I'm simply not ok
               | with that.
        
               | GeekyBear wrote:
               | No, I'm saying they designed the system so that they
               | don't have the key to decypt the scan result "vouchers"
               | until after the device tells them that the 30 image
               | threshold is crossed.
               | 
               | >Apple is unable to process individual vouchers; instead,
               | all the properties of our system mean that it's only once
               | an account has accumulated a collection of vouchers
               | associated with illegal, known CSAM images that we are
               | able to learn anything about the user's account.
               | 
               | Now, why to do it is because, as you said, this is
               | something that will provide that detection capability
               | while preserving user privacy.
               | 
               | https://techcrunch.com/2021/08/10/interview-apples-head-
               | of-p...
               | 
               | Meanwhile, a single false positive from an on server scan
               | is open to malicious use by anyone who can get a
               | subpeona.
        
               | telside wrote:
               | The lady doth protest too much, methinks
               | 
               | Just going to respond to every post on here with these
               | absurd points? K apple guy.
        
           | websites2023 wrote:
           | BigCos, take note: you're better off doing nefarious shit
           | without telling anyone. Because, if you come clean, you'll
           | only invite an endless parade of bloggers who will
           | misconstrue your technology to make you look bad.
        
             | __blockcipher__ wrote:
             | No misconstrual needed. This technology is genuinely bad.
             | It scans images against an arbitrary government-owned
             | black-box database. There's no guarantee that it's only
             | CSAM.
        
               | OrvalWintermute wrote:
               | I have a serious problem imagining that Apple will not be
               | willing to redeploy this technology in a novel and
               | nefarious means in exchange for _continued_ market access
               | when billions are on the line.
               | 
               | Mainland China will probably be the first chip to fall.
               | Can't imagine the Ministry of State Security not actively
               | licking their lips, waiting for this functionality to
               | arrive.
        
               | websites2023 wrote:
               | NECMEC isn't owned by the government and the database of
               | hashes is available from Apple.
        
               | tjfl wrote:
               | They're not owned by the federal government, but they do
               | get a lot of federal government money.
               | 
               | > The National Center for Missing & Exploited Children(r)
               | was established in 1984 as a private, nonprofit 501(c)(3)
               | organization. Today, NCMEC performs the following 15
               | specific programs of work, funded in part by federal
               | grants (34 U.S.C. SS 11293): Source:
               | https://www.missingkids.org/footer/about
               | 
               | US DOJ OJJDP lists recent grants totaling $84,446,366 in
               | FY19 and FY20. Source: https://ojjdp.ojp.gov/funding/awar
               | ds/list?awardee=NATIONAL%2...
        
               | __blockcipher__ wrote:
               | And don't forget, it's way more than just money:
               | 
               | https://www.law.cornell.edu/uscode/text/18/2258A
               | 
               | You _must_ report to them and only them.
               | 
               | For the GP to claim they're not government "owned" is a
               | rhetorical trick at best and outright ignorant absurdity
               | at worst.
        
               | commoner wrote:
               | > NECMEC isn't owned by the government
               | 
               | Even though NCMEC describes itself as "private", it was
               | established by and has been heavily funded by the U.S.
               | government.
               | 
               | From an archive of NCMEC's own history page, cited on
               | Wikipedia (https://web.archive.org/web/20121029010231/htt
               | p://www.missin...):
               | 
               | > In 1984, the U.S. Congress passed the Missing
               | Children's Assistance Act which established a National
               | Resource Center and Clearinghouse on Missing and
               | Exploited Children. The National Center for Missing &
               | Exploited Children was designated to fulfill this role.
               | 
               | > On June 13, 1984, the National Center for Missing &
               | Exploited Children was opened by President Ronald Reagan
               | in a White House Ceremony. The national 24-hour toll-free
               | missing children's hotline 1-800-THE-LOST opened as well.
               | 
               | $40 million/year of U.S. government funding from a 2013
               | bill (https://en.wikipedia.org/wiki/Missing_Children%27s_
               | Assistanc...):
               | 
               | > The Missing Children's Assistance Reauthorization Act
               | of 2013 (H.R. 3092) is a bill that was introduced into
               | the United States House of Representatives during the
               | 113th United States Congress. The Missing Children's
               | Assistance Reauthorization Act of 2013 reauthorizes the
               | Missing Children's Assistance Act and authorizes $40
               | million a year to fund the National Center for Missing
               | and Exploited Children.
        
               | __blockcipher__ wrote:
               | That's like saying the federal reserve is "private". No,
               | the NECMEC is not a private entity. Not only was it
               | heavily funded/created by the gov, but more importantly
               | it is granted special legal status. You and I can't just
               | spin up our own CSAM database. Nor do we have any laws
               | that say that any companies aware of CSAM must send it to
               | us and only us.
        
             | justin_oaks wrote:
             | It's important to keep nefarious stuff on the server side
             | because eventually someone will reverse engineer what's on
             | the client side.
             | 
             | Imagine if Apple had done this on the client side without
             | telling anyone, and later it was discovered. I think things
             | would be a whole worse for Apple in that case.
        
               | websites2023 wrote:
               | That being the case, why do it client side at all, when
               | presumably every claim they make is verifiable?
        
           | OnlineGladiator wrote:
           | The other companies didn't advertise and pride themselves on
           | being privacy focused. Part of the appeal of Apple was that
           | you could avoid that issue and they touted it regularly. Now
           | they're telling their customers to go fuck themselves (so
           | long as they're 18 or older).
        
             | GeekyBear wrote:
             | Conducting the scan on the user's device instead of on the
             | companies server is more private.
             | 
             | Apple can't decrypt the results of the scan until the ~30
             | image threshold is crossed and a human review is triggered.
             | 
             | Given Google's reluctance to hire humans when a poorly
             | performing algorithm is cheaper, are they turning over
             | every single false positive without a human review?
        
               | [deleted]
        
               | lifty wrote:
               | I guess people had a sense of ownership of these devices
               | and they feel that it's doing some they they don't want.
               | If ownership means control, maybe in the digital age full
               | ownership/control is not really possible on a managed
               | device like the iPhone. There are other example like the
               | John Deer tractor issues.
        
               | GeekyBear wrote:
               | I have a sense of ownership over my non-publicly-
               | accessible data when it's backed up to a cloud drive.
               | 
               | Apple isn't scanning that.
               | 
               | Google and Microsoft are.
        
               | lifty wrote:
               | Sure, you own the data, but you don't have any control on
               | what's happening to it. Just like iPhone users, we own
               | the physical device but we only control certain aspects
               | of what it's doing, and the boundaries are not set by us,
               | but by Apple.
        
           | heavyset_go wrote:
           | This seems like a common deflection, but get back to me when
           | either company puts programs in my pocket that scan my data
           | for crimes and snitch on me to authorities.
        
             | bobthepanda wrote:
             | At least Google does.
             | 
             | https://protectingchildren.google/intl/en/
             | 
             | > CSAI Match is our proprietary technology, developed by
             | the YouTube team, for combating child sexual abuse imagery
             | (CSAI) in video content online. It was the first technology
             | to use hash-matching to identify known violative videos and
             | allows us to identify this type of violative content amid a
             | high volume of non-violative video content. When a match of
             | violative content is found, it is then flagged to partners
             | to responsibly report in accordance to local laws and
             | regulations. Through YouTube, we make CSAI Match available
             | for free to NGOs and industry partners like Adobe, Reddit,
             | and Tumblr, who use it to counter the spread of online
             | child exploitation videos on their platforms as well.
             | 
             | > We devote significant resources--technology, people, and
             | time--to detecting, deterring, removing, and reporting
             | child sexual exploitation content and behavior. Since 2008,
             | we've used "hashing" technology, which creates a unique
             | digital ID for each known child sexual abuse image, to
             | identify copies of images on our services that may exist
             | elsewhere.
        
               | arsome wrote:
               | Note that they're not pushing it to devices that
               | consumers have purchased, just running it on their cloud
               | services. I think it's an important distinction here with
               | what Apple is doing.
        
               | heavyset_go wrote:
               | No, that happens on Google's servers and isn't an agent
               | in my pocket that runs on and invades my personal
               | property. It's also only for videos.
        
               | bobthepanda wrote:
               | The images being scanned are the ones that go to iCloud.
               | 
               | The Google page has a section later down that also says
               | they use hashing of images.
        
               | [deleted]
        
               | GeekyBear wrote:
               | When the scan is carried out on device, Apple can't read
               | the results of the scan until the 30 image threshold is
               | reached.
               | 
               | When Google scans on server, a single false positive
               | result can be abused by anyone who can get a warrant.
               | 
               | >Innocent man, 23, sues Arizona police for $1.5million
               | after being arrested for murder and jailed for six days
               | when Google's GPS tracker wrongly placed him at the scene
               | of the 2018 crime
               | 
               | https://www.dailymail.co.uk/news/article-7897319/Police-
               | arre...
               | 
               | Apple's method is more private.
        
             | GeekyBear wrote:
             | >a man [was] arrested on child pornography charges, after
             | Google tipped off authorities about illegal images found in
             | the Houston suspect's Gmail account
             | 
             | https://techcrunch.com/2014/08/06/why-the-gmail-scan-that-
             | le...
             | 
             | You don't consider the contents of your email account or
             | the files you mirror to a cloud drive to be your own
             | private data?
        
               | WA wrote:
               | No OP, but yes. I expect that my emails can be read and
               | that my files on a cloud service can be accessed. And I
               | don't even use Gmail.
               | 
               | I expect that my ISP tracks and stores my DNS resolutions
               | (if I use their DNS) and has a good understanding of the
               | websites I visit.
               | 
               | I expect that an app that I grant access to my contacts
               | uploads as much data as it can to their servers.
               | 
               | I expect WhatsApp and similar apps to collect and upload
               | meta data of my entire photo library such as GPS info the
               | second I give them access.
               | 
               | Hence, I don't give access. And hence, it's a problem if
               | there is no opt-out of local file scanning in the future.
        
               | heavyset_go wrote:
               | No, that happens on Google's servers and is not an agent
               | in my pocket that scans and invades my personal property
               | for evidence of crimes and snitches on me to authorities.
        
               | _trampeltier wrote:
               | There are worlds between. One case is about pictures on
               | your device. The other case is about pictures on Googles
               | "devices" or network. You had to upload it to Google.
               | EMail is also nothing like a letter anyway. It's a
               | postcard. Everybody can read it.
        
               | GeekyBear wrote:
               | No, one case is about photos you upload to iCloud Photos
               | only, and the other case is about every single thing in
               | your Google account.
               | 
               | >So if iCloud Photos is disabled, the system does not
               | work, which is the public language in the FAQ. I just
               | wanted to ask specifically, when you disable iCloud
               | Photos, does this system continue to create hashes of
               | your photos on device, or is it completely inactive at
               | that point?
               | 
               | If users are not using iCloud Photos, NeuralHash will not
               | run
               | 
               | https://techcrunch.com/2021/08/10/interview-apples-head-
               | of-p...
        
           | Copernicron wrote:
           | Most people don't pay attention to whether or not their stuff
           | is being scanned. Those of us who do pay attention have known
           | for a long time that it's being scanned. Especially by Google
           | and Facebook. Basically anyone whose business model is based
           | off of advertising. My default assumption is anything I
           | upload is scanned.
        
         | helen___keller wrote:
         | > I just don't understand how they acted this way at all.
         | 
         | There's a simple answer to this right? Despite everyone's
         | reaction, Apple genuinely believe this is a novel and unique
         | method to catch CSAM without invading people's privacy. And if
         | you look at it from Apple's point of view that's correct: other
         | major cloud providers catch CSAM content on their platform by
         | inspecting every file uploaded, i.e. total invasion of privacy.
         | Apple found a way to preserve that privacy but still catch the
         | bad people doing very bad things to children.
        
           | mox1 wrote:
           | But who was complaining about google and microsoft doing the
           | cloud scanning?
           | 
           | I don't mind my one drive being scanned for "bad stuff", I
           | very much mind my personally owned data stores being scanned,
           | with no opt out.
        
             | GeekyBear wrote:
             | The only thing Apple scans are files you upload to iCloud
             | Photos.
             | 
             | If you turn off iCloud Photos, nothing is scanned.
             | 
             | Microsoft scans everything.
             | 
             | >The system that scans cloud drives for illegal images was
             | created by Microsoft and Dartmouth College and donated to
             | NCMEC. The organization creates signatures of the worst
             | known images of child pornography, approximately 16,000
             | files at present. These file signatures are given to
             | service providers who then try to match them to user files
             | in order to prevent further distribution of the images
             | themselves, a Microsoft spokesperson told NBC News.
             | (Microsoft implemented image-matching technology in its own
             | services, such as Bing and SkyDrive.)
             | 
             | https://www.nbcnews.com/technolog/your-cloud-drive-really-
             | pr...
        
             | Spivak wrote:
             | Where's the line though? Would you be upset that the
             | Dropbox client scanned things before uploadng it? What
             | about the Google Drive client JS?
        
             | nine_k wrote:
             | That's the point: catching the bad guy foolish enough to
             | keep _known_ CSAM images on their phone, while not
             | technically invading the privacy of any good guys. Anyway,
             | if apple wanted to _covertly_ invade their users ' privacy,
             | they'd have no technical problems to do so.
             | 
             | What it takes to accept is the "nothing to hide" mentality:
             | your files are safe to scan (locally) because they can't be
             | known CSAM files. You have to trust the scanner. You allow
             | the scanner to touch your sensitive files because you're
             | not the bad guy, and you want the bad guy be caught (or at
             | least forced off the platform).
             | 
             | And this is, to my mind, the part Apple wasn't very
             | successful at communicating. The whole thing should have
             | started with an educational campaign well ahead of time.
             | The privacy advantage should have been explained again and
             | again: "unlike every other vendor, we won't siphon your
             | files unecrypted for checking; we do everything locally and
             | are unable to compromise your sensitive bits". Getting
             | one's files scanned should have become a badge of honor
             | among the users.
             | 
             | But, for some reason, they tried to do it in a low-key and
             | somehow hasty manner.
        
               | ravenstine wrote:
               | I mostly agree, though I will argue for the other side;
               | child predators don't exactly have a track record for
               | intelligence. Pedophiles still to this day get caught
               | trading CP on Facebook of all places. I think engaging in
               | that kind of activity requires a certain number of brain
               | cells misfiring. Watch some old episodes of To Catch A
               | Predator. Most of those guys were borderline retarded or
               | had obvious personality disorders.
        
           | mschuster91 wrote:
           | > Apple found a way to preserve that privacy but still catch
           | the bad people doing very bad things to children.
           | 
           | They're not going to find predators, all they are going to
           | find is people incompetent enough to have years old
           | (otherwise how would it end up at NCMEC?) CSAM on their
           | iPhones _and_ dumb enough to have iCloud backup turned on. To
           | catch _actual_ predators they would need to run AI scanning
           | on phones for all photos and risk a lot of false positives by
           | parents taking photos of their children on a beach.
           | 
           | Plus a load of people who will inadvertently have stuff that
           | "looks" like CSAM on their devices because some 4chan trolls
           | will inevitably create colliding material and spread it in
           | ads etc. so it ends up downloaded in browser caches.
           | 
           | All of this "let's use technology to counter pedos" is utter,
           | _utter_ crap that is not going to save one single child and
           | will only serve to tear down our freedoms, one law at a time.
           | Want to fight against pedos? Make photos of your airbnb,
           | hotel and motel rooms to help identify abuse recording
           | locations and timeframes, and teach children from an early
           | age about their bodies, sexuality and consent so that they
           | actually have the words to tell you that they are being
           | molested.
        
             | zepto wrote:
             | > people incompetent enough to have years old (otherwise
             | how would it end up at NCMEC?) CSAM on their iPhones
             | 
             | Who do you think has 30 or more CSAM images on their phone?
        
               | atq2119 wrote:
               | Let's be real, of course many of those people will be
               | child abusers.
               | 
               | But let's also be real about something else. Think of the
               | Venn diagram of child abusers and people who share CSAM
               | online. Those circles are not identical. Not everybody
               | with CSAM is necessarily a child abuser. Worse, how many
               | child abusers don't share CSAM online? Those can't ever
               | be found by Apple-style invasions of privacy, so one has
               | to wonder if we're being asked to give up significant
               | privacy for a crime fighting strategy that may not even
               | be all that effective.
               | 
               | The cynic in me is also wondering what fraction of people
               | working at places like NCMEC are pedophiles. It'd be a
               | rather convenient job for them. And after all, there's a
               | long history of ostensibly child-serving institutions
               | harboring the worst kind of offenders.
        
               | zepto wrote:
               | All CSAM is produced by child abuse. Everyone who shares
               | 30 images online is a participant in child abuse, even if
               | only by creating a demand for images.
               | 
               | > a crime fighting strategy
               | 
               | Is it a crime fighting strategy? I thought it was just
               | about not making their devices and services into _aids_
               | for these crimes.
               | 
               | > The cynic in me is also wondering what fraction of
               | people working at places like NCMEC are pedophiles.
               | 
               | Good question. What has that to do with this?
        
               | atq2119 wrote:
               | That's a tedious debate that has been rehashed to death.
               | Finding somebody who has CSAM without being a child
               | abuser doesn't save even a single child. Finding
               | _everybody_ who has CSAM without being a child abuser
               | would likely save some children (drying out commercial
               | operations, perhaps?), but is also basically impossible.
               | 
               | And how many children can really be helped in this way?
               | Surely the majority of child abuse happens independently
               | of the online sharing of CSAM.
               | 
               | And is that really worth the suspension of important
               | civil rights? That is really what this discussion is
               | about. Nobody is arguing against fighting child abuse,
               | the question is about how, and about what means can be
               | justified.
        
           | mirker wrote:
           | > Apple genuinely believe this is a novel and unique method
           | to catch CSAM without invading people's privacy
           | 
           | What's novel about this? The technique and issues with it are
           | fairly obvious to anyone with experience in computer vision.
           | It seems not too different from research from 1993 https://pr
           | oceedings.neurips.cc/paper/1993/file/288cc0ff02287...
           | 
           | The issues are also well known and encompass the whole
           | subfield of adversarial examples.
        
             | shuckles wrote:
             | Presumably the system is novel enough that a very similar
             | proposal which lacked one important insight (using
             | perceptual hashes to turn image similarity problems into
             | set intersection problems) was accepted to USENIX this
             | year: https://www.usenix.org/conference/usenixsecurity21/pr
             | esentat...
        
           | noapologies wrote:
           | It's really interesting to see the mental gymnastics people
           | are willing to go through to defend their favorite trillion
           | dollar corporations.
           | 
           | > other major cloud providers catch CSAM content on their
           | platform by inspecting every file uploaded, i.e. total
           | invasion of privacy.
           | 
           | > Apple found a way to preserve that privacy ...
           | 
           | So scanning for CSAM in a third-party cloud is "total
           | invasion of privacy", while scanning your own personal device
           | is "preserving privacy"?
           | 
           | The third-party clouds can only scan what one explicitly
           | chooses to share with the third-party, while on-device
           | scanning is a short slippery slope away from have its scope
           | significantly expanded (to include non-shared and non-CSAM
           | content).
        
             | helen___keller wrote:
             | > defend their favorite trillion dollar corporations
             | 
             | > is a short slippery slope away
             | 
             | Obviously people who trust Apple aren't concerned about
             | slippery slopes. What's the point of your post?
        
           | carnitas wrote:
           | Apple inspects every file on the local device Before its
           | uploaded. It's just pinky promise only matched with the on
           | device database when an upload is intended.
        
             | slg wrote:
             | Apple controls the hardware, software, and cloud service.
             | It was always a pinky promise that they wouldn't look at
             | your files. I don't know why we should doubt that pinky
             | promise less today than we did a month ago.
        
               | codeecan wrote:
               | They don't control the database used, any country can
               | thru legal means attach additional hashes for search and
               | reporting.
               | 
               | Apple has already proven it will concede to China's
               | demands.
               | 
               | They are building the worlds most pervasive surveillance
               | system and when the worlds governments come knocking to
               | use it ... they will throw their hands up and feed you
               | the "Apple complies with all local laws etc.."
        
               | bsql wrote:
               | Do any of the other companies control the database of
               | hashes they use? A government could have already done
               | what you're suggesting but I can't find a source where
               | this has been the case.
        
               | GeekyBear wrote:
               | > They don't control the database used
               | 
               | They control what goes into the on-device database that
               | is used.
               | 
               | >The on-device encrypted child abuse database contains
               | only data independently submitted by two or more child
               | safety organizations, located in separate jurisdictions,
               | and thus not under the control of the same government
               | 
               | https://www.techwarrant.com/apple-will-only-scan-abuse-
               | image...
        
               | ravenstine wrote:
               | Why do people expect anything different? Every corporate
               | promise is subject to change. When you hand your
               | belongings to someone else, those things are liable to be
               | tampered with.
        
               | fsflover wrote:
               | Because now Apple confirmed themselves that this promise
               | is not kept.
        
               | slg wrote:
               | What is "this promise"? Because I would consider it "we
               | will only scan files that you upload to iCloud". That was
               | true a month ago and that would be true under this new
               | system. The only part that is changing is that the
               | scanning happens on your device before upload rather than
               | on an Apple server after upload. I don't view that as a
               | material difference when Apple already controls the
               | hardware and software on both ends. If we can't trust
               | Apple to follow their promise, their products should
               | already have been considered compromised before this
               | change was announced.
        
               | fsflover wrote:
               | > The only part that is changing is that the scanning
               | happens on your device before upload
               | 
               | This is the key point.
               | 
               | 1. What if I change my mind and decide not to upload the
               | picture?
               | 
               | 2. This is a new mechanism for scanning private pictures
               | on the device. What could go wrong?
               | 
               | > If we can't trust Apple to follow their promise, their
               | products should already have been considered compromised
               | before this change was announced.
               | 
               | Many people did trust Apple to keep their files private
               | until now.
        
               | slg wrote:
               | How are those two examples different than before? You
               | can't unupload a photo under either the old or new
               | system. I don't know why we would expect that the
               | scanning feature will be more prone to accidentally scan
               | too many photos compared to the uploading feature
               | accidentally uploading too many photos.
               | 
               | >Many people did trust Apple to keep their files private
               | until now.
               | 
               | And that was my original point. If a pinky promise from
               | Apple is not enough to trust them, then Apple should have
               | never been trusted.
        
               | fsflover wrote:
               | > You can't unupload a photo under either the old or new
               | system.
               | 
               | You can choose to upload many pictures. They will start
               | uploading. Then, you change your mind. Some pictures were
               | not uploaded yet. But they _were_ scanned by the new
               | algorithm.
        
               | zepto wrote:
               | > This is a new mechanism for scanning private pictures
               | on the device.
               | 
               | No it isn't. It's a mechanism for scanning pictures as
               | they are uploaded to iCloud Photo Library.
               | 
               | Private pictures on the device are not scanned.
        
               | fsflover wrote:
               | Pictures not uploaded yet _are_ private.
        
               | zepto wrote:
               | Not if you have opted to have them uploaded.
        
               | CoolGuySteve wrote:
               | Should they even be doing that though? It seems like a
               | matter of time before it's possible to SWAT somebody by
               | sending them a series of hash colliding image files given
               | how not cryptographically secure the hash algorithm is.
               | 
               | I think I'm not the only one who'd rather not have my
               | devices call the cops on me in a country where the cops
               | are already way too violent.
        
               | slg wrote:
               | But this doesn't change any of that. It only changes
               | whether the scanning happens on your device as part of
               | the upload process instead of on the server after the
               | upload.
        
               | CoolGuySteve wrote:
               | Yeah that's right and I don't think either method is
               | ethical.
        
           | unyttigfjelltol wrote:
           | Maybe Apple was focused on keeping CSAM _off_ a cloud
           | resource that _really_ is Apple 's in both a practical and
           | technical sense. Merely scanning after upload maybe doesn't
           | accomplish that because the material already has arrived in
           | an Apple resource and someone might define that as a failure.
           | Viewed from a perspective of iCloud compliance it sorta makes
           | sense. No one ought to feel a lot of (legal) security
           | deploying a system that could be used to host CSAM, from the
           | Blockchain to whatever, it sounds like a don't-wizz-on-the-
           | electric-fence kind of risk.
           | 
           | Saddened by the privacy-adverse functionality on handsets but
           | Apple seems to be hitting the nail on the head that poor
           | _communication_ principally is driving outrage.
        
             | Syonyk wrote:
             | With the new system, it still arrives on the server, and
             | they can't even know about it until some threshold has been
             | passed - and then the servers still have to scan it.
             | 
             | So it's not even accomplishing "keeping it off the
             | servers."
        
           | speleding wrote:
           | > catch the bad people doing very bad things to children
           | 
           | You may catch a few perverts looking at the stuff, but I'm
           | not convinced this will lead to catching the producers. How
           | would that happen?
        
             | shuckles wrote:
             | Known CSAM detection was one of three features they
             | announced under the banner of child safety. Certainly they
             | understand it is by itself an incomplete intervention for a
             | much larger problem.
        
           | lixtra wrote:
           | > other major cloud providers catch CSAM content on their
           | platform by inspecting every file uploaded
           | 
           | That is very unlikely. Most likely they compare some hash
           | against a database - just like apple.
        
             | jowsie wrote:
             | That's what they're saying, the key difference is Apples
             | happens on your device, not theirs.
        
               | GiorgioG wrote:
               | That's not really better or different. In any meaningful
               | way.
        
               | nine_k wrote:
               | It is indeed meaningfully different: your sensitive data
               | never leaves your device in order to be scanned. There
               | can't be a crack in Apple servers that would expose files
               | of millions of users uploaded for scanning.
               | 
               | Eventually it could lead to Apple platform not having any
               | CSAM content, or any known legally objectionable content,
               | because any sane perpetrator would migrate to other
               | platforms, and less sane, caught.
               | 
               | This, of course, takes the user base to overwhelmingly
               | agree that keeping legally objectionable content is a bad
               | thing and should be exposed to police, and to trust the
               | whatever body which defines the hashes of objectionable
               | content. I'm afraid this order is not as tall as we could
               | imagine.
        
               | heavyset_go wrote:
               | > _your sensitive data never leaves your device in order
               | to be scanned. There can 't be a crack in Apple servers
               | that would expose files of millions of users uploaded for
               | scanning._
               | 
               | And yet photos that get scanned are still uploaded to
               | iCloud Photos, so they do end up on Apple's servers.
        
               | GeekyBear wrote:
               | Photos that users choose to upload to iCloud Photos do
               | indeed get uploaded to iCloud Photos?
               | 
               | I'm failing to see the issue.
        
               | heavyset_go wrote:
               | I think you might be tilting at windmills, now. The issue
               | is that the post I'm replying to claims that data never
               | ends up on Apple's servers even though it does. Thanks
               | for confirming that it does, though.
        
               | Spivak wrote:
               | Then why the outrage if they're the same?
        
               | JimBlackwood wrote:
               | Because this is Apple and people are very passionate
               | about both hating and liking Apple.
        
               | GeekyBear wrote:
               | Apple keeps the scan results encrypted with a key they
               | don't have until the device informs them that the
               | threshold of 30 images that match known kiddie porn has
               | been reached.
               | 
               | After that, they get the decryption key and trigger a
               | human review to make sure there haven't been 30 false
               | positives at once.
               | 
               | That is better, and more private, in a very meaningful
               | way.
               | 
               | False positive scan data sitting on the server is open to
               | malicious misuse by anyone who can get a warrant.
               | 
               | >Innocent man, 23, sues Arizona police for $1.5million
               | after being arrested for murder and jailed for six days
               | when Google's GPS tracker wrongly placed him at the scene
               | of the 2018 crime
               | 
               | https://www.dailymail.co.uk/news/article-7897319/Police-
               | arre...
               | 
               | Also, I sincerely doubt that Google has ended it's
               | practice of refusing to hire a human being to check for
               | false positives when a poorly performing algorithm is
               | cheaper.
        
               | orangeoxidation wrote:
               | Making it obviously and unquestionably more invasive.
        
               | Spivak wrote:
               | I just don't get this. Say you're given two options when
               | going through the TSA.
               | 
               | 1. The TSA agent opens your luggage and searches
               | everything for banned items.
               | 
               | 2. The TSA agent hands you a scanner for you to wave over
               | your luggage in private, it prints out a receipt of
               | banned items it saw, and you present that receipt to the
               | agent.
               | 
               | Which one is more invasive?
        
               | rfd4sgmk8u wrote:
               | 1. Cops on the street 2. Cops in your house
               | 
               | Which one is more invasive?
               | 
               | I don't think there is much to not understand. Apple is
               | proposing putting a cop in your house. On device scanning
               | is vastly vastly different than server side scanning. One
               | is over apples property, the other is over YOUR property.
               | Its a big deal, and a huge change in policy. And as
               | myself and others have indicated from years of
               | observation, it will grow and grow...
        
               | cma wrote:
               | The TSA installs this scanner in your house so it can
               | detect before you even get to the airport. They give a
               | pinky promise they will only run it when you book your
               | ticket and intend on going to the airport where you would
               | be expected to be scanned anyway.
               | 
               | The scanner technology can also be used for detecting
               | dissident journalism, but they say it won't be, it is
               | just for preventing terrorism, but still they say, they
               | _really_ need to install it in your house even though it
               | is only for scanning when you intend on going to the
               | airport.
        
               | bsql wrote:
               | > The scanner technology can also be used for detecting
               | dissident journalism, but they say it won't be, it is
               | just for preventing terrorism
               | 
               | This has always been possible with server side scanning.
               | Yeah we have to trust Apple to not enable it for users
               | who don't use iCloud photos but we've always had to trust
               | Apple given their closed source system.
        
               | Aaargh20318 wrote:
               | Apple's solution more like the TSA agent shows up at your
               | house and inspects your luggage before you even leave for
               | the airport and he pinky promises not to report on
               | anything else he might see while in your home.
        
               | shuckles wrote:
               | This version is incorrect because Apple's solution by
               | design does not reveal the non-suspect contents of your
               | luggage.
        
               | smnrchrds wrote:
               | 3. TSA installs scanners in all homes, premises to only
               | use them to scan your luggages as you head to the airport
               | and nothing else.
        
               | [deleted]
        
         | camillomiller wrote:
         | Apple's management can also be prone to hubris. This is also
         | very much a case where the engineers were left unbridled
         | without any proper check from marketing and comms, I suspect
         | because of the extreme complexity of the problem and the sheer
         | impossibility of putting it into layman terms effectively.
        
           | stingraycharles wrote:
           | Maybe the implementation is difficult to put into layman's
           | terms, but the high level goals - scanning the devices for
           | child porn - most definitely is not.
           | 
           | I'm not buying the "engineers were left unbridled" argument,
           | I think there just have been a level of obliviousness in much
           | wider a part of the organization for something like this to
           | happen.
        
           | bhawks wrote:
           | Was this a pure engineering driven exercise? I hadn't heard
           | that before and it doesn't match up with what I've heard
           | about Apple's internal culture. The level of defensive
           | pushback really makes me feel that they are very bought into
           | the system.
           | 
           | Definitely would appreciate a link to anything substantial
           | indicating that this was a bunch of eng in over their heads.
        
             | emptysongglass wrote:
             | Also have to agree: I don't see how this could originate
             | from engineers. Every engineer I've spoken with at the
             | company I work for has been mortified by this insanity.
        
               | sorrytim wrote:
               | It's really unpopular inside the fruit company. I've not
               | spoken with a single engineer in Cupertino who thinks
               | this is a good path. There has been radio silence from
               | management on how to address this. It almost feels like
               | they want this to fail. In the past management has given
               | us resources and talking points we can use with our
               | friends and family. Not this time.
        
               | orangeoxidation wrote:
               | > In the past management has given us resources and
               | talking points we can use with our friends and family.
               | 
               | Wow. That feels like an overreach. 'We don't just buy
               | your labor, but also your private opinion and tell you
               | how to talk to your family'.
        
               | dylan604 wrote:
               | This doesn't hold much water. As a counter example, we
               | have all of the engineers that designed Facebook, Google,
               | etc. We have people working in ad tech. We have people
               | working in lots of places that you and the people that
               | you talk with directly would be moritified, but there are
               | countless others that need a paycheck and do whatever
               | pointy haired bosses tell them to do.
        
         | nojito wrote:
         | Why are hash collisions relevent?
         | 
         | There are atleast 2-3 further checks to account for this.
        
           | foobiekr wrote:
           | your trust in the system is charming
        
           | fraa-orolo wrote:
           | Because it's not a cryptographic hash where a one bit
           | difference results in a completely different hash. It's a
           | perceptual hash that operates on a smaller bitmap derived
           | from the image so it's plausible that some innocuous images
           | might result in similar derivations; and there might be
           | intentionally crafted innocently-looking images that result
           | in an offensive derivative.
           | 
           | Salvador Dali could do something similar by hand in 1973 in
           | _Gala Contemplating the Mediterranean Sea_ [1]
           | 
           | [1] https://en.wikipedia.org/wiki/Lincoln_in_Dalivision
        
             | __blockcipher__ wrote:
             | This is a great answer but that's not actually the GP's
             | contention. Their argument is essentially "so what if
             | there's a collision, the human review will catch it". And
             | to that I'd say that the same is supposed to occur for the
             | no-fly list and we all know how that works in practice.
             | 
             | The mere accusal itself of possessing CSAM can be life
             | ruining if it gets to that stage. More importantly, a
             | collision will effectively allow warrantless searches, at
             | least of the collided images.
        
               | fraa-orolo wrote:
               | Indeed, I touched on that in another comment:
               | https://news.ycombinator.com/item?id=28227141
        
         | justapassenger wrote:
         | > For a company that marketed itself as one of the few digital
         | service providers that consumers could trust, I just don't
         | understand how they acted this way at all.
         | 
         | Because privacy stance is mostly PR to differentiate from
         | Google. And while there're invalid reasons to get users data,
         | there're also valid ones (at least from legal requirement point
         | of view - let's not get into weeds about personal freedom here
         | and if the laws and its implementations need to be changed).
         | 
         | Their PR was just writing the checks they cannot cash without
         | going on a war with governments.
        
           | istingray wrote:
           | In retrospect this makes sense to me. I don't like it, but I
           | get it now. When Apple said "privacy" what they meant was "we
           | don't like tracking cookies or hackers but everything else is
           | fine".
        
             | websites2023 wrote:
             | Privacy means "you pay for the device so we don't sell your
             | attention to advertisers." That's it. There's no protection
             | against state level actors (Pegasus, CSAM scanning, etc.).
             | 
             | If your threat model includes being the target of someone
             | who will plant child pornography on your phone, you are
             | already fucked. And no, Apple isn't suddenly going to scan
             | Chinese iPhones for Winnie the Pooh memes. They don't have
             | to. China already has the 50 cent party to do that for
             | them, on WeChat.
             | 
             | Basically everything everyone seems to think is just around
             | the corner has already been possible for years.
        
               | shuckles wrote:
               | Is there a consumer computing device that's more secure
               | than iPhone against state actors? A Chromebook, maybe?
        
               | websites2023 wrote:
               | There is no consumer grade device that is secure against
               | state actors.
        
           | n8cpdx wrote:
           | It was obvious from the beginning that the privacy nonsense
           | was a convenient excuse to cover for their poor (especially
           | at the time) cloud offerings compared to competitors like
           | Google.
           | 
           | I assumed they pivoted to focus on privacy, but clearly it
           | was just a marketing department innovation rather than a core
           | value (as their marketing department claimed).
        
         | FabHK wrote:
         | > What's shocking to me is how little Apple management
         | understood of what their actions looked like. Really stunning.
         | 
         | Maybe because they underestimated people's ignorance.
         | 
         | I think they saw (and still do see) it as a better, more
         | privacy preserving technique than what everyone else is doing.
        
           | fraa-orolo wrote:
           | The thing is that this _is_ a better and more privacy
           | preserving technique.
           | 
           | Their hubris is in not seeing or thinking that they will be
           | able to stand up to all kinds of abuses of this system that
           | its mere existence will invite; their hubris is also in
           | thinking that they will be able to perfectly and without
           | mistakes manage and overview a system making accusations so
           | heinous that even the mere act of accusing destroy people's
           | lives and livelihoods.
        
             | shuckles wrote:
             | What abuses apply to CSAM scanning in this hybrid pipeline
             | which don't apply to iCloud Backup? If they scanned on the
             | server, why couldn't governments "slippery slope" abuse the
             | system by requiring that all files on iOS end up in iCloud
             | Backup, where they can be scanned by the system?
             | 
             | Since the announcement, I can think of a dozen ways Apple
             | could be easily forced into scanning all the contents of
             | your device by assembling features they've already shipped.
             | Yet they haven't. At some point, people need to produce
             | evidence that Apple cannot hold the line they've said they
             | will.
        
               | candiodari wrote:
               | Exactly. Also people are strongly objecting that their
               | own device is being used to report them to law
               | enforcement. Surely someone at Apple noticed this
               | beforehand ...
        
         | duxup wrote:
         | I'm kinda amazed.
         | 
         | I mentioned I was thinking of moving from an Android phone to
         | Apple soon, somewhat privacy related.
         | 
         | My friends lectured me on "they're scanning your photos" ...
         | meanwhile they share their google photos albums with me and
         | marvel about how easy they are to search ...
         | 
         | Maybe we (humans) only get outraged based on more specific
         | narratives and not so much the general topics / issues?
         | 
         | I don't know but they didn't seem to notice the conflict.
        
           | kzrdude wrote:
           | Isn't there a small difference between these? A) They scan
           | everything I have released to google photos B) They scan
           | everything that exists on my device
           | 
           | Psychologically, you'll feel a difference in what you accept
           | between the two, I think
        
             | shapefrog wrote:
             | > B) They scan everything that exists on my device
             | 
             | No ... They scan everything that I have released to apple
             | photos that exists on my device.
             | 
             | Same scan - different place.
        
               | __blockcipher__ wrote:
               | The issue is that as soon as you set that precedent, it's
               | only a matter of time before it extends beyond iCloud.
               | That's the problem with doing any device-level scanning.
               | This is dystopian and scary. And yes I understand the
               | technology in its current iteration. The current form has
               | problems (weaponizing collisions etc) but the real issue
               | comes with future developments.
        
               | shapefrog wrote:
               | > They scan everything that exists on my device
               | 
               | I get to select the issue and it was in response to the
               | previous claim that 'they' are scanning everything that
               | exists on the device right now.
               | 
               | A year ago this was a possible 'future development'. 10
               | years from now I could be living on Mars. This is all
               | hypothetical.
        
               | heavyset_go wrote:
               | > _This is all hypothetical._
               | 
               | Exactly. However, people don't seem to have an issue when
               | hypothesizing that CSAM detection is good actually,
               | because Apple might implement E2EE, despite no evidence
               | of such intentions.
               | 
               | For some reason, though, people take issue when others
               | hypothesize that CSAM detection is bad actually, because
               | Apple might expand it further and violate users' privacy
               | even more. And there's actually precedent for this
               | hypothesis, given Apple's actions here and their own
               | words[1]:
               | 
               | > _This program is ambitious, and protecting children is
               | an important responsibility. These efforts will evolve
               | and expand over time._
               | 
               | [1] https://www.apple.com/child-safety/
        
               | robertoandred wrote:
               | Was scanning server side not a precedent?
        
               | shapefrog wrote:
               | Furthermore, was uploading (backing up) the entire
               | contents of your phone to a server to be freely scaned
               | and distributed (with warrant of course) server side not
               | also a precedent?
               | 
               | Must have been absloutely nuts round here back in 2011
               | when they announced the particular slippery slope that is
               | iCloud backup.
        
               | Multicomp wrote:
               | It was but it was one we were not in a position to resist
               | at the time. Surrendered to almighty convenience.
               | 
               | But here appears to be beyond the pale such that people
               | are saying "this far, no farther!"
        
               | __blockcipher__ wrote:
               | It set the precedent for scanning server side. Which I
               | don't _like_ but there's really no way to avoid it
               | anyway.
               | 
               | Now we have clientside scanning, and actually being led
               | by the [note the scare quotes] "pro-privacy" two ton
               | gorilla. It's a whole different ballgame.
        
             | gmueckl wrote:
             | Isn't the default on Android these days that all images get
             | uploaded ("backed up") to Photos? And how many users are
             | actually altering defaults?
        
             | whoknowswhat11 wrote:
             | As has been repeated over and over, apple only scans photos
             | that are part if icloud photos (ie, uploaded).
             | 
             | Don't want your photo's scanned, don't sync them to icloud.
             | Seriously! Please include the actual system when discussing
             | this system, not your bogeyman system.
             | 
             | "To help address this, new technology in iOS and iPadOS*
             | will allow Apple to detect known CSAM images stored in
             | iCloud Photos."
             | 
             | To increase privacy - they perform the scan on device prior
             | to upload.
        
               | Johnny555 wrote:
               | _As has been repeated over and over, apple only scans
               | photos that are part if icloud photos (ie, uploaded)._
               | 
               | "for now"
               | 
               | Which is the part most people have a problem with -- they
               | say that they are only scanning iCloud uploads now, but
               | it's simple extension of the scanner to scan all files.
               | 
               | I don't care if Apple scans my iCloud uploads on iCloud
               | servers, I don't want them scanning photos on my device.
        
               | smnrchrds wrote:
               | I will believe them if they put their money where their
               | mouth is: as a clause to iOS user agreement saying that
               | if they ever use they ever use this functionality for
               | anything other than CSAM or on anything other than iCloud
               | photos, ever person who was subjected to this scan will
               | be paid 100 million dollars by Apple. I will believe them
               | if they put this clause in, and I know when they have
               | changed their plans when they remove the clause. No more
               | pinky swears, let's add some stakes for breaking the
               | promise.
        
               | Johnny555 wrote:
               | I'd be satisfied with a money back guarantee -- if they
               | change the policy then I can return the phone for a full
               | refund.
        
               | kcb wrote:
               | Until one day a box pops up. It says "We've updated our
               | terms. See this 10,000 line document here. Please accept
               | to continue using your device." Then your clause is gone.
        
               | smnrchrds wrote:
               | I'm fine with that. Apple is a big company and changing
               | its TOS will be instantly reported on. It will act as a
               | canary of sorts to know when they turn evil and we know
               | to stop using their products.
        
             | duxup wrote:
             | Google does everything they can to backup your photos ...
             | and do it automatically.
             | 
             | I'm not sure there's a real difference unless you want to
             | watch your settings all the time. In google land they tend
             | to reset ... and really that happens a lot of places.
             | 
             | I think for most people if you use google, you're in their
             | cloud.
        
         | zepto wrote:
         | Consumers are probably in favor this, or don't care.
         | 
         | The only people who are bothered are people claiming this is
         | going to be misused by authoritarian governments.
        
         | daxuak wrote:
         | They are very likely aware of backlash but since this is a
         | easily defendable hill with a very slippery slope down the way,
         | it is of their interest to push for it IMHO.
        
           | floatingatoll wrote:
           | Apple has declared in interviews that the slope shall not be
           | slipped, but you're indicating that they chose this
           | _specifically_ because they can slip that slope.
           | 
           | How did you determine that their intentions contradict their
           | words? Please share the framework for your belief, so that
           | we're able to understand how you arrived at that belief and
           | to evaluate your evidence with an open mind.
           | 
           | (Or, if your claim is unsupported conjecture, please don't
           | misrepresent your opinions and beliefs as facts here at HN.)
        
             | monadgonad wrote:
             | "IMHO" means "in my humble opinion"
        
               | floatingatoll wrote:
               | I'm glad to see that there now, thanks!
        
         | floatingatoll wrote:
         | Right now, it seems like there are two specific groups of
         | people that are upset with Apple: Freedom evangelists (e.g.
         | EFF) and tech futurists (e.g. HN). They're saying, essentially:
         | 
         | "Apple does not have my permission to use my device to scan my
         | iCloud uploads for CSAM"
         | 
         | and
         | 
         | "This is a slippery slope that could result in Apple enforcing
         | thoughtcrimes"
         | 
         | Neither of these viewpoints are particularly agreeable to the
         | general public in the US, as far as I can determine from my
         | non-tech farming city. Once the fuss in tech dies down, I
         | expect Apple will see a net _increase_ in iCloud adoption --
         | all the fuss we 're generating is free advertising for their
         | efforts to stop child porn, and the objections raised are too
         | domain-specific to matter.
         | 
         | It's impossible to say for certain _which_ of your outcomes
         | will occur, but there 's definitely two missing from your list.
         | Corrected, it reads:
         | 
         | "Either there will be heads rolling at management, or Apple
         | takes a permanent hit to consumer trust, or Apple sees no
         | effect whatsoever on consumer trust, or Apple sees a permanent
         | boost in consumer trust."
         | 
         | I expect it'll be "no effect", but if I had to pick a second
         | guess, it would be "permanent boost", well offsetting any
         | losses among the tech/free/lib crowd.
        
       | ALittleLight wrote:
       | Would this work as an attack?
       | 
       | 1. Get a pornographic picture involving young though legal actors
       | and actresses.
       | 
       | 2. Encode a nonce into the image. Hash it checking for CSAM
       | collisions. If you've found a collision go on to the next step,
       | if not update the nonce and try again.
       | 
       | 3. You now have an image that, to visual inspection will appear
       | plausibly like CSAM, and to automated detection will appear like
       | CSAM. Though, presumably, it is not illegal for you to have this
       | image as it is, in fact, legal pornography. You can now text this
       | to anyone with an iPhone who will be referred by Apple to law
       | enforcement.
        
         | shapefrog wrote:
         | Your hash will match to say image 105 of the dataset. Upon
         | visual inspection, your 'legal porn' is going to have to at
         | least look passably like image 105 of the dataset to get
         | anywhere.
         | 
         | So at this point we have an image that computers think is CSAM
         | and people think is CSAM, and when held up next to the original
         | verified horrific image everyone agrees is the same image. At
         | this point, someone is going to ask, rightly so, where that
         | came from.
         | 
         | In order to generate this attack, you have had to go out and
         | procure, deliberately, known CSAM. Ignoring that it would be
         | easier just to send that to the target, rather than hiring
         | talent to recreate the pose of a specific piece of child porn
         | (or 30 pieces to trigger the reporting levels), the most likely
         | person by orders of magnitude to be prosecuted in this scenario
         | is the attacker.
        
           | rlpb wrote:
           | > Upon visual inspection, your 'legal porn' is going to have
           | to at least look passably like image 105 of the dataset to
           | get anywhere.
           | 
           | Define "get anywhere". Why won't you get raided by the police
           | and have all your devices seized first?
        
             | shapefrog wrote:
             | No, probably wouldnt even get to the desk of the police.
             | End of the road would be NCMEC, to whom apple would refer
             | hash matching images that pass human verification that are
             | close enough to porn.
             | 
             | If your 30 or so hash matching images matched their
             | corresponding known CSAM then that goes on to the police
             | and then they knock on your door.
        
         | Ashanmaril wrote:
         | If you're gonna go through that much effort, you might as well
         | just text someone an actually illegal image from something
         | other than an iPhone
         | 
         | You're already attempting to frame someone for a crime, might
         | as well commit another crime while you're at it
        
         | arn wrote:
         | How would you know if it's a CSAM collision? I don't believe
         | that database is publicly available anywhere, for obvious
         | reasons.
        
           | arsome wrote:
           | Apple's client clearly has some revealing information on
           | that...
        
           | ALittleLight wrote:
           | Good point. I can offer three possibilities to how you might
           | know. First, a data leak of the material. Second, law
           | enforcement presumably gives hashes of known CSAM to service
           | operators so they can detect and report it. You could pose as
           | or be the operator of such a service and get the hashes that
           | way. Third, if you were a government operator you may have
           | access to the hashes that way. (Although I guess corrupt
           | government agents would have other easier ways of getting to
           | you)
        
           | FabHK wrote:
           | Yes. That, plus it's unclear whether you can create a
           | collision with a nonce. It's a perceptual hash, not a
           | cryptographic one. Lastly, to compute that collision might be
           | so expensive that you could instead compute small SHA-256
           | hashes, ie mine BTC, and use the money to obtain your
           | sinister goals via other means.
        
         | robertoandred wrote:
         | Why would you save to your personal photo library some porn
         | that a creepy rando sent to you in a text?
        
       | Barrin92 wrote:
       | it's kind of silly how this is treated like some deeply technical
       | issue. Apple's 'one of a trillion' claim is either true and the
       | software is useless, because I'm pretty sure pedophiles can
       | figure out what a watermark or a gaussian blur in paint net is,
       | or it's imprecise and actually detects things which is the very
       | thing that makes it dangerous to the public.
       | 
       | It's a direct trade-off and the error tolerance of any such
       | filter is the only thing that makes it useful so we can basically
       | stop arguing about the depths of implementation details or how
       | high the collision rate of the hashing algorithm is etc. If this
       | thing is supposed to catch anyone it _needs_ to be magnitudes
       | more lenient than any of those minor faults.
        
         | 734129837261 wrote:
         | Exactly right. The tech Apple uses can be one of two things:
         | 
         | 1. It requires a perfect 1:1 match (their documentation says
         | this is not the case); 2. Or it has some freedom in detecting a
         | match, probably including a match with a certain percentage.
         | 
         | If it's the former, it's completely useless. A watermark or a
         | randomly chosen pixel with a slightly different hue and the
         | hash would be completely different.
         | 
         | So, it's not #1. It's going to be #2. And that's where it
         | becomes dangerous. The government of the USA is going to look
         | for child predators. The government of Saudi Arabia is going to
         | track down known memes shared by atheists, and they will be put
         | to death; heresy is a capital offence over there. And China
         | will probably do their best to track down Uyghurs so they can
         | make the process of elimination even easier.
         | 
         | It's not like Apple hasn't given in to dictatorships in the
         | past. This tech is absolutely going to kill people.
        
           | endisneigh wrote:
           | What? Why would memes be on the CSAM list?
        
             | visarga wrote:
             | They got to scan for whatever the local law requires them
             | to scan.
        
         | Ashanmaril wrote:
         | > I'm pretty sure pedophiles can figure out what a watermark or
         | a gaussian blur in paint net is, or it's imprecise and actually
         | detects things which is the very thing that makes it dangerous
         | to the public
         | 
         | Or better yet, they can just not store their stuff on an
         | iPhone. While meanwhile, millions of innocent people are have
         | their photos scanned and risking being reported as a pedophile.
        
       | trident5000 wrote:
       | Erosion of privacy and freedom always starts with "but think of
       | the children"
        
       | zakember wrote:
       | Is no one going to talk about how Apple is implementing this?
       | 
       | If Apple is training a neural network to detect this kind of
       | imagery, I would imagine there to be thousands, if not millions
       | of child pornography images on Apple's servers that are being
       | used by their own engineers to train this system
        
         | zimpenfish wrote:
         | > If Apple is training a neural network to detect this kind of
         | imagery
         | 
         | NCMEC generate the hashes using their CSAM corpus.
        
         | arn wrote:
         | It's not a neural network. It's not AI. It's not trying to
         | interpret image content. It's trying to identify known specific
         | images, but is using a fuzzy fingerprinting match.
        
       | dang wrote:
       | Ongoing related threads:
       | 
       |  _Hash collision in Apple NeuralHash model_ -
       | https://news.ycombinator.com/item?id=28219068 - Aug 2021 (542
       | comments)
       | 
       |  _Convert Apple NeuralHash model for CSAM Detection to ONNX_ -
       | https://news.ycombinator.com/item?id=28218391 - Aug 2021 (155
       | comments)
        
       | stevenalowe wrote:
       | Apple wants to enforce one of their corporate policies at my
       | expense: battery, CPU, and unknown risks if/when they're wrong.
       | What could possibly go wrong?
        
       | kevin_thibedeau wrote:
       | How long until thishashcollisionisnotporn.com is a thing?
        
         | istingray wrote:
         | sounds like a future NFT market
        
           | eurasiantiger wrote:
           | Sounds like hipster T-shirts with hash collision images.
        
         | mono-bob wrote:
         | Sounds like a fun weekend project.
        
       | 1vuio0pswjnm7 wrote:
       | Perhaps I have I missed this in all the discussions of Apple's
       | latest move but has anyone considered the following questions.
       | 
       | Does Apple's solution only stop people from uploading illegal
       | files to Apple's servers or does it stop them from uploading the
       | files to any server.
       | 
       | If Apple intends to control the operation of a computer purchased
       | from Apple after the owner begins using it, does Apple have a
       | duty to report illegal files found on that computer and stop them
       | from being shared (anywhere, not just through Apple's
       | datacenters).
       | 
       | To me, this is why there is a serious distinction between a
       | company detecting and policing what files are stored on their
       | computers (i.e., how other companies approach this problem) and a
       | company detecting and policing what files someone else's computer
       | is storing and can transfer over the internet (in this case,
       | unless I am mistaken, only to Apple's computers).
       | 
       | Mind you, I am not familiar with the details of exactly how
       | Apple's solution works nor the applicable criminal laws so these
       | questions might be irrelevant. However I was thinking that if
       | Apple really wanted to prevent the trafficking of ostensibly
       | illegal files then wouldn't Apple seek to prevent their transfer
       | not only to Apple's computers but to _any_ computer (and also
       | report them to the proper authorities). What duty does Apple have
       | if they can  "see into the owner's computer" and they detect
       | illegal activity. If Apple is in remote control of the computer,
       | e.g., they can detect the presence/absence of files remotely and
       | allow or disallow full user control through the OS, then does
       | Apple have a duty to take action.
        
         | CubsFan1060 wrote:
         | What has been stated is that this all _only_ happens as photos
         | are about to be uploaded to iCloud photos (not to be confused
         | with iCloud backup, a totally separate thing). This is
         | currently done server-side on iCloud photoes. This appears to
         | move it client side.
        
         | tejohnso wrote:
         | > Does Apple's solution only stop people from uploading illegal
         | files to Apple's servers or does it stop them from uploading
         | the files to any server.
         | 
         | Perhaps they're trying to prevent being in "posession" of
         | illegal images by having them on their own servers, rather than
         | preventing copying to arbitrary destinations.
        
         | ysavir wrote:
         | > does Apple have a duty to report illegal images found on that
         | computer and stop them from being shared
         | 
         | Duty? No, that's the secondary question. The primary question
         | is whether they have the right.
        
         | judge2020 wrote:
         | > Does Apple's solution only stop people from uploading illegal
         | images to Apple's servers or does it stop them from uploading
         | the images to any server.
         | 
         | Only applies to iCloud Photos uploads, but the photos are still
         | uploaded: when there's a match, the photo and a 'ticket' are
         | uploaded and Apple's servers (after the servers themselves
         | verify the match[0]) send the image to human reviewers to
         | verify the CSAM before submitting it to police as evidence.
         | 
         | 0: https://twitter.com/fayfiftynine/status/1427899951120490497
         | and https://www.apple.com/child-
         | safety/pdf/CSAM_Detection_Techni...
        
       | guerrilla wrote:
       | Something just occurred to me. If this goes forward, and then
       | Microsoft and/or Google copy it, then we might see lawmakers
       | expecting it. If that becomes the case, then it'd be yet another
       | barrier to entry (in addition to a cop on every phone.)
       | 
       | Isn't that where we already with things like Article 17 of the
       | EU's Copyright Directive?
        
         | shuckles wrote:
         | This is already the world we live in, with vendors like Thorn
         | building SaaS solutions for people who accept UGC but don't
         | have the clout to integrate with NCMEC and Microsoft PhotoDNA
         | directly.
        
       | almostdigital wrote:
       | It's clear now that Apple is never going to back down. It's
       | bittersweet, I'm sad to see them go but also excited for what I
       | think will be a FOSS renaissance.
        
       | yeldarb wrote:
       | I created a proof of concept showing how OpenAI's CLIP model can
       | function as a "sanity check" similar to how Apple says their
       | server-side model works.
       | 
       | In order for a collision to get through to the human checkers,
       | the same image would have to fool both networks independently:
       | 
       | https://blog.roboflow.com/apples-csam-neuralhash-collision/
        
         | mrits wrote:
         | Cool project. Wouldn't padding all generated images with a few
         | items that match CLIP get around this though?
        
           | yeldarb wrote:
           | Doing that would mean the NeuralHash would change though. And
           | you'd have to not only get CLIP to identify CSAM in the
           | generated image but also negate the parts of the generated
           | image that are causing CLIP to label it as "generated" (while
           | still colliding with the target NeuralHash).
           | 
           | Unclear how hard this would actually be in practice (if I
           | were going to attempt it, the first thing I'd try is to
           | evolve a colliding image with something like CLIP+VQGAN) but
           | certainly harder than finding a collision alone.
        
             | salawat wrote:
             | Take 2 CSAM pictures, known. Combine into one image.
             | 
             | Swing and a miss. Not in the CSAM dataset. Take two images.
             | Encode alternating pixels. Decode to get the original image
             | back. Convert to different encodings print to PDF or
             | Postscript. Encode as base64 representations of the image
             | file...
             | 
             | Who are we trying to fool here? This is kiddie stuff.
             | 
             | This is more about trying to implant scanning capabilities
             | on client devices.
        
               | rootusrootus wrote:
               | > This is more about trying to implant scanning
               | capabilities on client devices.
               | 
               | Did we think they didn't already have that ability?
        
               | salawat wrote:
               | No. The brazen part of this is Apple trying to sell the
               | public on this being okay, effective to the point of
               | mitigating the abuse potential. False negative AND false
               | positive have to counterweight the existence of the
               | vector for abuse. That's what you're measuring against.
               | You can't even guarantee that. This isn't even close to a
               | safe or effective move. The risk management on this is
               | hosed.
        
               | yeldarb wrote:
               | Those are examples of triggering false negatives (CSAM
               | that gets missed by the system), this collision attack is
               | about triggering false positives (non-CSAM that triggers
               | human intervention).
        
       | aborsy wrote:
       | With the rise of end to end encrypted data and messaging, such as
       | signal and WhatsApp, it makes sense for governments to shift the
       | search to user's devices and operating systems rather than cloud,
       | before stuff gets encrypted.
        
       | gjsman-1000 wrote:
       | For everyone upset about Apple's CSAM scanning, I think we all
       | forgot about the EARN IT Act. It was nearly passed last year but
       | Congress was finished before it could be voted on. It had
       | Bipartisan support and would've virtually banned E2E of any kind.
       | And it would have required scanning everywhere according to the
       | recommendations of a 19-member board of NGOs and unelected
       | experts. The reason for this mandatory backdoor was child
       | safety... even though AG Barr admitted that reducing the use of
       | encryption had useful side effects.
       | 
       | If you are Apple, even though EARN IT failed... you know where
       | Washington's heart lies. Is CSAM scanning a "better alternative",
       | a concession, an appeasement, a lesser evil, in the hope this
       | prevents EARN IT from coming back?
       | 
       | Also, many people forgot about the Lawful Access to Encrypted
       | Data Act of 2020, or LAED, which would _unilaterally banned_ E2E
       | encryption _in entirety_ and required that all devices featuring
       | encryption _must be unlockable_ by the manufacturer. That also
       | was on the table.
        
         | samename wrote:
         | Why can't we have the alternative we already have, where we
         | don't have our phones scans and E2EE exists without compromise?
         | Why do we have to accept it any other way?
         | 
         | If you're trying to frame this as "we need to prevent Congress
         | from ever passing something like the EARN IT act", I agree.
         | Apple and other tech companies already lobby Congress. Why
         | aren't they lobbying for encryption?
        
           | gjsman-1000 wrote:
           | They did, they lobbied _heavily_ against EARN IT, and so did
           | groups like the ACLU and EFF. However, it didn 't stop
           | Congress members from moving it along through the process -
           | they were only stopped because Congress just ran out of time.
           | It was clear that Congress was not interested in listening at
           | all to the lobbyists on the issue.
           | 
           | It's clear that EARN IT could literally be revived any day if
           | Apple didn't do something to say "we don't need it because
           | we've already satisfied your requirements."
        
             | orangecat wrote:
             | _It 's clear that EARN IT could literally be revived any
             | day if Apple didn't do something to say "we don't need it
             | because we've already satisfied your requirements."_
             | 
             | Alternatively, "Apple has shown that it's possible to do
             | without undue hardship, so we should make everyone else do
             | it too".
        
               | gjsman-1000 wrote:
               | Congress doesn't give a rip about "undue hardship."
               | Otherwise why would we pay income tax?
               | 
               | They were going to legally _mandate_ that everything be
               | scanned through methods less private than the ones Apple
               | has developed here, through EARN IT and potentially LAED
               | (which would have banned E2E in all circumstances and any
               | device that could not be unlocked by the manufacturer).
               | While that crisis was temporarily averted, the risk of it
               | coming back was and is very real.
               | 
               | Apple decided to get ahead of it with a better solution,
               | even though that solution is still bad. It's a lesser
               | evil to prevent the return of something worse.
        
             | cwizou wrote:
             | While I broadly agree that this (and other attempts in
             | Australia and rumblings about it in UK) may have been the
             | impetus for developing the technology, I'm not sure it
             | explains why they released it on Photos, where Apple holds
             | the keys, and not very far since you can look at your
             | entire photo roll on icloud.com with just your username and
             | password.
             | 
             | That tech is not being deployed on iMessage which is the
             | only e2ee(ish) service from Apple (with Keychain) and is
             | what those legislative attempts are usually targeting. One
             | could argue it would have made sense (technically) there
             | though, sure.
             | 
             | Was it a reason to release it preventively, on something
             | unrelated, to be in the good graces of legislators ? I'm
             | not sure it's a good calculation, and it doesn't cover
             | other platforms like Signal and Telegram that would still
             | be seen as a problem by those legislators and require them
             | to legislate anyway.
        
         | alerighi wrote:
         | You can't ban encryption, it's practically impossible, it's
         | like banning math.
        
           | vineyardmike wrote:
           | You can ban it's use, you can't ban its existence. You can't
           | mandate math work differently, you just mandate people not
           | use math.
           | 
           | Its amazing since this would have decimated the American Tech
           | sector in many unknown ways.
        
           | gjsman-1000 wrote:
           | If you can get jailed for speech, you can get jailed for
           | encryption.
        
             | jchw wrote:
             | Technically in the real world you can get jailed for
             | anything as long as corruption exists and humans remain
             | imperfect.
             | 
             | But you _shouldn't_ get jailed for protected speech and you
             | _shouldn't_ get jailed for preserving your privacy (via
             | encryption or otherwise.) As cynical as people may get,
             | this is one thing that we have to agree on if we want to
             | live in a free society.
             | 
             | And above all, most _certainly_ , we shouldn't allow being
             | jailed over encryption to become codified as law, and if it
             | does, we _certainly_ must fight it and not become
             | complacent.
             | 
             | Apathy over politics, especially these days, is
             | understandable with the flood of terrible news and highly
             | divisive topics, but we shouldn't let the fight for privacy
             | become a victim to apathy. (And yes, I realize big tech
             | surveillance creep is a fear, but IMO we're starting to get
             | into more direct worst cases now.)
        
           | bitwize wrote:
           | The ban won't make encryption disappear. You can still have
           | it. But if you're found using it, it's a felony.
        
           | rolph wrote:
           | in a loose sense, mathematics, and its associated notations
           | are a form of encryption
        
           | eurasiantiger wrote:
           | Well, we are already banning numbers (see: DeCSS), what's the
           | big difference?
           | 
           | In some countries even discussing the application of certain
           | numbers is unlawful.
        
           | Unklejoe wrote:
           | All they have to do is force Apple and Google to ban it and
           | any apps that use it from their app store and they've
           | effectively banned encryption for like 99% of people.
        
         | charcircuit wrote:
         | >which would unilaterally banned E2E encryption in entirety
         | 
         | It did not do this. The bill was essentially asking for search
         | warrants to become a part of the protocol. If you're only
         | solution to allowing for search warrants to work is to stop
         | encrypting data I feel you are intentionally ignoring other
         | options to make this seem worse than it is.
        
           | stormbrew wrote:
           | I am _very_ intrigued about what you think the other options
           | are that preserve E2E encryption. How do you make search
           | warrants part of the process without some kind of escrow or
           | third party involvement, at which point it is no longer  "end
           | to end"?
        
         | heavyset_go wrote:
         | This seems like speculation with no evidence. The government
         | cares about more than just CSAM, they care about terrorism,
         | human and drug trafficking, organized crime, gangs, fraud, drug
         | manufacturing etc.
         | 
         | This would only make sense if Apple intends to expand their
         | CSAM detection and reporting system to detect and report those
         | other things, as well.
        
           | gjsman-1000 wrote:
           | The EARN IT Act would have basically legally mandated a
           | backdoor in all services, with shifting recommendations, in
           | the name of preventing the spread of CSAM. Sound familiar?
           | 
           | Also, there is another reason why there is the CSAM Detecting
           | and Reporting system. With Apple CSAM Scan, that big "excuse"
           | Congress was planning to use through EARN IT to ban E2E is
           | diffused, meaning now Apple has the potential to add E2E to
           | their iCloud service before Congress can figure out a
           | different excuse.
        
         | jedmeyers wrote:
         | > If you are Apple, even though EARN IT failed... you know
         | where Washington's heart lies.
         | 
         | If you are IG Farben, you know where Berlin's heart lies...
        
         | falcolas wrote:
         | CSAM scanning is only _one_ type of scanning which is performed
         | at the behest of governments and corporations around the world.
         | This alone will not allow for end-to-end encryption of items in
         | the cloud.
         | 
         | There would need to be end-device scanning for arbitrary
         | objects, including full text search for strings including
         | 'Taiwan', 'Tiananmen Square', '09 F9', and so forth to even
         | begin looking at e2e encryption of your items in the cloud.
         | 
         | At which point... what's the point?
        
           | gjsman-1000 wrote:
           | Not really. E2E iMessage is still available in China if you
           | disable the iCloud Backup. It's also the only messenger in
           | China with an E2E option.
        
       | jl6 wrote:
       | > Anti-Child Abuse Imagery Tech
       | 
       | Room for improvement in the headline.
        
       | joelbondurant wrote:
       | ALL the dozens of people I know at Apple would do anything to
       | create a job looking at kiddie porn.
        
       | geoah wrote:
       | > The system relies on a database of hashes--cryptographic
       | representations of images--of known CSAM photos provided by
       | National Center for Missing & Exploited Children (NCMEC) and
       | other child protection organizations.
       | 
       | "Cryptographic representations of images". That's not the case
       | though right? These are "neuralhashes" afaik which are nowhere
       | close to cryptographic hashes but rather locality sensitive
       | hashes which is a fancy speak for "the closer two images look
       | like, the more similar the hash".
       | 
       | Vice and others keeps calling the cryptographic. Am I missing
       | something here?
        
         | whoknowswhat11 wrote:
         | No - the reporting is absolutely terrible here.
         | 
         | 1) These are more share similar visual features than crypto
         | hashes.
         | 
         | 2) HN posters have been claiming that apple reviewing flagged
         | photos is a felony -> because HN commentators are claiming
         | flagged photos are somehow "known" CASM - this is also likely
         | totally false. The images may not be CASM and the idea that a
         | moderation queue results in felony charges is near ridiculous.
         | 
         | 3) This illustrates why apple's approach here (manual review
         | after 30 images or so flagged) is not unreasonable. The push to
         | say that this review is unnecessary is totally misguided.
         | 
         | 4) They use words like "hash collision" for something that is
         | not a hash. In fact, different devices will calculate DIFFERENT
         | hashes for the SAME image at times.
         | 
         | One request I have - before folks cite legal opinions - those
         | opinions should have the name of a lawyer on them. Not this "I
         | talked to a lawyer" because we have no idea if you described
         | things accurately.
        
           | sandworm101 wrote:
           | >> those opinions should have the name of a lawyer on them.
           | 
           | Not going to happen. Lawyers in the US have issues with
           | offering unsolicited advice, and other problems with issuing
           | advice into states where they are not admitted. So likely
           | none of the US lawyers (and the great many more law students)
           | here will ever put their real name to a comment.
        
             | torstenvl wrote:
             | This is incorrectly applied. Offering a legal opinion is
             | fundamentally different from offering legal advice. We
             | publish legal opinions in academic and professional
             | publications all the time. That doesn't mean we have an
             | attorney-client relationship with anyone who reads those
             | opinions, or that we would advise that someone act in
             | accordance with such an opinion, particularly if no court
             | has adopted our position yet.
        
             | whoknowswhat11 wrote:
             | Sure - but if you are going to write blog posts / articles
             | - and all you can say is based on the lawyers I talked to
             | apple is committing child porn felonies, that is just
             | unacceptable.
             | 
             | At least HN should flag these and get these taken down.
             | Over and over the legal analysis is either trash or it's
             | clear the article author didn't understand something (so
             | how can lawyer give good advice?).
             | 
             | These conversations become so uninteresting when people
             | take these extreme type positions. Apple's brand is
             | destroyed - apple is committing child porn felonies.
             | 
             | I would have rather just had a link to the apple technical
             | paper and a discussion personally vs the over the top
             | random article feed with all sorts of misunderstandings.
             | 
             | And in contract law there are LOTS of legal articles online
             | - with folks name on them! They are useful! I read them and
             | enjoy them. Can we ask for that here where it matters maybe
             | more?
        
               | sandworm101 wrote:
               | >> there are LOTS of legal articles online - with folks
               | name on them
               | 
               | Articles are not legal advice. They are opinions on the
               | law applicable generally, rather than fact-based advice
               | to specific clients. Saying whether apple is doing
               | something illegal or not in this case, with a lawyer's
               | name stamped on that opinion, is very different.
        
             | dylan604 wrote:
             | Right. That's why there's never been an amicus brief or a
             | friends of the court type of document drafted and signed by
             | lawyers.
        
               | sandworm101 wrote:
               | Those are addressed to a court or government agency in a
               | specific place. The advice offered isn't for a bunch of
               | rando people on the internet across all number of
               | jurisdictions. And it is only general advice about an
               | area of law, normally at the appellate level, not
               | specific fact-dependent advice to a real flesh-and-blood
               | person. Amicus is also normally an opinion to sway the
               | court on broad policy, not a determination of facts in a
               | specific case.
        
             | agbell wrote:
             | This. Also try contacting a lawyer who knows this area and
             | asking to pay for a legal opinion brief so that you can
             | post it online to be debated by legions of software
             | developers.
             | 
             | Lawyers I know would politely decline that.
        
               | whoknowswhat11 wrote:
               | No - this is actually done supposedly as part of biz dev.
               | 
               | So your own firm may cover some costs if you have
               | something to say. If you found someone to pay for you to
               | do an analysis or offer your thoughts - you'd be in
               | heaven!
        
             | zie wrote:
             | but there are legal opinions that law firms and various
             | orgs(political or not) that wite on occasion from actual
             | lawyers sharing a legal opinion, publicly(or not).
             | 
             | But it's the law, it's fuzzy at best, much like your HR
             | department. It's only after a court decision has been
             | reached on your particular issue that it's anywhere near
             | "settled" case law, and even that's up for possible change
             | tomorrow.
        
           | fuzzer37 wrote:
           | Or how about they just don't scan my phone at all.
        
             | shapefrog wrote:
             | feel free to go ahead and opt out.
        
           | rootusrootus wrote:
           | > The images may not be CASM and the idea that a moderation
           | queue results in felony charges is near ridiculous.
           | 
           | Agree, and I think this is backed up by real world
           | experience. Has Facebook or anyone working on their behalf
           | ever been charged for possession of CSAM? I guarantee they've
           | seen some. Probably a lot, in fact. That's why we have
           | recurring discussions about the workers and the compensation
           | they get (or not) for the really horrid work they are tasked
           | with.
        
             | tetha wrote:
             | More of an anecdote, but our software allows users to
             | upload files, documents, images and such. This in turn
             | means, technically speaking, our platform could be used to
             | distribute malware or other illegal content. I've asked:
             | Well what happens if that occurs and we as a company
             | becomes aware?
             | 
             | Overall, we're advised to take about these steps there.
             | First off, report it. Second, remove all access for the
             | customer, terminate the accounts, lock them out asap.
             | Third, prevent access to the content without touching it.
             | For example, if it sits on a file system and a web server
             | could serve it, blacklist URLs on a loadbalancer. Fourth,
             | if necessary, begin archiving and securing evidence. But if
             | possible in any way, disable content deletion mechanisms
             | and wait for legal advice, or the law enforcement to tell
             | you how to gather the data.
             | 
             | But overall, you're not immediately guilty for someone
             | abusing your service, and no one is instantly guilty for
             | detecting someone is abusing your service.
        
             | hpoe wrote:
             | The problem isn't the idea of CASM, it's that what happens
             | when they start scanning for "misinformation", or whatever
             | else they want to come up with at that point.
        
               | whoknowswhat11 wrote:
               | Then why not wait until that actual issue comes up. The
               | look is so bad / awkward with this big protest over
               | something that many people are not going to find at all
               | objectionable.
               | 
               | Do you really think Apple's brand has been "destroyed"
               | over this?
        
               | rootusrootus wrote:
               | IMO the risk is that the reaction will be viewed amongst
               | the general population as an overreaction, leading to it
               | being ignored on the next go around. Many people are
               | going to listen to what Apple says, look at their piles
               | of technical documentation, see the word CSAM, and
               | conclude that the opposing side is crazy.
        
             | judge2020 wrote:
             | > Probably a lot, in fact.
             | 
             | > In 2020, FotoForensics received 931,466 pictures and
             | submitted 523 reports to NCMEC; that's 0.056%. During the
             | same year, Facebook submitted 20,307,216 reports to NCMEC
             | 
             | https://www.hackerfactor.com/blog/index.php?/archives/929-O
             | n....
        
             | Xamayon wrote:
             | From what I understand it's even more cut and dry than
             | that. If you submit a report to the NCMEC you are generally
             | required to (securely) keep a copy of the image(s) being
             | reported and any related info for a period of time. That's
             | part of the rules. You are also only compelled to report
             | things that you know are bad, so verification before
             | reporting makes sense. The idea that they would be charged
             | with a crime for doing what is essentially required by law
             | is flat out wrong. Unless they are storing the material
             | insecurely once confirmed bad or otherwise mishandling the
             | process, they seem to be following the law as I understand
             | it. IANAL but I have an account with the NCMEC for a
             | service I run, so I have looked through their documentation
             | and relevant laws to try to understand the requirements
             | placed on me as a service provider.
        
               | whoknowswhat11 wrote:
               | Thank you for a first hand report. All that makes sense -
               | you clearly can't and shouldn't "pass around images" even
               | in the office - so fair that it is super strict once
               | something is confirmed as X.
        
         | andrewmcwatters wrote:
         | Edit: "The main purpose of the hash is to ensure that identical
         | and visually similar images result in the same hash, and images
         | that are different from one another result in different
         | hashes."[1]
         | 
         | Apple isn't using a "similar image, similar hash" system.
         | They're using a "similar image, same hash" system.
         | 
         | [1]: https://www.apple.com/child-
         | safety/pdf/CSAM_Detection_Techni...
        
           | heavyset_go wrote:
           | > _There really is no sound concept of "the more similar the
           | hash."_
           | 
           | Perceptual hashes are not cryptographic hashes. Perceptual
           | hashing systems do compare hashes using a distance metric
           | like the Hamming distance.
           | 
           | If two images have similar hashes, then they look kind of
           | similar to one another. That's the point of perceptual
           | hashing.
        
             | andrewmcwatters wrote:
             | This isn't what Apple is doing, regardless of the fact they
             | are using the same terminology. See my edit above.
        
             | salawat wrote:
             | Except when you're not measuring the similarity you think
             | you are, because NeuralNets are f&#@ing weird.
             | 
             | You know, let me put it this way. Yiu know that one really
             | weird family member I'm pretty sure everyone either has or
             | is?
             | 
             | Guess what? They're a neural net too.
             | 
             | This is what Apple is asking you to trust.
        
               | heavyset_go wrote:
               | I agree, I'm just talking about the difference between
               | cryptographic hashes and other types of hashes.
        
             | [deleted]
        
           | function_seven wrote:
           | EDIT: Parent edited his comment to clarify. I understand the
           | point now. I'm wrong about similar images needing to have
           | "similar" hashes. Those hashes either need to match exactly,
           | or else not be considered at all.
           | 
           | IGNORE THIS: I think that's the parent comment's point. These
           | are definitely _not_ cryptographic hashes, since they--by
           | design and necessity--need to mirror hash similarity to the
           | perceptual similarity of the input images.
        
             | whoknowswhat11 wrote:
             | They are almost the opposite. The programs create different
             | hashes (though very similar) depending on platform you run
             | on even with SAME input. No crypto hash works even close to
             | this.
        
             | [deleted]
        
             | [deleted]
        
         | Someone wrote:
         | The neural hash itself isn't cryptographic, but there's
         | cryptography involved in the process.
         | 
         | They use "private set intersection"
         | (https://en.wikipedia.org/wiki/Private_set_intersection) to
         | compute a value that itself doesn't say whether an image is in
         | the forbidden list, yet when combined with sufficiently many
         | other such values can be used to do that.
         | 
         | They also encrypt the "NeuralHash and a visual derivative" on
         | iCloud in such a way that Apple can only decrypt that if they
         | got sufficiently many matching images (using
         | https://en.wikipedia.org/wiki/Secret_sharing)
         | 
         | (For details and, possibly, corrections on my interpretation,
         | see Apple's technical summary at https://www.apple.com/child-
         | safety/pdf/CSAM_Detection_Techni... and
         | https://www.apple.com/child-
         | safety/pdf/Apple_PSI_System_Secu...)
        
         | robertoandred wrote:
         | The CSAM hashes are encrypted and blinded, possible Vice and
         | others conflated and/or combined steps.
        
         | atonse wrote:
         | Yea by reading about this whole scandal I learned about
         | perceptual hashes, because the whole time I kept thinking
         | "can't they just alter one pixel and entirely change the hash"
         | because I was only familiar with cryptographic hashes.
         | 
         | It's a huge, huge, huge distinction.
        
         | jdavis703 wrote:
         | In a certain sense it is cryptographic because the original
         | CSAM images remains secret. While the hash is not useful for
         | maintaining integrity it still provides confidentiality.
         | 
         | Edit: this is apparently not true as demonstrated by
         | researchers.
        
           | judge2020 wrote:
           | This has technically been proven false:
           | 
           | > Microsoft says that the "PhotoDNA hash is not reversible".
           | That's not true. PhotoDNA hashes can be projected into a
           | 26x26 grayscale image that is only a little blurry. 26x26 is
           | larger than most desktop icons; it's enough detail to
           | recognize people and objects. Reversing a PhotoDNA hash is no
           | more complicated than solving a 26x26 Sudoku puzzle; a task
           | well-suited for computers.
           | 
           | https://www.hackerfactor.com/blog/index.php?/archives/929-On.
           | ..
        
             | robertoandred wrote:
             | PhotoDNA and NeuralHash are not the same thing.
        
               | judge2020 wrote:
               | Apple is using a private perceptual hash on the backend,
               | likely to further filter out bad/spam tickets.
               | 
               | https://twitter.com/fayfiftynine/status/14278999511204904
               | 97?...
               | 
               | Given neuralhash is a hash of a hash, I imagine they're
               | running photodna and not some custom solution which would
               | require Apple ingesting and hasing all of the images
               | themselves using another custom perceptual hash system.
               | 
               | > . Instead of scanning images in the cloud, the system
               | performs on-device matching using a database of known
               | CSAM image hashes provided by NCMEC and other child-
               | safety organizations. Apple further transforms this
               | database into an unreadable set of hashes, which is
               | securely stored on users' devices.
               | 
               | https://www.apple.com/child-
               | safety/pdf/CSAM_Detection_Techni...
        
               | robertoandred wrote:
               | My reading of that is NCMEC provides NeuralHash hashes of
               | their CSAM library to Apple, and Apple encrypts and
               | blinds that database before storing it on devices.
        
         | crooked-v wrote:
         | Something to note here is that in the hash collision that was
         | discovered, the two images look nothing alike. One is a picture
         | of a dog, the other is blobby grey static.
        
           | smoldesu wrote:
           | Furthermore, it may well be possible to combine that blobby
           | grey static with another image, manipulating it's visual hash
           | to create a "sleeper" positive. If this was possible in _a
           | week_ , then it's going to be very interesting to watch the
           | technology change/evolve over the next few years.
        
             | floatingatoll wrote:
             | Doing so would create a positive that still doesn't pass
             | Apple's human visual check against the (blurred) CSAM
             | content associated with that checksum, and if it somehow
             | did, it would still then also have to occur at qty.30 or
             | more, _and_ they 'd have to pass a human visual check
             | against the (unblurred) CSAM content by one of the agencies
             | in possession of it. It's not possible to spoof that final
             | test _unless_ you possess real CSAM content, at which point
             | you don 't need to spoof that final test _and_ you 'll end
             | up arrested for possession.
        
               | treesprite82 wrote:
               | There are concerns without the images needing to make it
               | all the way through their CSAM process. Apple is a US
               | company with obligations to report certain crimes they
               | become aware of, and NCMEC is a US government-tied
               | organisation that dismissed privacy concerns as "the
               | screeching voices of the minority".
               | 
               | Consider if the honeypot images (manipulated to match
               | CSAM hashes) are terrorist recruitment material for
               | example.
        
               | [deleted]
        
               | laverya wrote:
               | And if you make the second image be actual, 18+ porn?
               | Will Apple be able to tell the difference between that
               | and CSAM after the blur is applied?
               | 
               | Bonus points if you match poses, coloration, background,
               | etc.
        
               | floatingatoll wrote:
               | The only way to match poses, coloration, background, etc
               | is to possess illegal CSAM content. If you do so
               | successfully, you will result in your crafted image
               | passing the blur check and reaching the agency that
               | possesses the original image for final verification,
               | where it will immediately fail because it is obviously a
               | replica. You will then trigger that agency leading law
               | enforcement to find the _creator_ of the image, so that
               | they can arrest whoever possesses CSAM content in order
               | to model replicas after it.
               | 
               | I think that's a perfectly acceptable outcome, since
               | anyone with the hubris to both possess CSAM content _and_
               | create replicas of it especially deserves to be arrested.
               | Do you see a more problematic outcome here?
        
               | laverya wrote:
               | This assumes that it's impossible to reverse the
               | perceptual hashes used here in such a way that you could
               | determine poses and coloration, for one.
               | 
               | And in retrospect, you don't need to match that - you
               | just need it to appear obviously pornographic after the
               | blur is applied in order to get past Apple's reviewers.
               | After that, the lucky individual's life is in the hands
               | of the police/prosecutors. (I have to imagine that both
               | real and faked cases will look pretty much like "Your
               | honor/members of the jury, this person's device contained
               | numerous photos matching known CSAM. No, we won't be
               | showing you the pictures. No, the defence can't see them
               | either." Can you imagine a "tough on crime" prosecutor
               | taking the faked case to trial too? Would the police and
               | prosecutors even know it was faked?)
        
               | floatingatoll wrote:
               | Apple's reviewers are comparing blurred originals to
               | blurred matches. It needs to look like the blurred
               | original associated with the checksum that matched. It is
               | irrelevant whether the blurred match looks pornographic
               | or not.
        
           | 0x5f3759df-i wrote:
           | Take a look at this collision https://twitter.com/SarahJamieL
           | ewis/status/14280837442802565...
        
           | bawolff wrote:
           | I dont really think it'd be a valid second pre-image for this
           | type of hash if they did look similar.
        
           | __blockcipher__ wrote:
           | They actually do look alike to my eye, but in a "the way the
           | algorithm sees it" kind of way. I can see the obvious
           | similarity. But to your point it's not like it's two very
           | slightly different photos of dogs.
        
       | balozi wrote:
       | They are slowly turning what is a philosophical argument into a
       | technical question. Next step is for them to conjure up a
       | technical answer before claiming victory. Most people are already
       | bored by all the technical talk.
       | 
       | Neural hash this: Its about Trust. Its about Privacy. Its about
       | Boundaries between me and corporations/governments/etc.
        
       ___________________________________________________________________
       (page generated 2021-08-18 23:02 UTC)