[HN Gopher] Apple removes references to CSAM  from its child saf...
       ___________________________________________________________________
        
       Apple removes references to CSAM  from its child safety webpage
        
       Author : virgildotcodes
       Score  : 358 points
       Date   : 2021-12-15 11:01 UTC (12 hours ago)
        
 (HTM) web link (www.macrumors.com)
 (TXT) w3m dump (www.macrumors.com)
        
       | null_object wrote:
       | So far all the comments are predictably negative and imply that
       | Apple will at some time in the future attempt to covertly
       | implement the feature in secret.
       | 
       | Another take would be that they took on-board the extensive
       | public criticism, and changed their minds.
        
         | gentleman11 wrote:
         | Somebody above found a quote saying that plans are still to go
         | forward with it eventually
        
         | ItsBob wrote:
         | I don't believe that they'll slip it under the radar at all. I
         | think they'll brand it as something awesome and magical and
         | just add it anyway. People, for the most part, don't scratch
         | the surface EVER, so they'll just think it has a cool scanning
         | feature built-in now.
        
           | disgu wrote:
           | They already scan your photos but the reason for scanning is
           | important. Type 'cat' into the search and they'll show you
           | the pictures you've taken of cats. However, they can't just
           | go through your photos and use it for any purpose they want.
           | They have to give a specific reason.
        
             | ItsBob wrote:
             | I meant that they won't hide what it will do. It will be
             | called something benign and fluffy and it will be "for your
             | protection".
             | 
             | That's what I meant.
        
         | skoskie wrote:
         | I don't think Apple has much choice in the matter, and I really
         | wish more people understood why they did this in the first
         | place.
         | 
         | https://www.eff.org/deeplinks/2019/12/senate-judiciary-commi...
        
           | elliekelly wrote:
           | Apple has stood up to unreasonable government invasions of
           | privacy before. And won. They have the means. They have the
           | skill. They have the incentive (their entire brand is
           | arguably built around privacy) and their users have been
           | pretty clear in voicing their opposition to CSAM.
           | 
           | Some nebulous political references in opposition to
           | encryption aren't the reason Apple did this in the first
           | place. They had and continue to have plenty of choice in the
           | matter.
        
             | coolspot wrote:
             | > Apple has stood up to unreasonable government invasions
             | of privacy before. And won.
             | 
             | Only when they knew they would win, as with Bernandino
             | Shooter PR spectacle.
             | 
             | They didn't stand up against iCloud CN being hosted by
             | Chinese government, they didn't stand up against
             | unencrypted iCloud backups.
        
         | matheusmoreira wrote:
         | It's proprietary software. There's no way to know what it's
         | doing. Better to assume the worst.
        
         | BrazzVuvuzela wrote:
         | If Apple would like us to believe they changed their mind, they
         | might consider telling us they changed their mind. They haven't
         | told us that, so why should we assume it?
        
           | judge2020 wrote:
           | They're still a company with a PR and Legal team. Letting it
           | die quietly means it stays out of the news cycle (as in,
           | there are less articles about it being removed compared to if
           | they publicly announced they were wrong).
        
             | BrazzVuvuzela wrote:
             | > _They're still a company a PR and Legal team_
             | 
             | All the more reason to _not_ give them the benefit of the
             | doubt with shit like this. If they say something flat out,
             | that has some weight since there could be legal
             | repercussions for lying (in theory anyway.) But not saying
             | anything and letting people infer whatever they want? That
             | 's how companies with PR and Legal teams like to cover
             | their asses instead of outright lying. The company says
             | nothing and lets their supporters invent whatever headcanon
             | is most flattering. I don't go in for that.
             | 
             | Edit: The same thing happened when they let Apple fans
             | infer that this feature was preparation for enabling proper
             | E2EE. As far as I'm aware Apple never actually said that
             | and their supporters decided to believe it without
             | evidence, just a lot of wishful thinking.
        
             | pzo wrote:
             | I would respect more company and people that can admit to
             | mistake and apologize and commit not doing the same mistake
             | again. In this case Apple simply keeps people in limbo -
             | you cannot be sure what they are planning to do with it
             | without any public statement. They might as well tweak it a
             | little bit, rebrand with different name and push it again
             | as wrapped with different package.
        
         | klodolph wrote:
         | Another possibility (or way to look at it) is that it worked
         | poorly enough, and it was poorly received enough, that the
         | internal factions within Apple that opposed it had enough
         | leverage to kill it.
         | 
         | Sometimes, the right way to kill a project (politically) is to
         | let it publicly fail. Killing a project before it fails
         | requires much more of a fight. It could also mean that the very
         | public failure of this project gave Apple leverage over
         | whichever spooks were pushing for it in the first place.
        
           | zionic wrote:
           | >Sometimes, the right way to kill a project (politically) is
           | to let it publicly fail.
           | 
           | I feel this is definitely true most of the time, but in this
           | case the cost to their public image/perception among the tech
           | crowd was so high it was a mistake.
        
             | klodolph wrote:
             | It may have been a mistake for Apple as a company to make
             | this decision, but I think we can understand why the
             | individuals within Apple might choose to let decisions like
             | this play out and fail in public, even if they know it is
             | going to fail.
        
               | ksec wrote:
               | Well Craig Federighi seems to be happy with it. Their
               | software engineers leads of the features seems to be
               | happy with it. i.e I dont believe there were significant
               | objection to the feature within Apple. At least not those
               | in power. And that is why they went with it. The
               | objection obviously came later after it was reviewed to
               | the public.
        
               | vineyardmike wrote:
               | > Craig Federighi seems to be happy with it.
               | 
               | There is a big difference between happy with it and not
               | unhappy enough to quit publicly when part of your job is
               | to spread koolaid and look happy.
               | 
               | If i were against something like this, but not powerful
               | enough to squash it (especially if a gov had a hand in
               | getting it pushed through), then i'd make sure to do a
               | press tour and get it in every newspaper for all the
               | wrong reasons, while making an all/nothing stance to
               | torpedo the project.
               | 
               | I can think of no other reason beyond utter incompetence
               | why they'd announce it how they did. Apple is known for
               | being secretive, but why did they do a press interview
               | and say "sure other 3-letter gov agencies may need to add
               | to the illegal photos database in the future" unless you
               | wanted to really make sure the project would fail? Why
               | else announce it in the same press release as scanning
               | messages for similar content? It seems like the rollout
               | was designed to be as suspicious as possible tbh.
        
               | notriddle wrote:
               | Never assume 4D chess when simple incompetence can do the
               | job, because 4D chess is rare, and tends to fail when it
               | confronts the unexpected. If you were trying to sink the
               | project this way, how would it work out if some other,
               | unrelated scandal had popped up and distracted the
               | general public?
        
               | vineyardmike wrote:
               | > If I were trying to sink the project this way
               | 
               | This should be re-written to say
               | 
               | > If [i were not influencial enough to sink it
               | internally, so i] were trying to sink the project
               | [through bad publicity]
               | 
               | Therefore, the answer to your question is "darn, it
               | didn't work. that sucks but at least i tried".
               | 
               | Also...
               | 
               | > Never assume 4D chess when simple incompetence can do
               | the job
               | 
               | In this context, its very likely that a government is
               | somewhat behind it. We know the FBI and apple didn't get
               | along. We know apple has been doing things quid-pro-quo
               | for china to stay on their good side. So it seems like
               | we're already sitting at a 4d chess board, but no one is
               | sure who's playing.
               | 
               | If a government says you have to do X on the DL, and you
               | don't want to because its bad, then a logical solution is
               | to get general population + news media to say "X is bad!
               | We really hope X doesn't happen." Because then its easy
               | to show the government why you can't do X.
        
               | klodolph wrote:
               | > The objection obviously came later after it was
               | reviewed to the public.
               | 
               | Why is this obvious? Normally, when you have an objection
               | to a company policy, you voice those objections in
               | private to people within the company. This seems
               | "obvious" to me, but to explain--(1) it's against company
               | policy and often considered a fireable offense, (2)
               | employees have an interest in presenting a united front,
               | (3) those who want the project to fail publicly want it
               | to happen without interference from inside the company.
        
           | vineyardmike wrote:
           | > it could also mean that the very public failure of this
           | project gave Apple leverage over whichever spooks were
           | pushing for it in the first place.
           | 
           | This is my guess. I imagine it went something like this.
           | 
           | Congressman: "So tim, FBI talked to me and want some help.
           | Please find a way to check all pictures on iphone"
           | 
           | Tim: "or no?"
           | 
           | C: "We'll make it worth your while"
           | 
           | T: "This goes against our marketing and history, it'll take a
           | really big push. My team says we can manage X but we need to
           | work with CSAM agency to handle optics.
           | 
           | C: "Great!"
           | 
           | T: "I was right, the public hates it and doesn't care about
           | children, we're going back to our privacy stance in marketing
           | because the backlash would out-due the political reward.
           | We'll throw you under the bus if you push again on this. We
           | totally want to help for real and not just because politics
           | is a game and we have to play, but this cost is too high,
           | even you can see this backlash and recognize we can't go
           | threw with it. BTW thanks for giving us excuse to hire a few
           | new expensive ML scientists from google or wherever. They got
           | to write some nice papers."
           | 
           | C:"Gotcha. See you next week for gulf when we try again at
           | something new? There is this israeli company i want you to
           | meet"
        
             | tomc1985 wrote:
             | We all know who it was -- NCMEC was getting on Apple's case
             | because their CSAM reporting numbers were orders of
             | magnitude lower than the other big tech companies. Someone
             | in NCMEC upper management even wrote Apple nauseatingly
             | perky congratulatory letters celebrating Apple for actually
             | trying to save the children, and calling detractors braying
             | mules or somesuch.
        
         | Macha wrote:
         | This is the reaction the tech industry has bred with the only
         | options ever being "Yes, now" or "Later". People are used to
         | the industry not taking no for an answer, and so this pattern
         | matches to every other time the industry has "backed off for
         | now" (see also: MS pushing edge as default browser, Apple
         | boiling the frog with gatekeeper, Whatsapp/Facebook data
         | sharing, Google+ account migrations for YouTube accounts)
        
           | adrr wrote:
           | This isn't a revenue generating initiative. It is different
           | than your examples. Apple is trying to make two groups happy,
           | governments and customers. Governments don't like like
           | encryption(assuming Apple starts encrypting iCloud photos
           | with device based keys) and consumers don't like governments
           | snooping on them. If you're were the CEO of apple who would
           | you favor knowing either group could cost you money.
           | Governments could prioritize antitrust initiatives against
           | apple and consumers could stop buying apple products.
        
             | denton-scratch wrote:
             | > knowing either group could cost you money
             | 
             | I'm not a businessman, but I believe Apple's main customers
             | are _not_ governments, so I suppose that it 's not good
             | business for Apple to ignore their users' preferences.
             | Governments in most of Apple's marketplaces change every
             | few years, after all.
        
               | adrr wrote:
               | Why they pulled the scanning was because of customer
               | backlash. Fallout is yet to be seen what governments
               | especially when Apple switches to not having access the
               | keys to decrypt iCloud photos and documents. And moves
               | iCloud backups to being encrypted. One high profile crime
               | especially involving children and we'll see government
               | propose escrowed keys. UK is already proposing it. I am
               | not saying it's right or wrong. I do believe consumers
               | anger at Apple is the wrong target. If they want change,
               | get governments to pass laws that guarantee encryption as
               | a "right".
        
               | BrazzVuvuzela wrote:
               | > _Why they pulled the scanning_
               | 
               | They pulled documents about the scanning, but we don't
               | actually know they pulled the scanning itself.
               | 
               | > _when Apple switches to not having access the keys to
               | decrypt iCloud photos and documents._
               | 
               | We don't know that they are/were going to do that either.
               | It's just a popular 'fan theory.'
        
               | adrr wrote:
               | If weren't going to do device based keys, why put
               | scanning on the device? Scan on the servers. There would
               | have been 0 backlash if they did that.
        
             | Nevermark wrote:
             | You could ship a mechanism that scans your documents but
             | doesn't report anything.
             | 
             | Or do something equally pointless, but just as signally.
             | 
             | I feel like so many of these unstoppable force meets
             | immovable wall situations warp both functionality and
             | discourse (on all sides) on said functionality.
             | 
             | This comes about because everything is getting so
             | centralized. The government wants to be in everything, and
             | Apple (stand in for all big tech) does too. This pits them
             | against each other, with the consumer only acknowledged
             | when there is vast consumer agreement and motivation.
             | 
             | Similar to App Store, Apple wants total control, then
             | governments use Apple's control to cut of access to
             | particular apps. Consumers that understand the severe loss
             | of freedom this creates (for them and developers) both
             | immediate and in lost innovation don't like it. But there
             | isn't a huge consumer uproar on that one ... so freedom
             | wilts.
             | 
             | Bundling these basic features (cloud backups, app stores)
             | with operating systems is creating an unholy
             | government/platform axis that wins every time consumers are
             | unable to drum up the motivation to speak as coherent
             | allies pushing back.
        
           | pzo wrote:
           | I have been literally spammed in the last month on my iPhone
           | (even though still using iOS 14) with apple notifications
           | asking me "Do you agree to new terms of agreements? <Yes> <We
           | will ask you later>" (that's regarding new iCloud terms). I
           | was always careful not to miss click and not to agree. No
           | matter what few days later I kept giving the same
           | notification. The stopped for some reason after few months
           | and now I don't know if:
           | 
           | 1) I agree by accident
           | 
           | 2) They thankfully stopped nagging me
           | 
           | 3) Or they buggy implemented it and even though I kept
           | cancelling after few tries they assumed I agreed.
        
             | AlexandrB wrote:
             | I'm still tapping "later" on WhatsApp's ToS changes. I
             | wonder how much longer I can get away with it.
        
               | najqh wrote:
               | A few weeks ago I was forced to accept or I couldn't keep
               | using the app at all.
        
               | skinkestek wrote:
               | What have you switched to now?
        
               | najqh wrote:
               | I haven't switched, I accepted them because I don't
               | really care. I just wanted to see for how long I could
               | push it.
        
               | skinkestek wrote:
               | At the moment Facebook bought them I started researching
               | alternatives and I am happy to say that a couple of
               | hundred friends and family of mine have switched and not
               | a single person has tried to reach me on WhatsApp for
               | months until I uninstalled it.
        
             | Someone wrote:
             | I don't think cloud services realistically can do something
             | else than forcing all users to accept the latest license.
             | 
             | If they don't, they end up having data from different
             | licenses. If you have many of those, figuring out what you
             | can and cannot do with data becomes a nightmare.
             | 
             | You could get questions such as "Can you make a backup in
             | country C for user U? He's under the 2019 terms, so in
             | principle, yes, but alas, user V has shared content with U,
             | and V still is under the 2017 terms. Also, U has been in
             | the EU for the past weeks, so EU regulations apply, at
             | least as far as we catered for them in the 2019 terms"
             | 
             | Not changing the terms isn't an option, either, as they
             | have to be changed to cater for changes in laws. If you
             | operate in 100+ countries,
        
               | nkrisc wrote:
               | > If they don't, they end up having data from different
               | licenses. If you have many of those, figuring out what
               | you can and cannot do with data becomes a nightmare.
               | 
               | That's their problem to figure out. If they can't then
               | they should terminate customers who don't agree and ship
               | them their data.
               | 
               | Obviously they'll never do that since it would be a
               | terrible look and they'd lose customers, because they
               | want to have their cake and eat it, too.
               | 
               | I am not sympathetic to how hard it would be for a
               | corporation to do something. "It's too hard" is not an
               | excuse, not least of all if they're a multi-billion
               | dollar corporation.
        
         | DenisM wrote:
         | It's easy to tell which - is E2E still there? Then scanning is
         | still there too.
        
           | judge2020 wrote:
           | Well, it's mostly easy to tell since we still have
           | researchers decompiling and scrubbing through the OS to see
           | what's in it and what it does[0].
           | 
           | https://github.com/AsuharietYgvar/AppleNeuralHash2ONNX
        
         | momenti wrote:
         | What does it help if the creepy intelligence services _are in
         | the hardware_?
        
           | wholinator2 wrote:
           | I'd venture to say ease of use. I mean AT&T had everyone's
           | emails but still Facebook makes it much easier to infer and
           | understand and expand surveillance. Getting targeted
           | information might be doable in hardware, but mass
           | surveillance might be much easier if implemented a little
           | higher up, by a dedicated dev team which issues bug fixes and
           | feature extensions, which can turn over a request by you in a
           | fraction of the time and whos capabilities are far more
           | mutable. Hardware is more capable but it's a lot more work
        
         | bigyellow wrote:
         | It's closed source software - what makes anyone think it isn't
         | already implemented?
        
         | raxxorrax wrote:
         | I would have thought they continue the practice but because of
         | the bad publicity will remove all mentions of it. Is it
         | confirmed they won't scan user files? Because I would assume
         | that they still do.
         | 
         | As for the negativity. That is based on experience with large
         | tech companies.
        
         | mdoms wrote:
         | You can't be that naive?
         | 
         | > Update: Apple spokesperson Shane Bauer told The Verge that
         | though the CSAM detection feature is no longer mentioned on its
         | website, plans for CSAM detection have not changed since
         | September, which means CSAM detection is still coming in the
         | future.
        
         | kingcharles wrote:
         | This. There is absolutely no requirement for Apple to tell you
         | they are checking the signatures of all your files.
        
       | jetsetgo wrote:
       | The feature is already implemented. Now just in secret. Pumping
       | all user data straight to FBI
        
       | noisy_boy wrote:
       | People like us who do not want this kind of intrusion have other
       | things to do and lives to live. People driving such initiatives
       | _do_ such things to make their living. They will keep pushing for
       | it and eventually we will be too busy /forgetful/tired to stop
       | it. Unfortunately these issues are never prominent points in the
       | political discussion and get sidelined either by vested interests
       | or people's ignorance/lack of information or combination of such
       | factors. I wish I had the optimism to see hope in these matters.
        
       | itronitron wrote:
       | >> Apple Removes All References to...
       | 
       | First rule of scanning feature is don't talk about scanning
       | feature.
        
         | drexlspivey wrote:
         | Apple Removes All References to [undefined]
        
       | Traubenfuchs wrote:
       | I still don't understand what's the goal here? It certainly isn't
       | about catching child abuse. Any non completely stupid predator
       | will have switched to Android already. Who is paying Apple to
       | develop this feature that has zero (or negative!) value to the
       | end user?
       | 
       | Is it governments that want to scan end user devices for content
       | declared forbidden by them?
        
       | throwawaymanbot wrote:
       | Removing all reference does not mean their surveillance infra is
       | off your phone.
        
       | roenxi wrote:
       | If Apple kills the program they deserve some positive attention,
       | but realistically the governments of the world will want this
       | capability to scan people's data for things and will get it
       | developed one way or the other.
       | 
       | It has gotten so cheap to run dragnets that the only rational
       | responses are pessimism or demand for privacy by design.
        
         | threeseed wrote:
         | > but realistically the governments of the world will want this
         | capability to scan people's data for things and will get it
         | developed one way or the other
         | 
         | Sorry to burst your bubble.
         | 
         | But all of the photo/cloud providers including Apple have been
         | doing CSAM scanning for many years. And will continue to do so
         | for the foreseeable future.
         | 
         | Apple actually tried to give users more privacy.
        
           | roenxi wrote:
           | The US government, and probably others, scan through
           | everything on the internet. That has been a generally
           | accepted matter of fact since around 2015. I doubt there are
           | any secrets on iCloud.
           | 
           | I don't like it, but I'm still going to complain glumly if
           | things get worse. And at least I don't own the servers
           | involved.
        
       | bangonkeyboard wrote:
       | Absent an on-record commitment that this surveillance has been
       | abandoned, I don't put it past Apple to deploy it silently in a
       | minor point-update and then reveal it later, similar to how they
       | secretly converted iOS end-users' filesystems to APFS in versions
       | prior to its official release. They could then point to nobody
       | noticing the deployment as purported evidence that privacy
       | concerns were overblown.
        
       | LinuxBender wrote:
       | Is Apple's CSAM scanning operationally different than Windows 10
       | SmartScreen [1]?
       | 
       | [1] - https://www.windowscentral.com/how-disable-smartscreen-
       | trust...
        
         | kevinmgranger wrote:
         | Yes. One is application binaries, the other is photos.
        
           | LinuxBender wrote:
           | Has anyone verified that it only scans binaries?
        
             | kf6nux wrote:
             | Probably. It's not just about 1 in 8 billion people having
             | taken the time to look, it's also about 1 in (what I'm
             | guessing is) at least 100 people at Microsoft not leaking
             | it. Apple had a leak before it was even released. Secrets
             | are hard to keep.
        
       | kumarsw wrote:
       | This case reminds me a bit of the big stink with the Daraprim
       | price hikes and Martin Shkreli being dumb enough to be the public
       | face of it. That worked out pretty bad for him but also bad for
       | everyone else, because now we had a guy to blame for it, instead
       | of asking the harder questions like "why aren't drug prices
       | regulated?" Utilities must go to the public utility commission to
       | raise rates so why is the same not done for drugs? The federal
       | government stockpiles oil in case of shortages, why is it not
       | responsible for ensuring that drug production is continued?
       | 
       | The maddening thing about the Apple CSAM scanning controversy is
       | that I still have no idea which legislation Apple is complying
       | with, and it's difficult to find out any reporting on that
       | aspect. US? UK? EU? Clearly there is/are some government
       | organization(s) with teeth long enough that Apple is not willing
       | to go to court again as was the case with the San Bernandino
       | shooter. Point is, whether or not Apple has been dishonest about
       | this whole thing, they are still just a distraction.
        
         | lexapro wrote:
         | >That worked out pretty bad for him but also bad for everyone
         | else, because now we had a guy to blame for it, instead of
         | asking the harder questions like "why aren't drug prices
         | regulated?"
         | 
         | I would say it went very well for the pharma industry.
        
         | syshum wrote:
         | >> Utilities must go to the public utility commission to raise
         | rates
         | 
         | Which I have never seen a denial of so I am not sure that is
         | much of a check
        
         | young_unixer wrote:
         | > That worked out pretty bad for him
         | 
         | I don't think so. He was incarcerated for unrelated reasons.
        
           | philwelch wrote:
           | "Unrelated reasons"
           | 
           | The average American commits three felonies a day. I don't
           | think he was targeted by the feds for unrelated reasons even
           | if the actual charges were unrelated.
        
         | treis wrote:
         | There's a grand bargain where tech companies are granted
         | immunity from user generated content in exchange for moderating
         | that content. Apple's move is to try and maintain that bargain
         | while moving to E2E encryption.
        
           | dantheman wrote:
           | There is no grand bargain - the bargain is that they can
           | moderate and be immune from user generated content. The
           | default is no moderation to be immune from user generated
           | content or moderate and be liable.
        
           | Justin_K wrote:
           | "We support E2E encryption and can't unlock your device"...
           | except we put software on your device that scans for anything
           | our algorithm deems illegal and then we route that data to
           | the appropriate authorities.
        
           | yyyk wrote:
           | Given that the CSAM system as documented catches way less
           | cases than server-side scanning, in all likelihood Apple
           | can't implement E2E - no government would let Apple use such
           | an inefficient system by itself. The bargain probably
           | requires invasive client scanning _and_ server side scanning,
           | or at least way more invasive client scanning to catch all
           | the cases the current system can 't.
        
           | mindslight wrote:
           | If this were true, the right way to go about it would be to
           | spin out iCloud into a separate company. Being one legal
           | entity is what creates the larger situation where Apple is
           | hosting content that Apple is also able to moderate.
           | Splitting them up, Apple Software would be free to fully
           | represent their users' interests with proper encryption, and
           | iCloud would have zero ability to access to the plaintext.
        
           | coolspot wrote:
           | The bargain was likely about appstore monopoly and antitrust
           | law.
           | 
           | Apple published plans of CSAM exactly when all news were
           | discussing bad big tech monopoly, congress held hearing with
           | CEOs, etc.
        
           | Syonyk wrote:
           | > _Apple 's move is to try and maintain that bargain while
           | moving to E2E encryption._
           | 
           | It _could_ be. It makes sense as a precursor to it. However,
           | unless I 've missed something, at _no_ point in the entire
           | debacle did Apple say  "Yes, we're going to be moving iCloud
           | to E2EE, and this allows us to do it. If you opt into this,
           | the gains are that you can enable full encryption for
           | everything else, _and we literally can 't see it on the
           | servers._"
           | 
           | Apple, to the best of my knowledge, never said that, nor even
           | implied it. It was, "We're adding this to what we can do now,
           | take it."
        
             | rgovostes wrote:
             | They did not outright announce it but they dropped several
             | hints that this is the direction they were going,
             | enumerated here:
             | https://news.ycombinator.com/item?id=28249905
        
             | AlexandrB wrote:
             | > Apple, to the best of my knowledge, never said that, nor
             | even implied it. It was, "We're adding this to what we can
             | do now, take it."
             | 
             | Yup. There was a ton of speculation about Apple's motives -
             | whether legislative or related to iCloud E2E encryption -
             | but AFAIK Apple never confirmed anything outside of "we're
             | doing this to protect children". I think if there _was_
             | some other motive, Apple should have communicated it
             | better.
             | 
             | It's interesting how Apple's normal ability to set the tone
             | and framing of new features was undermined in this case by
             | a leak. I wonder if Apple would have been more successful
             | at marketing this if they were able to frame the debate
             | themselves.
        
               | Syonyk wrote:
               | The "slightly paranoid and cynical" hypothesis I have is
               | that the reason the whole set of docs looked like
               | something pushed out at 2AM, and that the "Clarification"
               | documents looked like they were written by people
               | operating on a week of no sleep, is because they were.
               | Apple wasn't planning to really bother documenting the
               | "feature," and someone found something sufficiently
               | objectionable to leak it. Maybe someone wanted to turn it
               | on early or something, or the hash database had some
               | "non-CP data" in it, or... I don't know, and probably
               | will never know.
               | 
               | But they decided that releasing what they had was better
               | than letting the leaks run wild - just, it turns out, the
               | actual system was really that bad. The "clarifications"
               | were a wild run of "Well, OK, how can we answer this
               | objection? And that one? Oh, crap, this one too!" - which
               | is about how the later documents read. "We just came up
               | with this idea that might address your objection!"
               | 
               | I'd love to hear the inside story of it at some point,
               | but accept I likely won't.
               | 
               | However, it's been enough to drive me off Apple products.
               | I can only hope enough other people have done so as well
               | to hurt their profits, though it's doubtful. :(
        
         | kingcharles wrote:
         | Apple appeared to be doing it on their own, not complying with
         | any legal requirement.
        
         | dantheman wrote:
         | The problems with drugs are entirely government created.
         | Through the granting of monopolies on drug formulations through
         | patents and then through the licensing by the FDA that prevents
         | other manufacturers to produce already approved drugs if
         | they're not exactly the same.
         | 
         | As with COVID we are seeing that the FDA fails to serve the
         | public and is overly cautious. If we want an FDA make it
         | voluntary, separate testing danger from effectiveness and at
         | the end of the day let each individual do whatever they want to
         | their own body.
        
       | LatteLazy wrote:
       | I don't get why they ever green lit this "feature". It's
       | unpopular with at least some users, it flies in the face of their
       | big push for privacy and it increases their liability. Who
       | approves such a request!?
        
       | zibzab wrote:
       | Apple is extremely sensitive to bad PR, this should not surprise
       | anyone.
        
         | fsflover wrote:
         | It worked very well with the iCloud encryption (not):
         | https://news.ycombinator.com/item?id=25777207.
        
       | ItsBob wrote:
       | Problem with something like this is the cat's out of the bag.
       | 
       | The code has been written, tested and merged etc. The project has
       | reached a point where it's ready to ship.
       | 
       | Apple have done their homework and they will not release ANYTHING
       | unless they think they'll either make money from it or at the
       | very least, not LOSE money from it. It's all about money, nothing
       | else.
       | 
       | It's coming whether we like it or not now :(
       | 
       | And you can be sure that others will follow. It will be in
       | Android at some point and possibly Windows and MacOS too.
       | 
       | I try not to be negative and cynical these days but the tech
       | oligarchs make that incredibly difficult.
        
         | sharklazer wrote:
         | In 15.2, they are scanning text messages for "nudity". Oddly,
         | no post here on HN about that.
         | 
         | My guess is it's already quietly pushed, but the flag needs
         | turned on.
         | 
         | https://petapixel.com/2021/12/13/apple-to-release-nudity-det...
        
           | mlindner wrote:
           | It's local only and they already scan your photos on-device
           | anyway. All the images have been processed for content and
           | you can do content aware searches on the images.
        
           | kevinmgranger wrote:
           | Is it local-only? If so, it has far fewer privacy
           | implications than CSAM scanning.
        
             | sharklazer wrote:
             | Yes, local only as far as I understand. But how many
             | additional steps does it take to turn the process into CSAM
             | scanning? It looks really close, even without squinting.
             | Caveat that I have no certain info though.
        
               | JimDabell wrote:
               | > But how many additional steps does it take to turn the
               | process into CSAM scanning? It looks really close, even
               | without squinting.
               | 
               | It's a different feature that works in an entirely
               | different way. There's plenty of descriptions of how both
               | features work out there, there's no need to guess at what
               | they might do.
        
           | jaywalk wrote:
           | There has been talk about it here, but it coincided with the
           | CSAM scanning which got much more coverage.
        
             | sharklazer wrote:
             | Thanks, I guess I missed it. Too many things going on in
             | this modern world to have a complete picture.
        
         | dkdbejwi383 wrote:
         | > It's all about money, nothing else.
         | 
         | Not being funny, but I am not sure why this comes as a
         | surprise. Of course everything Apple (and any other for-profit
         | org) does is to make money.
        
         | [deleted]
        
         | duxup wrote:
         | > The code has been written, tested and merged etc. The project
         | has reached a point where it's ready to ship.
         | 
         | Are we talking about anything that scans photos?
         | 
         | If so that's pretty common... everywhere.
         | 
         | I have some technically capable friends who were up in arms
         | about this situation.
         | 
         | Then they share their "vacation at X with friends " Google
         | photos album with me.
         | 
         | Cat isn't just out of the bag....
        
         | quitit wrote:
         | The "scanning" feature that underpins this has been deployed
         | for years, and I dare say wouldn't even be considered unique
         | for photo library managers. If you can search for "cat" and get
         | photos of cats, then your photo library already has the tools
         | necessary for CSAM detection.
         | 
         | I feel the fears about this implementation are largely
         | overblown - equivalent features are not being misused on Google
         | and Meta/Facebook's properties, despite having a much larger
         | audience and less procedural oversight. Rather Apple is the
         | outlier here for being not just late, but also by having
         | minimum thresholds that must be passed and an unskippable step
         | of human review.
         | 
         | I'm yet to hear a coherent argument about the potential dangers
         | of the system that understood the implementation, nor the state
         | of image scanning amongst cloud providers - even this thread
         | seems to believe that Apple is the first here. Apple's approach
         | was the most conservative, Google's goes much farther:
         | https://support.google.com/transparencyreport/answer/1033093...
        
           | ItsBob wrote:
           | > The "scanning" feature that underpins this has been
           | deployed for years
           | 
           | Yes, I agree that your phone has the ability to find stuff
           | like this based on ML etc. However, if their algorithm finds
           | something that matches a hash of child porn, you'll be
           | reported (after notification or something, can't remember).
           | 
           | Can you imagine what that would do to people? The risk of
           | fucking this up and destroying someone's life is far too
           | high. It's a stigma that will never go away even if it's
           | proven false.
           | 
           | And what if the government decides that they want to ruin a
           | political adversary? The technology is there to report them,
           | rightly or wrongly, for child porn on their phone and ruin
           | them. Again, proving that it was false will not be enough if
           | it leaked out in the first place.
           | 
           | In my opinion, it's my phone (yes, yes, I know they all dial
           | home to the mothership, I only lease it etc.) and I should
           | have complete say over it and this is one more step away from
           | that goal.
           | 
           | It's on the dystopian highway imo.
        
             | sweetheart wrote:
             | > matches a hash of child porn, you'll be reported
             | 
             | After 30 hits (so 30 false positives), and then the photos
             | are manually reviewed by a human being. How did people
             | never understand this? How on _Earth_ would people's live
             | be randomly destroyed by this? The reviewer would take one
             | look (or 30) and immediately realize it's not illicit.
             | Crisis averted. But the chances of those 30 false positives
             | actually all being in someone's iCloud was like
             | astronomically unlikely.
             | 
             | > And what if the government decides that they want to ruin
             | a political adversary?
             | 
             | Then you're already fucked, and the child porn scanning
             | software won't be what makes it possible to ruin your life.
        
               | kergonath wrote:
               | > After 30 hits (so 30 false positives), and then the
               | photos are manually reviewed by a human being.
               | 
               | As much as I think the panic is way overblown, this bit
               | is simply unacceptable, and they need to get nailed just
               | for having thought this was a good idea. Nobody is to
               | review personal photos; this is an insane breach of
               | privacy.
        
               | sseagull wrote:
               | I understand this. But am also aware of how processes
               | work out over longer timeframes.
               | 
               | Not too many people get caught, review team gets
               | downsized since it is not a revenue generator. Maybe we
               | can replace it with ML!
               | 
               | How many stories do we have on HN about overworked
               | employees reviewing horrible material? Or about how
               | impossible it is to contact support?
               | 
               | Just because there is a good system in place now doesn't
               | mean it will stay there.
        
               | simion314 wrote:
               | >After 30 hits (so 30 false positives), and then the
               | photos are manually reviewed by a human being. How did
               | people never understand this? How on _Earth_ would
               | people's live be randomly destroyed by this?
               | 
               | Don't be naive, the governments will demand that matches
               | on the "special" hashes will be reviewed by local
               | government people.
               | 
               | There is a similar feature in the "child safety" package
               | that scans for stuff in chat messages and probably (maybe
               | soon) in documents that are received ), so the code is in
               | place to have an uninstallable
               | "antivirus/antiCSAM/antipiracy" on your device that you
               | must trust both Apple and your goverment that does what
               | is promised but 100% you can't just delete the executable
               | to have piece of mind.
        
               | kergonath wrote:
               | > Don't be naive, the governments will demand that
               | matches on the "special" hashes will be reviewed by local
               | government people.
               | 
               | Seriously, if a government is after you, they will get
               | you regardless of the content of your phone. In a country
               | with decent laws, just a match is woefully insufficient
               | to demonstrate someone's guilt, and they will need some
               | proof. In other countries, laws do not matter so they can
               | just get you without issue. They have done it for quite a
               | long time and are doing it today.
               | 
               | All of this also points to the complete absurdity of this
               | mass hysteria: it is insane to ruin someone's life just
               | because they have a picture on their device. It is just
               | thoughtcrime, at this point.
        
               | simion314 wrote:
               | >All of this also points to the complete absurdity of
               | this mass hysteria: it is insane to ruin someone's life
               | just because they have a picture on their device. It is
               | just thoughtcrime, at this point.
               | 
               | But it happened, some guy was falsely accursed because
               | someone made a typo in an Ip address, he lost his job and
               | he was forcefully kept away from his wife and children.
               | 
               | >Seriously, if a government is after you, they will get
               | you regardless of the content of your phone
               | 
               | That is false, for example before 1989 listening to a
               | certain radio station was illegal here in Romania but the
               | government could not ask the guys the made the radio to
               | give them a list with all people that listened to this
               | radio station. Your claim is that the govberment knows my
               | name and what I did they can fuck me up, that is true but
               | this imnplies the gov already knows and has the proof, in
               | this case without the file scanner they won't know I own
               | political/religions materials that is illegal but after
               | the file scanner runs and reports me , only then I am on
               | the gov list.
               | 
               | I hope is clear, tech makes the job of the bad gov easy,
               | so in this case you as a random user you gai nothing from
               | your device scanning for CSAM but many lose from this,
               | the question is why if you gain nothing and this can be
               | done with a simple iCloud scan? (because for sure it will
               | be extended to private messages and documents)
        
               | sweetheart wrote:
               | > so in this case you as a random user you gai nothing
               | from your device scanning for CSAM
               | 
               | This makes it seem like there is no upside at all to the
               | CSAM scanning. There is: we stop more folks peddling
               | child porn, hopefully making a dent in the problem.
        
               | simion314 wrote:
               | >This makes it seem like there is no upside at all to the
               | CSAM scanning. There is: we stop more folks peddling
               | child porn, hopefully making a dent in the problem.
               | 
               | Apple could have scanned iCloud from the start and
               | prevent this problem if it is so big. If they are already
               | scanning then there is no use for scanning on device only
               | if iCloud is on and if they are not yet scanning iCloud
               | then you should ask Tim why is he ignoring the problem,
               | Google and Facebook already reported a lot of abuses,
               | does Tim love CP?
        
               | cle wrote:
               | > After 30 hits (so 30 false positives), and then the
               | photos are manually reviewed by a human being.
               | 
               | Sounds like a great gig for a pedophile.
        
               | ItsBob wrote:
               | And if the government makes it illegal to have a certain
               | document, or a book, what happens then?
               | 
               | You can guarantee Apple will report you. They have to. It
               | will be the law.
               | 
               | It will be used for human rights abuses. Guaranteed.
        
               | sweetheart wrote:
               | This falls under the "you're already fucked" part of my
               | argument. If it becomes illegal to have a file, or a
               | book, or an image, then the entire tech ecosystem we all
               | use everyday is completely untrustworthy.
               | 
               | Is that an issue? Sure! Is it a new issue, or specific to
               | Apple's implementation? Nope.
        
               | oarsinsync wrote:
               | > If it becomes illegal to have a file, or a book, or an
               | image
               | 
               | This is already true in certain circumstances.
               | 
               | > then the entire tech ecosystem we all use everyday is
               | completely untrustworthy.
               | 
               | This depends on what tech you use. The degree to which
               | you cannot trust it, or need to understand that it is
               | actively working against you, varies by platform. Nearly
               | all platforms undermine you in different ways. Some of
               | those platforms do so in a way in which you're ok with.
               | 
               | > Is that an issue? Sure! Is it a new issue, or specific
               | to Apple's implementation? Nope.
               | 
               | Agree to disagree. I don't mind cloud services (someone
               | else's computer) scanning my data and working against me.
               | I do mind my computer scanning my data and working
               | against me.
        
               | krapp wrote:
               | >After 30 hits, the photos are manually reviewed by a
               | human being. How did people never understand this? How on
               | _Earth_ would people's live be randomly destroyed by
               | this?
               | 
               | Many people here simply assume all governments and
               | corporations are simultaneously that evil and that
               | incompetent. The aggressive cynicism of this forum and
               | its reticence to engage with content before commenting
               | undermines its pretense at informed technical acumen and
               | civil discussion in many ways.
        
               | sweetheart wrote:
               | Good ol' Hacker News, baby!
        
             | threeseed wrote:
             | I really just love these comments.
             | 
             | You do know that Apple is scanning your photos server-side,
             | right ?
             | 
             | Just like Google, AWS, Yahoo etc anyone that stores your
             | photos in the cloud.
        
               | IcyClAyMptHe wrote:
               | There's a fundamental difference between you sending
               | files to a company server which are then scanned, and
               | your own hardware constantly monitoring you for illegal
               | activity so that it can report you to the authorities.
        
               | ItsBob wrote:
               | > You do know that Apple is scanning your photos server-
               | side, right ?
               | 
               | Yes.
               | 
               | They have every right to. It's their servers.
               | 
               | However, this is MY phone (ok, not mine, I don't use
               | Apple devices but you get my point)
        
           | BrazzVuvuzela wrote:
           | > _equivalent features are not being misused on Google and
           | Meta /Facebook's properties_
           | 
           | The objections come when such systems are to be run not on
           | corporate property, but on end user property. I don't see
           | anybody objecting to corporations scanning content uploaded
           | to the corporation's own computers. Scanning content on the
           | user's computer, which is the user's property, is what
           | unnerves people.
           | 
           | Furthermore, while Apple may have implemented this feature in
           | a relatively safe way (I don't fully understand it so I'm not
           | confident of that, but smart people here think it so I'll
           | assume it for the sake of argument), I believe that the
           | general public does not really understand this nuance and
           | Apple normalizing the practice of scanning personal computers
           | will clear the way for other corporations to implement
           | scanning in shittier way. If Microsoft were to start scanning
           | all documents on a Windows computer by uploading those
           | documents to their own servers, is that a distinction that
           | would necessarily be made clear to the general public? I
           | don't think so. The rhetoric will be "Apple does it and
           | that's fine, so Microsoft can do it too". The technical
           | differences in the two implementations will be understood in
           | forums like this, but not by the general public at large.
        
             | angulardragon03 wrote:
             | Photo classification takes place on device as well. The
             | models are generated on Apple's side, but the photos on
             | your device are classified by your device while it sleeps
             | and charges. Check the last line on this page:
             | https://support.apple.com/en-us/HT207368
        
             | quitit wrote:
             | I agree that there is a need for such changes to be public
             | and debated - my grind comes from those which makes
             | arguments which are neither factual, nor pay attention to
             | the status quo. (Thus leaping to fantasy assumptions which
             | are without basis.)
             | 
             | For example Google and Meta both scan all material that is
             | on their services, both of these providers actually go
             | further by utilising AI to hunt imagery which may not be
             | identified yet. So the idea of this compelling others to
             | scan is false: Apple is last here.
             | 
             | Similar to Google and Meta, Apple's CSAM processes only
             | come into play when the material is uploaded to Apple's
             | servers. The minutiae of where the scanning occurs is
             | irrelevant because Apple's implementation is merely a way
             | of performing the function without giving up the user's
             | privacy (by scanning locally rather than sending all of
             | your photos to Apple's servers for review.)
             | 
             | The scanning process is merely the addition of verified
             | hashes to a system which already uses this same technique
             | to categorise ones photos into searchable images
             | ("waterfalls" "dogs" "bicycles" etc.), because this is an
             | existing system there is no impact to performance/device
             | usage.
        
               | BrazzVuvuzela wrote:
               | > _The minutiae of where the scanning occurs is
               | irrelevant because Apple 's implementation is merely a
               | way of performing the function without giving up the
               | user's privacy (by scanning locally rather than sending
               | all of your photos to Apple's servers for review.)_
               | 
               | This is where you and I part ways. I don't think that's
               | an irrelevant minutiae, because it normalizes on-device
               | scanning. This implementation or others may change in the
               | future and opposition to those future implementations
               | will be hampered by "Apple was already doing it"
               | arguments, which will sound fairly plausible to the non-
               | technical public.
               | 
               | Maybe I'm making a 'slippery slope' argument here, but
               | I've become comfortable with such lines of thinking. My
               | cynicism has been rewarded too often before.
        
               | quitit wrote:
               | Yet the on device scanning already exists on every
               | platform. This is how the system makes content rapidly
               | searchable.
        
               | BrazzVuvuzela wrote:
               | The context of this discussion is on-device scanning that
               | reports users to the police. It's tedious to spell this
               | out; you should be able to infer it.
        
             | saagarjha wrote:
             | > I don't see anybody objecting to corporations scanning
             | content uploaded to the corporation's own computers.
             | 
             | That depends on how "BYOD" the company is. There's an awful
             | lot of companies that provide work machines that most of
             | their employees reasonably use for some sort of personal
             | work, be it checking their bank statements to see if they
             | got a deposit or logging into a handful of services so
             | their settings sync. In my opinion it is unreasonable to
             | expect people to _not_ do this, especially given that the
             | alternative is quite painful with modern software.
             | Corporate searches have a problem of not really having
             | clear limits like a legal search might, so going through
             | e.g. your synced web history when trying to find evidence
             | of credential abuse (check your own audit logs!) is common.
        
               | BrazzVuvuzela wrote:
               | My understanding is that the EU has stronger privacy
               | protections for employees using their employer's
               | computers. Personally, I keep all my private business off
               | my company laptop, if I want to check my bank statements
               | I use my phone (which is not enrolled in any company
               | system, I refuse those offers.) I don't sync a single
               | thing between my personal computers and work computer.
               | But yes, I see your point. Maybe there is some merit to
               | stronger privacy protection laws in this circumstance,
               | since so many users of corporate computers lack the
               | inclination, awareness, discipline to keep their personal
               | stuff off computers they use but don't own.
        
           | simion314 wrote:
           | The difference is that there is only 1 IF statement or
           | local/remote config flag that will enable this scanning on
           | your device even if you don't use the iCloud photo sharing
           | feature.
           | 
           | It is super well documented that Apple ALWAYS follows the
           | laws of the land, so the scenario is that a government adds
           | some new hashes that might not be CSAM as you and I define it
           | but might be political stuff(but the citizens can't check).
           | Also now that the code is there Apple can't refuse to add
           | this hashes or enable the feature forcefully when a legal
           | warrant for that territory is presented.
           | 
           | some more config changes and it can start scanning for text
           | in documents too.
        
             | threeseed wrote:
             | Given that Apple wrote OSX, this scenario has been possible
             | for over 20 years.
             | 
             | And it hasn't eventuated. But yet now it is something to
             | panic over ?
        
               | simion314 wrote:
               | Until recently you had a lot of freedom on your OSX
               | device, you could install and uninstall whatever you
               | wanted , you could have a firewall that blocked
               | applications. On iOS and slowly on OSX this freedoms are
               | lost, already when you open your computer, open an
               | application Apple is notified (for your safety), only a
               | few more steps are needed until you will get laptops with
               | iOS like lockdowns and maybe a PRO expensive version for
               | the ones that want the key for the gates.
        
               | jeromegv wrote:
               | You can blame them when that happens but it's odd to
               | blame OS X for being locked when clearly OS X is not...
               | locked.
        
               | simion314 wrote:
               | >You can blame them when that happens but it's odd to
               | blame OS X for being locked when clearly OS X is not...
               | locked.
               | 
               | OSX is getting more and more locked, iOS is locked so why
               | shouldn't people complain when things get worse.
               | 
               | I was just explaining that the excuse "OSX is 20 years
               | old and it was not locked yet so it will never be locked
               | q.e.d " , this is wrong, you can see clearly how things
               | get more locked in steps and you don't have to wait
               | untill last step to complain.
               | 
               | Keep in mind that you have better keyboards on Apple
               | laptops today because people complained, the CSAM shit
               | was delayed because people complained, the Apple tax was
               | lowered in some cases because developers and Epic
               | complained, so complaining sometimes works.
        
             | quitit wrote:
             | This argument falls exactly into the point I make about not
             | understanding the implementation or the status quo.
             | 
             | Here's what would need to change about the system to fulfil
             | your requirements.
             | 
             | 1. The local government would need to pass a law allowing
             | such a change 2. The hashes would then need to come from
             | the government. Instead of the two intersection of two
             | independent CSAM databases. 3. The review process would
             | need to redesigned and deployed to provide a lower
             | threshold and permit the reviewers to see high resolution
             | images. Neither of these changes are trivial. 4. The
             | reports would then need to go to the government. 5. What's
             | stopping these same governments from requesting such things
             | from Google and Meta - both of which have comparable
             | systems with lower oversight?
             | 
             | Apple don't "ALWAYS follows the laws of the land", one can
             | read into this with the recent "secret" agreement between
             | Apple and China which detail the various ways that Apple
             | hasn't complied with Chinese requests (e.g. the requests
             | for source code.)
        
               | amcoastal wrote:
               | Oh you mean apple didn't bend over for China when it
               | involved not giving up their source code but when it
               | didn't involve them losing their IP they did it easily?
               | Great job Apple, stand up move that I can trust.
        
               | quitit wrote:
               | There's plenty of other examples - but people hunting
               | strawmen are somehow blind.
        
               | ItsBob wrote:
               | How do you know that there isn't already a secret order
               | in place to look for a particular list of hashes? Perhaps
               | something that has been sealed due to national security?
               | 
               | If it's triggered, send the details to a particular group
               | of individuals with certain clearance to view and verify.
               | 
               | If they have the ability to do this, they will definitely
               | do this!
        
               | jeromegv wrote:
               | Why would apple have released a whole white paper on the
               | feature if the plan was... to do it in secret? Clearly
               | they could have gone the google route, start scanning and
               | not tell anyone about it. It's so odd to blame them for
               | something they haven't done.
               | 
               | Of course they could decide later to implement it! Just
               | like any company could choose to do that! It's the whole
               | point, your phone is closed source. But clearly apple
               | didn't pick that route, they announced it BEFORE shipping
               | and asked for feedback! And you're blaming them for
               | that?!
        
               | BrazzVuvuzela wrote:
               | > _Why would apple have released a whole white paper on
               | the feature if the plan was... to do it in secret?_
               | 
               | Didn't that only happen after a leak?
        
               | simion314 wrote:
               | >Why would apple have released a whole white paper on the
               | feature if the plan was
               | 
               | Isn't obvious ? Some guy using an exploit would install a
               | debugger on his iPhone and find this scanning process,
               | Apple PR would be hit. But Apple could do this and gain
               | PR by pretending is a safety feature, the guy with the
               | debugger will see that files are scanned and hashes are
               | sent to the server but he can't prove that the config
               | file won't be updated in the future for everyone or some
               | and the hashes will be pulled from a different database,
               | the threashdold will change and the is iCloud enabled
               | flag will be ignored.
               | 
               | What makes no sense and Apple failed to explain and
               | fanboys also failled to convince me with any evidence,
               | why the fuck not scan the images on iCloud only since we
               | are pretending that only if iCloud is enabled the images
               | are scanned. (there is no credible evidence that this is
               | part of any plan to offer some kind of super encryption
               | that Apple has no keys for)
        
               | ItsBob wrote:
               | I mean that the Gov will come along and say:
               | 
               | "See that hashing thing you have, this secret gov. order
               | that you can't tell anyone about says you need to look
               | for this list of hashes too. If you find them, you have
               | to let these 5 people know and if you tell anyone you'll
               | accidentally shoot yourself three times in the back of
               | the head to commit suicide!"
        
               | simion314 wrote:
               | >1. The local government would need to pass a law
               | allowing such a change
               | 
               | Sure, it is not hard, but already laws can apply, did
               | China had to pass a law like "any company has to update
               | their maps software to conform with our local maps?" or
               | China just showed Timmy a graph with profits from China
               | dropping and then Apple said "yes sir!!!" .
               | 
               | > 2. The hashes would then need to come from the
               | government. Instead of the two intersection of two
               | independent CSAM databases
               | 
               | Yeap, the government will demand for a local database to
               | be used and since the crimes are local you need to send
               | the reports o the local police and a local Apple employee
               | will have to check, not some cheap random contractor.
        
               | kmeisthax wrote:
               | Funny that you mention maps, because mapping information
               | in China is heavily censored, and has been since 2002[0].
               | They even wrote their own coordinate-obfuscation
               | algorithm so that you can't mix Chinese and western map
               | data or GPS coordinates and get accurate positions on
               | both sides.
               | 
               | Apple's response to the "what if governments force you to
               | abandon the hidden set intersection stuff you wrote"
               | thing is, "well we'll just say no". Which, unless Apple
               | plans to pull out of hostile jurisdictions entirely or
               | levy their own private military[1], holds no weight.
               | Countries own their internal markets in the same way
               | Apple owns the App Store market.
               | 
               | [0] https://en.wikipedia.org/wiki/Restrictions_on_geograp
               | hic_dat...
               | 
               | [1] which should scare the shit out of you even more than
               | the CSAM scanner
        
               | simion314 wrote:
               | Did you miss the reality where Apple says yes to China
               | all the times? Including maps and storing the iCloud data
               | in China?
               | 
               | The CSAM scanner does not scarry me as it is today, but
               | the fact that is a step, next year will scan more files,
               | then Google will copy Apple (willing or not) and then
               | most PCs will do the same thing.
               | 
               | Apple just normalized that is fine you have an
               | proprietary file scanner/"antiviruys" installed on yoru
               | device with the properties:
               | 
               | 1 it can't be stopped or removed
               | 
               | 2 it get's hashes from some government agencies over
               | online updates
               | 
               | 3 matches will be reviewed by someone that it said are
               | trustworthy and might get sent to your local police
               | 
               | 4 the stupid scanner has stupid excuses to exist, where
               | would have been extremely simple to implement and non
               | controversial to scan the fucking images after the upload
               | and not before.
        
               | dotancohen wrote:
               | I see that you don't live in Turkey. Or Lebanon. Or
               | Egypt, or the United States, or Brazil, or any of tens of
               | other nations where this exact scenario is not only
               | plausible, but is of real concern.
        
               | kergonath wrote:
               | What is the alternative? Companies deciding that the law
               | does not apply, just because?
               | 
               | Reverse this for a moment: would it be acceptable that a
               | Chinese tech behemoth decides US law does not apply to
               | their American operation, because they are convinced that
               | western democratic values are dangerous?
               | 
               | This is also a real concern. If you think that rule of
               | law is a good thing, then companies _must_ comply with
               | local laws. The alternative is not a humanist utopia; it
               | is the rule of unaccountable corporations.
        
               | simion314 wrote:
               | >If you think that rule of law is a good thing, then
               | companies must comply with local laws. The alternative is
               | not a humanist utopia; it is the rule of unaccountable
               | corporations.
               | 
               | In this case there is no law to force Apple to scan on
               | device, they can scan iCloud only if the law forces them
               | to do that, or scan only iCloud images that are public or
               | shared if the law forces to do only that.
               | 
               | Before 1989 listen to a specific radio station was
               | illegal , but the good part was that the radio device
               | could not snitch on you , if someone would have invented
               | snitching radio then I am 100% only those radio devices
               | would be legal to be sold in my country at that time. So
               | if you create a snitching device then you invite the bad
               | governments to add the laws after to profit from that and
               | make their lives easier.
        
               | quitit wrote:
               | Then you should already be gravely concerned because, as
               | I've stated: Apple is last to the game here. Google and
               | Meta each with significantly larger audiences than Apple
               | are already doing this. Since Meta in particular is
               | social media and the main source of casual political
               | dis/information for the masses this would be the low
               | hanging fruit that you're worried about.
               | 
               | The vague "this is a serious concern because my
               | government is shit" requires a person to ignore a lot in
               | favour of a little.
        
               | dotancohen wrote:
               | > Then you should already be gravely concerned
               | 
               | I am concerned. I explicitly wrote "...not only
               | plausible, but is of real concern."
        
               | cool_dude85 wrote:
               | Sorry, my meta hardware is scanning client-side for CSAM?
               | And you claim it's the same for google; can you tell me
               | which Pixel is scanning client-side?
        
               | jeromegv wrote:
               | Which apple device is now scanning client side for CSAM?
        
               | simion314 wrote:
               | >Which apple device is now scanning client side for CSAM?
               | 
               | I think you missed OP point, the grandparent found a bad
               | excuse for Apple, that Google and Facebook do the same so
               | nothing is wrong, but the OP responded correctly that
               | this is false, those companies don't install file
               | scanners on your device so the excuse is bullshit.
        
               | kergonath wrote:
               | > those companies don't install file scanners on your
               | device so the excuse is bullshit.
               | 
               | They don't install file scanners on iOS because they
               | cannot. They are happy with scanning my contacts and
               | doing who knows what with my photos if I let them,
               | though.
        
               | simion314 wrote:
               | >They don't install file scanners on iOS because they
               | cannot. They are happy with scanning my contacts and
               | doing who knows what with my photos if I let them,
               | though.
               | 
               | Nobody says that Google or Facebook are good/saints, but
               | there is a difference, my phone did not had Facebook or
               | WhatApp installed, so journalists or activists can just
               | don't install this crap.
        
               | BrazzVuvuzela wrote:
               | > _two intersection of two independent CSAM databases._
               | 
               | I don't think that offers meaningful protection. A
               | malicious government could acquire or produce real CSAM,
               | or at least a convincing fake of it, then modify that
               | image subtly to match the hash of the political content
               | they were targeting (I believe such engineered collisions
               | have already been demonstrated.) They would then be able
               | to get that manipulated CSAM into as many independent
               | CSAM databases as they wanted, simply by uploading it to
               | the web then reporting it to all those independent
               | databases.
        
               | quitit wrote:
               | This isn't feasible for a variety of reasons:
               | 
               | 1. Forced hash collisions are fragile in comparison to
               | the genuine material. CSAM hashing is devised in a way to
               | allow change in the image without significant variation
               | to the hash. Forced collisions however don't possess that
               | property as their true hash is different.
               | 
               | 2. It's trivial to detect images with false hashes, and
               | different hashing methods will not produce collisions.
               | 
               | 3. The task you're proposing, in itself is not pragmatic
               | nor effective: dissenting political materials are easily
               | created fresh (i.e. no colliding hash) and exists in many
               | forms that go well beyond an image.
               | 
               | 4. Human reviewers again would see the pattern and adapt
               | the system accordingly
               | 
               | 5. CSAM database holders aren't idiots. They understand
               | the significance of their hash libraries and work against
               | manipulation from CSAM peddlers.
        
               | BrazzVuvuzela wrote:
               | Points 1 and 3 assume any government wouldn't choose to
               | target people who had a specific document, I think this
               | is a poor assumption. Hash fragility doesn't seem
               | relevant if both the hashing algorithms and the target
               | content are known to the government
               | 
               | Point 2 and 5 I don't understand, could you expand on it?
               | If the government produced manipulated CSAM that didn't
               | exist in any other pre-manipulation form, how would the
               | manipulation be detected? And supposing it were detected,
               | would CSAM database holders choose not to add that image
               | to their databases just because it was manipulated? If
               | that were the case, it would give pedophiles a way to
               | exempt their images from the database too.
               | 
               | Point 4 is neutralized by the government demanding local
               | human reviewers for phones located in their country.
               | China already does this with iCloud, they could demand
               | the same for the CSAM reviewers, then coerce those
               | reviewers or simply select them for party loyalty in the
               | first place.
        
           | codydh wrote:
           | It seems that many of the arguments against this system rely
           | on believing governments (or similarly powerful groups) will
           | strong-arm Apple and fabricate evidence.
           | 
           | I fail to see how this system (which I'm not a big fan of, to
           | be clear) makes it easier if you assume governments can and
           | will fabricate evidence. Doesn't seem you need an iPhone,
           | smartphone, or to have owned or done anything at all in that
           | case.
        
         | m12k wrote:
         | > Apple have done their homework and they will not release
         | ANYTHING unless they think they'll either make money from it or
         | at the very least, not LOSE money from it. It's all about
         | money, nothing else.
         | 
         | > It's coming whether we like it or not now :(
         | 
         | Um, I think the public outcry showed them pretty clearly that
         | they will lose customers and money precisely if they go ahead
         | with this. I think the people championing this inside Apple
         | (some rumors say Cook himself) were expecting this to be some
         | kind of feel-good, "we're thinking of the children" PR piece -
         | not the shitstorm it turned out to be.
        
           | derwiki wrote:
           | Now they know to be quieter about this. It will be a quiet
           | line item in an upcoming dot release. And not enough people
           | will fuss (eventually).
        
             | sharklazer wrote:
             | I have a feeling it already came in 15.2. They are scanning
             | texts for "nudity" in images. Not that they are the same
             | thing. Or being used in the same way. But I have a feeling
             | CSAM scanning just needs it's flag turned on to work.
             | 
             | https://petapixel.com/2021/12/13/apple-to-release-nudity-
             | det...
        
               | jeromegv wrote:
               | Thank you for your "feelings". Except your feelings have
               | no basis in reality. Why would you blame a company for
               | something that they clearly haven't done?
        
               | zionic wrote:
               | I see where you're coming from, but Apple brought this on
               | themselves by annoucing this invasive spyware garbage.
               | This is what you get when you undermine your user's
               | trust.
        
               | sharklazer wrote:
               | Well, their gonna push it at some point. Do you think
               | they'll tell you when they do, given the backlash.
               | 
               | Tell you what, put up 20k USD and I'll go through the
               | formal process of verifying binaries and whatnot.
               | Otherwise, I think my "feeling" still contributes to the
               | discussion. I don't need to put in 200 hours to prove
               | something I'm not misrepresenting as fact.
               | 
               | CSAM scanning will come. Apple probably won't tell you.
        
       | ksec wrote:
       | I am glad, that _finally_ , a healthy dose of skepticism are
       | applied to Apple. Finally. Even on site like Macrumors.com.
        
       | endisneigh wrote:
       | I never understood the big deal about this. If you use most cloud
       | file synching services you are already being scanned, and if you
       | don't, this (CSAM) thing wouldn't have affected you anyway.
       | 
       | The main rebuttal is that people claimed that Apple could expand
       | the scanning to photos you aren't sending to iCloud. If you
       | believe that though, then not using Apple devices (regardless of
       | whether they implement CSAM or not) is clearly the move.
       | 
       | tldr: if you trust apple, this doesn't matter, and if you don't,
       | you shouldn't use apple devices anyway.
        
         | fsflover wrote:
         | https://news.ycombinator.com/item?id=29564794
        
         | mikehall314 wrote:
         | Many people were upset or annoyed at the idea of their own
         | device spying on them. I can understand that.
         | 
         | However, the major issue for me is the database of "forbidden"
         | neural hashes which you are not permitted to have on your
         | phone. Who controls what goes into that database? Right now,
         | it's only CSAM and Apple have said they pinky-swear that if
         | they're told to add other things they will refuse.
         | 
         | Any maybe they will. Maybe Tim Cook will fight tooth-and-nail
         | to avoid adding non-CSAM hashes to that list. But then maybe
         | Tim retires in five years time and his replacement doesn't have
         | the appetite for that fight. Or maybe Apple gets court order
         | saying "you MUST also report this sort of content too". Maybe
         | that list of forbidden hashes now includes political imagery,
         | the picture from Tiananmen Square, known anti-government memes,
         | etc.
         | 
         | Once the technology is created and deployed, the only thing
         | preventing that sort of over-reach is Apple's resolve against
         | governmental pressure; even if you'd trust Tim Cook with your
         | life, I can completely see why that would give people pause.
        
           | endisneigh wrote:
           | > However, the major issue for me is the database of
           | "forbidden" neural hashes which you are not permitted to have
           | on your phone. Who controls what goes into that database?
           | Right now, it's only CSAM and Apple have said they pinky-
           | swear that if they're told to add other things they will
           | refuse.
           | 
           | This isn't what was being proposed. What was being proposed
           | was that if you had a photo that resolved to a "forbidden"
           | hash, upon being uploaded to iCloud data would be sent to
           | Apple. Presumably after some arbitrary amount of times this
           | happens, information would then be sent to the local
           | government.
           | 
           | If you were not using iCloud to store photos then it wouldn't
           | matter either way (according to Apple).
        
             | mikehall314 wrote:
             | I don't think anything I've said contradicts that. Although
             | Apple said they would only scan content which was uploaded
             | to iCloud Photos, scanning still takes place locally on-
             | device.
        
               | endisneigh wrote:
               | I'm not saying you made a contradiction - I'm saying that
               | the files in question would've been scanned either way.
               | The question is simply whether or not you trust and
               | believe Apple. If you don't it doesn't make any
               | difference whether it's being scanned on device or not.
        
               | mikehall314 wrote:
               | I'm sorry - when you said "this isn't what was being
               | proposed" I must have mis-understood what you meant.
        
           | Syonyk wrote:
           | > _However, the major issue for me is the database of
           | "forbidden" neural hashes which you are not permitted to have
           | on your phone. Who controls what goes into that database?
           | Right now, it's only CSAM and Apple have said they pinky-
           | swear that if they're told to add other things they will
           | refuse._
           | 
           | The problem with this is that we already know "National
           | Security Letters with Gag Orders" exist, and we know that the
           | US government, at least, doesn't mind using those.
           | 
           | "You will do this thing and you will tell nobody that you've
           | done this thing."
           | 
           | The Lavabit rabbit hole is a case study here: https://en.wiki
           | pedia.org/wiki/Lavabit#Suspension_and_gag_ord...
        
           | zionic wrote:
           | >Maybe Tim Cook will fight tooth-and-nail to avoid adding
           | non-CSAM hashes to that list.
           | 
           | 1) Tim Cool will have no knowledge of this when the
           | government inevitably expands the hash list
           | 
           | 2) Apple gets a list of hashes, not content, from NCMEC which
           | is an unconstitutional quasi-governmental agency created by
           | congress and funded by the government to bypass the 4th
           | amendment.
           | 
           | 3) Apple will simply get an updated list of hashes and has no
           | reasonable means to verify what the hashes from any given
           | government are.
        
       | matthewdgreen wrote:
       | My prediction is that client-side CSAM scanning is indeed dead,
       | but Apple will make the "easy" decision to perform server-side
       | scanning. The problem is that this decision will lock them out of
       | offering an (increasingly standard) E2EE backup capability. I
       | don't know that the Apple execs have grappled with the
       | implications of this, long term. But at least it's an
       | improvement.
        
         | fossuser wrote:
         | I'm hoping they're just going to announce it alongside e2ee
         | rather than not doing it at all in favor of server side
         | unencrypted scanning.
         | 
         | It was a mistake to announce it in isolation as seen by this
         | blowback because most people don't understand or care about the
         | actual details and just loudly complain.
         | 
         | Similar to the dismissal of the privacy preserving Bluetooth
         | exposure notifications. Dumb knee-jerk dismissals imo that end
         | up with worse outcomes for everyone.
        
         | FabHK wrote:
         | Maybe they can offer a choice?
         | 
         | You can pick
         | 
         | a) no backups
         | 
         | b) iCloud backups with E2EE and privacy preserving CSAM
         | scanning as suggested by Apple (private set intersection, half
         | client side, half server side)
         | 
         | c) iCloud backups, unencrypted, with CSAM scanning server side
         | (like all other cloud services)
         | 
         | I know what I'd choose!
        
         | zionic wrote:
         | The only acceptable solution is full E2EE for all Apple
         | services. Nothing else is satisfactory at this point. They need
         | to make this step to regain their user's trust.
        
           | judge2020 wrote:
           | This isn't exactly in Apple's hands.
           | 
           | https://www.bbc.com/news/technology-51207744
           | 
           | https://www.eff.org/deeplinks/2019/12/senate-judiciary-
           | commi...
        
             | zionic wrote:
             | No law is on the books blocking encryption. Apple needs to
             | nut up and ship full E2EE, daring the politicians to do
             | anything about it. Let's see how big they talk when it's
             | their political careers on the line.
        
         | giansegato wrote:
         | They've been already doing server side CSAM, for a while, on
         | iCloud pics. Their issue was targeting all media that wasn't
         | uploaded.
        
           | snowwrestler wrote:
           | The system they proposed would only have scanned images in
           | the process of being uploaded--so, the opposite of your 2nd
           | sentence.
           | 
           | There was some reporting that Apple does server-side CSAM
           | scanning, but based on the volume of matches they report, it
           | must be extremely limited.
           | 
           | Apple never said it this way, but many people assumed that
           | Apple designed this client-side scanning system so that they
           | could then encrypt iCloud data with client-side keys, which
           | would provide end to end encryption for customer backups...
           | and prevent any server-side scanning. This is likely what
           | Matthew is referring to.
        
             | zerkten wrote:
             | From a read of the expert opinions on the CSAM topic when
             | this blew up. From that, it seemed like you had to be all
             | in, or keeping out of active CSAM scanning/monitoring for
             | legal reasons. They noted the match volume and came to the
             | same conclusion.
             | 
             | With the attention I don't see how Apple can delay taking
             | some action here. A new external liability that can be
             | attacked in different ways. Either they need to level up
             | the CSAM scanning server-side to increase compliance, or
             | they need to reimplement the client-side approach. The
             | latter approach is probably much better from their
             | perspective for various reasons.
             | 
             | Ultimately, they might have been better giving this a lower
             | profile overall and effectively have sought public
             | attention of law enforcement to provide some cover. The
             | authorities are going to get this in some fashion
             | eventually.
        
             | judge2020 wrote:
             | > but based on the volume of matches they report, it must
             | be extremely limited.
             | 
             | For reference:
             | 
             | > According to NCMEC, I submitted 608 reports to NCMEC in
             | 2019, and 523 reports in 2020. In those same years, Apple
             | submitted 205 and 265 reports (respectively). It isn't that
             | Apple doesn't receive more picture than my service, or that
             | they don't have more CP than I receive. Rather, it's that
             | they don't seem to notice and therefore, don't report.
             | 
             | https://www.hackerfactor.com/blog/index.php?/archives/929-O
             | n...
        
               | snowwrestler wrote:
               | Thanks, that is the reference I was thinking of.
        
         | jamil7 wrote:
         | > The problem is that this decision will lock them out of
         | offering an (increasingly standard) E2EE backup capability
         | 
         | Is this a given?
        
           | willis936 wrote:
           | It's not really encrypted if you can tell what the data is.
           | The two capabilities are entirely mutually exclusive.
        
         | bangonkeyboard wrote:
         | Thanks for raising the initial alarm on this.
        
       | yelskcol wrote:
       | Seems bizarre that this is what people take issue with and yet
       | they're happy otherwise to hand over their data to FAANG
       | companies. If you're concerned about privacy / data misuse you
       | shouldn't be using these services at all and you should have been
       | pointing out the issues years ago.
        
         | fsflover wrote:
         | https://news.ycombinator.com/item?id=27897975
        
         | authed wrote:
         | Data stored in the cloud should be considered public (at this
         | time), but not the data on your device.
        
       | [deleted]
        
       | balozi wrote:
       | _> Apple has quietly nixed all mentions of CSAM from its Child
       | Safety webpage, suggesting its controversial plan to detect child
       | sexual abuse images on iPhones and iPads may hang in the balance
       | following significant criticism of its methods._
       | 
       | OR more likely suggests that Apple prefers to continue its
       | program in secret.
        
       | nbzso wrote:
       | I don't care. I already switched my business to Linux Boxes. The
       | last Mac OS in use is Catalina. All of my critical software runs
       | under Linux. Figma is in the browser. Affinity Designer is
       | running perfectly under Isolated Windows VM. If I ever use Apple
       | computer again it will be with Linux. After 20+ years of using
       | Apple computers I don't find them interesting enough to risk my
       | data and be scanned, categorized and processed by third party
       | criteria. Think different. Use Linux.
        
         | kf6nux wrote:
         | > I don't care. I [switched to Linux]
         | 
         | That's short-sighted and self-centered.
         | 
         | You don't care about whether or not other people's rights are
         | violated?
         | 
         | You don't see how normalizing rights violations could lead to
         | downstream effects on you?
        
           | nbzso wrote:
           | Please, don't insult me and judge me so quickly. Read all my
           | responses. You can check my posts back in time (If you wish).
           | 
           | The whole reason for my HN post activity (reading in the
           | shadows from 2008-10) is CSAM. In the beginning of this issue
           | nobody cared and tons of Apple fanboys down-voted every
           | possible negative reaction.
           | 
           | Apple is industry leader. Everyone will copy what they
           | create. You can read "I don't care" as an emotional reaction
           | of an individual who had invested tons of money in Apple,
           | used Apple products and evangelized thousands of friends and
           | customers. Only to realize that The Beauty is actually The
           | Beast.
        
           | matheusmoreira wrote:
           | We do care. However, what can we do about it? Can we somehow
           | make the billion dollar corporation do what we want?
           | Unlikely. Better to use systems that we actually control so
           | that we don't have to suffer their tyranny.
        
           | zarzavat wrote:
           | The truth is that nobody has a right to use Apple products.
           | It's good that Apple listens to criticism but if they had
           | decided to steamroller this through, they would have been
           | within their rights to do so.
        
         | donohoe wrote:
         | And your phone is what?
         | 
         | If it's an Android it's probable got something similar in
         | place.
         | 
         | On an iPhone all you must to use disable iCloud sync to avoid
         | CSAM
        
           | fsflover wrote:
           | How about GNU/Linux phones, Librem 5 and Pinephone? Not well
           | polished yet but running desktop OS with all its
           | possibilities.
        
           | SXX wrote:
           | Android have plenty of open source distributions without such
           | features. They don't save you from baseband backdoor, but at
           | least OS itself isn't your enemy.
        
           | nbzso wrote:
           | Basic phone and iPhone SE (the old version) for banking apps.
           | Never used iCloud. All my backups are done locally. It is not
           | only about CSAM.
           | 
           | CSAM was the last straw. Before this was that: echo 127.0.0.1
           | ocsp.apple.com | sudo tee -a /etc/hosts
        
             | threeseed wrote:
             | In your situation then CSAM scanning would never have been
             | an issue.
             | 
             | It only applies when you have iCloud Photo Library enabled.
        
               | mensetmanusman wrote:
               | No, the specific controversy was about on-device hash
               | scanning which used child porn as cover, when in fact it
               | is exactly the feature that China et al. wants in order
               | to make sure people aren't sharing any political memes
               | etc.
        
               | virtue3 wrote:
               | for now? You can't really guarantee anything with
               | companies when they start rolling things out like this.
               | 
               | The fact that they went so hard for the feature without
               | really considering how people felt about the invasion is
               | important.
        
               | threeseed wrote:
               | Why would they bother ? The PR from actually trying to
               | help users has been negative.
               | 
               | So instead they will just continue to scan your photos
               | server-side.
        
               | nicce wrote:
               | > So instead they will just continue to scan your photos
               | server-side.
               | 
               | Yep, and because of that we are not getting E2E services
               | (Photos, backup), which would have been possible with
               | this new technology.
        
               | jaywalk wrote:
               | E2EE services are absolutely possible without this
               | technology. What are you saying?
        
               | nicce wrote:
               | They are possible, but they will not be implemented on
               | Apple's scale without CSAM features. Politics already try
               | to implement forced backdoors.
        
               | jeromegv wrote:
               | Without considering how people felt? They cancelled it!
               | The whole process was about getting feedback on it.
               | 
               | We're you consulted on how google implemented it server
               | side with google photos? You likely didn't know!
               | 
               | Only one company chose transparency here.
        
               | pseudalopex wrote:
               | An Apple VP implied anyone who opposed it didn't
               | understand it. And endorsed an NCMEC director calling
               | them screeching voices of the minority.[1]
               | 
               | [1] https://9to5mac.com/2021/08/06/apple-internal-memo-
               | icloud-ph...
        
               | boppo1 wrote:
               | > They cancelled it!
               | 
               | I doubt it. It will be quietly implemented. If a techie
               | digs in far enough with RE to prove it, general society
               | will treat them like they wear a tinfoil hat.
        
               | jaywalk wrote:
               | > The whole process was about getting feedback on it.
               | 
               | What a load of BS. Apple told us it was going to happen,
               | and then quietly canned it when there was backlash. There
               | was no intention of "consultation" or "getting feedback"
               | at all.
        
               | nbzso wrote:
               | It is not about the implementation either. It is about
               | hostile attack towards user privacy after years of
               | advertising: What Happens on your iPhone, stays on your
               | iPhone. Privacy is King. Privacy. That's iPhone.
        
               | nbzso wrote:
               | >So we lost yet another feature which would have
               | increased the privacy.
               | 
               | With all the respect, there is one small detail: The
               | search criteria is provided by third party organisation
               | (funded by DOD), without option of public oversight due
               | to nature of "sensitive" data set. This data set would be
               | defined by similar "organisations" per country bases.
               | 
               | In my humble opinion this "privacy related solution" is
               | Stasi wet-dream on steroids.
               | 
               | Some governments liked it so much, that they wanted it to
               | be extended.. https://9to5mac.com/2021/09/09/csam-scan-
               | encrypted-messages/
        
               | Retric wrote:
               | CSAM didn't allow that, they only included hashes
               | provided by multiple governments.
               | 
               | The backlash was obvious from the outside, but it's clear
               | someone spent a lot of time building something they felt
               | was reasonable. Even if Apple presumably just implemented
               | the same system in iCloud, at least people became aware
               | governments have access to such systems.
        
               | w1nk wrote:
               | I'm not sure I can agree that it's clear someone spent a
               | lot of time building something they felt was reasonable.
               | 
               | The entire technical underpinnings of this solution rely
               | on essentially neural image embeddings, which are very
               | well known to be susceptible to all sorts of clever
               | attacks. Notice how within a day or two of the
               | announcement of this whole system and it's symbols,
               | people were already finding lots of 'hash' collisions.
               | 
               | In the space of people and places that can train and
               | implement/serve such embeddings, these issues are pretty
               | widely known, which makes it very non-obvious how this
               | happened IMO. Someone that understood all of these issues
               | seems to have directly ignored them.
        
               | Retric wrote:
               | The government image hashes are supposed to be secret
               | information, any crypto system is vulnerable if you
               | assume the secrets are suddenly public information. I am
               | sure plenty of crypto currency people would object to
               | saying the system is insecure because anyone can post a
               | transaction with your private key.
               | 
               | More importantly hash collisions result in manual review
               | by humans at Apple, hardly a major issue. This is also a
               | safety measure protecting users from political images
               | being tagged in these database.
        
               | lttlrck wrote:
               | Reasonable and logical: it's a similar idea to a locally
               | hosted virus scanner scanning all your documents...
        
               | ibigb wrote:
               | All cloud platform scan for CSAM. Apple was going to move
               | the scanning task off their servers, where it is now,
               | onto apple devices only as the devices upload to apple's
               | cloud. If you use cloud services, info does not stay on
               | your phone, does it?
        
               | nbzso wrote:
               | After years of using Little Snitch to defend myself from
               | Apple telemetry, it is not an option for me to "trust"
               | any form of closed source software.
               | 
               | I can ditch my iPhone at any time, but Apple wanted CSAM
               | to be part of Mac OS and this is unacceptable.
               | 
               | Anyway, I am glad that they tried. This was a wake up
               | call with positive financial and technological outcome
               | for me and my business.
        
               | nicce wrote:
               | > I can ditch my iPhone at any time, but Apple wanted
               | CSAM to be part of Mac OS and this is unacceptable.
               | 
               | It is already on MacOS. It sends hashes from every
               | unknown file on "virus protection purposes". Only name
               | and hash database is different. How that changes your
               | privacy?
               | 
               | And to be fair, they never said publicly that it would
               | come to Mac OS.
        
               | nbzso wrote:
               | >These features are coming later this year in updates to
               | iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.*
               | 
               | http://web.archive.org/web/20210807010615/https://www.app
               | le....
        
               | zepto wrote:
               | Except that you were just wrong and there was no attack.
               | When people complained that this was over the line, Apple
               | didn't do it.
        
               | nicce wrote:
               | The most people have misunderstood how this system
               | actually worked. I read every paper carefully, and this
               | was very innovative system to protect your privacy IF we
               | compare to existing CSAM solutions. Of course the best
               | option is to not scan at all, but that is not really
               | option.
               | 
               | Only those pictures which were about to be uploaded
               | iCloud, would have been scanned, and Apple would get
               | information about image contents only if it is flagged.
               | The phone is blackbox and it scans your files anyway all
               | the time, sending metadata to Apple e.g. because of the
               | spotlight feature and photo albums, or simply syncing
               | your data to cloud. There is massive AI behind that
               | spotlight feature. Nothing would have changed, just the
               | different kind of information would have been flagged,
               | but this time encrypted.
               | 
               | The major improvement was E2EE like system for Photos.
               | Currently they are not E2E encrypted in Cloud. They are
               | plaintext for Apple. iOS 15 beta had also encryption for
               | backups, but it never reached public reach after CSAM was
               | delayed. So we lost yet another feature which would have
               | increased the privacy. "But it happens in my device", is
               | not really valid argument since most people don't
               | understand what happens in their devices in the first
               | hand. Even AV engines sends all your data to cloud, and
               | you can't opt out in most of the cases and it is for
               | every file.
        
               | tjpnz wrote:
               | An innovative solution to the wrong problem.
               | 
               | What exactly does one achieve in nailing the local pedo
               | for _old_ content? If he 's stupid enough to be uploading
               | it to iCloud he's stupid enough to get himself busted in
               | all manner of other ways. The real problem is of course
               | criminals who are currently engaged in the production of
               | CSAM. Which Apple's scheme does nothing to address.
        
               | nicce wrote:
               | It is essential to stop sharing those files no matter
               | what. It might not help on getting them caugh, but those
               | who buy these pictures or find elsewhere, might try to
               | share them as well. They might not be tech guys, so they
               | might try to use some familiar service. And yes, there
               | are many people like that. iCloud would be flooded if no
               | action is taken against sharing.
        
               | hulitu wrote:
               | So you are saying that Apple has access to all your data
               | ( = no privacy ) but this "app" will improve your privacy
               | by scanning all your files.
        
               | nicce wrote:
               | It is not scanning all of your files, only those which
               | will be send to iCloud. Scanning for part of pipeline in
               | sync process, so it even can't scan other files.
               | 
               | But yes, currently Apple can access all your images on
               | the cloud, but with this change they would have been E2E
               | encrypted, hence improving privacy.
        
           | bubblethink wrote:
           | This is such a silly strawman. First, it is not related to
           | the parent's point at all. Second, it is perfectly possible
           | to run AOSP based OSes not tied to google.
        
             | donohoe wrote:
             | I disagree.
             | 
             | >> First, it is not related to the parent's point at all
             | 
             | The post was about switching away computers. My (badly
             | implied) point was, a lot of the effort is wasted unless if
             | you still use an Apple iPhone...
             | 
             | If you skip using an iPhone, then fine, but the other
             | options out there are likely worse.
             | 
             | >> Second, it is perfectly possible to run AOSP based OSes
             | not tied to google
             | 
             | You can build your own Google-free etc version but again
             | most people won't because of the effort and the need to
             | redo for updates again and again. The Parent post author
             | probably has the means to do it, but it's another endless
             | task they have to stay on. All other options (to my limited
             | knowledge - plz share if you know more) mean compromises in
             | apps, and functionality - unless you're happy to spend
             | hours each wee/month troubleshooting and setting up
             | workaround and working with specific compatible hardware.
             | 
             | It can be done. But how much of your life will you want to
             | spend on that? Reading other comments maybe it is a solved
             | problem now - it hadn't been last time I tried.
             | 
             | If I assumed too much from the parent post then sorry.
        
               | bigyellow wrote:
               | LineageOS has near-nightly binary releases, you don't
               | need to build or "redo" anything to keep up to date,
               | other than a 5-10 minute update over adb about once a
               | week.
        
         | theplumber wrote:
         | What about the mobile OS?
        
           | bigyellow wrote:
           | Not OP, but check out LineageOS without Play Services.
        
         | [deleted]
        
         | zepto wrote:
         | Did you switch because of the proposed CSAM detector?
        
           | Syonyk wrote:
           | I can't speak to the person you're asking, but I very much
           | have eliminated Apple products from my daily use as a result
           | of CSAM - though it was on top of a range of other things
           | (mostly China related) that told me I can no longer really
           | trust Apple.
           | 
           | I've scrapped out a perfectly good (and exceedingly nice to
           | use) M1 Mini, I no longer use a 2015 MBP, I've shut down my
           | iPhone (not quite for good, I still have to figure out how to
           | move my Signal root of identity off it), and I no longer use
           | an iPad for work reference PDFs.
           | 
           | I've replaced these with far less functional "But I'm making
           | them work" sort of devices. An AT&T Flip IV for phone stuff
           | (KaiOS, handles basic calls and texts, and not much else,
           | though it does have GPS and maps, and can do email if you let
           | it), a Kobo Elipsa + KOReader for PDF support, and I've spent
           | a good bit of time tweaking a PineBook Pro into my daily
           | driver laptop.
           | 
           | The general view I have of Apple at this point is, "Oh, we
           | know what kind of company you are, now we're just haggling
           | over details."[0]
           | 
           | To whine about Apple doing stuff and continue to use them
           | says, "Yes, it's important enough to complain about online,
           | but not important enough to alter how I use their products -
           | therefore, not really important at all."
           | 
           | [0]: https://quoteinvestigator.com/2012/03/07/haggling/
        
             | hef19898 wrote:
             | It's funny, the Apple CSAM issue got me to de-google and
             | de-MS as much as possible. I didn't use Apple, but I
             | figured the others won't be any better. That worked very
             | well in private life, professionally it is Windows and will
             | be for the foreseeable future. That leaves Windows machines
             | for the kids to run MS Office and Teams for school.
        
             | zepto wrote:
             | > I've scrapped out a perfectly good (and exceedingly nice
             | to use) M1 Mini
             | 
             | Given that the CSAM issue turns out to be moot, I assume
             | you feel regret at doing this.
        
               | Syonyk wrote:
               | Not really. It's forced me to evaluate what I actually
               | need (not much) vs what I'd prefer (fancy, nice,
               | expensive systems because... well, why, of course I need
               | them!).
               | 
               | I'd love an ODroid N2+ with 8GB RAM, though. Twice the
               | CPU performance of a Pi4 but it's pretty RAM choked with
               | only 4GB.
        
               | zepto wrote:
               | I see, so you felt regret for buying a computer you
               | didn't need and claimed it was because of CSAM but that
               | was actually a lie.
               | 
               | Now you are promoting the ODroid N2+ which is completely
               | irrelevant to this thread.
        
               | Syonyk wrote:
               | I've had Mac desktops in my office for about 20 years at
               | this point, to include a mirror door G4, a G5, a
               | sunflower iMac, a 27" iMac, a 2018 Mac Mini (what an
               | absolute turd for anything more complex than a static
               | desktop, the GPU was horrid), and the M1 Mini. At no
               | point did I really evaluate that my needs over time have
               | changed, I just always had a Mac desktop, and, through
               | most of that time, a Mac laptop. It was my default
               | computing base, plus a range of Linux boxes.
               | 
               | The CSAM thing was the final straw for me with Apple, and
               | as it turns out, my needs have rather significantly
               | decreased over time, as the lower power hardware has
               | increased in capability, though... sadly not as much RAM
               | as I'd find useful.
               | 
               | It turns out that I don't actually _need_ Apple hardware
               | to do that which I now care to do with computers. And one
               | of those replacements was a N2+, which, while a bit sharp
               | around some edges, is actually a tolerably usable little
               | Linux desktop.
               | 
               | You can call it what you want, I'm just describing what
               | I've done in response to the changes in how Apple has
               | chosen to treat their users.
               | 
               | The nature of the computing ecosystems have changed
               | around me, such that I chose to make changes. I don't
               | particularly regret the decision any more than I would
               | selling a vehicle that no longer suits my needs when I
               | move.
        
               | zepto wrote:
               | > The CSAM thing was the final straw for me with Apple.
               | 
               | A final straw which _never actually happened_. They in
               | fact listened to users saying how unpopular this would
               | be, and changed course.
               | 
               | > I'm just describing what I've done in response to the
               | changes in how Apple has chosen to treat their users.
               | 
               | No, you are describing what you've done in response to a
               | _demonstrably false belief_ about how Apple has chosen to
               | treat their users.
        
               | smoldesu wrote:
               | > "Update: Apple spokesperson Shane Bauer told The Verge
               | that though the CSAM detection feature is no longer
               | mentioned on its website, plans for CSAM detection have
               | not changed since September, which means CSAM detection
               | is still coming in the future."
               | 
               | It's still coming.
        
               | zepto wrote:
               | Is it? What makes you think that?
               | 
               | Edit:
               | 
               | Please don't retroactively edit your comment to make the
               | comment that follows it look misplaced.
               | 
               | It is a dishonest tactic that makes you look like a bad
               | faith commenter. Who can trust that anything you have
               | written hasn't been altered after the fact?
               | 
               | There is no reason you couldn't have included that quote
               | as a reply to my question instead of silently editing it
               | in to make it look as if it has always been there.
        
               | smoldesu wrote:
               | I'm not sure why you're flying off the handle when I'm
               | quite obviously not the poster above. I have no
               | intentions of re-formatting my comments here, just
               | informing you of the status quo right now.
               | 
               | > Is it? What makes you think that?
               | 
               | The Verge reached out to Apple[0] and Apple confirmed
               | that their CSAM scanning feature is still coming.
               | 
               | > When reached for comment, Apple spokesperson Shane
               | Bauer said that the company's position hasn't changed
               | since September, when it first announced it would be
               | delaying the launch of the CSAM detection. "Based on
               | feedback from customers, advocacy groups, researchers,
               | and others, we have decided to take additional time over
               | the coming months to collect input and make improvements
               | before releasing these critically important child safety
               | features," the company's September statement read.
               | 
               | So yes, they still fully intend to push the update. You
               | should probably take this less personally too, your
               | arguments are only suffering by attacking people who are
               | just trying to chip in their two cents.
               | 
               | [0] https://www.theverge.com/2021/12/15/22837631/apple-
               | csam-dete...
        
               | zepto wrote:
               | > I'm quite obviously not the poster above.
               | 
               | You are in fact the one who dishonestly edited your
               | comment. That's what I was pointing out, and it was you
               | and not the commenter before that I was replying to. _We
               | can see your name._
               | 
               | Trying to pretend otherwise is either delusion, or more
               | dishonesty.
               | 
               | I agree that the update provided by the verge suggests
               | that apple has not abandoned this plan yet.
        
               | smoldesu wrote:
               | What are you even talking about? Can you provide me a
               | link to the specific comment that I had modified here?
               | I'm genuinely lost.
               | 
               | In any case, that's not a refutation or response to the
               | link I sent.
        
               | zepto wrote:
               | Here is the comment you edited _after_ I asked the
               | question in the comment that follows it. When you first
               | posted it you didn't provide a quote.
               | 
               | > https://news.ycombinator.com/item?id=29569282
               | 
               | I don't need to refute your response. In fact after
               | seeing the update I agree that CSAM is not yet defeated.
               | 
               | That doesn't change the fact that your commenting tactics
               | are dishonest. _That_ is the problem here. You cannot be
               | trusted not to edit upthread comments.
        
               | smoldesu wrote:
               | At no point did I ever edit that comment. It's contents
               | were, since the time of posting, a quote and simply "It's
               | still coming."
               | 
               | I legitimately have no idea what you're talking about.
               | This is either an extremely confusing attempt to gaslight
               | me or you're mistaking me for someone else. In any case,
               | I don't really know what else to add here. Do you
               | remember what was supposedly the original contents of
               | this comment? We're really getting nowhere here pointing
               | fingers.
        
               | baby-yoda wrote:
               | "moot" - nope, see below.
               | 
               | https://news.ycombinator.com/item?id=29569782
        
         | coldtea wrote:
         | > _I don 't care._
         | 
         | OK, so?
        
           | hu3 wrote:
           | Please don't post shallow dismissals.
           | 
           | https://news.ycombinator.com/newsguidelines.html#comments
        
             | arbitrage wrote:
             | Be kind. Don't be snarky.
             | 
             | Comments should get more thoughtful and substantive.
             | 
             | When disagreeing, please reply to the argument instead of
             | calling names.
             | 
             | Eschew flamebait.
             | 
             | Please don't complain about tangential annoyances.
             | 
             | Please don't post shallow dismissals.
             | 
             | https://news.ycombinator.com/newsguidelines.html#comments
        
               | DonHopkins wrote:
               | Beautiful is better than ugly.
               | 
               | Explicit is better than implicit.
               | 
               | Simple is better than complex.
               | 
               | Complex is better than complicated.
               | 
               | Flat is better than nested.
               | 
               | Sparse is better than dense.
               | 
               | Readability counts.
               | 
               | Special cases aren't special enough to break the rules.
               | 
               | Although practicality beats purity.
               | 
               | Errors should never pass silently.
               | 
               | Unless explicitly silenced.
               | 
               | In the face of ambiguity, refuse the temptation to guess.
               | 
               | There should be one-- and preferably only one --obvious
               | way to do it.
               | 
               | Although that way may not be obvious at first unless
               | you're Dutch.
               | 
               | Now is better than never.
               | 
               | Although never is often better than _right_ now.
               | 
               | If the implementation is hard to explain, it's a bad
               | idea.
               | 
               | If the implementation is easy to explain, it may be a
               | good idea.
               | 
               | Namespaces are one honking great idea -- let's do more of
               | those!
               | 
               | https://www.python.org/dev/peps/pep-0020/
               | 
               | Take heart in the bedeepening gloom
               | 
               | That your dog is finally getting enough cheese.
               | 
               | https://www.youtube.com/watch?v=Ey6ugTmCYMk
        
             | coldtea wrote:
             | How is not the parent comment a shallow reaction to the
             | post, which was my point? It was the usual empty response
             | that gets the top comment post and sets the tone for the
             | thread.
        
               | nbzso wrote:
               | I beg you pardon, my responses can be categorized as
               | provocative sometimes, but they are never empty. I always
               | state clearly my point and give factual data.
        
         | Silhouette wrote:
         | _Affinity Designer is running perfectly under Isolated Windows
         | VM._
         | 
         | Share more details about how you set that up please? Good
         | graphics software, or the relative lack of it, is one of the
         | few things still holding us back from shifting several daily
         | use machines to Linux. Affinity would be OK for that but the
         | only reports we found suggested it wasn't happy running
         | virtually and performance was poor.
        
           | nbzso wrote:
           | I switched my office workflow to Manjaro.
           | 
           | https://leduccc.medium.com/improving-the-performance-of-a-
           | wi...
           | 
           | https://wiki.archlinux.org/title/PCI_passthrough_via_OVMF#Is.
           | ..
        
         | aunty_helen wrote:
         | Hey I just want to let you know that you're not alone and there
         | are other sane people out there that aren't just grumbling but
         | keeping on with the boiling frog.
         | 
         | Even though the new mbps look really nice, I've got a Framework
         | laptop sitting at home that I'm in the process of migrating to
         | (that's $4k of MBP max that apple have missed out on).
         | 
         | And early next year, I'm moving off the iphone platform onto
         | probably a pixel and switching the OS to something like
         | Graphine. It's going to be fun and experimental.
         | 
         | I'm also looking to hire some developers next year, provided
         | laptops will be Linux too.
        
       | xunn0026 wrote:
       | Reminder that Internet Archive has a 2-to-1 Matching Gift
       | campaign for donations: https://archive.org/donate
       | 
       | It's quite important to see what a given page _wrote_.
        
       | citizenpaul wrote:
       | It's incredibly disturbing to me how little much of the HN crowd
       | seems to understand how these systems like CSAM actually work.
       | I'm not talking about the tech or programming.
       | 
       | This is a fast track tool to putting anyone in jail for the most
       | despised crime a person can commit. The tool has no oversight, no
       | transparency, no counter, no clear your name, no real usage
       | definitions other than relentless spying on the "protect the
       | children" fallacy.
       | 
       | Most people don't understand how life destroying a false
       | accusation is once the government paperwork is started. At that
       | point like it or not you are basically considered guilty by all
       | the agents of the governement by default. Their jobs are to
       | prosecute not eliminate false positives. Once you have been on
       | their radar your life will never ever be the same. And since it
       | is CP related you will not even get helpful people willing to
       | touch it with a 10ft pole cause of fear they may be implicated.
       | CSAM is a dastardly and effective way to establish a totalitarian
       | rule by fear of life in jail, nothing more.
        
         | xeromal wrote:
         | Dude every single comment in the post when this was announced
         | was against it.
        
           | Kye wrote:
           | There were a few "why are you defending predators?" type
           | comments.
        
             | Nginx487 wrote:
             | And "I have nothing to hide!" type comments, from few
             | voyeurists.
        
               | commandlinefan wrote:
               | > from few voyeurists
               | 
               | In this case, wouldn't that be exhibitionists?
        
         | bradgessler wrote:
         | What's disturbing is that nothing has materially changed. For
         | those of us who are running iOS or Android, we're just an OS
         | update away from CSAM landing on our devices again. It's just a
         | matter of time. The question is what will it be renamed to and
         | under what pretext will it be deployed?
        
           | bradgessler wrote:
           | Eugh, yep. Apple pretty much confirms its coming in the
           | future:
           | 
           | "Update: Apple spokesperson Shane Bauer told The Verge that
           | though the CSAM detection feature is no longer mentioned on
           | its website, plans for CSAM detection have not changed since
           | September, which means CSAM detection is still coming in the
           | future."
           | 
           | Expect a much quieter roll out.
        
             | speg wrote:
             | My bet is they will settle for a similar solution to other
             | providers and scan the photos on their servers.
        
               | vbezhenar wrote:
               | Do you have any reasons to think that they're not
               | scanning photos on their servers? AFAIK every major
               | storage cloud does that, including Apple.
               | 
               | What probably happened is that Apple planned to introduce
               | E2E encryption for iCloud. But they have to scan photos,
               | so they rolled out client scanning to ensure that they're
               | not storing encrypted child porn on their servers.
        
               | bradgessler wrote:
               | That's still a much better world to live in than "let's
               | make it OK for people to think they can deploy scanners
               | to users devices". My device is mine, their servers are
               | theirs.
        
               | [deleted]
        
             | GekkePrutser wrote:
             | We will not make it quiet.. It will be found out and I'm
             | sure the privacy organisations will make a big deal of it.
             | 
             | I have much less of a problem with server based scanning.
             | But my device is mine.
        
           | Mountain_Skies wrote:
           | "Medical misinformation" seems to be the trendy pearl
           | clutching excuse of the day.
        
         | GuB-42 wrote:
         | I hate this thing as much as the rest of HN but I think you are
         | thinking too far.
         | 
         | CSAM is just a tool that matches image hashes. It can't make
         | accusations. Matching hashes is not illegal, even having
         | pictures of abused children is not illegal, no prosecutor will
         | take your case if it is the only thing you have. At most it
         | will turn you into a witness. If there is an investigation, it
         | can turn you into a suspect, but suspicion is not an
         | accusation. I mean, if your car is parked next to the place a
         | crime happened, you can be suspected, but it doesn't mean you
         | are going to jail because of where you parked your car.
         | 
         | If, for some reason, you get wrongly accused because the system
         | recorded a match and nothing else, it is not a problem of the
         | tool, it is the much bigger problem of justice not working,
         | blame your shithole country if it happens, not Apple.
         | 
         | I am not saying that false accusations don't happen, but saying
         | that police has no other job than to prosecute is a bit much.
         | Police asked me questions a few times, once for a car that
         | caught fire not far away from where I was, once for a missing
         | person, and once they didn't tell me the reason. Was I
         | suspected? Maybe, I was at the "wrong place" after all, but
         | yet, I never heard from them again. If prosecution was their
         | only goal, I would have been in jail now.
        
           | kobalsky wrote:
           | > CSAM is just a tool that matches image hashes. It can't
           | make accusations
           | 
           | who validates the source content of the hashes? what stops
           | the government from adding one tiny bitty hash that they know
           | is uniquely present in let's say, assange's phone?
        
             | GuB-42 wrote:
             | Again a "what if" scenario.
             | 
             | It is not trivial to identify a picture that is uniquely
             | present in Assange's phone. Without having access to
             | Assange's phone that is. And if you have access to
             | Assange's phone, you probably have way more information
             | than such a trick will give you.
             | 
             | But yes, it is a problem, and a reason why I dislike CSAM.
             | But from "allowing the government to track highly sensitive
             | individuals" to "you, the average citizen, will go to
             | jail", there is a huge gap.
             | 
             | Also, I think one thing many people missed is that a hash
             | is not enough. The only thing a hash does it that it allows
             | a reviewer at Apple to decrypt a matching file. If the
             | picture turns out to be nothing special, nothing will
             | happen.
             | 
             | In fact, that feature is much weaker than people make it,
             | it just puts Apple on par with what others like Google,
             | Microsoft, etc... can do. I think the reason it got so much
             | negative press is that Apple made privacy a big selling
             | point. I think it serves them right, I don't like how Apple
             | treats privacy as just a way to attack Google without
             | really committing to it, but still, for the facts, it is a
             | bit unfair.
        
         | _fat_santa wrote:
         | Stuff like this gives me pause in the tech industry. My brother
         | is in med-school and always joke with him that the worst thing
         | I can at work is take a website down and the worst thing he can
         | do is kill someone.
         | 
         | But with tech like this, you can literally kill someone with
         | it. Sure you're not actually "killing" anyone, but the
         | decisions these systems make can drive someone to end their
         | life, whether or not the accusations by the machine are true. I
         | wouldn't want to be the guy building that software.
        
           | jareklupinski wrote:
           | lest we forget
           | 
           | > Because of concurrent programming errors (also known as
           | race conditions), it sometimes gave its patients radiation
           | doses that were hundreds of times greater than normal,
           | resulting in death or serious injury.
           | 
           | https://en.wikipedia.org/wiki/Therac-25
        
           | commandlinefan wrote:
           | > can drive someone to end their life
           | 
           | Don't underestimate the possibility of somebody else ending
           | their lives for them in a fit of righteous rage.
        
         | jodrellblank wrote:
         | Thing you haven't mentioned is that with no oversight, no
         | transparency no counter, no clear your name, the tyranny which
         | wants to ruin your life doesn't _need_ to build a CSAM scanner
         | to do it. A tip-off to the police, a falsified claim from an
         | "eye witness", an "accident" putting your name on the wrong
         | report to get the machinery rolling and it will, as you say,
         | roll by itself. See what actually did happen to Sir Cliff
         | Richard[1] where "a complaint to the police" got his house
         | raided and his name dragged through the media and despite not
         | being arrested, not being charged, case being dropped, him
         | suing the media and winning, suing the police and winning and
         | the media apologising, he says his reputation will never be
         | completely cleared - and it probably never will be, and it
         | didn't require any specific tech.
         | 
         | "The System" didn't need to wait patiently for a big tech
         | company to build a complicated system with dastardly moustache-
         | twirling weaknesses before it could affect Sir Cliff, anymore
         | than Big Brother in 1984 needed to wait for Winston to be seen
         | taking action on the telescreen in his house before the system
         | crushed him. If "The System" wanted to accuse you, it could
         | accuse your car of having been seen on a nearby road at the
         | time, your likeness being seen on CCTV, and if you get rid of
         | licence plate recognition scanners and CCTV then it would be
         | "an anonymous witness" seeing you in the area at the time or
         | "overhearing you discussing" something.
         | 
         | > " _Most people don 't understand how life destroying a false
         | accusation is_"
         | 
         | Citation needed? Because we were forced to read 1984 and To
         | Kill a Mockingbird in school and I'm pretty sure a lot of other
         | people were as well; we've seen cult TV series like The
         | Prisoner and The Fugitive focusing on the lives of the trapped
         | trying to escape "the system". Blockbuster films have been made
         | about the power of "The State" chasing people such as Enemy of
         | the State with Will Smith. Films in the pattern "wrongly
         | accused must fight to clear his name" and "stalwart good guy
         | fights the system gone bad" are practically genres on their own
         | - The Fugitive as a film with Harrison Ford, Who Framed Roger
         | Rabbit, Captain America: The Winter Soldier, The Shawshank
         | Redemption, and specifically about someone falseley accused by
         | a child, The Hunt with Mads Mikkelsen which is in the IMDB Top
         | 250 films.
         | 
         | Are you also, for example, complaining about the system where
         | doctors and social workers are mandatory reporters if they have
         | any evidence or suspicion of children being abused? That's a
         | "their word against yours" system which pervades the nation,
         | has no oversight, has no objective criteria, no explanaiton of
         | the workings of the brain behind it, relies on humans when we
         | know for certain that there are doctors who make mistakes and
         | errors of judgement and at least some who would look the other
         | way for money.
         | 
         | At least with the tech company CSAM scanners we can
         | _potentially_ have a society wide discussion about how they
         | work in detail, whether that 's reasonable, who could and
         | should oversee them and how, what kinds of data cloud companies
         | are obliged to know about their customers and what is
         | reasonable for them to not know, etc. But even in tech circles
         | we can't actually have those discussions, most of the attempts
         | to discuss the details of how it works are drowned out by
         | people who haven't a clue what Apple announced, misunderstand
         | massive parts of it, conflate two or three different systems,
         | don't seem to understand that other tech companies already do
         | this with much less careful design, and cry tyranny and blame
         | China.
         | 
         | Hypothetically, iMessage or any chat system could be built end
         | to end encrypted, any cloud file sync or file storage could be.
         | One end of the spectrum is that they should be. If that's your
         | position then "the system" will probably work around you,
         | that's the "opt out of modern life" position of the "return to
         | the woods" lifestyle. You can't unmake billions of cheap
         | ubiquitous cameras, you can't unopen Pandora's box, and you
         | probably can't stop people emotionally reacting strongly to
         | topics of child protection. If your reaction to that is "not
         | listening, no no no" then it doesn't seem likely to sway "the
         | system" very much at all.
         | 
         | > " _CSAM is a dastardly and effective way to establish a
         | totalitarian rule by fear of life in jail, nothing more._
         | 
         | America has effectively incarcerated a lot of people into for-
         | profit prisons without needing this. China has effectively
         | added a lot of population control by rule of fear without
         | needing this. You don't need a high-tech murder-detector to
         | make people fear being thrown in jail for murder.
         | 
         | [1]
         | https://en.wikipedia.org/wiki/Cliff_Richard#Property_search,...
        
         | junon wrote:
         | > It's incredibly disturbing to me how little much of the HN
         | crowd seems to understand how these systems like CSAM actually
         | work.
         | 
         | Dunno. I remember most people on HN were absolutely pissed this
         | was even being considered. My takeaway is that _most_ people
         | here understand it to a large enough degree that they don 't
         | like it.
        
         | Spivak wrote:
         | You should at least be somewhat charitable to the other side of
         | this issue which is that CSAM is the result of real life sexual
         | abuse and the market for it causes more. Unless you want a cop
         | stationed every 10 feet you can't stop the abuse, you can only
         | take away the incentive to do it in the first place which is
         | the theory behind most laws. The reason most people aren't
         | constantly worried about being murdered isn't because we have
         | super advanced defense systems but because the fear of
         | punishment keeps people from doing it.
         | 
         | I'm aware that the majority of HN want the solution to be "do
         | nothing" and some quote about liberty and security but you
         | would probably change your tune if you or your kids were
         | trafficked, starved, raped, and the had the pictures and videos
         | distributed to fetishists online. CSAM detection is the
         | government's best option to kill the market for it and make it
         | radioactive.
         | 
         | At least listen to the experience of human trafficking
         | survivors before you say it's all a big conspiracy. Those scary
         | signs on buses, trains, in women's bathrooms, and doctor's
         | offices aren't a joke.
        
           | gigglesupstairs wrote:
           | You are making it sound all philosophical when HN crowd is
           | already philosophically behind finding an appropriate
           | solution to the CP problem. "Problem" is not a problem here,
           | current proposed solution is. See it critically and not
           | emotionally.
        
             | Spivak wrote:
             | See this is always how online discussions go.
             | 
             | * "CSAM is -- nothing more -- than a way for governments to
             | establish a totalitarian rule!"
             | 
             | * "Oh come on, that's really uncharitable to the people
             | implementing CSAM detection, at least present the argument
             | for the other side in the way they view the issue."
             | 
             | [loud screeching of goalposts...]
             | 
             | * "Of course we acknowledge the problem of sexual abuse
             | imagery, it's just that we don't like the CSAM detection as
             | a solution... [aaand have no ideas for an effective privacy
             | preserving alternative, and want nothing done in the
             | interim while we search for the unicorn]."
             | 
             | There becomes a point where if you don't want to implement
             | the best idea we've got to address the problem, and don't
             | have any other ideas, then it gets harder to believe you
             | really care about the problem.
        
               | hackinthebochs wrote:
               | >There becomes a point where if you don't want to
               | implement the best idea we've got to address the problem
               | 
               | The assumption here is "something must be done". The fact
               | is that liberty and safety are intrinsically at odds. If
               | we're going to make progress, the question we have to
               | face is how many rescued children is worth how many units
               | of liberty. It's distasteful to present the issue so
               | bluntly, but it's the implicit calculation we do in other
               | cases, e.g. preventing child deaths in car accidents. We
               | all implicitly agree that some amount of death and
               | disfigurement of children is worth the economic and
               | social benefits of cars. Similarly, how much liberty
               | should we give up here?
        
               | Spivak wrote:
               | Of course but at least call a spade a spade, the units of
               | liberty you're willing to trade to solve the problem is
               | basically the metric of how much you care about it.
               | 
               | It's a fine position to take that the harm due to that
               | much invasion of privacy and government involvement in
               | our private lives isn't worth but means precisely that
               | you care about the problem less than someone who is
               | willing to make that trade.
        
               | hackinthebochs wrote:
               | I don't think that follows. One can maximally prioritize
               | children while also believing that all children are
               | better served by a society that protects liberty over
               | immediate safety. How you weigh these issues turns on how
               | you weigh the N-th order effects. It's probably not too
               | controversial to say that eliminating all cars would harm
               | children more and thus children as a whole benefit more
               | from cars than their elimination. But it would be
               | disingenuous to characterize this calculation as caring
               | more about economic benefit than children.
        
           | mindslight wrote:
           | You're conflating viewing CSAM with the physical act of
           | exploitation, through some dubious reference to "incentives"
           | and "the market". I don't think people who abuse children are
           | doing so because it's economically lucrative, but rather
           | because they themselves are dangerously sick in the head. But
           | maybe you know more about this scene than I do. (see how that
           | works?)
           | 
           | Once abuse has occurred and has been "documented", completely
           | preventing the spread of that CSAM requires totalitarian
           | systems of policing communications, which are what is being
           | argued against here. Invoking the original abuse as if the
           | topic at hand would have had some bearing on it is an
           | emotional appeal, not a logical argument.
        
           | xboxnolifes wrote:
           | > The reason most people aren't constantly worried about
           | being murdered isn't because we have super advanced defense
           | systems but because the fear of punishment keeps people from
           | doing it.
           | 
           | In this case, CSAM is the "advanced" defense system, not the
           | fear of punishment.
        
             | Spivak wrote:
             | Kinda? Detecting CSAM doesn't stop the victim from getting
             | abused in the first place which is the goal. It's to
             | increase the risk of getting caught and punished, and to
             | make the photos worthless.
        
           | matheusmoreira wrote:
           | Nice job making everyone who opposes totalitarian
           | surveillance systems look like they don't care about abuse.
           | This is exactly why children are the perfect political
           | weapon: it makes people accept anything, and the few of us
           | who resist are lumped in with the abusers.
        
           | ohazi wrote:
           | > The reason most people aren't constantly worried about
           | being murdered isn't because we have super advanced defense
           | systems but because the fear of punishment keeps people from
           | doing it.
           | 
           | No, the _real_ reason most people aren 't constantly worried
           | about being murdered is that most of the people we encounter
           | aren't murderers.
        
             | Spivak wrote:
             | > most of the people we encounter aren't murderers
             | 
             | Gods I wish this fallacy had a name, I guess it's a
             | corollary of the Spiders Georg principal. The number of
             | perpetrators is not proportional to the number of victims.
             | 
             | It's not true that 40% of men will sexually assault someone
             | in their lifetime but it's nonetheless true that 40% of
             | women will be sexually assaulted in their lifetime.
        
               | ohazi wrote:
               | Fine, turn my statement around if you prefer. The reason
               | people aren't afraid of being murdered is because _being
               | a victim of murder is rare_.
               | 
               | But the statistic you're citing here conflicts with your
               | earlier statement (which is the one I take issue with):
               | 
               | > isn't because we have super advanced defense systems
               | but because the fear of punishment keeps people from
               | doing it.
               | 
               | The "fear of punishment" deterrent is clearly doing
               | _absolutely nothing_ for the 40% of women who have been
               | / will be sexually assaulted.
               | 
               | Fear of punishment does not work when "correct" execution
               | of justice is so rare and arbitrary. That's my point.
               | Police are corrupt. Prosecutors are corrupt. Therefore,
               | there is no justice. Therefore, "fear of justice" doesn't
               | work as a deterrent.
               | 
               | Even if the police were better, if all it takes is a few
               | murdering, raping, psychopaths to do all of the damage,
               | how could "fear of punishment" possibly work?
               | 
               | There is, however, fear that the bureaucracy will decide
               | to come after you in a Kafkaesque horror show...
        
               | belval wrote:
               | Jesus in what country? Even taking some very liberal
               | definition of rape 25% is insanely high. Do you have some
               | source on that?
               | 
               | EDIT: In the US we have a 2000 study quoting 5% and a
               | 2011 study saying "nearly 20%" but their data include
               | attempted rape which is a pretty important distinction.
               | This is coming from Wikipedia though.
        
               | Spivak wrote:
               | Changed it to sexual assault because you're right, the
               | study I was going off of included attempted rape which
               | made it higher, but also only counted penetration which
               | is also kinda dumb. Switching to sexual assault
               | eliminates some ambiguity.
        
           | FredPret wrote:
           | So do detective work, track down the pedo's, kick down their
           | doors, shoot/imprison them, and free the children.
           | 
           | Don't take away the privacy and liberty of hundreds of
           | millions of innocent citizens.
           | 
           | What happened to all the people who visited Epstein's
           | pedophile island? Any consequences? Nothing so far! But the
           | rest of the population better watch out, Big Brother is going
           | through their photo libraries. It's all a massive government
           | overreach.
        
             | Spivak wrote:
             | The whole reason these systems are proposed is because
             | "detective work" is ineffective and the harm is ongoing.
             | You gotta a least meet people where they are. Don't you
             | think the level of policing required to actually find
             | people who possess CSAM wouldn't also be a massive
             | overreach? Or are you hoping that "just do more" will come
             | to nothing in practice?
             | 
             | Prosecuting members of the ruling class in this country is
             | a whole separate issue, and one that I'm sure we are in
             | total agreement on sans the "well if rich people are above
             | the law why can't everyone be too" take.
        
               | matheusmoreira wrote:
               | > Don't you think the level of policing required to
               | actually find people who possess CSAM wouldn't also be a
               | massive overreach?
               | 
               | Obviously. Criminalizing possession of data is dumb. It's
               | essentially a declaration that some numbers are illegal.
               | It's the exact same problem faced by copyright. Any
               | attempt to actually enforce these laws at scale will be a
               | massive overreach to the point of tyranny. They are
               | incompatible with the computing freedom we enjoy today.
        
               | Spivak wrote:
               | I don't buy this argument because you can make any law
               | sound silly by reducing it to something absurd. Saying
               | that that the benefit isn't worth the trade in freedom is
               | totally valid but the quip about illegal numbers isn't
               | super persuasive.
               | 
               | "Criminalizing possession of nuclear weapons is dumb,
               | it's essentially a declaration that some molecules are
               | illegal."
               | 
               | "Criminalizing hate speech is dumb, it's essentially a
               | declaration that certain combinations of sound
               | frequencies are illegal."
               | 
               | "Criminalization of murder is dumb, it's essentially a
               | declaration that I certain locations where I store my
               | bullets are illegal."
        
               | matheusmoreira wrote:
               | It sounds absurd because it _is_ absurd. Think about the
               | absolute control that would be necessary in order to stop
               | our computers from working with arbitrary data. Obviously
               | this logic cannot be applied to the physical world.
        
           | bradgessler wrote:
           | Nobody is saying "do nothing" about all the depraved stuff
           | some people do with respect to human trafficking, etc.
           | 
           | What people are asking is to not be treated as if they are
           | human traffickers by Apple. If they deploy CSAM, they are in
           | effect saying, "sorry world, we suspect all of you are trying
           | to traffic humans so we need to scan your images".
           | 
           | That's not how law enforcement should work in a free and open
           | society. You should be suspected of illegal activity first,
           | then law enforcement can move in and do their job.
           | 
           | Once CSAM is deployed, then it's much easier to expand the
           | scope of it to include other forms of content, and Apple will
           | have a much harder time saying "No".
           | 
           | I was really hoping Apple would take the backlash of CSAM to
           | exert pressure against whoever is asking them to do this.
           | "Sorry <large gov entity>, we can't do that without
           | destroying the decade of goodwill we built up with our
           | customers. You'll have to legislate this for us to do it".
           | That would have the advantage of at least a public debate.
        
       | sixhobbits wrote:
       | @dang
       | 
       | suggested title update as the important bit is cut off
       | 
       | Apple Removes References to CSAM Scanning from Child Safety
       | Webpage
        
       | whatever1 wrote:
       | If it is already implemented how would we even know.
       | 
       | I believe that since iOS is doing ML scanning locally anyway, it
       | is doing hash checks too.
        
       | kingcharles wrote:
       | Does anyone know if the law enforcement file signature databases
       | use SHA or MD5?
       | 
       | My guess is that they are all MD5 signatures.
       | 
       | So - my question is - can I get a file onto someone's device that
       | will match a CSA image, but is actually not at all CSA? (i.e.
       | basically a more serious "SWAT" attack because it ends with the
       | person potentially spending years in jail and their life ruined)
        
       | kf6nux wrote:
       | The page also no longer references the iMessage forced snitching
       | on children. Was that covered in another news story?
        
       | mrtksn wrote:
       | I strongly believe that this feature could have broken Apple's
       | brand irreparably. %99.9 of us are not pedophiles but personal
       | devices that snitch you to the police(with extra steps) is
       | something that most people will not accept.
       | 
       | The proposed database for snitching was CSAM but once the system
       | there, the devices become the pocket police that can receive any
       | kind of scan-match-snitch instructions. This would be the point
       | where humanity embraces total control.
       | 
       | That said, an offline version might be workable. Think 2E2
       | encrypted devices that can be scanned against a database with
       | physical access to the device upon approval from the device
       | owner. Imagine a person being investigated through standard
       | police work, it can be used to prove the existence or the lack of
       | the existence of the materials without exposing unrelated and
       | private information if the person accepts to cooperate.
        
         | BrazzVuvuzela wrote:
         | > are more
         | 
         | Typo of abhor?
        
           | mrtksn wrote:
           | Yep, thanks
        
       | amelius wrote:
       | > Apple Removes All References to Controversial CSAM Scanning
       | Feature from Its...
       | 
       | I was hoping to read "Source Code" next.
        
       | BiteCode_dev wrote:
       | It's removed from marketing material, but is it removed from the
       | code?
       | 
       | Since the OS is proprietary, how do we know they didn't go for it
       | anyway?
        
         | jeromegv wrote:
         | How do you know it's not on android?
        
           | jetsetgo wrote:
           | This post is about Apple; why deflect?
        
           | lordofgibbons wrote:
           | Who cares? Weren't talking about android right now. No need
           | for "what about" deflections of valid criticisms
        
             | kf6nux wrote:
             | That may have been simple ignorance instead of
             | whataboutism. The person asking it may genuinely not know
             | how much easier it is to inspect Android (both the device
             | and the open source). They may not know you can build
             | custom images of Android either.
        
           | zeusly wrote:
           | Android is open source? At least you have the chance to use
           | it like that, some tinkering required.
        
             | treesknees wrote:
             | The AOSP code is open source. The Android and related apps
             | running on your latest Pixel 6 is not.
        
             | najqh wrote:
             | The only difference between Android and iOS in this regard
             | is that Android's frameworks are open source. The rest is
             | exactly the same.
        
         | grishka wrote:
         | > Since the OS is proprietary, how do we know they didn't go
         | for it anyway?
         | 
         | Reverse engineering is a thing.
        
           | aembleton wrote:
           | Have you reverse engineered iOS?
        
             | judge2020 wrote:
             | Well, someone did:
             | 
             | https://9to5mac.com/2021/08/18/apples-csam-detection-
             | reverse...
        
             | grishka wrote:
             | I haven't reverse engineered iOS itself because I've never
             | had any reasons to, but I have reverse engineered third-
             | party iOS apps (dumped from a jailbroken device) and some
             | parts of macOS.
        
           | nicce wrote:
           | Good luck on not being more than half year behind the latest
           | release, with decent accuracy. That really takes some effort.
        
           | BiteCode_dev wrote:
           | An expensive, hard, long and unreliable thing.
        
       | tester89 wrote:
       | > *Update*: Apple spokesperson Shane Bauer told The Verge that
       | though the CSAM detection feature is no longer mentioned on its
       | website, plans for CSAM detection have not changed since
       | September, which means CSAM detection is still coming in the
       | future.
        
         | matheusmoreira wrote:
         | So it's just damage control.
        
         | irq-1 wrote:
         | https://www.theverge.com/2021/12/15/22837631/apple-csam-dete...
        
       | 1cvmask wrote:
       | Let's give all your data to governments under the guise of
       | protecting children or against terrorists. It is like the John
       | Jonik cartoon.
       | 
       | http://www.libertyclick.org/propaganda-for-government-contro...
       | 
       | http://jonikcartoons.blogspot.com/
        
         | Veen wrote:
         | > Let's give all your data to governments under the guise of
         | protecting children
         | 
         | But that's precisely what Apple's CSAM implementation doesn't
         | do. It compares on-device image hashes with hashes of known CP
         | images. It affords more privacy than other methods, which Apple
         | is probably using anyway and which other cloud services are
         | definitely using.
         | 
         | https://www.apple.com/child-safety/pdf/CSAM_Detection_Techni...
        
           | elliekelly wrote:
           | Why do you not consider hashes of my photos to be my data?
        
             | Veen wrote:
             | That's an interesting question. I have no idea who "owns"
             | hashes. Does the copyright on a photo transfer to hashes of
             | the photo? Is a hash a derivative work? Regardless, I am
             | sure that somewhere in the terms and conditions you give
             | Apple the right to copy, move, and process your photos and
             | probably hashes of them too.
        
           | zionic wrote:
           | >It compares on-device image hashes with hashes of known CP
           | images.
           | 
           | No, it compares against hashes supplied by a government
           | agency (NCMEC). Apple has no way to verify the hashes are in
           | fact CP, as it is only re-hashing hashes.
        
             | kingcharles wrote:
             | Also, can we create hashes that are not CSA and match CSA?
             | 
             | https://www.theverge.com/2017/4/12/15271874/ai-
             | adversarial-i...
             | 
             | If we can, then, hypothetically we need to get non-CSA
             | images onto important people's iPhones so they get
             | arrested, jailed for years and have their lives ruined by
             | Apple.
             | 
             | Disclaimer: I buy Apple products.
        
             | kingcharles wrote:
             | I have another problem with this. In a lot of jurisdictions
             | virtual CSA images are legal. i.e. cartoons and images
             | created entirely in CG.
             | 
             | These images can be 100% indistinguishable from the real
             | thing. Without knowing the source of the images that they
             | are putting in the database, how do they know the images
             | are actually illegal?
        
             | treis wrote:
             | There was a manual review step if the hashes triggered.
        
             | treesknees wrote:
             | But once it reaches some non-defined threshold, it's Apple
             | who reviews the content not the government. They don't have
             | a way to verify the hashes, but they would have a way to
             | verify the content which matches the hashes.
             | 
             | Presumably at this stage is where malicious hashes would be
             | detected and removed from the database.
        
               | zionic wrote:
               | > it's Apple who reviews the content not the government.
               | 
               | An unaccountable "Apple Employee" who is likely (in the
               | US and other countries) to be a LEO themselves will see a
               | "visual derivative" aka a 50x50px greyscale copy of your
               | content.
               | 
               | There is no mechanism to prevent said "employee" from
               | hitting report 100% of the time, and no recourse if they
               | falsely accuse you. The system is RIPE for abuse.
               | 
               | >Presumably at this stage is where malicious hashes would
               | be detected and removed from the database.
               | 
               | Collision attacks have already been demonstrated. I could
               | produce a large amount of false positives by modifying
               | legal adult porn to collide with neural hashes. Anyone
               | could spread these images on adult sites. Apple
               | "employees" that "review" the "image derivatives" will
               | then, even when acting honestly, forward you to
               | prosecution.
        
               | Veen wrote:
               | > and no recourse if they falsely accuse you
               | 
               | Of course there is. The judicial system.
               | 
               | (Although, to be clear. I don't live in America and I
               | might be more worried about this if I did.)
        
               | akyoan wrote:
               | Yes but no one wants the finger pointed at themselves.
               | Even if innocence is proven, someone will go through your
               | files and you will have to deal with the law.
               | 
               | The recourse should be _before_ this reaches the law.
        
               | zionic wrote:
               | Doesn't that put you in the position of suing apple after
               | you've:
               | 
               | 1) Spent who knows how long in jail
               | 
               | 2) Lost your job
               | 
               | 3) Defaulted on your mortgage
               | 
               | 4) Been divorced
               | 
               | 5) Had you reputation ruined
               | 
               | Money can't fix everything, and trusting the courts to
               | make you whole years after the fact is a foolish
               | strategy.
        
               | aunty_helen wrote:
               | No thanks, I'll give up my iphone before it gets to this
               | stage. Not an experiment that I'm willing to partake in.
        
               | junon wrote:
               | > Presumably at this stage is where malicious hashes
               | would be detected and removed from the database.
               | 
               | How, if 1) the original content is never provided to
               | Apple, and 2) the offending content on consumer devices
               | is never uploaded to Apple?
        
               | kemayo wrote:
               | You're misunderstanding the proposed system. The entire
               | system as-described only runs on content that's in the
               | process of being uploaded to Apple -- it's part of the
               | iCloud Photos upload system. Apple stores all this
               | content encrypted on their servers, but it's not E2E so
               | they have a key.
               | 
               | This entire system was a way for Apple to avoid
               | decrypting your photos on their servers and scanning them
               | there.
               | 
               | Hypothetically, if Apple implemented this system _and_
               | switched to E2E for the photo storage, you 'd be more
               | private overall because Apple would be incapable of
               | seeing anything about your photos until you tripped these
               | hash matches, as opposed to the status quo where they can
               | look at any of your photos whenever they feel like it.
               | (And the hash matches only include a "visual derivative"
               | which we assume means a low res thumbnail.) I say
               | hypothetically because Apple never said this was their
               | plan.
               | 
               | You can argue about whether or not Apple _should_ be
               | doing this. But it does seem to be fairly standard in the
               | cloud file storage industry.
        
             | [deleted]
        
             | specialist wrote:
             | Exactly. Apple doesn't want to be in the unenviable
             | position of making that judgement (is a hotdog or not).
             | 
             | As a taxpayer and customer, I concur. I'm glad _someone_ is
             | doing that job. But I don 't want it to be a corporation.
        
               | nomel wrote:
               | A third party reviewer would confirm the images weren't
               | just hash conflicts, then file a police report, according
               | to the documentation.
        
               | zionic wrote:
               | >third party reviewer
               | 
               | A "third party" paid by apple who is totally-not-a-cop
               | who sees a 50x50pz grayscale "image derivative" is in
               | charge of hitting "is CP" or "Is not CP".
               | 
               | I don't understand how anyone can have faith in such a
               | design.
        
               | nomel wrote:
               | That's incorrect. That's the Apple reviewer. After that,
               | it goes to further review by NCMEC, where it's verified.
               | NCMEC is the only one legally capable of verifying it,
               | fully, and they're the ones that file the police report.
               | 
               | So, to get flagged, you need _many_ hash collisions
               | destined for iCloud. Then, to get reported, some number
               | must get a false positives in the Apple review, and then
               | some number must somehow fail the full review by NCMEC.
        
           | 1cvmask wrote:
           | How about we expand the list of known CP images, under force,
           | to include many non-CP images.
        
             | kingcharles wrote:
             | As I stated in a reply above, I guarantee that this
             | database contains a number of non-real-CP images already
             | that are just CG.
        
             | Grustaf wrote:
             | Why would they want to entrap people with fake child porn
             | images? What would even be the point of getting people
             | flagged? They would never be convicted anyway, since they
             | are not actually storing illegal images.
        
               | Syonyk wrote:
               | It doesn't matter.
               | 
               | What matters is that the person is arrested, that their
               | name is leaked to the mainstream news media, and that
               | they're dragged through the mud - the same as seems to
               | happen in _literally every other child porn case._
               | 
               | The actual guilt or innocence doesn't matter at that
               | point. Their lives have been sufficiently ruined, and, if
               | they are actually innocent, the message has been clearly
               | sent.
        
               | Grustaf wrote:
               | Do you have any examples of that happening, innocent
               | people getting arrested for child porn? Never heard of
               | it.
               | 
               | In any case, the police will not arrest anyone because
               | they have only been flagged by the automatic system. As
               | soon as Apple reviews the images that triggered the
               | flagging, they will see that they aren't actually child
               | porn, and hence they won't even contact the police.
        
               | yunesj wrote:
               | I recently described an example:
               | https://news.ycombinator.com/item?id=29567729
        
               | Grustaf wrote:
               | Seems they went after him for other things
               | 
               | https://nymag.com/intelligencer/amp/2021/08/bitcoin-ian-
               | free...
        
             | sweetheart wrote:
             | Then the myriad other tech companies that scan the images
             | uploaded to their platforms will be alerted of supposed
             | illicit images being uploaded. This isn't new. No one's
             | lives have been ruined by this and it's been practice for
             | years now.
        
               | zionic wrote:
               | >No one's lives have been ruined by this
               | 
               | This is a strong claim with zero evidence. The social
               | costs of defending an accused perpetrator of this nature
               | are insanely high.
               | 
               | While we're making unsubstantiated claims, I assert that
               | intelligence agencies routinely exploit this. My only
               | substantiation is that it's obvious and I would if I
               | worked for them.
        
               | sweetheart wrote:
               | We have literally no reason to think it's an issue. The
               | burden of proof is on those who believe its a threat, not
               | those who believe it isn't.
        
               | zionic wrote:
               | "Burden of proof"? Seems like it's pretty hard to prove
               | what intelligence agencies are doing.
               | 
               | Also, there's the fun "you can't sue us because you have
               | no standing/can't show harm, and showing harm to prove
               | you have standing would be illegal".
        
               | sweetheart wrote:
               | > "Burden of proof"?
               | 
               | Yes, here you go: https://en.wikipedia.org/wiki/Burden_of
               | _proof_(philosophy)
        
               | yunesj wrote:
               | Ian Freemen is a controversial political activist whose
               | pro-liberty, pro-bitcoin radio station was raided by the
               | FBI in 2016 and had their equipment confiscated. It was
               | widely covered in the news, he was kicked out of at least
               | one community organization, and last I heard, they still
               | didn't have all their equipment back.
               | 
               | No charges were ever filed, and the media outlets didn't
               | bother to update their articles.
               | 
               | I think false accusations happen, especially to
               | controversial figures, and assume most victims just don't
               | want to call much attention to it.
        
               | kemayo wrote:
               | Isn't this a separate issue to the one the grandparent
               | was complaining about? I can't see anything about child
               | porn related to this person, just a lot of talk about
               | money laundering.
               | 
               | Also, are you sure there were no charges filed? Some
               | cursory googling turns up articles including a PDF of the
               | indictment [1] and more-recent articles seem to refer to
               | an ongoing process [2].
               | 
               | [1]: https://manchesterinklink.com/fbi-descends-on-keene-
               | and-surr...
               | 
               | [2]: https://nymag.com/intelligencer/2021/08/bitcoin-ian-
               | freeman-...
        
             | Veen wrote:
             | The same is true of the technology that can find photos of
             | cats on your device. It can be abused, but smartphones and
             | cloud services are packed with technology that could be
             | abused by a motivated bad actor. If you're worried about
             | it, you're better off not using a smartphone at all.
        
       | threeseed wrote:
       | Just so everyone is clear here:
       | 
       | a) CSAM scanning only ever applied to photos being uploaded via
       | iCloud Photo Library. It was never applied to photos that you
       | weren't giving to Apple nor any other files.
       | 
       | b) The "but they could expand it to anything else" logic is
       | baseless and frankly stupid. Why ? Because they wrote the
       | operating system. They could implement this feature in any of the
       | 100+ daemons that are currently running as root and no one would
       | ever know.
       | 
       | c) It was a better solution for user privacy than what exists
       | today i.e. Apple secretly scans your photo server-side. Which you
       | gave them permission to when you agreed to the EULA for iCloud.
        
         | lozenge wrote:
         | So what happens when China sends Apple some signatures of anti-
         | Party memes? "Sorry, you can't view the actual photos to verify
         | they are of abused children, that's illegal".
        
         | Karunamon wrote:
         | > _CSAM scanning only ever applied to photos being uploaded via
         | iCloud Photo Library_
         | 
         | Which is everything in your camera roll, and last I checked,
         | you can't pick and choose which pics. Additionally, saving a
         | photo from the internet, or in some apps, merely receiving one
         | (WhatsApp does this, at least), automatically puts it there.
         | 
         | So let's amend A to reflect reality:
         | 
         | a) CSAM scanning is applied to all photos an average iOS user
         | will interact with if they have iCloud Photos turned on, which
         | is the default
         | 
         | which boils down to:
         | 
         | a) CSAM scanning is applied to all photos an average iOS user
         | will interact with
        
       | nunez wrote:
       | I am impressed and scared about how quickly media coverage and
       | "public outrage" scaled down after the first few weeks this
       | feature was announced. While I agree with the technology and its
       | mission, I'm glad that Apple walked it back while they try to
       | improve its messaging and privacy.
        
       | theHIDninja wrote:
       | The real reason behind CSAM is that Tim Cook is a paedophile and
       | he wanted to detect when CP was uploaded to Apple's servers so he
       | could jack off to it.
        
       ___________________________________________________________________
       (page generated 2021-12-15 23:02 UTC)