[HN Gopher] Security Threat Model Review of the Apple Child Safe...
       ___________________________________________________________________
        
       Security Threat Model Review of the Apple Child Safety Features
       [pdf]
        
       Author : sylens
       Score  : 306 points
       Date   : 2021-08-13 19:10 UTC (1 days ago)
        
 (HTM) web link (www.apple.com)
 (TXT) w3m dump (www.apple.com)
        
       | magicloop wrote:
       | I think is this a good document and it also brings to the table
       | the threshold images count (around 30) and the alternate neural
       | hash they will keep private (to guard against adverserial images
       | trying to create a false positive on-device).
       | 
       | FWIW, I actually did an amateur threat model analysis in a
       | comment in separate HN thread. I always thought this was called
       | for because the initial document set was just the mathematics,
       | not the people/process/implementation/policy risks and threat
       | model that was the source of widespread concerns.
        
       | 616c wrote:
       | I have long been interested in what a professional-grade threat
       | model from a large FAANG/SV organization is. Is this a
       | representative model?
       | 
       | Microsoft came up with DREAD and STRIDE and they suggest there
       | threat models are more elaborate.
       | 
       | Would love to see more representative examples!
        
         | jdlshore wrote:
         | This feels more like a marketing document to me than a workaday
         | threat model. It's fairly handwavey, and the goal seems to be
         | to convince rather than to do a hardnosed analysis.
         | 
         | Not that it's not useful--I found it convincing--but I doubt
         | this is what a real threat model looks like. Not that I
         | actually know. I'd be interested in seeing a real one too.
        
       | simondotau wrote:
       | Here's what I don't get: who is importing CSAM into their camera
       | roll in the first place? I for one have never felt the urge to
       | import regular, legal porn into my camera roll. Who the hell is
       | going to be doing that with stuff they know will land them in
       | prison? Who the hell co-mingles their deepest darkest dirtiest
       | secret amongst pictures of their family and last night's dinner?
       | 
       | I can believe that some people might be so technically illiterate
       | that they'd upload CSAM to a private Facebook group thinking it
       | was secure. But as far as I'm aware it's not possible to password
       | protect photo albums using the built-in Photos app.
        
         | simondotau wrote:
         | Actually it occurs to me that perhaps they're noticing a lot of
         | CSAM is being produced with smartphone cameras and are hoping
         | to snag the phone which produced the originals. If they can get
         | new content into their database before the photographer deletes
         | them, they might find the phone which took the originals--and
         | then find the child victim.
         | 
         | Beyond implausible, but then most law enforcement tends to rely
         | on someone eventually doing something stupid.
        
           | phnofive wrote:
           | It is only supposed to detect CSAM already known to NCMEC,
           | not identify new images.
        
             | 0xy wrote:
             | NCMEC's database does not only contain CSAM. It has never
             | been audited. It's full of false positives. They are immune
             | from FOIA requests. They are accountable to nobody. They
             | work so closely with the FBI that there are FBI agents
             | working on the database directly.
             | 
             | NCMEC is essentially an unaccountable, unauditable
             | department of the FBI that also happens to be incompetent
             | (the amount of non-CSAM in the database is large).
        
               | simondotau wrote:
               | You don't know what's in it, but you do know it's full of
               | false positives? I wonder, do you know how many of those
               | false positives are flagged as A1?
        
               | 0xy wrote:
               | I know for a fact that it is full of false positives,
               | there's also public sources making the same claim. [1]
               | 
               | [1] https://www.hackerfactor.com/blog/index.php?/archives
               | /929-On...
        
               | simondotau wrote:
               | I clicked on your link hoping for serious analysis. I
               | would have settled for interesting analysis. I was
               | disappointed. It's just a guy who found one MD5 hash
               | collision. The paragraph was written in a way that makes
               | it unclear whether source of this specific hash was in
               | fact NCMEC or if it was "other law enforcement sources".
               | So which was it? Did this person follow up with the
               | source to confirm whether his hash match was a false
               | positive or a hash collision?
               | 
               | So in short, the source for your claims has evidence of
               | between zero and one false positives.
               | 
               | Unimpressive would be an understatement.
        
               | 0xy wrote:
               | It was not a hash collision. It was a false positive. The
               | way the hashing solution works doesn't really allow for a
               | false positive except in extraordinarily rare
               | circumstances.
               | 
               | It matched a man holding a monkey full clothed as a CSAM
               | image, direct from NCMEC's database.
               | 
               | He encountered a 20% false positive rate while running a
               | moderately popular image service, with an admittedly low
               | sample size. It's still evidence, and given NCMEC is
               | immune from oversight, FOIA and accountability, it's
               | concerning.
               | 
               | Also, the fact I know there are false positives does not
               | stem from that post. I know it independently, but since
               | you asked for a source stronger than "just trust a random
               | internet guy", I gave you one.
               | 
               | He's not the only person making the claim though, others
               | throughout these threads with industry knowledge have
               | confirmed what I already knew. If you're asking me to
               | reveal how I know, I'm afraid you'll be disappointed. I'd
               | rather be accused of lying than elaborate.
        
               | simondotau wrote:
               | > _It was not a hash collision. It was a false positive._
               | 
               | Again, this was an MD5. It's literally impossible to
               | assert that it was a false positive with absolute
               | certainty. A hash collision is not outside the realm of
               | possibility, especially with MD5. Apparently no attempt
               | was made to chase this up. And we still don't know
               | whether it was the NCMEC database or "other law
               | enforcement sources".
               | 
               | You continue to claim that it was "direct from NCMEC's
               | database" but again, that isn't asserted by your source.
               | 
               | > _He encountered a 20% false positive rate_
               | 
               | He encountered one potential false positive. Converting
               | one data point into a percentage is exactly why earlier I
               | described this nonsense as being disingenuous. The fact
               | that you would cite this source and then defend their
               | statistical clown show is, in my opinion, strong positive
               | evidence that your other citation-free assertions are all
               | entirely made up.
        
               | 0xy wrote:
               | What you're missing is the statistical probability of two
               | MD5 hashes colliding, which is astronomically unlikely.
               | 
               | Your argument is essentially that a collision is more
               | likely than a false positive, which would imply a false
               | positive rate of 0.00% in NCMEC's database based on its
               | size.
               | 
               | It's clear that nothing will convince you if you believe
               | that humans managing a database will never, ever make a
               | mistake after over 300,000,000 entries. Because if they
               | make one single mistake, then it supports my argument --
               | a false positive becomes substantially more likely
               | statistically than a hash collision.
               | 
               | You're also providing a pretty large red herring with
               | your suggestion that he could've simply asked NCMEC if it
               | was a false positive or a hash collision. NCMEC would
               | never provide that information, because the database is
               | highly secret.
               | 
               | Given those statistics, I think that source is more than
               | valid.
               | 
               | >in my opinion, strong positive evidence that your other
               | citation-free assertions are all entirely made up
               | 
               | I am happy to let you believe that I'm lying.
               | 
               | One industry insider whose employer works with NCMEC
               | cited a false positive rate of 1 in 1,000. [1] The
               | product he works in is used in conjunction with NCMEC's
               | database. Elsewhere, in press releases, the company cites
               | a failure rate of 1% (presumably both false positives and
               | false negatives) [2]
               | 
               | [1] https://news.ycombinator.com/item?id=21446562
               | 
               | [2] https://www.prnewswire.com/news-releases/thorns-
               | automated-to...
        
               | simondotau wrote:
               | Nowhere have I claimed that the NCMEC databases are
               | entirely devoid of miscategorised data. I am merely
               | pushing back at your evidence-free claim that _" it's
               | full of false positives."_
               | 
               | Once again, you continue to assume that the MD5 collision
               | cited was from a NCMEC corpus and not "other law
               | enforcement sources". You're reading far more into your
               | sources than they are saying.
               | 
               | And now you are conflating claims of false positives in
               | the origin database with rates of false positives in a
               | perceptual hashing algorithm. You are clearly very
               | confused.
        
               | 0xy wrote:
               | >Nowhere have I claimed that the NCMEC databases are
               | entirely devoid of miscategorised data
               | 
               | If NCMEC's databases have a false positive rate of 1 in
               | 1,000, do you realise that means a false positive rate is
               | substantially more likely than a hash collision?
               | 
               | >I am merely pushing back at your evidence-free claim
               | that "it's full of false positives."
               | 
               | 1 in 1,000 is "full of false positives" by my own
               | standards, to which I've posted evidence from an employee
               | whose company works directly with NCMEC.
               | 
               | >you continue to assume that the MD5 collision cited was
               | from a NCMEC corpus and not "other law enforcement
               | sources".
               | 
               | NCMEC and law enforcement are one and the same. FBI
               | employees work directly at NCMEC. [1] Law enforcement
               | have direct access to the database, including for
               | categorisation purposes. To suggest law enforcement's
               | dataset is tainted yet NCMEC's is not doesn't make any
               | sense to me.
               | 
               | >You are clearly very confused
               | 
               | Address the 1 in 1,000 claim. Thorn is an NCMEC partner
               | and a Thorn employee has claimed a false positive rate of
               | 1 in 1,000. In press releases, Thorn is even less sure at
               | 1% failure rates. Thorn uses perceptual hashing.
               | 
               | I can't see how you can simultaneously claim the database
               | isn't "full of false positives" while acknowledging a
               | failure rate as abysmal as 1 in 1,000.
               | 
               | I also didn't conflate anything, both a hash collision
               | and a perceptual hash collision are less likely than a
               | false positive, by an extraordinary margin. Apple claims
               | their algorithm has a collision 1 in 1,000,000,000,000
               | times. Compare that to 1 in 1,000.
               | 
               | The database is full of false positives, and now
               | presumably you'll deny both industry claims and NCMEC
               | partner claims.
               | 
               | [1] https://www.fbi.gov/investigate/violent-crime/cac
        
               | simondotau wrote:
               | Thorn isn't claiming that _" the database is full of
               | false positives."_ Once again you are conflating claims
               | of false positives in the origin database with rates of
               | false positives in a perceptual hashing algorithm. You
               | are so catastrophically confused that ongoing dialogue
               | serves no purpose. Goodbye.
        
             | simondotau wrote:
             | That's my point. If the original photographer doesn't
             | delete them by the time they're known to NCMEC and the hash
             | database is updated, they would match. As I said, I
             | recognise this is beyond implausible.
        
         | jdlshore wrote:
         | Apparently 70 million images have been reported by other
         | providers (if I'm reading [1] correctly, which I'm not sure I
         | am, since the paywall hides it before I can read closely).
         | 
         | Assuming that's correct, the answer is "a lot of people." I
         | find that eminently plausible.
         | 
         | [1] https://www.nytimes.com/2020/02/07/us/online-child-sexual-
         | ab...
        
           | simondotau wrote:
           | It looks like well over 90% of that is Facebook and the bulk
           | of the remainder is Google and Yahoo who crawl the web and
           | run large email services. None of those are analogous to
           | importing CSAM into your smartphone's photo library.
        
       | concinds wrote:
       | I strongly considered switching away from Apple products last
       | weekend; but this document has convinced me otherwise.
       | 
       | The threats people identify have minimal risk. If a total
       | stranger offers you a bottle of water, you may worry about it
       | being spiked, but him having offered the bottle doesn't make it
       | more, or less, likely that he'll stab you after you accept it.
       | They're separate events, no "slippery slope". It's very
       | convincing that any alteration that would "scan" your whole phone
       | would be caught eventually, and even if it's not, the
       | announcement of this feature has no bearing on it. If Apple has
       | evil intent, or is being coerced by NSLs, they would (be forced
       | to) implement the dangerous mechanism whether this Child Safety
       | feature existed or not. The concept of backdoors is hardly
       | foreign to governments, and Apple didn't let any cats out of any
       | bags. This document shows that Apple did go to great lengths to
       | preserve privacy; including the necessity of hashes being
       | verified in two juristictions, the fact that neither the phone
       | nor Apple know if any images have been matched below the
       | threshold; the lack of remote updates of this mechanism; the use
       | of vouchers instead of sending the full image; the use of
       | synthetic vouchers; and on and on.
       | 
       | Furthermore, the risks of the risks of this mechanism are lower
       | than the existing PhotoDNA used by practically every competing
       | service. Those have no thresholds; the human review process is
       | obscure; there is no claim that other pictures won't be looked
       | at.
       | 
       | The controversy, the fact that it uses an on-device component.
       | But PhotoDNA-solutions fail many of Apple's design criterias,
       | which require an on-device solution:
       | 
       | - database update transparency
       | 
       | - matching software correctness
       | 
       | - matching software transparency
       | 
       | - database and software universality
       | 
       | - data access restrictions.
       | 
       | What about the concerns that the hash databases could be altered
       | to contain political images? PhotoDNA could be, too; but would be
       | undetectable unlike with Apple's solutions. Worse: with
       | serverside solutions, the server admin could use an altered hash
       | DB only for specific target users, causing harm without arousing
       | suspicion. Apple's design prevents this since the DB is
       | verifiably universal to all users.
       | 
       | A rational look at every argument I've seen against Apple's
       | solution indicates that it is strictly superior to current
       | solutions, and less of a threat to users. Cryptographically
       | generating vouchers and comparing hashes is categorically not
       | "scanning people's phones" or a "backdoor." I think the more
       | people understand its technical underpinnings, the less they'll
       | see similarities with sinister dystopian cyberpunk aesthetics.
        
         | pseudalopex wrote:
         | > If Apple has evil intent, or is being coerced by NSLs, they
         | would (be forced to) implement the dangerous mechanism whether
         | this Child Safety feature existed or not.
         | 
         | Apple used to fight implementing dangerous mechanisms. And
         | succeeded.[1]
         | 
         | [1] https://en.wikipedia.org/wiki/FBI-Apple_encryption_dispute
        
           | shuckles wrote:
           | Their "you can't compel us to build something" argument was
           | for building a change to the passcode retry logic, which is
           | presumably as simple as a constant change. Certainly building
           | a back door in this system is at least as difficult, so the
           | argument still stands.
        
             | mediumdeviation wrote:
             | In that specific case at least the FBI is asking Apple to
             | produce new firmware that will bypass existing protection
             | on an existing device - basically asking Apple to root a
             | locked down phone which would likely require them breaking
             | their own encryption or finding vulnerabilities in their
             | own firmware. This is not exactly technically trivial since
             | anything of this sort that's already known would be patched
             | out.
             | 
             | In this case it seems only a matter of policy that Apple
             | submits CSAM hashes to devices for scanning. No new
             | software needs to be written, no new vulnerability found.
             | There's no difference between a hash database of CSAM, FBI
             | terrorists photos, or Winnie the Pooh memes. All that's
             | left is Apple's words that their policy is capable of
             | standing up to the legal and political pressures of
             | countries it operates in. That seems like a far lower bar.
        
               | shuckles wrote:
               | No, they were asked to sign a software update which would
               | bypass the application processor enforced passcode retry
               | limit on an iPhone 5C which had no secure element.
        
         | Syonyk wrote:
         | I understand the technologies they're proposing deploying at a
         | decent level (I couldn't implement the crypto with my current
         | skills, but what they're doing in the PSI paper makes a
         | reasonable amount of sense).
         | 
         | The problem is that this "hard" technological core (the crypto)
         | is subject to an awful lot of "soft" policy issues around the
         | edge - and there's nothing but "Well, we won't do that!" in
         | there.
         | 
         | Plus, the whole Threat Model document feels like a 3AM
         | brainstorming session thrown together. "Oh, uh... we'll just
         | use the intersection of multiple governments hashes, and,
         | besides, they can always audit the code!" Seriously, search the
         | threat model document for the phrase "subject to code
         | inspection by security researchers" - it's in there 5x. How,
         | exactly, does one go about getting said code?
         | 
         | Remember, national security letters with gag orders attached
         | exist.
         | 
         | Also, remember, when China and Apple came to a head over iCloud
         | server access, Apple backed down and gave China what they
         | wanted.
         | 
         | Even if _this, alone_ isn 't enough to convince you to move off
         | Apple, are you comfortable with the trends now clearly visible?
        
           | FabHK wrote:
           | > Even if this, alone isn't enough to convince you to move
           | off Apple, are you comfortable with the trends now clearly
           | visible?
           | 
           | Still much better than all but the most esoteric inconvenient
           | alternatives.
        
             | Syonyk wrote:
             | And if those are all that's left that meet your criteria
             | for a non-abusive platform, then... well, that's what
             | you've got to work with. Maybe try to improve those non-
             | abusive platforms.
             | 
             | I'm rapidly heading there. I'm pretty sure I won't run
             | Win11 given the hardware requirements (I prefer keeping
             | older hardware running when it still fits my needs) and the
             | _requirement_ for an online Microsoft account for Win11
             | Home (NO, and it 's pretty well stupid that I have to
             | literally disconnect the network cable to make an offline
             | account on Win10 now, and then disable the damned nag
             | screens).
             | 
             | If Apple is going full in on this whole "Your device is
             | going to work against you" thing they're trying for,
             | well... I'm not OK with that either. That leaves Linux and
             | the BSDs. Unfortunately, Intel isn't really OK in my book
             | either with the fact that they can't reason about their
             | chips anymore (long rant, but L1TF and Plundervolt allowing
             | pillage of the SGX guarantees tells me Intel can't reason
             | about their chips)... well. Hrm. AMD or ARM it is, and
             | probably not with a very good phone either.
             | 
             | At this point, I'm going down that road quite quickly, far
             | sooner than I'd hoped, because I do want to live out what I
             | talk about with regards to computers, and if the whole
             | world goes a direction I'm not OK with, well, OK. I'll find
             | alternatives. I accept that unless things change, I'm
             | probably no more than 5-10 years away from simply
             | abandoning the internet entirely outside work and very
             | basic communications. It'll suck, but if that's what I need
             | to do to live with what I claim I want to live by, that's
             | what I'll do.
             | 
             | "I think this is a terrible idea and I wish Apple wouldn't
             | do it, but I don't care enough about it to stop using Apple
             | products" is a perfectly reasonable stance, but it does
             | mean that Apple now knows they can do more of this sort of
             | thing and get away with it. Good luck with the long term
             | results of allowing this.
        
               | dwaite wrote:
               | > And if those are all that's left that meet your
               | criteria for a non-abusive platform, then... well, that's
               | what you've got to work with. Maybe try to improve those
               | non-abusive platforms.
               | 
               | There's always the potential to work on the underlying
               | laws. Not every problem can be solved by tech alone.
        
             | arvinsim wrote:
             | Then people need to start asking themselves if privacy is
             | really a value they really hold or is just an empty,
             | bandwagon idealism.
             | 
             | Because sacrificing privacy for convenience is why we got
             | to this point.
        
           | [deleted]
        
       | nokya wrote:
       | A threat model without a discussion on threats. Nice.
        
       | gandalfgeek wrote:
       | Sounds like it still comes down to trust. Quoting from the paper:
       | 
       | "Apple will refuse all requests to add non-CSAM images to the
       | perceptual CSAM hash database; third party auditors can confirm
       | this through the process outlined before. Apple will also refuse
       | all requests to instruct human reviewers to file reports for
       | anything other than CSAM materials for accounts that exceed the
       | match threshold."
        
         | 0xy wrote:
         | It's already a lie, considering NCMEC's database already has
         | non-CSAM images. It won't ever be true, since the hundreds and
         | possibly thousands of entities with direct and indirect access
         | to the database can upload SunnyMeadow.jpg labeled as CSAM
         | content, and it's blindly accepted.
         | 
         | If Apple is receiving a hash labeled as "CSAM", how can they
         | possibly make this guarantee? They do not know it's CSAM. It's
         | unverifiable and comes down to "trust us".
         | 
         | Remember, NCMEC is an unaccountable organisation inextricably
         | linked with the FBI and the U.S. government. It is not subject
         | to FOIA requests. Its database has never been audited, and it
         | is known to be a mess already.
        
           | simondotau wrote:
           | > _" hundreds and possibly thousands of entities with direct
           | and indirect access to the database can upload
           | SunnyMeadow.jpg labeled as CSAM content, and it's blindly
           | accepted."_
           | 
           | According to whom?
           | 
           | > _" it is known to be a mess already."_
           | 
           | According to whom? You've posted a number of claims about the
           | CSAM database administration process on HN and the one
           | instance where you've cited a source, it's been a junk
           | source.
        
         | celeritascelery wrote:
         | To me, the fact that they are making such a strong statement is
         | a big deal. If they had plans or thought that they would be
         | forced to add other content they would not be so direct in this
         | statement. I hope I am not wrong. As you said: trust.
        
       | makecheck wrote:
       | I don't like the idea of stuff running on my device, consuming my
       | battery and data, when the _only point_ is to see if _I_ am doing
       | something wrong?
       | 
       | An analogy I can come up with is: the government hires people to
       | visit your house every day, and while they're there they need
       | your resources (say, food, water, and electricity). In other
       | words, they use up some of the stuff you would otherwise be able
       | to use only for yourself -- constantly -- and the only reason
       | they're there is to see if you're doing anything wrong. Why would
       | you put up with this?
        
         | tzs wrote:
         | A better fitting analogy would be you ask FedEx to pick up a
         | package at your house for delivery, and they have a new rule
         | that they won't pick up packages unless they can look inside to
         | make sure that they are packed correctly and do not contain
         | anything that FedEx does not allow.
         | 
         | When they open and look into the package while in your house
         | for the pickup, they are using your light to see and your
         | heating/cooling is keeping the driver comfortable while they
         | inspect the package.
        
           | weiliddat wrote:
           | I'd say the analogy is closer to, they'll x-ray your package
           | to make sure there aren't any prohibited/dangerous items.
        
         | [deleted]
        
         | shuckles wrote:
         | The point of it is to make sure iCloud Photos remains a viable
         | service in light of real and perceived regulatory threats, and
         | possibly leave the door open to end to end encryption in the
         | future.
        
           | gpm wrote:
           | There is no regulatory threat within the US that could
           | require this happen, if this was demanded by the government
           | it would be a blatant violation of the 4th amendment. Apple
           | should have stood their ground if this was in response to
           | perceived government pressure.
        
             | dwaite wrote:
             | IMHO, the relevant regulations are pretty carefully worded
             | to make 4th amendment defenses harder. In this case, Apple
             | has some liability for criminal activity on their platform
             | whether it is was E2E encrypted or not.
        
             | matwood wrote:
             | There are two gov. issues. One is if the FBI comes knocking
             | and the other is new laws. The government could absolutely
             | write a law banning e2ee so that when the FBI does knock
             | with a warrant they can get access.
             | 
             | In the past, Apple has done what they can to stand their
             | ground against the first, but they (like any other company)
             | will have to comply with any laws passed.
             | 
             | Whether the 'if a warrant is obtained the gov. should be
             | granted access' law is a violation of the 4th amendment
             | remains to be seen.
        
               | gpm wrote:
               | When the FBI does knock with a warrant _they already get
               | access_ to data on iCloud. iCloud is not meaningfully
               | encrypted to prevent this.
               | 
               | The FBI is unable to get a warrant to search data _on
               | everyones phones_ , regardless of what laws are passed.
               | They might be able to get a warrant to search all the
               | data on apple's server (I would consider this unlikely,
               | but I don't know of precedent in either direction), but
               | that data is fundamentally not on everyones phones. This
               | isn't a novel legal question, you cannot search everyones
               | devices without probable cause that "everyone" committed
               | a crime.
        
           | jjcon wrote:
           | So they open a huge back door instead of waiting for a threat
           | and fighting that?
        
           | least wrote:
           | This is a false dichotomy that is being pushed _constantly_
           | online. E2EE is possible without this tech and in fact is
           | only useful without tech like this to act as a MITM.
        
         | nearbuy wrote:
         | A lot of things work this way. The government is quite
         | literally taxing some of your income and using it to check if
         | you're doing anything wrong.
         | 
         | When a cop is checking your car for parking violations, they're
         | using some of your taxpayer dollars to see if you're doing
         | something wrong. You don't benefit at all from your car getting
         | checked. But you probably benefit from everyone else's cars
         | being checked, which is why we have this system.
         | 
         | Or similarly, you don't benefit from airport security searching
         | your luggage. It's a waste of your time and invasion of your
         | privacy if you don't have anything forbidden and even worse for
         | you if you do have something forbidden. But you help pay for
         | airport security anyway because it might benefit you when they
         | search other people's luggage.
         | 
         | You don't benefit at all from having your photos scanned. But
         | you (or someone else) might benefit from everyone else's photos
         | being scanned (if it helps catch child abusers or rescue abused
         | children). It's just a question of whether you think the
         | benefit is worth the cost.
        
         | literallyaduck wrote:
         | So 3rd amendment defense? These are digital soldiers being
         | quartered in our digital house.
        
           | dwaite wrote:
           | Probably a clarification of the 4th in order; there are
           | multiple levels of government and private institutions at
           | play to disguise that the government is pressuring private
           | institutions to do search and seizure on their behalf.
        
       | hendersoon wrote:
       | I have no doubt the features work exactly the way Apple said they
       | do. Seriously.
       | 
       | I very much doubt that they will refuse to scan for whatever
       | China asks. I very much doubt they'll risk the PRC shutting down
       | Foxconn factories.
       | 
       | I very much doubt Apple will ultimately be able to resist
       | scanning for whatever the US Government asks in a national
       | security letter attached to a gag order. They will take them to a
       | secret court session which we'll never hear about, the court will
       | rule in the government's favor, and Apple will be forced to
       | comply and not allowed to tell anybody.
       | 
       | This happened to Cloudflare and they were gagged while fighting
       | it for 4 years. They didn't win in court, but the government
       | eventually dropped the case as it was no longer needed. And this
       | started under Obama, not Trump.
       | 
       | Now, will Apple bother to fight this out in court? I think they
       | probably will. But we won't know about that until they either
       | definitively win or the government gives up. If the government
       | wins, on the other hand, we'll NEVER hear about it.
        
         | heavyset_go wrote:
         | > _Now, will Apple bother to fight this out in court? I think
         | they probably will._
         | 
         | Why would they? Each year they give up customer data on over
         | 150,000 users without a fight when the US government simply
         | asks them for it[1].
         | 
         | [1] https://www.apple.com/legal/transparency/us.html
        
       | underseacables wrote:
       | What's not discussed in this Paper is how it won't be used by
       | third-party actors to falsely accuse someone. Does nobody
       | remember the iCloud hack? What about instead of downloading
       | Jennifer Lawrence's photos, a hacker uploaded children to get
       | someone falsely accused?
        
         | thephyber wrote:
         | > Does nobody remember the iCloud hack?
         | 
         | The iCloud hack was a few hundred cases of spear phishing.
         | 
         | How does your concern differ from physical access to the phone
         | by an unauthorized user?
         | 
         | I think the law is the problem. Simple possession of that type
         | of content is criminalized, as opposed to requiring the
         | traditional elements of a crime to prosecute it.
        
         | dogma1138 wrote:
         | Google, Microsoft and a bunch of other services already scan
         | for CSAM materials, yet this hasn't been an issue so far. In
         | fact I can't really find a single case when someone was framed
         | like this, despite how easy it is to fill someone's account
         | with CSAM and call the cops/wait.
         | 
         | I don't like the privacy and property rights issues of this but
         | the whole someone will use this to frame someone is quite BS.
        
           | nonbirithm wrote:
           | Here is one case which was allegedly caused by malware.
           | Several people were arrested and subsequently acquitted.
           | 
           | I would say that the opt-out preference on iPhones that by
           | default uploads all photos to iCloud combined with a remote
           | execution payload like Pegasus would render such an attack
           | plausible.
           | 
           | https://en.wikipedia.org/wiki/United_States_v._Solon?wprov=s.
           | ..
        
           | fuckcensorship wrote:
           | How would someone prove that they were framed by someone
           | planting CSAM? Without evidence, how likely is it that they
           | would be believed?
        
           | underseacables wrote:
           | But the services you mentioned don't do it on the device. The
           | blind trust in Apple that everything will be ok is quite BS,
           | and very naive .
        
           | Renaud wrote:
           | 1) how would you know someone's being framed if it's not
           | detected as such? in this case it just looks like the target
           | had CSAM on their machine/account and they are found guilty,
           | despite their objections, maybe because no-one really looked
           | closely at the evidence, or it was planted well enough.
           | 
           | 2) if it is detected as a plant. Would you know it? Would it
           | fly above our radar? There is evidence this happens [1], some
           | cases are quite high profile [2].
           | 
           | 3) This technique is already in use. Russia is allegedly
           | using it to discredit opponents [3]. Do you think you would
           | hear from it if it was done by secret services of any country
           | to discredit or get rid of foreign targets or even some at
           | home?
           | 
           | 4) it's the ultimate weapon against foes and opponents. It's
           | a dangerous one but I have to wonder how many estranged
           | spouses get revenge planting evidence [4], or how many
           | journalists or troublemakers have been the recipient of some
           | carefully crafted planting. I mean, nearly anyone reading HN
           | could do it sufficiently well to get someone in serious
           | trouble, it's not really a technical issue so it's not hard
           | to imagine it's being done.
           | 
           | [1]: https://www.independent.co.uk/news/uk/crime/handyman-
           | planted...
           | 
           | [2]: https://www.miaminewtimes.com/news/shaq-hit-with-
           | lawsuit-for...
           | 
           | [3]:
           | https://www.nytimes.com/2016/12/09/world/europe/vladimir-
           | put...
           | 
           | [4]: https://www.crimeonline.com/2019/07/04/wife-plants-
           | child-por...
        
           | staticassertion wrote:
           | As I recall someone's neighbor once tried to frame them by
           | downloading CSAM on their network and then calling the police
           | or something like that.
        
         | azinman2 wrote:
         | How is that not already the case for Google, Facebook, Dropbox,
         | Microsoft, and I'm sure countless others? Or are you just
         | suggesting that now Apple gets added to this existing long
         | list, and that's particularly notable?
        
       | throwawaymanbot wrote:
       | Chinafy the population.
        
       | tehnub wrote:
       | Their use of the phrase "This claim is subject to code inspection
       | by security researchers like all other iOS device-side security
       | claims" stood out to me.
       | 
       | Could someone tell me how that inspection works? Are there
       | researchers who are given the source code?
       | 
       | (I posted this on another thread [0] earlier, but it's more
       | relevant here) [0]: https://news.ycombinator.com/item?id=28175619
        
         | skybrian wrote:
         | It's not my field, but my understanding is that security
         | researchers often disassemble and/or debug binaries without
         | having access to the source code.
         | 
         | For example, as soon as an OS update is released, people will
         | take it apart.
         | 
         | Having source might be nice but it's not necessary.
        
           | [deleted]
        
         | eivarv wrote:
         | Could someone tell me how that inspection works? Are there
         | researchers who are given the source code?
         | 
         | It doesn't (with the exception of an exclusive program[0]). Not
         | only do they generally make security work on their devices
         | difficult - they have a history of suing companies that
         | facilitate it [1].
         | 
         | [0]: https://developer.apple.com/programs/security-research-
         | devic...
         | 
         | [1]: https://www.theverge.com/2021/8/11/22620014/apple-
         | corellium-...
        
         | kevinsundar wrote:
         | Apple actually gives special devices to Security Researchers
         | that allow them further access into the device than a normal
         | consumer device: https://developer.apple.com/programs/security-
         | research-devic...
         | 
         | In this way, third party security researchers can verify their
         | claims. It actually works out pretty well for them since third
         | party security researchers often find pretty severe
         | vulnerabilities through this program.
        
           | [deleted]
        
           | robryk wrote:
           | That program, at least when it was introduced, required
           | participants not to report security vulnerabilities publicly
           | until Apple allowed them to do so, with no limits on how long
           | that can be (see
           | https://news.ycombinator.com/item?id=23920454 for a
           | discussion from that time).
           | 
           | That makes this program particularly useless for the purpose
           | of auditing whether Apple is adhering to its promises.
        
             | dwaite wrote:
             | For security issues where the participants basically make
             | their living indirectly by getting credit for security
             | vulnerabilities, this carrot-and-stick potentially
             | motivates them to stay quiet.
             | 
             | Meanwhile, researchers have gotten wise to notary
             | techniques (like publishing document hashes to twitter)
             | which would let them severely and publicly shame Apple
             | should they sit on something that ultimately turns out to
             | be a zero day, with much delight from/participation by the
             | media.
             | 
             | For privacy/societal issues where Apple is a deliberate bad
             | actor, they would presumably either directly be willing to
             | break the terms of the agreement to go public, or would
             | release information indirectly and rely on herd privacy
             | with other researchers.
        
       | TrumpRapedWomen wrote:
       | Why don't they just scan photos service side? Are they trying to
       | save on CPU costs?
        
       | jorgesborges wrote:
       | I keep changing my mind on this. On the one hand I already
       | operate on the assumption that uploading data to a cloud service
       | renders that data non-private, or at least in great risk of
       | becoming non-private in the future. This simply makes my
       | operating assumption explicit. Also this particular
       | implementation and its stated goals aren't egregious.
       | 
       | But then there's the slippery slope we've all been discussing --
       | and the gradual precedents set for it, this being a nail in the
       | coffin. But I sympathize with Apple for making transparent what I
       | assume happens behind closed doors anyway. My optimistic side
       | thinks maybe this is a sign that extensions of this project will
       | also be made explicit. Maybe this is the only way for them to
       | take this fight if they're being pressured. In any case now is
       | the time to speak out. If they extend it to texts and other
       | documents I'm out - even as it stands I'm thinking of
       | alternatives. But the fact we're all discussing it now is at
       | least a point of hope.
        
         | naasking wrote:
         | > But I sympathize with Apple for making transparent what I
         | assume happens behind closed doors anyway.
         | 
         | Using your devices' CPU and battery seems more egregious than
         | doing it on their servers. If they want to help law
         | enforcement, then they should pay for it. Of course they want
         | to help law enforcement by forcing other people to pay the
         | costs.
         | 
         | Imagine if Ford came out with a cannabis sensor in their cars
         | that automatically called the cops on you if it detected
         | cannabis inside the car. The sensor is allegedly only active
         | when you're _renting_ a Ford vehicle, not if you purchase it
         | outright. Not a perfect analogy, but how comfortable would you
         | find this situation?
        
           | Spivak wrote:
           | Let's do a more realistic example. Ford introduces a "driver
           | safety" mechanism where the car requires a clean breathalyzer
           | reading in order to start. If it fails it pops up a message
           | that reminds you that drunk driving is illegal but doesn't
           | actually stop you from starting the engine. It then sends
           | Ford the results in an encrypted payload along with 1/30th of
           | the decryption key.
           | 
           | After 30 reports someone at Ford opens the payloads, looks at
           | the results and, decides whether to contact the police based
           | on how damning the evidence is.
           | 
           | Because all the tests are performed locally you're never
           | giving up any of your privacy unless you actually are driving
           | drunk or are the 1/billion and get 30 false positives. I feel
           | like this is a much stronger than the "if you have nothing to
           | hide" meme because local testing like this lets you can have
           | everything to hide while still revealing specific bad
           | behavior.
           | 
           | Like if the TSA had scanners that somehow could only reveal
           | banned items and nothing else I'm not sure I would even
           | consider it a privacy violation.
        
             | naasking wrote:
             | That's not a realistic analogy because driving is a
             | privilege you must earn, not a human right.
        
       | PaulDavisThe1st wrote:
       | > Apple generates the on-device perceptual CSAM hash database
       | through an intersection of hashes provided by at least two child
       | safety organizations operating in separate sovereign
       | jurisdictions - that is, not under the control of the same
       | government. Any perceptual hashes appearing in only one
       | participating child safety organization's database, or only in
       | databases from multiple agencies in a single sovereign
       | jurisdiction, are discarded by this process, and not included in
       | the encrypted CSAM database that Apple includes in the operating
       | system.
       | 
       | Well, that's quite clever, isn't it.
       | 
       | It's just the sort of thing that someone who had _" never heard
       | of"_ the Five Eyes, or extradition treaties or illegal rendition
       | operations involving multiple nations, might come up with.
       | 
       | Genius!
        
         | zionic wrote:
         | China: here's our database and here is one for Hong Kong.
         | 
         | They're totally separate we promise.
        
           | anticensor wrote:
           | Separate but identical, indeed.
        
         | cmsj wrote:
         | If your threat model includes pervasive spying by multiple
         | nation states, and being grabbed in the night by black
         | helicopters, it seems unlikely you'll be overly concerned about
         | them precisely inserting at least 30 of your photos into
         | multiple CSAM databases and also co-ercing Apple's manual
         | review to get you reported to NCMEC.
        
           | PaulDavisThe1st wrote:
           | I think you have this backwards.
           | 
           |  _Given_ that nations already  "grabbing in the night with
           | black helicopters" (semantically, at least), and do so with
           | impunity, it doesn't seem much of a stretch to imagine they'd
           | potentially set someone up using this much milder sort of
           | approach.
        
           | sylens wrote:
           | I don't think people are worried about multiple nation states
           | framing them with CSAM photos - they're worried about
           | multiple nation states in an intelligence collaboration
           | poisoning both sets of hash lists with non-CSAM material, so
           | that there is an intersection that makes it onto the device.
           | 
           | There is still that Apple human reviewer once the threshold
           | has passed. What I would love to ask Apple is - what happens
           | if/when their reviewers start noticing political material,
           | religious material, etc. is being flagged on a consistent
           | basis, thereby insinuating that the hash list has been
           | poisoned. What's their play at that point?
        
             | jdlshore wrote:
             | The document states that incorrectly flagged items are
             | forwarded to engineering for analysis. Given their target
             | false-positive rate (1 in 3 trillion, was it?) it seems
             | likely that engineering would very carefully analyze a rush
             | of false positives.
        
         | nullc wrote:
         | well crap I just wrote a response lecturing you on the Five
         | Eyes collusion before reading the last two lines of your post.
         | :P
         | 
         | There exists a kind of sufficiently advanced stupidity that it
         | can only be constructed by really smart people. People who are
         | smart enough to rationalize doing something which is
         | transparently wrong.
        
         | dexter89_kp3 wrote:
         | Curious, does any technological system used widely, standup to
         | the threat levels you mentioned?
        
           | PaulDavisThe1st wrote:
           | Probably not.
           | 
           | Does any technological system used widely justify itself by
           | saying "it would take two national jurisdictions cooperating
           | to break this?"
        
             | dexter89_kp3 wrote:
             | Was not defending this measure, my question was out of
             | curiosity
        
       | drodgers wrote:
       | > The second protection [against mis-inclusion of non CSAM
       | hashes] is human review: there is no automated reporting in
       | Apple's system. All positive matches must be visually confirmed
       | by Apple as containing CSAM before Apple will disable the account
       | and file a report with the child safety organization.
       | 
       | I don't understand this at all. As I understand it, part of the
       | problem is that -- in the US -- Apple isn't legally _allowed_ to
       | review, transmit or do anything else with suspected CSAM images,
       | so they can 't have a manual review process (or even check that
       | their neural hashing is working as expected on the real dataset).
       | 
       | Does anyone else have any idea of what this is trying to
       | describe?
       | 
       | If Apple really are somehow reviewing flagged photos to confirm
       | that they're CSAM and not maliciously flagged files before
       | sending any reports, then that does make the system substantially
       | more resilient to Five Eyes abuse (not that I wish that job on
       | anyone).
       | 
       | Edit: There's more context later in the document
       | 
       | > First, as an additional safeguard, the visual derivatives
       | themselves are matched to the known CSAM database by a second,
       | independent perceptual hash. This independent hash is chosen to
       | reject the unlikely possibility that the match threshold was
       | exceeded due to non-CSAM images that were adversarially perturbed
       | to cause false NeuralHash matches against the on-device encrypted
       | CSAM database. If the CSAM finding is confirmed by this
       | independent hash, the visual derivatives are provided to Apple
       | human reviewers for final confirmation.
       | 
       | This is more confusing: visually comparison using a second
       | perceptual hash doesn't actually to provide any protection
       | against mis-inclusion of non-CSAM images: it just double-checks
       | that the image really was a match in the database (ie. protects
       | against hash-collision errors), but it does't check that the
       | database contained actual CSAM.
       | 
       | Apple explicitly says that this process protects against mis-
       | inclusion though, which doesn't make sense to me yet.
        
         | zionic wrote:
         | It's simple, the reviewers will be cops (so viewing CP is
         | legal).
         | 
         | They will pass on/hit report at rates that make a FISA judge
         | blush.
        
         | shuckles wrote:
         | Your understanding is incorrect. Apple can, and is in fact
         | required to, verify that they have actual CSAM before
         | forwarding it to the Cyber Tip line. At that point, they must
         | delete the information within 60 days.
        
           | heavyset_go wrote:
           | > _Apple can, and is in fact required to, verify that they
           | have actual CSAM_
           | 
           | It's not Apple's job to decide if something is CSAM or not.
           | They're required to report it if they suspect it is, even if
           | they can't confirm it.
        
           | drodgers wrote:
           | Interesting. In that case, do you know why they talk about
           | reviews only seeing "visual derivatives" (from the second
           | perceptual hash)?
           | 
           | Either these 'derivatives' basically contain the original
           | image (so reviewers can verify that it's actual CSAM) and
           | there's no point in using derivatives at all, or they're more
           | abstract (eg. 8x8 pixellated images) in which case the
           | reviewer can't see the actual content (but could confirm a
           | database match).
           | 
           | Edit: I was able to find the answer, they suggest that the
           | 'visual derivative' is something like a 'low-resolution
           | version' of the original image, so the content should still
           | be clearly visible.
        
             | shuckles wrote:
             | Apple claims that "data minimization" is one of their
             | privacy pillars, and this was probably an attempt at that.
             | You could imagine an inverted colors or lower res image
             | gets the point across without subjecting you or Apple to
             | all the high fidelity images you have in your library.
        
         | FabHK wrote:
         | Your phone transmits the images with their security envelope
         | (which was computed on device and contains neural hash and
         | "visual derivative") to the iCloud server. During that process,
         | Apple does not know whether there's any CSAM in it, so they can
         | transmit legally.
         | 
         | Then the server determines whether the number of matches
         | exceeds the threshold. Only if that is the case (by crypto
         | magic) can the security envelope of the flagged images only (by
         | crypto magic) be unlocked, and the "visual derivative" be
         | reviewed.
         | 
         | (Note that if (at a later stage) E2EE is enabled for the
         | photos, the images themselves would never be accessible by
         | Apple or LE, whether flagged or not, if I understand the design
         | correctly).
        
           | dwaite wrote:
           | Your understanding seems correct. After a positive evaluation
           | from Apple, the CyberTipline report is filed to NCMEC, which
           | operates as a clearinghouse and notifies law enforcement.
           | 
           | Law enforcement then gets a court order, which will cause
           | Apple to release requested/available information about that
           | account.
           | 
           | If photos later are outside the key escrow system, Apple
           | would not be able to release the photos encryption keys, and
           | would only be able to share 'public' photo share information
           | which the user has opted to share without encryption to make
           | it web-accessible.
           | 
           | Presumably in this case, the visual derivatives would still
           | be used as evidence, but there would be 5th amendment
           | arguments around forcing access to a broader set of photos.
        
       | makerofthings wrote:
       | My phone is an extension of my brain, it is my most trusted
       | companion. It holds my passwords, my mail, my messages, my
       | photos, my plans and notes, it holds the keys to my bank accounts
       | and my investments. I sleep with it by my bed and I carry it
       | around every day. It is my partner in crime.
       | 
       | Now apple are telling me that my trusted companion is scanning my
       | photos as they are uploaded to iCloud, looking for evidence that
       | apple can pass to the authorities? They made my phone a snitch?
       | 
       | This isn't about security or privacy, I don't care about
       | encryption or hashes here, this is about trust. If I can't trust
       | my phone, I can't use it.
        
         | sneak wrote:
         | This is the thing Apple doesn't realize: our phone's OS vendor
         | has one job: not to betray us.
         | 
         | They have betrayed us.
         | 
         | I have purchased my last iPhone.
        
           | cucumb3rrelish wrote:
           | good luck finding a replacement, you'll end up using a custom
           | rom if you are serious about tracking and privacy, and that
           | custom rom will be... fully dependent on donations.
        
         | tsjq wrote:
         | of all the items in the list, this is what the govt is
         | interested in :
         | 
         | > It is my partner in crime.
        
         | zabatuvajdka wrote:
         | Technically they could have been scanning the photos already to
         | power some AI algorithms or whatever else.
         | 
         | I think this is a nudge for folks to wake up and see the
         | reality of what it means to use the cloud. We are leasing
         | storage space from Apple in this case.
         | 
         | Technically, it's no different than a landlord checking up on
         | their tenants to make sure "everything is okay."
         | 
         | And technically, if you do not like iCloud, don't use it and
         | roll your own custom cloud storage! After all, it's redundancy
         | and access the cloud provides. And Apple provides APIs to build
         | them.
         | 
         | Hell, with the new Files app on iOS i just use a Samba share
         | with my NAS server (just a custom built desktop with RAID 5 and
         | Ubuntu.)
        
           | jbuhbjlnjbn wrote:
           | No they really couldn't, because someone investigating,
           | monitoring and reverse engineering the device traffic might
           | have noticed, it would be leaked by a whistleblower, there
           | are plenty of ways this could ruin Apple.
           | 
           | Not so now, they are in the clear either way because Apple
           | themselfes cannot look into the hashes databank. So the
           | backdoor is there, the responsibility of the crime of spying
           | is forwarded to different actors, which even cannot be
           | monitored by Apple.
           | 
           | This is truly a devilish device they thought up. Make misuse
           | possible, exonerate any responsibility to outside actors, act
           | naive as if hands are clean.
        
           | a012 wrote:
           | > Technically, it's no different than a landlord checking up
           | on their tenants to make sure "everything is okay."
           | 
           | By this logic, you don't own your phone but rent it from
           | Apple?
        
             | zabatuvajdka wrote:
             | Technically I own my iPhone but the software is licensed to
             | me (not a lawyer but that's my understanding).
             | 
             | We probably also sign something that allows Apple to do
             | what they will with our images--albeit we retain the
             | copyrights to them.
             | 
             | Just to reiterate: the landlord metaphor is referring to
             | software which we lease from Apple--not the device itself.
             | 
             | This is yet another case where I want to emphasize the
             | importance of OPEN hardware and software designs. If we
             | truly want ownership, we have to take ownership into our
             | hands (which means a lot of hard work). Most commercial
             | software is licensed so no we don't own anything in whole
             | that runs commercial software.
        
           | makerofthings wrote:
           | They do scan the photos on my device, to provide useful
           | functionality to me. All good. If they scan photos in iCloud,
           | that's up to apple, I can use it or not. No problem. With
           | this new setup, my device can betray me.I think that is
           | different and crosses a line.
        
       | sylens wrote:
       | I think this is the first time they have mentioned that you will
       | be able to compare the hash of the database on your device with a
       | hash published in their KB article. They also detailed that the
       | database is only the intersection of hash lists from two child
       | safety organizations under separate governmental jurisdictions.
       | 
       | My immediate thought is that this could still be poisoned by Five
       | Eyes participants, and that it does not preclude state actors
       | forcing Apple to replicate this functionality for other purposes
       | (which would leave the integrity of the CSAM database alone, thus
       | not triggering the tripwire).
        
         | notJim wrote:
         | > this could still be poisoned by Five Eyes participants, and
         | that it does not preclude state actors forcing Apple to
         | replicate this functionality for other purposes
         | 
         | The thing is, if this is your threat model you're already
         | screwed. Apple has said they comply with laws in jurisdictions
         | where they operate. The state can pass whatever surveillance
         | laws they want, and I do believe Apple has shown they'll fight
         | them to an extent, but at the end of the day they're not going
         | to shut down the company to protect you. This all seems
         | orthogonal to the CSAM scanning.
         | 
         | Additionally, as laid out in the report, the human review
         | process means even if somehow there is a match that isn't CSAM,
         | they don't report it until it has been verified.
        
         | vmception wrote:
         | the opportunity being to add general functions in photo viewing
         | apps that add a little entropy to every image (for this
         | specific purpose), to rotate hashes, rendering the dual
         | databases useless
         | 
         | monetization I guess being to hope for subscribers on github,
         | as this could likely just be a nested dependency that many apps
         | import. a convenient app for this specific purpose might not
         | last long in app stores.
        
           | initplus wrote:
           | Perceptual hashes are specifically designed to be resistant
           | to minor visual alterations.
        
             | vmception wrote:
             | Let the cat and mouse race begin
             | 
             | Any perceptual hash apps to test that theory on?
        
       | semerda wrote:
       | "Human review and reporting" - fact checkers?
       | 
       | Or are we to assume that their AI is 100% effective.
       | 
       | This will not end well..
        
       | lifty wrote:
       | Is there a good argument for doing the scanning on the phone and
       | not on the iCloud servers?
        
         | kemayo wrote:
         | Well, theoretically this could be a step towards e2e encrypting
         | the photos so that Apple _can 't_ see them on their servers.
         | 
         | This entire "secure tickets that unlock a derivative of your
         | photos so we can review them" system would, in this
         | interpretation, be there as a way to neuter law enforcement
         | objections[1] to no longer being able to serve warrants for the
         | contents of your photo library.
         | 
         | Now, will this _actually_ happen? Hard to say.
         | 
         | [1] There was a story last year claiming that Apple wanted to
         | turn on e2e, but the FBI got them to drop it.
         | https://www.popularmechanics.com/technology/security/a306318...
        
         | himaraya wrote:
         | No. Contrary to Craig's claims, Google & Microsoft et al. do
         | similar hash matching on the cloud, instead of 'looking at
         | images'.
        
           | notJim wrote:
           | But people consider that bad! With this system, Apple can
           | never look at your photos in the cloud. They are a pipe for
           | the bits, but they don't examine the contents, except on your
           | device.
           | 
           | I'm not saying I necessarily agree that this is better, all-
           | told (mixed feelings), but I think it's worth acknowledging
           | why a rational actor might consider this superior to Google
           | doing whatever the fuck they want with anything you upload.
        
             | himaraya wrote:
             | I mean, Apple can, as long as iCloud remains unsecured by
             | e2e, the roadmap for which remains MIA.
        
               | notJim wrote:
               | They can, but they say they don't. Of course if you're
               | uploading data to their servers (and to a lesser extent,
               | using their OS), you are trusting them to some degree.
        
         | LexGray wrote:
         | Photos are currently encrypted e2e already so Apple does not
         | have the access server side to create the hash unlike their
         | competition who chews their customers data for machine
         | learning. https://support.apple.com/en-us/HT202303
        
           | kemayo wrote:
           | You've misread that page -- the table just refers to content
           | being stored in an encrypted form, even if Apple still
           | possesses a key. There's a list below it of fully end-to-end
           | encrypted content, which doesn't include Photos.
        
             | LexGray wrote:
             | True enough. Apple does have the key. Well then instead the
             | answer is that they are having you pay the CPU and energy
             | cost instead of wasting twice the energy decrypting and
             | hashing your photos server side.
        
               | dwaite wrote:
               | You were sort of both right.
               | 
               | Apple has the keys in escrow across several HSMs separate
               | from the cloud hosting environment. Nothing within iCloud
               | can actually see the content, but there are processes to
               | get access to a particular account for various reasons
               | (such as government order). There is a separation of
               | permissions as well as hardware-generated audit logs.
               | Reportedly the HSM systems were specifically manipulated
               | to not be extensible via software updates.
               | 
               | So it is E2E with a separate key escrow system, which
               | Apple (wisely) does not attempt to call E2E.
               | 
               | Apple couldn't implement the PhotoDNA-based scanning
               | other providers have done because Apple would need access
               | to the escrowed keys for each user photo uploaded.
        
       | m3kw9 wrote:
       | Setting arguments aside, if ever there is a CSAM scanner I'd let
       | loose in my account is Apples.
        
       | [deleted]
        
       | swiley wrote:
       | Burn it all down.
       | 
       | If you can't understand the theory and reimplement it then you
       | shouldn't even look at it.
       | 
       | The internet is full of pathological wizards and witches. Come
       | here for obscure books but otherwise stay far away.
        
       | istingray wrote:
       | I just found the perfect video clip to represent Apple's response
       | to launching this feature: https://youtu.be/-7XchZICFQU?t=17
        
       | shuckles wrote:
       | It's pretty lonely over here in technical discussion land. Have
       | we considered Reuters's intern's take on this?
        
         | dang wrote:
         | Please don't post unsubstantive comments yourself. If other
         | threads/comments aren't good, options include posting something
         | better, or not posting. Degrading things even further is not a
         | good choice (edit: particularly when the thread is new--threads
         | are extremely sensitive to initial conditions, and a
         | flamebait/unsubstantive comment early on can have a strongly
         | degrading effect).
         | 
         | https://news.ycombinator.com/newsguidelines.html
         | 
         | https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...
        
           | shuckles wrote:
           | Sincere question: why doesn't this policy apply to the
           | innumerable number of hot takes which are variations on the
           | exact same, often misinformed, reaction? I'm thinking of
           | comments like, "Apple will send me to prison for taking bath
           | tub pics of my infant!"
           | 
           | I have seen many on here.
        
             | dang wrote:
             | Two answers. First, it does apply. If you see a post that
             | ought to have been moderated but hasn't been, the likeliest
             | explanation is that we didn't see it. We don't come close
             | to seeing everything that gets posted here--there is far
             | too much. You can help by flagging it or emailing us at
             | hn@ycombinator.com.
             | 
             | Second, there are degrees of these things. It's definitely
             | bad for comments to repeat the same shallow things over and
             | over, but the GP was particularly information-free and
             | snarky, and mucking up this thread while claiming to be
             | speaking in favor of technical discussion is particularly
             | counterproductive.
             | 
             | https://hn.algolia.com/?dateRange=all&page=0&prefix=true&so
             | r...
        
               | shuckles wrote:
               | I think the flagging mechanism is rife for abuse. There
               | are clearly more people interested in low information
               | reactions than substantive discussion. If all you act on
               | is reports, you will be biased towards whatever they care
               | about. That is not a good way to moderate for nuance. I'm
               | sure you may have considered this, but I don't think it's
               | coming through in the discussion of this important topic
               | which has been mostly hysterical.
        
               | dang wrote:
               | Reports are not all we act on. We read the threads too.
               | No one can read all of them though.
               | 
               | Discussion on this topic has certainly been mixed, but
               | "mostly hysterical" sounds like an exaggeration to me.
               | People's reactions to these things are conditioned by the
               | cognitive bias that causes us to weight the things we
               | dislike much more strongly than the things we agree with 
               | (https://hn.algolia.com/?dateRange=all&page=0&prefix=true
               | &que...).
        
               | [deleted]
        
               | zepto wrote:
               | Mostly hysterical may technically be an exaggeration, but
               | mostly misinformed, and often dis-informative is not.
               | 
               | This isn't just about people weighing things they dislike
               | more strongly.
               | 
               | It's also about groupthink, confirmation bias, and a lack
               | of curiosity.
               | 
               | HN doesn't have an immune system against straight up
               | misinformation.
        
               | dang wrote:
               | It certainly doesn't! Misinformation and disinformation
               | are terms du jour, but as far as I can tell they are
               | indistinguishable from old-fashioned people-being-wrong-
               | on-the-internet. If you expect an internet forum to be
               | immune from that...well, that's too much to expect. As
               | far as I can tell (and moderating HN for years has really
               | hammered this home), nearly everyone is wrong about
               | nearly everything.
               | 
               | Not only that, but no one (or perhaps we can say for
               | decorum's sake, _nearly_ no one) really cares about the
               | truth. We care about what we like and we want it to win
               | against what we dislike; all the rest is rationalization.
               | Moderating HN has really hammered that one into me as
               | well. Such is human nature, and trying to moderate
               | against it would be futile, not to mention a fast track
               | to burnout. I 've tried, and have the scars.
               | 
               | And at the same time I completely sympathize with the
               | frustration of watching discourse on a major topic being
               | dominated by shallow, indignant repetition that isn't
               | engaging with the specifics of a situation.
               | 
               | From a moderation point of view, when a tsunami of a
               | story washes over HN for a week or more, there's not that
               | much we can do about it. We can prevent it from
               | completely dominating the site; we can ask a few people
               | not to break the site guidelines when we notice that
               | happening; that's about it. Pretending we can do much
               | more than that would be like Canute commanding the waves.
               | 
               | (Btw, maybe I'm mistaking you for someone else but I have
               | the feeling that your comments have gone, over the years,
               | from frequently breaking the site guidelines to doing a
               | pretty good job of respecting them. If that's true,
               | that's amazing and I appreciate it a ton.)
               | 
               | Edit: I could keep adding new paragraphs to this post for
               | almost forever, but here's another factor that moderating
               | HN has hammered into me. It's possible for people to be
               | wrong on both the facts and the arguments and yet for
               | there to be some deeper justice or truth in what they're
               | saying. Oftentimes, conflicts play out like this: someone
               | responds by correcting facts or pointing out flaws in
               | arguments, but they only succeed in provoking an
               | intensified fury and end up drawing it to themselves.
               | That's because they're answering on the level of facts-
               | and-arguments as a way of dismissing, rather than
               | acknowledging, that deeper level of truth or justice that
               | people have strong feelings about. Mostly all this does
               | is increase the amplitude of the conflict. This is very
               | much a co-creation between the conflicting parties--i.e.
               | between the ones who are right (or feel they are) and the
               | other ones who are right (or feel they are). This is the
               | real answer to complaints about groupthink, in my
               | opinion.
        
               | shuckles wrote:
               | Responding to your edit: I think Zepto understands that
               | and tried to direct the conversation that way. For
               | example: https://news.ycombinator.com/item?id=28120649
               | 
               | They even spent time engaging with someone who said CSAM
               | shouldn't matter by arguing that the fact that lots of
               | the general population cares about it makes it a concern
               | worth thinking about.
               | 
               | Even I did, in my own way:
               | https://news.ycombinator.com/item?id=28165116
               | 
               | Perhaps our responses here are in frustration - that we
               | weren't able to engage with the matters at hand with
               | people of similar skill and interests. I suppose one way
               | to read your post is that HN isn't suited for that level
               | of discussion on breaking news with emotional resonance.
               | Maybe another space.
        
               | zepto wrote:
               | Not only that, but no one (or perhaps we can say for
               | decorum's sake, nearly no one) really cares about the
               | truth. > We care about what we like and we want it to win
               | against what we dislike; all the rest is rationalization.
               | 
               | This is a bleak, valueless and ascientific view of the
               | world. It took a while for me to process it. It is
               | consistent with the postmodern post-truth zeitgeist of
               | tribalism and power being the only thing that matters.
               | 
               | > Moderating HN has really hammered that one into me as
               | well. Such is human nature, and trying to moderate
               | against it would be futile, not to mention a fast track
               | to burnout. I've tried, and have the scars.
               | 
               | I can very clearly see why moderating HN would cause you
               | to adopt this view. It's interesting because it really
               | doesn't match the sense of who you are as you come across
               | in your comments.
               | 
               | My own view is that we are not in such a bleak world.
               | Meaning structures are shaped by media and we are only a
               | decade or so into the era where media has become social.
               | This is such a monumental and unprecedented change that
               | it's bound to appear that there is no up or down for a
               | while.
               | 
               | I think it's transient, and doesn't reflect a universal
               | truth about human nature.
               | 
               | For one obvious contradiction - can you think of anyone
               | who actually prefers the idea of a valueless ascientific
               | world? If so, who, and why?
        
               | zepto wrote:
               | I may be the person you are thinking of. I don't think I
               | ever 'frequently' broke the site guidelines, but I
               | definitely don't do it much these days, and it's always a
               | mistake when I do.
               | 
               | With regard to the conflict dynamics - I agree, although
               | the amplification isn't necessarily bad.
               | 
               | Also I think with regard to the 'deeper truth', often
               | it's not so much 'deeper' as simpler.
               | 
               | In this case for example, people strongly dislike being
               | distrusted - it might well be a genetic disposition since
               | it is a crucial part of our social psychology, and Apple
               | is making a mistake by ignoring that. This isn't about
               | people not trusting Apple. It's about people not feeling
               | trusted _by_ Apple.
               | 
               | Perhaps that counts as deeper, but there isn't much to
               | say about it - it's not a loud sounding insight, and so
               | all of the other arguments get built up as a way of
               | amplifying that signal.
        
               | zepto wrote:
               | > but as far as I can tell they are indistinguishable
               | from old-fashioned people-being-wrong-on-the-internet. If
               | you expect an internet forum to be immune from
               | that...well, that's too much to expect.
               | 
               | Also, this is a little too glib. I'm not talking about
               | people being wrong or misinformed. I am talking about
               | people actively spreading misinformation. These _are_
               | distinguishable although I accept that they may be
               | impractical to moderate, hence my claim about no immune
               | system.
        
               | shuckles wrote:
               | I think the fact that no technical document has lasted on
               | the front page but various rumors have is a strong point
               | for my view. An article about internal dissent at Apple
               | has hung on for most of the day, yet almost no comments
               | on it engage with any of the key concepts in the article:
               | what does it mean for an 800 post slack thread to exist?
               | Why does it matter that the security employees don't seem
               | against the idea on the thread?
               | 
               | Similarly, I doubt anyone would be able to reliably match
               | the comments section of any of the dozen articles about
               | this announcement that have reached the front page in the
               | last 8 days with the actual article content.
               | 
               | These are all bad signs about the healthiness of
               | discussion.
        
               | dang wrote:
               | https://news.ycombinator.com/item?id=28173134 is #1 on
               | the front page right now.
               | 
               | Matthew Green and Alex Stamos both wrote things about
               | this that were on HN's front page for a long time. I'm
               | pretty sure there have been other technical threads as
               | well.
        
               | unityByFreedom wrote:
               | You're doing a bang up job man. It's very kind of you to
               | continue responding to these folks when they've taken
               | your attention rather off topic from your original
               | comment above. Other people's bad behavior doesn't excuse
               | your own, which also happens to be the Chinese version of
               | the golden rule: Ji Suo Bu Yu ,Wu Shi Yu Ren  (literally:
               | what you don't want, don't apply to others)
        
       | akomtu wrote:
       | This is Apple's trying to shift the focus onto technicalities.
       | Remotely monitored CCTV cams in your house is never a good idea,
       | no matter what the treat model is.
        
       | xuki wrote:
       | What is the process to make sure the next management will honor
       | these promises?
        
         | notJim wrote:
         | Apple management changing would be a massive threat regardless
         | of these premises. Apple has gone to bat for encryption and
         | privacy to a greater degree than Google, for example, and
         | that's because of their current management. You should not in
         | any way assume that the next group will do the same!
        
         | hypothesis wrote:
         | Heck, what is the process for _current_ management to honor
         | them?
        
           | xuki wrote:
           | Weirdly enough, I trust the current management to do the
           | right thing. But that can be changed at a whim, and in my
           | opinion, the greatest security threat to this whole thing.
        
             | himaraya wrote:
             | I mean, I trusted current management not to pull stunts
             | like this to begin with.
        
       | 1f60c wrote:
       | Having read the document, I am even more confident in Apple's
       | approach than before. I commend Apple for its transparency.
        
       | almostdigital wrote:
       | > Apple will publish a Knowledge Base article containing a root
       | hash of the encrypted CSAM hash database included with each
       | version of every Apple operating system that supports the
       | feature. Additionally, users will be able to inspect the root
       | hash of the en- crypted database present on their device, and
       | compare it to the expected root hash in the Knowledge Base
       | article.
       | 
       | This is just security theater, they already sign the operating
       | system images where the database reside. And there is no way to
       | audit that the database is what they claim it is, doesn't contain
       | multiple databases that can be activated under certain
       | conditions, etc.
       | 
       | > This feature runs exclusively as part of the cloud storage
       | pipeline for images being up- loaded to iCloud Photos and cannot
       | act on any other image content on the device
       | 
       | Until a 1-line code change happens that hooks it into UIImage.
        
         | kemayo wrote:
         | > And there is no way to audit that the database is what they
         | claim it is, doesn't contain multiple databases that can be
         | activated under certain conditions, etc.
         | 
         | Although this is true, the same argument _already_ applies to
         | "your phone might be scanning all your photos and stealthily
         | uploading them" -- Apple having announced this program doesn't
         | seem to have changed the odds of that.
         | 
         | At some point you have to trust your OS vendor.
        
           | majormajor wrote:
           | If you're uploading to the cloud, you have to trust a lot
           | more than just your OS vendor (well, in the default case,
           | your OS vendor often == your cloud vendor, but the access is
           | a lot greater once the data is on the cloud).
           | 
           | And if your phone has the _capability_ to upload to the
           | cloud, then you have to trust your OS vendor to respect your
           | wish if you disable it, etc.
           | 
           | It's curious that _this_ is the particular breaking point on
           | the slope for people.
           | 
           | The "on device" aspect just makes it more immediate feeling,
           | I guess?
        
             | kemayo wrote:
             | Yeah, it's weird. Speaking purely personally, whether the
             | scanning happens immediately-before-upload on my phone or
             | immediately-after-upload in the cloud doesn't really make a
             | difference to me. But this is clearly not a universal
             | opinion.
             | 
             | The most-optimistic take on this I can see is that this
             | program could be the prelude to needing to trust less
             | people. If Apple can turn on e2e encryption for photos,
             | using this program as the PR shield from law enforcement to
             | be able to do it, that'd leave us having to _only_ trust
             | the OS vendor.
        
               | notJim wrote:
               | > Speaking purely personally, whether the scanning
               | happens immediately-before-upload on my phone or
               | immediately-after-upload in the cloud doesn't really make
               | a difference to me.
               | 
               | What I find interesting is that so many people find it
               | _worse_ to do it on device, because of the risk that they
               | do it to photos you don 't intend to upload. This is
               | clearly where Apple got caught off-guard, because to
               | them, on-device = private.
               | 
               | It seems like the issue is really the mixing of on-device
               | and off. People seem to be fine with on-device data that
               | stays on-device, and relatively fine with the idea that
               | Apple gets your content if you upload it to them. But
               | when they analyze the data on-device, and then upload the
               | results to the cloud, that really gets people.
        
               | shuckles wrote:
               | This seems like a necessary discussion to have in
               | preparation for widespread, default end to end
               | encryption.
        
               | ec109685 wrote:
               | Them adding encrypted hashes to photos you don't intend
               | to upload is pointless and not much of a threat given the
               | photo themselves are they. They don't do it, but it
               | doesn't feel like a huge risk.
        
               | rustymonday wrote:
               | Is this really surprising to you? I'm not trying to be
               | rude, but this is an enormous distinction. In today's
               | world, smartphones are basically an appendage of your
               | body. They should not work to potentially incriminate its
               | owner.
        
               | alwillis wrote:
               | _They should not work to potentially incriminate its
               | owner._
               | 
               | But that ship has long sailed, right?
               | 
               | Every packet that leaves a device potentially
               | incriminates its owner. Every access point and router is
               | a potential capture point.
        
               | rustymonday wrote:
               | When I use a web service, I expect my data to be
               | collected by the service, especially if it is free of
               | charge.
               | 
               | A device I own should not be allowed to collect and scan
               | my data without my permission.
        
               | alwillis wrote:
               | _A device I own should not be allowed to collect and scan
               | my data without my permission._
               | 
               | It's not scanning; it's creating a cryptographic safety
               | voucher for each photo you upload to iCloud Photos. And
               | unless you reach a threshold of 30 CSAM images, Apple
               | knows nothing about _any_ of your photos.
        
             | userbinator wrote:
             | Yes, you had to trust Apple, but the huge difference with
             | this new thing is that hiding behind CSAM gives them far
             | more (legally obligated, in fact --- because showing you
             | the images those hashes came from would be illegal)
             | plausible deniability and difficulty of verifying their
             | claims.
             | 
             | In other words, extracting the code and analysing it to
             | determine that it does do what you expect is, although not
             | _easy_ , still legal. But the source, the CSAM itself, is
             | illegal to possess, so you can't do that verification much
             | less publish the results. It is this effective legal moat
             | around those questioning the ultimate targets of this
             | system which people are worried about.
        
               | majormajor wrote:
               | Surely they could do their image matching against all
               | photos in iCloud without telling you in advance, and then
               | you'd be in exactly the same boat? Google was doing this
               | for email as early as 2014, for instance, with the same
               | concerns about its extensibility raised by the ACLU: http
               | s://www.theguardian.com/technology/2014/aug/04/google-
               | ch...
               | 
               | So in a world where Apple pushes you to set up icloud
               | photos by default, and can do whatever they want there,
               | and other platforms have been doing this sort of of thing
               | for years, it's a bit startling that "on device before
               | you upload" vs "on uploaded content" triggers far more
               | discontent?
               | 
               | Maybe it's that Apple announced it at all, vs doing it
               | relatively silently like the others? Apple has always had
               | access to every photo on your device, after all.
        
               | pseudalopex wrote:
               | It isn't startling people trust they can opt out of
               | iCloud photos.
        
               | kemayo wrote:
               | If you trust that you can opt out of iCloud Photos to
               | avoid server-side scanning, trusting that this on-device
               | scanning only happens as part of the iCloud Photos upload
               | process (with the only way it submits the reports being
               | as metadata attached to the photo-upload, as far as I can
               | tell) seems equivalent.
               | 
               | There's certainly a slippery-slope argument, where some
               | future update might change that scanning behavior. But
               | the system-as-currently-presented seems similarly
               | trustable.
        
               | pseudalopex wrote:
               | I trust Apple doesn't upload everyone's photos despite
               | opting out because it would be hard to hide.
        
               | kemayo wrote:
               | I bet it'd take a while. The initial sync for someone
               | with a large library is big, but just turning on upload
               | for new pictures is only a few megabytes a day. Depending
               | on how many pictures you take, of course. And if you're
               | caught, an anodyne "a bug in iCloud Photo sync was
               | causing increased data usage" statement and note in the
               | next iOS patch notes would have you covered.
               | 
               | And that's assuming they weren't actively hiding anything
               | by e.g. splitting them up into chunks that could be
               | slipped into legitimate traffic with Apple's servers.
        
             | himaraya wrote:
             | No, the threat model differs entirely. Local scanning
             | introduces a whole host of single points of failure,
             | including the 'independent auditor' & involuntary scans,
             | that risk the privacy & security of all local files on a
             | device. Cloud scanning largely precludes these potential
             | vulnerabilities.
        
               | majormajor wrote:
               | Your phone threat model should already include "the OS
               | author has full access to do whatever they want to
               | whatever data is on my phone, and can change what they do
               | any time they push out an update."
               | 
               | I don't think anyone's necessarily being too upset or
               | paranoid about THIS, but maybe everyone should also be a
               | little less trusting of every closed OS - macOS, Windows,
               | Android as provided by Google - that has root access too.
        
               | himaraya wrote:
               | Sure, but that doesn't change the fact that the
               | vulnerabilities with local scanning remain a significant
               | superset of cloud scanning's.
               | 
               | Apple has built iOS off user trust & goodwill, unlike
               | most other OSes.
        
               | shuckles wrote:
               | Cloud Scanning vulnerability: no transparency over data
               | use. On the phone, you can always confirm the contents of
               | what's added to the safety voucher's associated data. On
               | the cloud, anything about your photos is fair game.
               | 
               | Where does that fit in your set intersection?
        
               | himaraya wrote:
               | > On the phone, you can always confirm the contents of
               | what's added to the safety voucher's associated data.
               | 
               | ...except you can't? Not sure where these assumptions
               | come from.
        
               | ec109685 wrote:
               | It's code running your device is the point, so while
               | "you" doesn't include everyone, it does include people
               | who will verify this to a greater extent than if done on
               | cloud.
        
               | lttlrck wrote:
               | It differs, but iOS already scans images locally and we
               | really don't know what they do with the meta data, and
               | what "hidden" categories there are.
        
               | himaraya wrote:
               | Yes, exactly why Apple breaching user trust matters.
        
               | shuckles wrote:
               | And how is telling you in great detail about what they're
               | planning to do months before they do it and giving you a
               | way to opt out in advance a breach of trust? What more
               | did you expect from them?
        
               | himaraya wrote:
               | You forgot, 'after it leaked'
        
               | shuckles wrote:
               | It's almost certain the "leak" was from someone they had
               | pre-briefed prior to a launch. You don't put together 80+
               | pages of technical documentation with multiple expert
               | testimony in 16 hours.
        
               | himaraya wrote:
               | 'Almost certain'? Have you heard of contingency planning?
        
               | stale2002 wrote:
               | > What more did you expect from them?
               | 
               | Well they could not do it.
        
               | shuckles wrote:
               | You might prefer that, but it doesn't violate your
               | privacy for them to prefer a different strategy.
        
               | stale2002 wrote:
               | why even ask the question " What more did you expect from
               | them?" if you didn't care about the answer?
               | 
               | I gave a pretty obvious and clear answer to that, and
               | apparently you didn't care about the question in the
               | first place, and have now misdirected to something else.
               | 
               | I am also not sure what possible definition of "privacy"
               | that you could be using, that would not include things
               | such as on device photo scanning, for the purpose of
               | reporting people to the police.
               | 
               | Like, lets say it wasn't Apple doing this. Lets say it
               | was the government. As in, the government required every
               | computer that you own, to be monitored for certain
               | photos, at which point the info would be sent to them,
               | and they would arrest you.
               | 
               | Without a warrant.
               | 
               | Surely, you'd agree that this violates people's privacy?
               | The only difference in this case, is that the government
               | now gets to side step 4th amendment protections, by
               | having a company do it instead.
        
               | shuckles wrote:
               | My question was directed at someone who claimed their
               | privacy was violated, and I asked them to explain how
               | they would've liked their service provider to handle a
               | difference in opinion about what to build in the future.
               | I don't think your comment clarifies that.
        
               | stale2002 wrote:
               | > how they would've liked their service provider to
               | handle a difference in opinion about what to build in the
               | future
               | 
               | And the answer is that they shouldn't implement things
               | that violate people's privacy, such as things that would
               | be illegal for the government to do without a warrant.
               | 
               | That is the answer. If it is something that the
               | government would need a warrant for, then they shouldn't
               | do it, and doing it would violate people's privacy.
        
               | shuckles wrote:
               | What's the difference between hybrid cloud/local scanning
               | "due to a bug" checking all your files and uploading too
               | many safety vouchers and cloud scanning "due to a bug"
               | uploading all your files and checking them there?
        
               | himaraya wrote:
               | ...because cloud uploads require explicit user consent,
               | practically speaking? Apple's system requires none.
        
               | shuckles wrote:
               | I think the mods should consider shadow banning you. Your
               | comments make little sense.
        
               | himaraya wrote:
               | Ditto, too bad you got flagged earlier
        
               | kemayo wrote:
               | Wouldn't both of those scenarios imply that the "bug" is
               | bypassing any normal user consent? They're only
               | practically different in that the "upload them all for
               | cloud-scanning" one would take longer and use more
               | bandwidth, but I suspect very few people would notice.
        
               | [deleted]
        
               | himaraya wrote:
               | I think the difference lies in the visibility of each
               | system in typical use. Apple's local scanning remains
               | invisible to the user, in contrast to cloud uploading.
        
           | Veserv wrote:
           | Yes, they can technically already do so, but that is not the
           | question. The question is what can they _legally_ do and
           | justify with high confidence in the event of a legal
           | challenge.
           | 
           | Changes to binding contractual terms that allow broad
           | readings and provide legal justification for future overreach
           | are dangerous. If they really are serious that they are going
           | to use these new features in a highly limited way then they
           | can put their money where their mouth is and add _legally
           | binding_ contractual terms that limit what they can do with
           | serious consequences if they are found to be in breach. Non-
           | binding marketing PR assurances that they will not abuse
           | their contractually justified powers are no substitute for
           | the iron fist of legal penalty clause.
        
           | m4rtink wrote:
           | What about trust-but-verify ?
           | 
           | If the OS was open source and supported reproducible builds,
           | you would not have to trust them, you could verify what it
           | actually does & make sure the signed binaries they ship you
           | actually correspond to the source code.
           | 
           | Once kinda wonders what they want to hide if they talks so
           | much about user privacy yet don't provide any means for users
           | to verify their claims.
        
           | almostdigital wrote:
           | Yeah that's true, although to do some sort of mass scanning
           | stealthily they would need a system exactly like what they
           | built with this, if they tried to upload everything for
           | scanning the data use would be enormous and give it away.
           | 
           | I guess it comes down to that I don't trust an OS vendor that
           | ships an A.I. based snitch program that they promise will be
           | dormant.
        
             | kemayo wrote:
             | Speaking cynically, I think that them having announced this
             | program like they did makes it less likely that they have
             | any sort of nefarious plans for it. There's a _lot_ of
             | attention being paid to it now, and it 's on everyone's
             | radar going forwards. If they actually wanted to be sneaky,
             | we wouldn't have known about this for ages.
        
               | theonlybutlet wrote:
               | They'd have to be transparent about it as someone would
               | easily figure it out.You have no way of verifying the
               | contents of that hash database. once the infrastructure
               | is in place (i.e. on your phone) it's a lot easier to
               | expand on it. People have short memories and are easily
               | desensitized, after a year or two of this, everyone will
               | forget and we'll be in uproar about it expanding to
               | include this or that...
        
               | smichel17 wrote:
               | You're making the mistake of anthropomorphizing a
               | corporation. Past a certain size, corporations start
               | behaving less like people and more like computers, or
               | maybe profit-maximizing sociopaths. The intent doesn't
               | matter, because 5 or 10 years down the line, it'll likely
               | be a totally different set of people making the decision.
               | If you want to predict a corporation's behavior, you need
               | to look at the constants (or at least, slower-changing
               | things), like incentives, legal/technical limitations,
               | and internal culture/structure of decision-making (e.g.
               | How much agency do individual humans have?).
        
               | kemayo wrote:
               | I feel that I _was_ stating the incentives, though.
               | 
               | This being an area people are paying attention to makes
               | it less likely they'll do unpopular things involving it,
               | from a pure "we like good PR and profits" standpoint.
               | They might sneak these things in elsewhere, but this
               | specific on-device-scanning program has been shown to be
               | a risk even at its current anodyne level.
        
             | shuckles wrote:
             | No they wouldn't need a system like this. They already
             | escrow all your iCloud Backups, and doing the scanning
             | server side allows you to avoid any scrutiny through code
             | or network monitoring.
        
           | g42gregory wrote:
           | > At some point you have to trust your OS vendor.
           | 
           | Yes, and we were trusting Apple. And now this trust is going
           | away.
        
             | rootusrootus wrote:
             | > now this trust is going away
             | 
             | Is it really? There are some very loud voices making their
             | discontent felt. But what does the Venn diagram look like
             | between 'people who are loudly condemning Apple for this'
             | and 'people who were vehemently anti-Apple to begin with'?
             | 
             | My trust was shaken a bit, but the more I hear about the
             | technology they've implemented, the more comfortable I am
             | with it. And frankly, I'm far more worried about gov't
             | policy than I am about the technical details. We can't fix
             | policy with tech.
        
               | matwood wrote:
               | > I'm far more worried about gov't policy than I am about
               | the technical details. We can't fix policy with tech.
               | 
               | Yeah. I don't really understand the tech utopia feeling
               | that Apple could simply turn on e2ee and ignore any
               | future legislation to ban e2ee. The policy winds are
               | clearly blowing towards limiting encryption in some
               | fashion. Maybe this whole event will get people to pay
               | more attention to policy...maybe.
        
           | ianmiers wrote:
           | What happens if someone tries to coerce Apple into writing
           | backdoor code? Engineers at Apple could resist, resign, slow
           | roll the design and engineering process. They could leak it
           | and it would get killed. Things would have to get very very
           | bad for that kind of pressure to work.
           | 
           | On the other hand, once Apple has written a backdoor
           | enthusiastically themselves, it's a lot easier to force
           | someone to change how it can be used. The changes are small
           | and compliance can be immediately verified and refusal
           | punished. To take it to its logical extreme: you cannot
           | really fire or execute people who delay something (especially
           | if you lake the expertise to tell how long it should take).
           | But you can fire or execute people who refuse to flip a
           | switch.
           | 
           | This technology deeply erodes Apple and its engineers'
           | ability to resist future pressure. And the important bit here
           | is there adversary isn't all powerful. It can coerce you to
           | do things in secret, but its power isn't unlimited. See what
           | happened with yahoo.[0]
           | 
           | https://www.reuters.com/article/us-yahoo-nsa-
           | exclusive/exclu...
        
           | slg wrote:
           | Which is why I am confused by a lot of this backlash. Apple
           | already controls the hardware, software, and services. I
           | don't see why it really matters where in that chain the
           | scanning is done when they control the entire system. If
           | Apple can't be trusted with this control today, why did
           | people trust them with this control a week ago?
        
             | conradev wrote:
             | Yeah. I will say, though, I am happy that people are having
             | the uncomfortable realization that they have very little
             | control over what their iPhone does.
        
               | rootusrootus wrote:
               | Now we just need everyone to have that same realization
               | about almost all the software we use on almost all the
               | devices we own. As a practical matter 99.99% of us
               | operate on trust.
        
               | hypothesis wrote:
               | Remember that emission cheating scandal? Where we
               | supposedly had a system in place to detect bad actors and
               | yet it was detected by a rare case of some curious
               | student exploring how things work or some such.
        
             | [deleted]
        
             | XorNot wrote:
             | Because people (HN especially) overestimate how easy it is
             | to develop and deploy "stealth" software that is never
             | detected in a broad way.
             | 
             | The best covert exfiltration is when you can hit individual
             | devices in a crowd, so people already have no reason to be
             | suspicious. But you're still leaving tracks - connections,
             | packet sizes etc. if you actually want to do anything and
             | you only need to get caught _once_ for the game to be up.
             | 
             | This on the other hand is essentially the perfect channel
             | for its type of surveillance...because it _is_ a covert
             | surveillance channel! Everyone is being told to expect it
             | to exist, that it 's normal, and that it will receive
             | frequent updates. No longer is their a danger a security
             | researcher will discover it, its "meant" to be there.
        
             | nabakin wrote:
             | This is the conclusion that I personally arrived at. When I
             | confronted some friends of mine with this, they gave me
             | some good points to the contrary.
             | 
             | Letting Apple scan for government-related material on your
             | device is a slippery slope to government surveillance and a
             | hard line should be drawn here. Today, Apple may only be
             | scanning your device for CSAM material for iCloud.
             | Tomorrow, Apple may implement similar scanning elsewhere,
             | gradually expand the scope of scanning to other types of
             | content and across the entire device, implement similar
             | government-enforcing processes on the device, and so on.
             | It's not a good direction for Apple to be taking,
             | regardless of how it works in this particular case. A
             | user's device is their own personal device and anything
             | that inches toward government surveillance on that device,
             | should be stopped.
             | 
             | Another point made was that government surveillance never
             | happens overnight. It is always very gradual. People don't
             | mean to let government surveillance happen and yet it does
             | because little things like this evolve. It's better to stop
             | potential government surveillance in its tracks right now.
        
             | zwily wrote:
             | That's where I'm at. They could have just started doing
             | this without even saying anything at all.
        
               | jimworm wrote:
               | But in that case they would eventually be caught red-
               | handed and won't get to do the "for the children" spiel
               | and get it swept under the rug like it's about to be.
        
               | bredren wrote:
               | The goal is not for it to be swept under the rug. The
               | goal is for it to deflect concerns over the coming
               | Privacy Relay service.
        
               | heavyset_go wrote:
               | The government cares far more about other things than
               | CSAM, like terrorism, human and drug trafficking,
               | organized crime, and fraud. Unless the CSAM detection
               | system is going to start detecting those other things and
               | report them to authorities, as well, it won't deflect any
               | concerns over encryption or VPNs.
        
               | threatofrain wrote:
               | Their private relay service appears orthogonal to CSAM...
               | it won't make criminals and child abusers easier or
               | harder to catch, and it doesn't affect how people use
               | their iCloud Photos storage.
        
               | bredren wrote:
               | These people are commonly prosecuted using evidence that
               | includes server logs that showing their static IP
               | address.
               | 
               | Read evidence from past trials it is obvious. See also
               | successful and failed attempts to subpoena this info from
               | VPN services.
               | 
               | Only people with iCloud will be using the relay.
               | 
               | It is true on the surface the photos is disconnected from
               | the use. However, Apple only needs a solid answer that
               | handles the bad optics of what you can do with the Tor-
               | like anonymity of iCloud Privacy Relay.
               | 
               | However, if you look more closely, the CSAM service and
               | its implementation are crafted exactly around the
               | introduction of the relay.
        
             | matwood wrote:
             | Agreed. I think this just shined a spotlight for a lot of
             | people who didn't really think about how much they had to
             | trust Apple.
        
             | unityByFreedom wrote:
             | They risk a whistleblower if they don't announce this news
             | while implementing that in a country with a free press.
             | 
             | It's better to be forthright, or you risk your valuation on
             | the whims of a single employee.
        
         | notJim wrote:
         | > Until a 1-line code change happens that hooks it into
         | UIImage.
         | 
         | I really don't understand this view. You are using proprietary
         | software, you are always an N-line change away from someone
         | doing something you don't like. This situation doesn't change
         | this.
         | 
         | If you only use open source software and advocate for others to
         | do the same, I would understand it more.
        
           | ipv6ipv4 wrote:
           | Did you verify all the binaries that you run are from
           | compiled source code that you audited? Your BIOS? What about
           | your CPU and GPU firmware?
           | 
           | There is always a chain of trust that you end up depending
           | on. OSS is not a panacea here.
        
             | simondotau wrote:
             | It's not a panacea but the most implausible the mechanism,
             | the less likely it's going to be used on anyone but the
             | most high value targets.
             | 
             | (And besides, it's far more likely that this nefarious
             | Government agency will just conceal a camera in your room
             | to capture your fingers entering your passwords.)
        
           | almostdigital wrote:
           | > I really don't understand this view. You are using
           | proprietary software, you are always an N-line change away
           | from someone doing something you don't like. This situation
           | doesn't change this.
           | 
           | And I don't understand why it has to be black and white, I
           | think the N is very important in this formula and if it is
           | low that is a cause for concern. Like an enemy building a
           | missile silo on an island just off your coast but promising
           | it's just for defense.
           | 
           | All arguments I see is along the lines of "Apple can
           | technically do anything they want anyways so this doesn't
           | matter". But maybe you're right and moving to FOSS is the
           | only solution long-term, that's what I'm doing if Apple goes
           | through with this.
        
             | fay59 wrote:
             | I'd leave this one out to the lawyers. I'm not one but I
             | don't think that the court will evaluate the number of
             | lines of code required for help.
        
             | notJim wrote:
             | The size of N doesn't really matter. I'm sure Apple ships
             | large PRs in every release, as any software company does.
        
               | almostdigital wrote:
               | Maybe not if you assume Apple is evil but for the case of
               | Apple being good intentioned but having its hand forced,
               | they will have a much harder time resisting a 1 line
               | change than a mandate to spend years to develop a
               | surveillance system
        
         | jolux wrote:
         | > And there is no way to audit that the database is what they
         | claim it is, doesn't contain multiple databases that can be
         | activated under certain conditions, etc.
         | 
         | They describe a process for third parties to audit that the
         | database was produced correctly.
        
           | LexGray wrote:
           | Do we have any idea how the NCMEC database is curated? Are
           | there cartoons from Hustler depicting underage girls in
           | distress? Green text stories stating they are true about
           | illegal sexual acts? CGI images of pre-pubescent looking
           | mythical creatures? Manga/Anime images which are sold on the
           | Apple Store? Legitimate artistic images from books currently
           | sold? Images of Winnie the Pooh the government has declared
           | pornographic? From the amount of material the Feds claim is
           | being generated every year I would have to guess all of this
           | is included. The multi-government clause is completely
           | pointless with the five-eyes cooperation.
           | 
           | The story here is that there is a black box of pictures.
           | Apple will then use their own black box of undeclared rules
           | to pass things along to the feds which they have not shared
           | what would be considered offending in any way shape or form
           | other than "we will know it when we see it". Part of the
           | issue here is that Apple is taking the role of a moral
           | authority. Traditionally Apple has been incredibly anti-
           | pornography and I suspect that anything that managed to get
           | into the database will be something Apple will just pass
           | along.
        
             | jolux wrote:
             | Apple is manually reviewing every case to ensure it's CSAM.
             | You do have to trust them on that.
             | 
             | But if your problem is with NCMEC, you've got a problem
             | with Facebook and Google who are already doing this too.
             | And you can't go to jail for possessing adult pornography.
             | So even if you assume adult porn images are in the
             | database, and Apple's reviewers decide to forward them to
             | NCMEC, you would still not be able to be prosecuted, at
             | least in the US. Ditto for pictures of Winnie the Pooh. But
             | for the rest of what you describe, simulated child
             | pornography is already legally dicey as far as I know, so
             | you can't really blame Apple or NCMEC for that.
        
               | laserlight wrote:
               | > You do have to trust them on that.
               | 
               | If this system didn't exist, nobody would have to trust
               | Apple.
               | 
               | > you would still not be able to be prosecuted
               | 
               | But I wouldn't want to deal with a frivolous lawsuit, or
               | have a record in the social media of being brought CSA
               | charges.
        
               | LexGray wrote:
               | Facebook I completely approve of. You are trafficking
               | data at that point if you are posting it. I just recall
               | the days of Usenet and Napster when I would just download
               | at random and sometimes the evil would mislabel things to
               | cause trauma. I do not download things at random any more
               | but when I was that age it would have been far more
               | appropriate to notify my parents then it would be to
               | notify the government.
               | 
               | In any case it is likely the government would try to
               | negotiate a plea to get you into some predator database
               | to help fill the law enforcement coffers even if they
               | have no lawful case to take it to court once they have
               | your name in their hands.
        
               | jlokier wrote:
               | > Ditto for pictures of Winnie the Pooh.
               | 
               | References to Winnie the Pooh in these discussions are
               | about China, where images of Winnie are deemed to be
               | coded political messages and are censored.
               | 
               | The concern is that Apple are building a system that is
               | ostensibly about CSAM, and that some countries such as
               | China will then leverage their power to force Apple to
               | include whatever political imagery in the database as
               | well. Giving the government _there_ the ability to home
               | in on who is passing around those kinds of images in
               | quantity.
               | 
               | If that seems a long way indeed from CSAM, consider
               | something more likely to fit under that heading by local
               | government standards. There's a country today, you may
               | have heard of, one the USA is busy evacuating its
               | personnel from to leave the population to an awful fate,
               | where "female teenagers in a secret school not wearing a
               | burqa" may be deemed by the new authorities to be
               | sexually titillating, inappropriate and illegal, and if
               | they find out who is sharing those images, punishments
               | are much worse than mere prison. Sadly there are a
               | plethora of countries that are very controlling of
               | females of all ages.
        
               | somebodythere wrote:
               | Drawings are prosecutable in many countries including
               | Canada, the UK, and Australia. Also, iCloud sync is
               | enabled by default when you set up your device, whereas
               | the Facebook app at least is sandboxed and you have to
               | choose to upload your photos.
        
         | shuckles wrote:
         | Apple shipped iCloud Private Relay which is a "1-line code
         | change that hooks into CFNetwork" away from MITMing all your
         | network connections, by this standard.
        
           | almostdigital wrote:
           | For me the standard is that I don't want any 1-line code
           | change between me and near-perfect Orwellian surveillance.
        
             | shuckles wrote:
             | Since your one-liners seem to be immensely dense with
             | functional changes, I can't understand how you trust any
             | software.
        
           | ec109685 wrote:
           | Any connection worth its salt should be TLS protected.
        
             | shuckles wrote:
             | Also in CFNetwork. Probably a one line change to replace
             | all session keys with an Apple generated symmetric key.
        
       | [deleted]
        
       | lucasyvas wrote:
       | If they continue down this road, they will not only have
       | effectively created the smartphone category, but also
       | coincidentally destroyed it.
       | 
       | If our devices are designed to spy on us, we're frankly not even
       | going to use them anymore. I wonder if they forgot that using
       | this thing is optional? You'll see a resurgence of single purpose
       | dumb devices.
        
       | rootsudo wrote:
       | Truly, 1984. Siri now gives you suggestions if you have dangerous
       | thoughts.
       | 
       | "Anonymous helplines and guidance exist for adults with at-risk
       | thoughts and behavior" - "Learn more and get help."
       | 
       | Now a fake AI on your phone that knows your actions and behaviors
       | can sleep and wait for specific triggers and give you feedback
       | and alert.
        
       | Jerry2 wrote:
       | What strikes me about this paper is that there are no names of
       | the people who wrote it on the title page. Why were they afraid
       | to put their name(s) on this?
        
         | celeritascelery wrote:
         | Seeing HN over the last week, I can think of a few reasons...
        
           | stjohnswarts wrote:
           | Yeah seeing as its just a white paper there really isn't much
           | need, and the possibility of retribution is immense on this
           | one. I know a lot of pissed off people, let alone some of the
           | nuts who this is angering and who could dox or do worse to
           | the authors.
        
       | chipotle_coyote wrote:
       | In other HN comments on this subject I've (hopefully) made it
       | clear that I'm not really in favor of this project of Apple's,
       | and that there's a legitimate "slippery slope" argument to be
       | made here. So I hope people will entertain a contrarian question
       | without downvoting me into oblivion. :)
       | 
       | Here's the thing I keep circling around: assume that bad actors,
       | government or otherwise, want to target political dissidents
       | using internet-enabled smartphones. The more we learn about the
       | way Apple actually implemented this technology, the less likely
       | it seems that it would make it radically easier for those bad
       | actors to do so. For instance, the "it only scans photos uploaded
       | to iCloud" element isn't just an arbitrary limitation that can be
       | flipped with one line of code, as some folks seem to think; as
       | Erik Neuenschwander, head of Privacy Engineering at Apple,
       | explained in an interview on TechCrunch[1]:
       | 
       | > Our system involves both an on-device component where the
       | voucher is created, but nothing is learned, and a server-side
       | component, which is where that voucher is sent along with data
       | coming to Apple service and processed across the account to learn
       | if there are collections of illegal CSAM. That means that it is a
       | service feature.
       | 
       | Will this stop those bad actors if they're determined? No, of
       | course not, but there are _so many ways_ they can do it already.
       | They 'll get cloud storage providers to give them access. If they
       | can't, they'll get _network_ providers to give them access. And
       | if those bad actors are, as many people fear, government actors,
       | then they have tools far more potent than code: they have laws.
       | It was illegal to export  "strong encryption" for many years,
       | remember? I've seen multiple reports that European lawmakers are
       | planning to require some kind of scanning for CSAM. If this goes
       | into effect, technology isn't going to block those laws for you.
       | Your Purism phone will either be forced to comply or be illegal.
       | 
       | I wrote in a previous comment on this that one of Silicon
       | Valley's original sins is that we tend to treat all problems as
       | if they're engineering problems. Apple is treating CSAM as an
       | engineering problem. Most of the discussion on HN about how
       | horrible and wrong Apple is here _still_ treats it as an
       | engineering problem, though: well, you can get around this by
       | just turning off iCloud Photos or never using Apple software or
       | throwing your iPhone in the nearest lake and switching to
       | Android, but only the _right_ kind of Android, or maybe just
       | never doing anything with computers again, which admittedly will
       | probably be effective.
       | 
       | Yet at the end of the day, this _isn 't_ an engineering problem.
       | It's a policy problem. It's a governance problem. In the long
       | run, we solve this, at least in liberal democracies, by voting
       | people into office who understand technology, understand the
       | value of personal encryption, and last but certainly not least,
       | understand the value of, well, liberal democracy. I know that's
       | easy to dismiss as Pollyannaism, but "we need to protect
       | ourselves from our own government" has a pretty dismal track
       | record historically. The entire point of having a liberal
       | democracy is that _we are the government,_ and we can pull it
       | back from authoritarianism.
       | 
       | The one thing that Apple is absolutely right about is that
       | expanding what those hashes check for is a policy decision. Maybe
       | where those hashes get checked isn't really what we need to be
       | arguing about.
       | 
       | [1]: https://techcrunch.com/2021/08/10/interview-apples-head-
       | of-p...
        
         | new299 wrote:
         | > there's a legitimate "slippery slope" argument
         | 
         | The slippery slope argument is the only useful argument here.
         | 
         | The fundamental issue with their PSI/CSAM system is that they
         | already were scanning iCloud content [1] and that they're
         | seemingly not removing the ability to do that. If the PSI/CSAM
         | system had been announced along side E2E encryption for iCloud
         | backups, it would be clear that they were attempting to act in
         | their users best interests.
         | 
         | But it wasn't, so as it stands there's no obvious user benefit.
         | It then becomes a question of trust. Apple are clearly willing
         | to add functionality at the request of governments, on device.
         | By doing this they lose user trust (in my opinion).
         | 
         | > In the long run, we solve this, at least in liberal
         | democracies, by voting people into office who understand
         | technology, understand the value of personal encryption
         | 
         | But this is clearly not the way it happens in practice. At
         | least in part, we vote with our wallets and give money to
         | companies willing to push back on governmental over-reach.
         | Until now, Apple was one such company [2].
         | 
         | I realize that Apple likely don't "care" about privacy (it's a
         | company, not a individual human). But in a purely cynical
         | sense, positioning themselves as caring about privacy, and
         | pushing back against governmental over-reach on users behalf
         | was useful. And while it's "just marketing" it benefits users.
         | 
         | By implementing this functionality, they've lost this
         | "marketing benefit". Users can't buy devices believing they're
         | supporting a company willing to defend their privacy.
         | 
         | [1] https://nakedsecurity.sophos.com/2020/01/09/apples-
         | scanning-...
         | 
         | [2] https://epic.org/amicus/crypto/apple/
        
           | dwaite wrote:
           | > The fundamental issue with their PSI/CSAM system is that
           | they already were scanning iCloud content
           | 
           | This scanning was of email attachments being sent through an
           | iCloud-hosted account, not of other iCloud hosted data (which
           | is encrypted during operation.)
        
             | cm2187 wrote:
             | If it's encrypted but apple has the key, it's not encrypted
             | to them.
        
             | BostonEnginerd wrote:
             | I don't think that photos are encrypted as you can view
             | them from the iCloud website.
        
             | new299 wrote:
             | Do you have a public reference for this?
        
           | shuckles wrote:
           | > This is an area we've been looking at for some time,
           | including current state of the art techniques which mostly
           | involves scanning through entire contents of users' libraries
           | on cloud services that -- as you point out -- isn't something
           | that we've ever done; to look through users' iCloud Photos.
           | 
           | https://techcrunch.com/2021/08/10/interview-apples-head-
           | of-p...
           | 
           | > This moment calls for public discussion, and we want our
           | customers and people around the country to understand what is
           | at stake.
           | 
           | - Tim Cook, Apple
           | 
           | At what point does the fact that half a decade has passed
           | since those words were written yet the hacker community has
           | made little contribution to that discourse about the
           | importance of privacy start implicating us in the collective
           | failure to act?
        
             | new299 wrote:
             | > The voucher generation is actually exactly what enables
             | us not to have to begin processing all users' content on
             | our servers, which we've never done for iCloud Photos.
             | 
             | I really dislike this statement. It's likely designed to be
             | "technically true". But it's been reported elsewhere that
             | they do scan iCloud content:
             | 
             | https://nakedsecurity.sophos.com/2020/01/09/apples-
             | scanning-...
             | 
             | Perhaps they scan as the data is being ingested. Perhaps
             | it's scanned on a third party server. But it seems clear
             | that it is being scanned.
        
               | dwaite wrote:
               | https://www.forbes.com/sites/thomasbrewster/2020/02/11/ho
               | w-a...
               | 
               | My interpretation is Sophos got it wrong (they don't give
               | a quote from the Apple Officer involved and manage to
               | have a typo in the headline).
               | 
               | Apple does scanning of data which is not encrypted, such
               | as received and sent email over SMTP. They presumably at
               | that time were using PhotoDNA to scan attachments by
               | hash. This is likely what Apple was actually talking
               | about back at CES 2020.
               | 
               | They may have been also scanning public iCloud photo
               | albums, but I haven't seen anyone discuss that one way or
               | another.
        
               | zepto wrote:
               | That link doesn't confirm that they have already been
               | doing it. Just that they changed an EULA.
        
               | shuckles wrote:
               | My understanding based on piecing together the various
               | poorly cited news stories is that Apple used to scan
               | iCloud Mail for this material, and that's it.
        
               | new299 wrote:
               | If you have references to also help me piece this
               | together I'd find that really helpful.
        
               | selsta wrote:
               | > Last year, for instance, Apple reported 265 cases to
               | the National Center for Missing & Exploited Children,
               | while Facebook reported 20.3 million
               | 
               | According to [1] it does seem like Apple didn't do any
               | wide scale scanning of iCloud Data.
               | 
               | [1] https://www.nytimes.com/2021/08/05/technology/apple-
               | iphones-...
        
               | new299 wrote:
               | If they weren't doing any scanning why would they find
               | any to report? The data is encrypted as rest so... why
               | would they find any to report. This clearly doesn't
               | include search requests [1].
               | 
               | iCloud has perhaps 25% of the users of Facebook. Of that
               | 25% it's not clear how many actively use the platform of
               | backups/photos. iCloud is not a platform for sharing
               | content like Facebook. So how many reports should we
               | expect to see from Apple? It's unclear to me.
               | 
               | So, I'm not saying the number isn't suspiciously low. But
               | it doesn't really clarify what's going on to me...
               | 
               | [1] https://www.apple.com/legal/transparency/pdf/requests
               | -2018-H...
        
               | shuckles wrote:
               | Forbes had the only evidence based reporting where they
               | cited a court case where Apple automatically detected
               | known CSAM in attachments to iCloud Mail: https://www.for
               | bes.com/sites/thomasbrewster/2020/02/11/how-a...
               | 
               | Everyone else is making inferences from a chance to their
               | privacy policy and a statement made by a company lawyer.
        
             | hypothesis wrote:
             | And yet we can go e.g. on Twitter and observe comments from
             | relevant security researchers that appear to describe a
             | _chilling_ atmosphere surrounding privacy research,
             | including resistance to even consider white paper.
             | 
             | That's not my area of expertise and don't know how to fix
             | that, but that should be an important consideration.
        
           | nicce wrote:
           | > If the PSI/CSAM system had been announced along side E2E
           | encryption for iCloud backups, it would be clear that they
           | were attempting to act in their users best interests.
           | 
           | Most likely because the only way this announcement makes
           | sense, is that they tried to respond for misleading leaks.
           | We'll see on September probably some E2EE announcements,
           | since iOS 15 beta supports tokens for backup recovery. At
           | least, let's hope so.
        
           | chipotle_coyote wrote:
           | > If the PSI/CSAM system had been announced along side E2E
           | encryption for iCloud backups, it would be clear that they
           | were attempting to act in their users best interests.
           | 
           | Absolutely. I don't know whether there's a reason for this
           | timing (that is, if they are planning E2E encryption, why
           | they announced this first), but this is probably the biggest
           | PR bungle Apple has had since "you're holding it wrong," if
           | not ever.
           | 
           | > Apple are clearly willing to add functionality at the
           | request of governments, on device.
           | 
           | Maybe? I'm not as willing to state that quite as
           | definitively, given the pushback Apple gave in the San
           | Bernardino shooter case. Some of what their Privacy
           | Engineering head said in that TechCrunch article suggests
           | that Apple has engineered this to be strategically awkward,
           | e.g., generating the hashes by using ML trained on the CSAM
           | data set (so the hashing system isn't as effective on _other_
           | data sets) and making the on-device hashing component part of
           | the operating system itself rather than a separately
           | updatable data set. That in turn suggests to me Apple is
           | still looking for an engineering way to say  "no" if they're
           | asked "hey, can you just add these other images to your data
           | set." (Of course, my contention that this is not ultimately
           | an engineering problem applies here, too: even if I'm right
           | about Apple playing an engineering shell game here, I'm not
           | convinced it's enough if a government is sufficiently
           | insistent.)
           | 
           | A minor interesting tidbit: your linked Sophos story is based
           | on a Telegraph UK story that has this disclaimer at the
           | bottom:
           | 
           | > This story originally said Apple screens photos when they
           | are uploaded to iCloud, Apple's cloud storage service. Ms
           | Horvath and Apple's disclaimer did not mention iCloud, and
           | the company has not specified how it screens material, saying
           | this information could help criminals.
           | 
           | It's hard to say what they were actually doing, but it's
           | reasonable to suspect it's an earlier, perhaps entirely
           | cloud-based rather than partially cloud-based, version of
           | NeuralHash.
        
             | new299 wrote:
             | Right, and in the interview linked [1] above they state:
             | 
             | > The voucher generation is actually exactly what enables
             | us not to have to begin processing all users' content on
             | our servers, which we've never done for iCloud Photos.
             | 
             | But they do appear to do "something" server-side. It's
             | possible that all data in scanned as it is ingested for
             | example. I dislike this statement, because it's probably
             | technically correct but doesn't help clarify the situation
             | in a helpful way. It makes me trust Apple less.
             | 
             | [1] https://techcrunch.com/2021/08/10/interview-apples-
             | head-of-p...
        
               | cm2187 wrote:
               | What I don't understand is that if they announced they
               | would do that scanning server side, the only eyebrows
               | that would be raised is of people who thought they were
               | doing it already. It's not like if those pictures were
               | e2e encrypted. I still haven't seen any convincing
               | argument about why searching client side provide any
               | benefit to end users, while being a massive step in the
               | direction of privacy invasion.
        
               | dwaite wrote:
               | > But they do appear to do "something" server-side. It's
               | possible that all data in scanned as it is ingested for
               | example. I dislike this statement, because it's probably
               | technically correct but doesn't help clarify the
               | situation in a helpful way. It makes me trust Apple less.
               | 
               | The qualifier is "Photos" - different services have
               | different security properties.
               | 
               | Email transport is not E2E encrypted because there are no
               | interoperable technologies for that.
               | 
               | Other systems are encrypted but apple has a separate key
               | escrow system outside the cloud hosting for law
               | enforcement requests and other court orders (such as a
               | heir/estate wanting access).
               | 
               | Some like iCloud Keychain use more E2E approach where
               | access can't be restored if you lose all your devices and
               | paper recovery key.
               | 
               | iCloud Photo Sharing normally only works between AppleID
               | accounts, with the album keys being encrypted to the
               | account. However, you can choose to publicly share an
               | album, at which point it becomes accessible via a browser
               | on icloud.com. I have not heard Apple talking about
               | whether they scan photos today once they are marked
               | public (going forward, there would be no need).
               | 
               | FWIW this is all publicly documented, as well as what
               | information Apple can and can't provide to law
               | enforcement.
        
               | ec109685 wrote:
               | Even without publicly sharing your iCloud photos, they
               | are accessible on iCloud.com (e.g. you can see your
               | camera role there).
        
         | loceng wrote:
         | Could you not potentially infer a lot about a person if they
         | habe X photo content saved - via gathering that data elsewhere?
         | 
         | E.g. a simple one: anyone that may have saved a "winnie the
         | pooh" meme image?
         | 
         | And it's not like Apple's going to publish a list of images
         | that the system searched; though it'd be easy to differentiate
         | between child abuse content vs. other types of images - but
         | would it ever get out of Apple if someone went rogue or was
         | "experimenting" to say "gather stats" of how many people have X
         | image saved?
        
         | unityByFreedom wrote:
         | > Will this stop those bad actors if they're determined? No, of
         | course not, but there are so many ways they can do it already.
         | 
         | This is not a reason to let your guard down on security.
         | Keeping up with securing things against bad actors is a
         | constant battle. Tim Cook put it best [1] and I want to hear
         | how this is not exactly what he described 5 years ago.
         | 
         | [1] https://youtu.be/rQebmygKq7A?t=57
        
         | cannabis_sam wrote:
         | > The more we learn about the way Apple actually implemented
         | this technology, the less likely it seems that it would make it
         | radically easier for those bad actors to do so.
         | 
         | Then why isn't Apple pushing this angle?
        
         | ODILON_SATER wrote:
         | Speaking of mobile OS. I am a bit of a newbie myself in this
         | area. I am an Android user but I want to decouple from Google
         | as much as possible. Is there an mobile OS out there that
         | offers a similar experience to, say, Android in terms of
         | functionalities, apps, etc without the drawback of privacy
         | concerns?
        
           | juniperplant wrote:
           | Your best bets are GrapheneOS or CalyxOS. In both cases be
           | prepared to sacrifice a lot in terms of convenience (more
           | with GrapheneOS).
        
           | gary17the wrote:
           | The problem is that many Android apps require Google services
           | to function properly. You can try two Android derivatives:
           | CalyxOS[1] that implements a privacy-conscious subset of
           | Google services allowing many Android apps to work properly,
           | and GrapheneOS[2] that excludes Google services altogether at
           | the cost of lower app compatibility[3]. Both require using
           | Google Pixel hardware.
           | 
           | [1] https://calyxos.org/ [2] https://grapheneos.org/ [3]
           | "GrapheneOS vs CalyxOS ULTIMATE COMPARISON (Battery & Speed
           | Ft. Stock Android & iPhone)",
           | https://www.youtube.com/watch?v=7iS4leau088
        
             | sphinxcdi wrote:
             | > GrapheneOS[2] that excludes Google services altogether at
             | the cost of lower app compatibility[3].
             | 
             | It now has https://grapheneos.org/usage#sandboxed-play-
             | services providing broader app compatibility.
             | 
             | That video is quite misleading and it's not the best source
             | for accurate information about GrapheneOS.
        
               | gary17the wrote:
               | Thank you for pointing this out, I will start linking to
               | the updated material.
        
         | newbamboo wrote:
         | "never doing anything with computers again, which admittedly
         | will probably be effective."
         | 
         | Increasingly I believe this is the right and probably only
         | answer. How do we collectively accomplish this, that's the real
         | problem. It has to be financially and economically solved.
         | Political, social and asymmetric solutions won't work.
        
         | XorNot wrote:
         | > For instance, the "it only scans photos uploaded to iCloud"
         | element isn't just an arbitrary limitation that can be flipped
         | with one line of code, as some folks seem to think; as Erik
         | Neuenschwander, head of Privacy Engineering at Apple, explained
         | in an interview on TechCrunch[1]:
         | 
         | >> Our system involves both an on-device component where the
         | voucher is created, but nothing is learned, and a server-side
         | component, which is where that voucher is sent along with data
         | coming to Apple service and processed across the account to
         | learn if there are collections of illegal CSAM. That means that
         | it is a service feature.
         | 
         | The first paragraph does not follow from the detail in the
         | second at all. Setting aside how abstract the language is, what
         | about adding more complexity to the system is preventing Apple
         | from scanning other content?
         | 
         | This is all a misdirect: they're saying "look at how complex
         | this system is!" and pretending that, particularly when they
         | built and control the entire system including it's existence,
         | that any of that makes changing how it operates "difficult".
        
           | chipotle_coyote wrote:
           | The system isn't entirely on-device, but relies on uploading
           | data to Apple's servers. Hence, why I said that it's not an
           | arbitrary limitation that it only scans photos uploaded to
           | iCloud. The system literally has to upload enough images with
           | triggering "safety vouchers" to Apple to pass the reporting
           | threshold, and critical parts of that calculation are
           | happening on the server side.
           | 
           | I think what you're arguing is that Apple could still change
           | what's being scanned for, and, well, yes: but that doesn't
           | really affect my original point, which is that this is a
           | policy/legal issue. If you assume governments are bad actors,
           | then yes, they could pressure Apple to change this technology
           | to scan for other things -- but if this technology didn't
           | exist, they could just as easily pressure Apple to do it
           | _all_ on the servers. I think a lot of the anger comes from
           | _they shouldn 't be able to do any part of this work on my
           | device,_ and emotionally, I get that -- but technologically,
           | it's hard for me not to shake the impression that "what
           | amount happens on device vs. what amount happens on server"
           | is a form of bikeshedding.
        
         | [deleted]
        
         | [deleted]
        
         | pseudalopex wrote:
         | Lots of people have made policy arguments. No US law requires
         | client side scanning. No US law forbids E2E encryption. US
         | courts don't let law enforcement agencies just demand
         | everything they want from companies. Apple relied on that 5
         | years ago successfully.[1] And capitulating preemptively is bad
         | strategy usually.
         | 
         | What Neuenschwander said doesn't establish it isn't just an
         | arbitrary limitation.
         | 
         | [1] https://en.wikipedia.org/wiki/FBI-Apple_encryption_dispute
         | 
         | Where the hashes get checked is relevant to the policy problem
         | of what
        
           | Retric wrote:
           | That's different. The FBI can legally require Apple or any
           | other US company to search for specific files it has access
           | to on it's own servers because nothing currently shields
           | backup providers. They could and did force Apple to aid in
           | unlocking iPhones when Apple had that capacity. What they
           | couldn't do was "These orders would compel Apple to write new
           | software that would let the government bypass these devices'
           | security and unlock the phones."
           | 
           | Forcing companies to create back doors in their own is
           | legally a very different situation. As to why iCloud is
           | accessible by Apple, the point is to backup a phone someone
           | lost. Forcing people to keep some sort of key fob with a
           | secure private key safe in order to actually have access to
           | their backups simply isn't tenable.
        
             | mmastrac wrote:
             | Apple is a trillion dollar company with a lot of smart
             | people. You could probably get them to design a system of N
             | of M parts for recovery, or an apple branded key holder
             | that you can store in your bank vault and friends houses.
             | If they wanted to they'd do it.
        
               | sneak wrote:
               | More than that: Apple already designed and partially
               | implemented such a "trust circles" system.
               | 
               | Apple legal killed the feature, because of pressure from
               | the US government.
               | 
               | https://www.reuters.com/article/us-apple-fbi-icloud-
               | exclusiv...
               | 
               | They also run iCloud (mostly not e2e) on CCP-controlled
               | servers for users in China.
               | 
               | They can decrypt ~100% of iMessages in real-time due to
               | the way iCloud Backup (on by default, not e2e) escrows
               | iMessage sync keys.
               | 
               | Apple does not protect your mprivacy from Apple, or, by
               | extension, the governments that ultimately exert control
               | over Apple: China and the USA.
        
               | Retric wrote:
               | That's not what the article says, they simply don't know.
               | 
               | "However, a former Apple employee said it was possible
               | the encryption project was dropped for other reasons,
               | such as concern that more customers would find themselves
               | locked out of their data more often."
        
           | zepto wrote:
           | > And capitulating preemptively is bad strategy usually.
           | 
           | Why do you think they are doing it then?
        
           | heavyset_go wrote:
           | Each year, Apple gives up customer data on over 150,000 users
           | based on US government data requests, and NSL and FISA
           | requests[1].
           | 
           | The idea that Apple would fight this is a farce, as they
           | regularly give up customers' data without a fight when the
           | government requests it.
           | 
           | [1] https://www.apple.com/legal/transparency/us.html
        
             | alwillis wrote:
             | _The idea that Apple would fight this is a farce, as they
             | regularly give up customers ' data without a fight when the
             | government requests it._
             | 
             | There are laws regarding this, so they don't have a choice.
             | If they get a subpoena from a FISA court, there's not much
             | they can do, but that goes for every US-based company.
             | 
             | Whatever fighting is going on is behind the scenes, so we
             | wouldn't know about it.
        
           | chipotle_coyote wrote:
           | None of the laws do _yet._ My observation isn 't about the
           | laws as they necessarily exist now, just as the worry about
           | how this could be abused isn't about Apple's policy as it
           | exists now.
           | 
           | If we trust US courts to stop law enforcement agencies from
           | demanding everything they want from companies, they they can
           | stop law enforcement agencies from demanding Apple add non-
           | CSAM data to the NeuralHash set. If we _don 't_ trust the
           | courts to do that, then we're kind of back at square one,
           | right?
        
             | simondotau wrote:
             | I'm not American, but my understanding is that as soon as
             | Government is _forcing_ Apple to search our devices for
             | something, 4th Amendment protections apply. (Unless they
             | hold a search warrant for that specific person, of course.)
             | Is this not correct?
        
               | kevin_thibedeau wrote:
               | No. The 4A protections don't apply to third parties. This
               | is part of why the US has nearly nonexistent data
               | protection laws.
        
               | simondotau wrote:
               | If Apple was performing scans on their cloud servers,
               | you'd be absolutely right. But if the scanning is being
               | done on the individual's device, I'm not sure it's that
               | straightforward. The third party doctrine surely cannot
               | apply if the scanning is performed _prior to_ the
               | material being in third party hands.
               | 
               | Therefore if the Government forces Apple to change the
               | search parameters _contained within private devices,_ I
               | cannot see how this would work around the 4th Amendment.
               | 
               | If this is correct, it might be possible to argue that
               | Apple's approach has (for Americans) constitutional
               | safeguards which do not exist for on-cloud scanning
               | performed by Google or Microsoft.
        
               | zepto wrote:
               | This point was made in the economist today.
               | 
               | https://www.economist.com/united-
               | states/2021/08/12/a-38-year...
        
               | simondotau wrote:
               | I read the article; I don't think they highlighted this
               | specific point that on-device scanning has a potential,
               | hypothetical constitutional advantage in comparison to
               | Google, Microsoft and Facebook who scan exclusively in
               | the cloud.
        
               | zepto wrote:
               | They don't draw out the comparison, but they do mention
               | the protection.
        
               | heavyset_go wrote:
               | > _The 4A protections don 't apply to third parties._
               | 
               | The government can't pay someone to break into your house
               | and steal evidence they want without a warrant. I mean,
               | they can, but the evidence wouldn't be admissible in
               | court.
        
               | kevin_thibedeau wrote:
               | They don't need a warrant. You gave data to someone else.
               | That someone isn't bound to keep it secret. They can
               | demand a warrant if they are motivated by ethical
               | principles but that is optional and potentially overruled
               | by other laws.
        
               | least wrote:
               | The sort of questions about 4A protections here haven't
               | really been tested. Third party doctrine might not apply
               | in this circumstance and the court is slowly evolving
               | with the times.
        
               | simondotau wrote:
               | e.g.
               | https://en.wikipedia.org/wiki/Carpenter_v._United_States
               | 
               |  _In Carpenter v. United States (2018), the Supreme Court
               | ruled warrants are needed for gathering cell phone
               | tracking information, remarking that cell phones are
               | almost a "feature of human anatomy", "when the Government
               | tracks the location of a cell phone it achieves near
               | perfect surveillance, as if it had attached an ankle
               | monitor to the phone's user"._
               | 
               |  _...[cell-site location information] provides officers
               | with "an all-encompassing record of the holder's
               | whereabouts" and "provides an intimate window into a
               | person's life, revealing not only [an individual's]
               | particular movements, but through them [their] familial,
               | political, professional, religious, and sexual
               | associations."_
        
         | corndoge wrote:
         | What's the contrarion question?
        
           | chipotle_coyote wrote:
           | Hmm. I should perhaps have said "contrarian view," although
           | I'm not sure it's actually even super contrarian in
           | retrospect. Maybe more "maybe we're not asking the right
           | questions."
        
         | Despegar wrote:
         | >Apple is treating CSAM as an engineering problem.
         | 
         | No they're treating it as a political and legal problem (with
         | the UK and the EU being the furthest along on passing
         | legislation). Their implementation is the compromise that
         | preserves end-to-end encryption, given those political winds.
        
           | matwood wrote:
           | Agree. I think a lot of HN is seeing it as strictly an
           | engineering problem while completely ignoring the blowing
           | political winds.
        
         | lsh123 wrote:
         | > ... the less likely it seems that it would make it radically
         | easier for those bad actors to do so
         | 
         | Depends on implementation. I can easily see a possibility that
         | the check is gated by a server side logic that can be changed
         | at any moment without anybody knowing.
        
         | ianmiers wrote:
         | > Our system involves both an on-device component where the
         | voucher is created, but nothing is learned, and a server-side
         | component, which is where that voucher is sent along with data
         | coming to Apple service and processed across the account to
         | learn if there are collections of illegal CSAM. That means that
         | it is a service feature.
         | 
         | Neuenschwander seems to, maybe deliberately, be conflating :
         | "Apple's servers have to be in the loop" and "the code can only
         | look at photos on iCloud".
         | 
         | You are right, the problem is a "slippery slope," but Apple
         | just built roller skates and there are governments trying to
         | push us down it. Apple is in a far better position to resist
         | those efforts if they say " we don't have this code, we will
         | not build it, and there's no way for it to be safe for our
         | users."
         | 
         | I'd say that's a little different than the slippery slope.
         | Something more like (in)defense in depth.
        
         | rootusrootus wrote:
         | This is the refreshing analysis that I hope to see when I come
         | to HN. Thank you for being reasonable.
        
         | arthur_sav wrote:
         | Absolutely.
         | 
         | We've seen this time after time where the 3 letter agencies
         | give companies an ultimatum: either comply or get shut down.
         | 
         | The worse part is that nobody can do anything about it. Under
         | the cloak of secrecy and threats a faceless government is
         | shaping major policy, breaking laws etc...
         | 
         | My only hope is that the biggest companies can speak up since
         | they have a bit of a leverage. They can rally people behind
         | them if the requests are borderline unethical, like spying on
         | everybody.
         | 
         | If Apple can't deal with this, NOBODY else can.
        
           | ClumsyPilot wrote:
           | "either comply or get shut down'
           | 
           | I think if a three letter agency 'shuts down' Apple, the
           | political blowback will be nuclear. The agency will get put
           | on a leash if they pull some shit like tthat
        
             | arthur_sav wrote:
             | That's why i said big companies have some leverage. People
             | will notice.
             | 
             | If they don't speak up then nobody else can.
        
         | mindslight wrote:
         | > _It 's a policy problem. It's a governance problem. In the
         | long run, we solve this, at least in liberal democracies, by
         | voting people into office who understand technology, understand
         | the value of personal encryption_
         | 
         | Yes, it is ultimately policy problem. But the way we get people
         | in office who understand technology is to get technological
         | capabilities in the hands of people before they get into
         | office, as well as the hands of those who will vote for them.
         | Just like "bad facts make bad law", bad engineering makes bad
         | law.
         | 
         | Twenty years ago we forcefully scoffed at the Clipper Chip
         | proposal and the export ban on crypto, because they were so at
         | odds with the actual reality of the digital environment. These
         | days, most people's communications are mediated by large
         | corporations operating on plaintext, and are thus are
         | straightforward to monitor and censor. And especially when
         | companies lead the charge, governments expect to have the same
         | ability.
         | 
         | If I could hole up and rely on Free software to preserve my
         | rights indefinitely, I wouldn't particularly care what the
         | Surveillance Valley crowd was doing with their MITM scheme. But
         | I can't, because Surveillance Valley is teaching governments
         | that communications _can be controlled_ while also fanning the
         | flames and creating glaring examples of why _they need to be
         | controlled_ (cf social media  "engagement" dumpster fire). And
         | once governments expect that technology can be generally
         | controlled, they will rule any software that does not do their
         | bidding as some exceptional circumvention device rather than a
         | natural capability that has always existed. This entire "trust
         | us" cloud culture has been one big vaccination for governments
         | versus the liberating power of technology that we were excited
         | for two decades ago. This end result has been foreseeable since
         | the rise of webapps, but it's hard to get software developers
         | to understand something when their salary relies upon not
         | understanding it.
         | 
         | Apart from my ][gs (and later my secondhand NeXT), I've never
         | been a huge Apple fan. But I had hoped that by taking this
         | recent privacy tack, they would put workable digital rights
         | into the hands of the masses. Design their system to be solidly
         | secure against everyone but Apple, control the app store to
         | prevent trojans, but then stay out of users' business as
         | software developers should. But brazenly modifying their OS,
         | which should be working _for the interests of the user_ , to do
         | scanning _against the interests of the user_ is a disappointing
         | repudiation of the entire concept of digital rights. And so
         | once again we 're back to Free software or bust. At least the
         | Free mobile ecosystem seems to be progressing.
        
         | strogonoff wrote:
         | > For instance, the "it only scans photos uploaded to iCloud"
         | element isn't just an arbitrary limitation that can be flipped
         | with one line of code, as some folks seem to think
         | 
         | It might be a happy incident if their architecture limits this
         | feature to CSAM _now_ , but their ToS are clearly much more
         | general-purpose than that, strategically allowing Apple to pre-
         | screen for _any_ potentially illegal content. If ToS remain
         | phrased this way, surely the implementation will catch up.
        
         | maCDzP wrote:
         | I agree that this is a policy issue. The EU passed a new law
         | just a last month regarding this [0].
         | 
         | I thought this was the implementation for the EU. If it was -
         | it was fast?
         | 
         | [0] https://news.ycombinator.com/item?id=27753727
        
           | Svip wrote:
           | I still see a lot of people thinking the EU law _demands_ the
           | scanning of images for CSAM, but really it _permits_ the
           | scanning of images again. Apparently no one noticed that EU
           | privacy laws actually prohibited the scanning of user images
           | until last year, when companies like Facebook stopped
           | scanning images to avoid fines.
           | 
           | So Apple's move can hardly have been for the EU, but had the
           | European Parliament not passed it, Apple would have had to
           | disable it in the EU.
        
         | dkdk8283 wrote:
         | What apple is doing is fucked up. Full stop. Mental gymnastics
         | are required to go beyond this premise.
        
         | chevill wrote:
         | >The entire point of having a liberal democracy is that we are
         | the government, and we can pull it back from authoritarianism.
         | 
         | We technically can. We also can technically elect people that
         | understand technology. But practically its 100x more likely
         | that I, a person that works with technology for a living,
         | magically finds a way to provide my family with a decent
         | lifestyle doing somethign that doesn't use computers at all.
         | And I view the chances of that happening to be almost non-
         | existent.
        
         | mschuster91 wrote:
         | > The entire point of having a liberal democracy is that we are
         | the government, and we can pull it back from authoritarianism.
         | 
         | The counterpoint: we (as in, the Western-aligned countries)
         | _don 't have_ a true liberal democracy and likely never had.
         | Not with the amount of open and veiled influence that religion
         | (and for what it's worth, money) has in our societies - ranging
         | from openly Christian centrist/center-right parties in Europe
         | to entire communities in the US dominated by religious sects of
         | all denominations.
         | 
         | And all of these tend to run on a "think about the children"
         | mindset, _especially_ regarding anything LGBT.
         | 
         | The result? It is very hard if not outright impossible to
         | prevent or roll back authoritarian measures that were sold as
         | "protect the children", since dominant religious-affiliated
         | people and institutions (not just churches, but also
         | thinktanks, parties and media) will put anyone in their
         | crosshairs. Just look at how almost all blog pieces and many
         | comments on the CSAM scanner debacle have an "I don't like
         | pedophilia" disclaimer...
        
         | xvector wrote:
         | > It was illegal to export "strong encryption" for many years,
         | remember? I've seen multiple reports that European lawmakers
         | are planning to require some kind of scanning for CSAM. If this
         | goes into effect, technology isn't going to block those laws
         | for you. Your Purism phone will either be forced to comply or
         | be illegal.
         | 
         | The point is that with a Purism phone or custom ROM on my
         | Android phone, I could disable these "legally required"
         | features, because the law is fucking dumb, and my rights matter
         | more.
         | 
         | The law can ban E2EE, cryptocurrencies, and privacy, but so
         | long as we have some degree of technical freedom we can and
         | will give it the middle finger.
         | 
         | Apple does not offer this freedom. Here we see the walled
         | garden of iOS getting worse and more freedom-restricting by the
         | year. When the governments of the world demand that Apple
         | become an arm of the dystopia, Apple will comply, and its users
         | will have no choice but to go along with it.
         | 
         | Apple, knowing that it is a private company completely and
         | utterly incapable of resisting serious government demands (ie
         | GCBD in China) should never have developed this capability to
         | begin with.
         | 
         | If Apple is going to open this Pandora's box, they ought to
         | open up their devices too.
        
           | alwillis wrote:
           | _When the governments of the world demand that Apple become
           | an arm of the dystopia, Apple will comply, and its users will
           | have no choice but to go along with it._
           | 
           | I would argue that Apple is creating systems so they _can 't_
           | become an arm of the dystopia.
           | 
           | For example, even if a government somehow forced Apple to
           | include non-CSAM hashes to the database, the system only uses
           | hashes from multiple child protection agencies in different
           | jurisdictions where the CSAM is the same.
           | 
           | So Apple only uses the hashes that are the same between org
           | A, B and C and ignores the rest.
           | 
           | This, along with the audibility Apple recently announced and
           | the other features makes it so there's literally nothing
           | counties can do to force Apple to comply with some dystopian
           | nightmare...
           | 
           | Of course, with potentially more open operating systems, it
           | would be trivial by comparison for state actors to create a
           | popular/custom ROM for Android that's backdoored.
        
             | treis wrote:
             | >I would argue that Apple is creating systems so they can't
             | become an arm of the dystopia.
             | 
             | For the life of me I can't see how catching child molesters
             | is part of a dystopia.
        
           | chipotle_coyote wrote:
           | > The point is that with a Purism phone or custom ROM on my
           | Android phone, I could disable these "legally required"
           | features, because the law is fucking dumb, and my rights
           | matter more.
           | 
           | But if we assume a government is determined to do this, can't
           | they find other ways to do it? If you were using Google
           | Photos with your Purism phone, it doesn't matter what you do
           | on your device. And you can say "well, I wouldn't use that,"
           | but maybe your ISP is convinced (or required) to do packet
           | inspection. And then you can say, "But I'm using encryption,"
           | and the government mandates that they have a back door into
           | all encrypted traffic that goes through their borders.
           | 
           | And I would submit that if we really assume a government is
           | going to extreme lengths, then they'll make it as hard as
           | possible to use an open phone in the first place. They'll
           | make custom ROMs illegal. They'll go after people hosting it.
           | They'll mandate that the phones comply with some kind of
           | decryption standard to connect to cellular data networks. If
           | we assume an authoritarian government bound and determined to
           | spy on you, the assumption that we can be saved by just
           | applying enough open source just seems pretty shaky to me.
           | 
           | So, I certainly don't think that a purely technological
           | solution is _enough,_ in the long run. This is a policy
           | issue. I think hackers and engineers really, really want to
           | believe that math trumps policy, but it doesn 't. By all
           | means, let's fight for strong encryption -- but let's also
           | fight for government policy that supports it, rather than
           | assuming encryption and open source is a guarantee we can
           | circumvent bad policy.
        
             | cinquemb wrote:
             | > And then you can say, "But I'm using encryption," and the
             | government mandates that they have a back door into all
             | encrypted traffic that goes through their borders.
             | 
             | And then one can compromise and infect millions of such
             | backdoored devices and start feeding (much cheaper than the
             | government enforcement implementation) spoofed data into
             | these systems at scale on these backdoored devices that act
             | like "swatting as a service" and completely nullify any
             | meaning they could get from doing this.
             | 
             | I'm personally really interested in router level malware +
             | 0days on devices as distribution vectors rather than the
             | typical c&c setup.
             | 
             | > They'll go after people hosting it.
             | 
             | Not too hard to imagine one being able to distribute such
             | things across millions of ephemeral devices that are
             | networked and incentivized to host it, all across the
             | world, regardless of illegality in any particular
             | jurisdiction. Technology enables this, without such, it
             | wont be possible.
             | 
             | > I think hackers and engineers really, really want to
             | believe that math trumps policy, but it doesn't
             | 
             | I don't think that at all, I think it comes down to
             | incentives. I was listening to a talk the other day where
             | someone mentioned that for the longest time (since at least
             | wwII), governments pretty much had a monopoly on
             | cryptographers and now there are lots of places/systems
             | that are willing to pay more to apply cutting edge
             | research.
             | 
             | > but let's also fight for government policy that supports
             | it, rather than assuming encryption and open source is a
             | guarantee we can circumvent bad policy.
             | 
             | Much more cheaper for an individual, with more immediate
             | feedback mechanisms doing one vs another. One can also
             | scale a lot faster than another esp since one is very much
             | divorced from implementation.
        
           | shapefrog wrote:
           | > Apple, knowing that it is a private company completely and
           | utterly incapable of resisting serious government demands (ie
           | GCBD in China) should never have developed this capability to
           | begin with.
           | 
           | You are basically arguing that an iphone be incablable of
           | doing any function at all.
           | 
           | Dont want the government demanding the sent/recieved data -
           | no data sending and recieving funcitons.
           | 
           | Dont want the government demanding phone call intercepts - no
           | phone call functionality.
           | 
           | Dont want the government demanding the contents of the screen
           | - no screen.
           | 
           | The pandoras box was opened the day someone inseted a radio
           | chip and microphone into a device. It has been open for a
           | very long time, this is not the moment it suddenly opened.
        
           | dwaite wrote:
           | > ... because the law is fucking dumb, and my rights matter
           | more.
           | 
           | If they aren't _everybody's_ rights then they aren't really
           | your rights either. They are at best a privilege, and at
           | worst something you have just been able to get away with (so
           | far).
           | 
           | > The law can ban E2EE, cryptocurrencies, and privacy, but so
           | long as we have some degree of technical freedom we can and
           | will give it the middle finger.
           | 
           | Sure, in that scenario techies can secretly give it the
           | middle finger right up until the authoritarian government
           | they idly watched grow notices them.
           | 
           | If someone is seriously concerned about that happening, they
           | could always consider trying to divert the government away
           | from such disaster by participating.
           | 
           | > When the governments of the world demand that Apple become
           | an arm of the dystopia, Apple will comply, and its users will
           | have no choice but to go along with it.
           | 
           | Government demands of this sort are normally referred to as
           | legal and regulatory compliance. Corporations, which are a
           | legal concept allowed by the government, generally have to
           | conform to continue to exist.
           | 
           | > Apple, knowing that it is a private company completely and
           | utterly incapable of resisting serious government demands (ie
           | GCBD in China) should never have developed this capability to
           | begin with.
           | 
           | IMHO, having some portion of a pre-existing capability
           | doesn't matter when you aren't legally allowed to challenge
           | the request or answer "no".
        
         | kelnos wrote:
         | > _For instance, the "it only scans photos uploaded to iCloud"
         | element isn't just an arbitrary limitation that can be flipped
         | with one line of code, as some folks seem to think; as Erik
         | Neuenschwander, head of Privacy Engineering at Apple, explained
         | in an interview on TechCrunch[1]:_
         | 
         | > > _Our system involves both an on-device component where the
         | voucher is created, but nothing is learned, and a server-side
         | component, which is where that voucher is sent along with data
         | coming to Apple service and processed across the account to
         | learn if there are collections of illegal CSAM. That means that
         | it is a service feature._
         | 
         | I don't see how that proves what Neuenschwander says it proves.
         | Sending the voucher to the server is currently gated on whether
         | or not the associated photo is set to be uploaded to iCloud.
         | There's no reason why that restriction couldn't be removed.
         | 
         | The hard part about building this feature was doing the
         | scanning and detection, and building out the mechanism by which
         | Apple gets notified if anything falls afoul of the scanner.
         | Changing what triggers (or does not trigger) the scanning to
         | happen, or results being uploaded to Apple, is trivial.
         | 
         | The only thing that I think is meaningfully interesting here is
         | that it's possible that getting hold of a phone and looking at
         | the vouchers might not tell you anything; only after they
         | vouchers are sent to Apple and _algorithm magic_ happens will
         | you learn anything. But I don 't think that really matters in
         | practice; if you can get the vouchers off a phone, you can
         | almost certainly get the photos associated with them as well.
         | 
         | I read through the article you linked, and it feels pretty
         | hand-wavy to me. Honestly I don't think I'd believe what Apple
         | is saying here unless they release a detailed whitepaper on how
         | the system works, one that can be vetted by experts in the
         | field. (And then we still have to trust that what they've
         | detailed in the paper is what they've actually implemented.)
         | 
         | The bottom line is that Apple has developed and deployed a
         | method for locally scanning phones for contraband. Even if
         | they've designed things so that the result is obfuscated unless
         | there are many matches and/or photos are actually uploaded,
         | that seems to be a self-imposed limitation that could be
         | removed without too much trouble. The capability is there; not
         | using it is merely a matter of policy.
         | 
         | > _I know that 's easy to dismiss as Pollyannaism, but "we need
         | to protect ourselves from our own government" has a pretty
         | dismal track record historically. The entire point of having a
         | liberal democracy is that we are the government, and we can
         | pull it back from authoritarianism._
         | 
         | Absolutely agree, but I fear we have been losing this battle
         | for many decades now, and I'm a bit pessimistic for our future.
         | It seems most people are very willing to let their leaders
         | scare them into believing that they must give up liberty in
         | exchange for safety and security.
        
           | matwood wrote:
           | > There's no reason why that restriction couldn't be removed.
           | 
           | The problem is there's no reason Apple couldn't do anything.
           | All the ML face data gathered on device would be way more
           | valuable than these hash vouchers.
        
         | tehabe wrote:
         | The issue is, it is all software settings. The iMessage filter
         | is only active for acconts held by minors, that can be changed
         | by Apple. Might not even be a client side setting. Also the
         | code which can scan uploaded images can be used anywhere, scan
         | every picture which is being stored on the device. It is all
         | software and everything which is needed is one update.
        
       | cmsj wrote:
       | This seems like a pretty helpful document, and really shows how
       | carefully the CSAM system has been designed.
       | 
       | Two things that stuck out to me:
       | 
       | 1) The CSAM hash database is encrypted on-device
       | 
       | 2) The device doesn't know the results of hash comparisons
       | 
       | These suggest to me that the hash comparison is going to happen
       | inside the Secure Enclave - give it the encrypted database and
       | the NeuralHash of an image, it gives you back a Security Voucher
       | to upload.
       | 
       | I don't quite get how the hash database can stay secret though -
       | the decryption key has to come from somewhere. AFAIK the hash
       | database has never been made publicly available before, which
       | probably makes it quite a high value target for security
       | researchers and curious parties.
        
         | stjohnswarts wrote:
         | That's fine, but keep it on the server I don't want this crap
         | on my phone. I don't want to be guilty until proven innocent.
        
         | nullc wrote:
         | No secure enclave is needed, the system uses strong
         | cryptography to protect Apple and their database providers from
         | accountability.
         | 
         | I gave a semi-technical description of how private set
         | intersection works here:
         | 
         | https://news.ycombinator.com/item?id=28124716
         | 
         | Assuming they implement it correctly and there are no apple-
         | internal leaks we won't be discovering the content of the
         | database.
        
         | FabHK wrote:
         | > These suggest to me that the hash comparison is going to
         | happen inside the Secure Enclave - give it the encrypted
         | database and the NeuralHash of an image, it gives you back a
         | Security Voucher to upload.
         | 
         | My understanding is that the hash comparison, or at any rate
         | the counting of matches, is done in the cloud then. The way the
         | crypto is set up is that the server has no access to the
         | security envelope and its contents (and, after E2EE in the
         | future, to any images) unless the number of matches exceeds the
         | threshold.
        
         | shuckles wrote:
         | The hash database is blinded on the server before being
         | embedded into iOS. You'd need to access Apple's private key or
         | break cryptography to get access to it from the client. For the
         | less cryptographically savvy, the "Technical Summary" is a good
         | document on their Child Safety page.
        
       | unityByFreedom wrote:
       | > The system is designed so that a user need not trust Apple, any
       | other single entity, or even any set of possibly-colluding
       | entities from the same sovereign jurisdiction (that is, under the
       | control of the same government)
       | 
       | Who's to say authoritarian governments don't team up? We _just_
       | saw this happen over the last decade with the rise of nationalism
       | everywhere.
       | 
       | We already know governments can put significant pressure on
       | private companies. Don't capitulate, Apple. Blow the whistle.
       | You're getting screwed and you have our support if you just stand
       | up for what you said 5 years ago.
        
       | fortran77 wrote:
       | Why don't they scan files as they're downloaded, the way we scan
       | for malware and viruses on PCs? This way, we can warn people
       | _before_ a law is broken!
        
       | ty___ler wrote:
       | If apple can be coerced by governments to scan photos on the
       | serverside then what is the operational purpose of a frontend
       | scan, given what apple is publicly saying they are trying to do?
       | This is the most confusing aspect for me.
        
         | codeecan wrote:
         | Apple will turn it on eventually for non iCloud destined photos
         | (and documents), you don't build a system like this not to use
         | it.
         | 
         | We know Apple concedes to China's demands, and with another
         | Snowden situation, the US would not hesitate to add classified
         | documents to the scan list and identify individuals.
         | 
         | This system will catch no abusers, because they're not
         | Airdropping photos to each other.
         | 
         | And if they're smart enough to use a VPN, they're not storing
         | that on a phone thats logged into the cloud.
         | 
         | And if they're not using a VPN, they can be caught at download
         | time.
        
       | bnj wrote:
       | A curious outcome of the scrutiny that the system design is
       | receiving is that it will make it less effective at the task of
       | catching actual collections of CSAM.
       | 
       | Page eleven promises something I haven't seen before: that Apple
       | will publish the match threshold for each version. Meaning, even
       | if after all this, there are abusers still using iCloud photos to
       | manage their illegal collection of CSAM, they can check back to
       | know what number of images is safe to keep.
       | 
       | A twisted version of the Streisand effect.
        
         | somebodythere wrote:
         | If you knew about the existence of this program and were
         | determined to keep a collection of CSAM on your phone I don't
         | understand why someone would choose to dance around specific
         | thresholds rather than just disable iCloud sync.
        
       | zepto wrote:
       | A key point that needs to be mentioned: we strongly dislike being
       | distrusted.
       | 
       | It might well be a genetic heritage. Being trusted in a tribe is
       | crucial to survival, and so is likely wired deep into our social
       | psychology.
       | 
       | Apple is making a mistake by ignoring that. This isn't about
       | people not trusting Apple. It's about people not feeling trusted
       | _by_ Apple.
       | 
       | Because of this, it doesn't matter how trustworthy the system is
       | or what they do to make it less abusable. It will still represent
       | distrust of the end user, and people will still feel that in
       | their bones.
       | 
       | People argue about the App Store and not being trusted to install
       | their own Apps etc. That isn't the same. We all know we are
       | fallible and a lot of people like the protection of the store,
       | and having someone to 'look after' them.
       | 
       | This is different and deeper than that. Nobody want to be a
       | suspect for something they know they aren't doing. It feels
       | dirty.
        
         | unityByFreedom wrote:
         | The part Federighi's "interview" where he can't understand how
         | people perceive this as a back door [1] seems incredibly out of
         | touch. The back door is what everyone is talking about. Someone
         | at Apple should at least be able to put themselves in their
         | critics' shoes for a moment. I guess we need to wait to hear
         | Tim Cook explain how this is not what he described 5 years ago
         | [2].
         | 
         | [1] https://youtu.be/OQUO1DSwYN0?t=425
         | 
         | [2] https://youtu.be/rQebmygKq7A?t=57
        
           | zepto wrote:
           | It's not a back door in any sense of the word. That's why he
           | is surprised people see it as one.
           | 
           | It really only does what they say it does, and it really is
           | hard to abuse.
           | 
           | But that doesn't matter. The point is that even so, it makes
           | everyone into a suspect, and that _feels wrong_.
        
             | laserlight wrote:
             | I would say that it is a back door, because it would work
             | even if iCloud Photos were E2E encrypted. It may not be a
             | generic one, but one that is specific to a purpose Apple
             | decided is rightful. And there is no guarantee that Apple
             | (or authorities) won't decide that there are other rightful
             | purposes.
        
               | dwaite wrote:
               | > I would say that it is a back door, because it would
               | work even if iCloud Photos were E2E encrypted.
               | 
               | Backdoor is defined by the Oxford Dictionary as "a
               | feature or defect of a computer system that allows
               | surreptitious unauthorized access to data."
               | 
               | The system in question requires you to upload the data to
               | iCloud Photos for the tickets to be meaningful and
               | actionable. Both your phone and iCloud services have EULA
               | which call out and allow for such scanning to take place,
               | and Apple has publicly described how the system works as
               | far as its capabilities and limitations. In the sense
               | that people see this as a change in policy (IIRC the
               | actual license agreement language changed over a year
               | ago) , Apple has also described how to no longer use the
               | iCloud Photos service.
               | 
               | One less standard usage is not about unauthorized access
               | but specifically to private surveillance (e.g. "Clipper
               | chip") - but I would argue that the Clipper chip was a
               | case where the surveillance features were specifically
               | not being talked about, hence it still counting as
               | "unauthorized access".
               | 
               | But with a definition that covers broad surveillance
               | instead of unauthorized access, it would still be
               | difficult to classify this as a back door. Such
               | surveillance arguments would only pertain to the person's
               | phone and not to information the user chose to release to
               | external services like iCloud Photos.
               | 
               | To your original point, it would still work with iCloud
               | Photos did not have the key escrow, albeit with less data
               | being capable of being able to be turned over to law
               | enforcement. However iCloud Photos being an external
               | system would still mean this is an intentional and
               | desired feature (presumably) by the actual system owners
               | (Apple).
        
               | laserlight wrote:
               | I see. My interpretation doesn't hold up given your
               | definitions of back door.
               | 
               | I bet the authorities would be happy with a surveillance
               | mechanism disclosed in the EULA, though. Even if such a
               | system is not technically a back door, I am opposed to it
               | and would prefer Apple to oppose it.
               | 
               | Edit: I just noticed that you had already clarified your
               | argument in other replies. I am sorry to make you repeat
               | it.
        
               | dwaite wrote:
               | It has proven very difficult to oppose laws meant to
               | deter child abuse and exploitation.
               | 
               | Note that while the EFF's mission statement is about
               | defending civil liberties, they posted two detailed
               | articles about Apple's system without talking about the
               | questionable parts of the underlying CSAM laws. There was
               | nothing about how the laws negatively impact civil
               | liberties and what the EFF might champion there.
               | 
               | The problem is that the laws themselves are somewhat
               | uniquely abusable and overreaching, but they are meant to
               | help reduce a really grotesque problem - and reducing
               | aspects like detection and reporting is not going to be
               | as effective against the underlying societal issue.
               | 
               | Apple has basically been fighting this for the 10 years
               | since the introduction of iCloud Photo, saying they
               | didn't have a way to balance the needs to detect CSAM
               | material without impacting the privacy of the rest of
               | users. PhotoDNA was already deployed at Microsoft and
               | being deployed by third parties like Facebook when iCloud
               | Photo launched.
               | 
               | Now it appears that Apple was working a significant
               | portion on that time toward trying to build a system that
               | _did_ attempt to accomplish a balance between
               | social/regulatory responsibility and privacy.
               | 
               | But such a system has to prop technical systems and legal
               | policies against one another to make up the shortcomings
               | of each, which make it a very complex and nuanced system.
        
             | unityByFreedom wrote:
             | > It's not a back door in any sense of the word.
             | 
             | It is. It's a simple matter for two foreign governments to
             | decide they don't want their people to criticize the head
             | of state with memes, and then insert such images into the
             | database Apple uses for scanning.
             | 
             | Apple's previous position on privacy was to make such
             | snooping impossible because they don't have access to the
             | data. Now they are handing over access.
             | 
             | What I and thousands of others online are describing ought
             | to be understandable by anyone at Apple. The fact that an
             | Apple exec can sit there in a prepared interview and look
             | perplexed about how people could see this as a back door is
             | something I don't understand at all. This "closing his
             | ears" attitude may indicate he is full of it.
        
               | [deleted]
        
               | Spivak wrote:
               | How does this attack even work? So some government
               | poisons the database with political dissident memes and
               | suddenly Apple starts getting a bunch of new reports
               | which when reviewed are obviously not CSAM.
               | 
               | If the government can force Apple to also turn over these
               | reports then they could have just made Apple add their
               | political meme database directly and it's already game
               | over.
        
               | unityByFreedom wrote:
               | More like, the government says Apple can't operate there
               | unless they include what _they_ say is illegal.
               | 
               | Apple is run by humans who are subject to influence and
               | bias. Who knows what policy changes will come in Apple's
               | future. Apple's previous stance was to not hand over data
               | because they don't have access to it. This change
               | completely reverses that.
        
               | zepto wrote:
               | Edit: I reconsidered my previous reply.
               | 
               | That really doesn't sound like anything I'd describe as a
               | "back door". A back door implies general purpose access.
               | A system which required the collision of multiple
               | governments and Apple and their child abuse agencies
               | simply is not that.
               | 
               | One of the casualties of this debate is that people are
               | using terms that make things sound worse than they are.
               | If you can't get at my filesystem, you don't have a back
               | door. I understand the motive for stating the case as
               | harshly as possible, but I think it's misguided.
               | 
               | Having said this, I would find it interesting to hear
               | what Federighi would say about this potential abuse case.
        
               | unityByFreedom wrote:
               | > A system which required the collision of multiple
               | governments and Apple and their child abuse agencies
               | simply is not that.
               | 
               | Agree to disagree.
               | 
               | > I would find it interesting to hear what Federighi
               | would say about this potential abuse case.
               | 
               | Personally I would not. That's a political consideration
               | and not something I want to hear a technologist weigh in
               | on while defending their technology. Apple's previous
               | stance, with which I agree, was to not give humans any
               | chance to abuse people's personal data,
               | 
               | https://youtu.be/rQebmygKq7A
        
               | zepto wrote:
               | > Personally I would not. That's a political
               | consideration and not something I want to hear a
               | technologist weigh in on while defending their
               | technology.
               | 
               | It's not. The abuse case flows from their architecture.
               | Perhaps it isn't as 'easy' as getting multiple countries
               | to collude with Apple. If the architecture can be abused
               | the way you think it can, that is a technical problem as
               | well as a political one.
        
               | unityByFreedom wrote:
               | You can't solve the human-bias problem with technology.
               | That's the whole reason Apple didn't want to build in a
               | back door in the first place.
        
               | zepto wrote:
               | You may not be able to solve the bias problem altogether,
               | but you can definitely change the threat model and who
               | you have to trust.
               | 
               | Apple's model has always involved trusting _them_. This
               | model involves trusting other people in narrow ways. The
               | architecture determines what those ways are.
        
               | unityByFreedom wrote:
               | Trusting Apple to not scan my device in the past was easy
               | because as an engineer I know I would speak up if I saw
               | that kind of thing secretly happening, and I know
               | security researchers would speak up if they detected it.
               | 
               | Now Apple _will_ scan the device and we must trust that
               | 3rd parties will not abuse the technology by checking for
               | other kinds of imagery such as memes critical of heads of
               | state.
               | 
               | The proposed change is so much worse than the previous
               | state of things.
        
               | zepto wrote:
               | > checking for other kinds of imagery such as memes
               | critical of heads of state.
               | 
               | Do you live in a country where the head of state wants to
               | check for such memes?
        
               | unityByFreedom wrote:
               | Probably. You underestimate humans if you don't think any
               | of us will try to squash things that make us look bad.
        
               | zepto wrote:
               | Does your state not have protections against such
               | actions?
        
         | shuckles wrote:
         | But the catch is: all the incumbents already treated your data
         | as if you were guilty until proven innocent. Apple's
         | transparency about that change may have led people to
         | internalize that, but it's been the de facto terms of most
         | cloud relationships.
         | 
         | What I personally don't understand is why Apple didn't come out
         | with a different message: we've made your iPhone so secure that
         | we'll let it vouch for your behalf when it sends us data to
         | store. We don't want to see the data, and we won't see any of
         | it unless we find that lots of the photos you send us are
         | fishy. Android could never contemplate this because they can't
         | trust the OS to send vouchers for the same photos it uploads,
         | so instead they snoop on everything you send them.
         | 
         | It seems like a much more win-win framing that emphasizes their
         | strengths.
        
           | zepto wrote:
           | I agree this is vastly better than what anyone else is doing,
           | and you know I understand the technology.
           | 
           | However, I don't think any framing would have improved
           | things. I think it was always going to feel wrong.
           | 
           |  _I_ would prefer they don't do this because it feels bad to
           | be a suspect even in this abstract and in-practice harmless
           | way.
           | 
           | Having said that, having heard from people who have
           | investigated how bad pedophile activity actually is, I can
           | imagine being easily persuaded back in the other direction.
           | 
           | I think thins is about the logic of evolutionary psychology,
           | not the logic of cryptography.
           | 
           | My guess is that between now and the October iPhone release
           | we are going to see more media about the extent of the
           | problem they are trying to solve.
           | 
           | That is how Apple wins this.
        
             | unityByFreedom wrote:
             | > Having said that, having heard from people who have
             | investigated how bad pedophile activity actually is, I can
             | imagine being easily persuaded back in the other direction.
             | 
             | There are terrible things out there that we should seek to
             | solve. They should not be solved by creating 1984 in the
             | literal sense, and certainly not by the company that became
             | famous for an advertisement based on that book [1].
             | 
             | Apple, take your own advice and Think Different [2].
             | 
             | [1] https://www.youtube.com/watch?v=VtvjbmoDx-I
             | 
             | [2] https://www.youtube.com/watch?v=5sMBhDv4sik
        
               | zepto wrote:
               | > creating 1984 in the _literal_ sense
               | 
               | I take it you haven't read 1984.
               | 
               | When Craig Federighi straps a cage full of starving rats
               | to my face, I'll concede this point.
        
               | unityByFreedom wrote:
               | China has tiger chairs. Should we move closer to their
               | big brother systems?
        
               | zepto wrote:
               | No but this has nothing to do with that.
               | 
               | In case you missed it, I think this is probably a bad
               | move.
               | 
               | I just don't think these arguments about back doors and
               | creeping totalitarianism are either accurate or likely to
               | persuade everyday users when they weigh them up against
               | the grotesque nature of child exploitation.
        
               | unityByFreedom wrote:
               | > I just don't think these arguments about back doors and
               | creeping totalitarianism are either accurate or likely to
               | persuade everyday users when they weigh them up against
               | the grotesque nature of child exploitation.
               | 
               | Agree to disagree. This opens the door to something much
               | worse in my opinion.
        
             | shuckles wrote:
             | I don't think I agree. If you think this boils down to
             | instinct, do you think a story about coming together to
             | save the next generation will work well on people cynical
             | enough to see TLAs around every corner? At the very least,
             | I feel like Apple should probably make a concession to the
             | conspiracy minded so that they can bleach any offensive
             | bits from their device and use them as offline-first DEFCON
             | map viewing devices, or some such.
        
               | zepto wrote:
               | > do you think a story about coming together to save the
               | next generation will work well on people cynical enough
               | to see TLAs around every corner?
               | 
               | Absolutely not. I don't think they need to persuade
               | people who are convinced of their iniquity.
               | 
               | What they need is an environment in which those people
               | look like they are arguing over how many dictatorships
               | need to collude to detect anti-government photos and how
               | this constitutes _literal 1984_ , while Apple is
               | announcing a way to deter horrific crimes against
               | American children that they have seen on TV.
        
           | unityByFreedom wrote:
           | > What I personally don't understand is why Apple didn't come
           | out with a different message: we've made your iPhone so
           | secure that we'll let it vouch for your behalf when it sends
           | us data to store. We don't want to see the data, and we won't
           | see any of it unless we find that lots of the photos you send
           | us are fishy.
           | 
           | That is PR speak that would have landed worse in tech forums.
           | I respect them more for not doing this.
           | 
           | The core issue is this performs scans, without your approval,
           | on your local device. Viruses already do that and it's
           | something techies have always feared governments might
           | impose. That private companies are apparently being strong-
           | armed into doing it is concerning because it means the
           | government is trying to circumvent the process of public
           | discussion that is typically facilitated by proposing
           | legislation.
        
             | zepto wrote:
             | > without your approval
             | 
             | This isn't true. You can always turn off iCloud Photo
             | Library and just store the photos locally or use a
             | different cloud provider.
        
               | unityByFreedom wrote:
               | Let's not pretend anyone is really "opting in" on this
               | feature.
        
               | zepto wrote:
               | Anyone who doesn't like it can switch off iCloud Photo
               | Library.
               | 
               | I can imagine many people doing that in response to this
               | news.
        
               | unityByFreedom wrote:
               | The fact that you can opt out for now does not set me at
               | ease.
        
               | zepto wrote:
               | Nor me. I will not opt out because I think there is no
               | threat to me and I like iCloud photos. That doesn't mean
               | I like the presence of this mechanism.
        
               | unityByFreedom wrote:
               | > That doesn't mean I like the presence of this
               | mechanism.
               | 
               | I don't think I ever said you did.
        
               | zepto wrote:
               | No - but you did suggest there was no opting in. I'm
               | pointing out that just because I'm not entirely happy
               | with the choice doesn't mean it isn't a choice.
        
               | upbeat_general wrote:
               | I hate this argument. Pressing yes to the T&C once when
               | you setup an Apple account doesn't exactly constitute my
               | approval imo (even if it does legally).
               | 
               | There's no disable button or even clear indication that
               | it's going on.
        
               | dwaite wrote:
               | > There's no disable button or even clear indication that
               | it's going on.
               | 
               | iCloud Photos is an on-off switch.
               | 
               | In terms of clearly indicating everything that is going
               | on within the service, that is just not possible for most
               | non-tech users. It appears to have been pretty difficult
               | even for those familiar with things like cryptography and
               | E2E systems to understand the nuances and protections in
               | place.
               | 
               | Instead, the expectation should be that using _any_
               | company's cloud hosting or cloud synchronization features
               | is fundamentally sharing your data with them. It should
               | then be any vendor's responsibility give give customers
               | guarantees on which to evaluate trust.
        
         | tzs wrote:
         | That's not at all clear.
         | 
         | A lot of people like strong border controls for instance, even
         | if it means when they return to their country from abroad they
         | have to go through more checks or present more documents to get
         | in. Or consider large gated communities where you have to be
         | checked by a guard to get in.
         | 
         | Many peoples seem fine with being distrusted as long as the
         | distrust is part of a mechanism to weed out those who they feel
         | really are not trustworthy and it is not too annoying for them
         | to prove that they are not one of those people when they
         | encounter a trust check.
        
           | zepto wrote:
           | Your examples are not a good analogy. The distrust is
           | transient and then you are cleared. This is a stare of
           | permanently being a suspect.
           | 
           | However, it may certainly be the case that in the end people
           | in general do accept this as a price worth paying to fight
           | pedophiles.
        
       ___________________________________________________________________
       (page generated 2021-08-14 23:02 UTC)