[HN Gopher] Apple Regrets Confusion over 'iPhone Scanning'
___________________________________________________________________
Apple Regrets Confusion over 'iPhone Scanning'
Author : belter
Score : 301 points
Date : 2021-08-13 18:07 UTC (4 hours ago)
(HTM) web link (www.bbc.com)
(TXT) w3m dump (www.bbc.com)
| yawaworht1978 wrote:
| I think apple regrets that the media coverage and the outcry was
| tremendous.
|
| Off course they have been victim to general scepticism towards
| big tech companies which has gained traction recently. One might
| ask, why did they not broadcast clearer messages and why was the
| "confusion" clarified for so long.
| hypothesis wrote:
| Was it not a Friday Night news dump? They were hoping that
| outrage will die over the weekend when everyone moves on to
| next news topic.
| yawaworht1978 wrote:
| Off course,everyone who's ever been in a large corporation
| will know these things aren't just soft launched. There have
| been prior reviews with the authorities, lawyers, the
| drafters of the terms and conditions and every possible
| strategic deployment was a-b-c etc tested. This is nothing
| but corporate double speech and damage reduction.
|
| I wonder, though, why the back tracking on the messaging
| happened. Reputation damage or fear it might affect the
| bottom line.
|
| With this act, they kind of make anyone guilty unless proven
| otherwise by scraping their data. Apple was the last man
| standing regarding telemetry handling, bit better than
| others. Now, I am not so sure
| hypothesis wrote:
| Yep, all that is true and they are afraid of both. However
| I can't see how they are going that broken thing together
| again...
|
| Earlier, when they said about allowing people to stay on
| iOS 14, that was a real head-scratcher which now makes
| sense.
| [deleted]
| S_A_P wrote:
| Yeah I am just not buying this. I think Apple regrets that they
| are not able to just brush this past the public without blowback.
| My confidence that they really care about privacy is shook.
| bborud wrote:
| All you need is to get some set of images onto someone's phone.
| And Apple will take care of the character assassination for you.
| 0x0 wrote:
| I regret ever getting into a software ecosystem that comes
| bundled with snitchware in the sealed system volume provided by
| its first party vendor.
| pcmoney wrote:
| WTF is he talking about? There is no "confusion" they are
| scanning your phone for data they have decided is bad. Yes today
| it is allegedly CP, tomorrow it is anything.
|
| "The system could only match "exact fingerprints" of specific
| known child sexual abuse images, he said."
|
| Or like whatever they, the US govt, or any govt where they want
| to make money (such as China) wants. Is anyone auditing the
| blacklist? Is it publicly reviewable? (Since it contains CP of
| course not)
| kemayo wrote:
| It's worth bearing in mind that the human review step does mean
| that a government can't just slip stuff in without securing
| Apple's cooperation (including training their review staff
| about all the political content they have to look for).
| Otherwise the reviewers would presumably just go "huh, that's
| weird, this Winnie the Pooh meme definitely isn't child porn"
| and move on.
|
| _Can_ a government secure Apple 's cooperation in that? I have
| no idea. But it does make a useful subversion of the hash
| database a more complicated thing to accomplish.
| elliekelly wrote:
| In some ways I think human review is even creepier. I don't
| want an algorithm looking at my private photos but I _
| _definitely_ _ don't some rando "reviewer" looking at them!
| But I guess it all comes down to the same thing: I don't want
| _anyone_ looking at my photos unless I've deliberately shared
| my photos with them.
| kemayo wrote:
| I think a lot of that does come down to levels of trust in
| their algorithm. Their claim is that it's _staggeringly_
| unlikely for an account to get flagged without actually
| containing photos which are really in the database of
| hashes they were provided (one in a trillion[1]). Then the
| only photos that the reviewers get to view are
| "derivatives" (they've not said what this actually means)
| of the photos of the photos that actually matched.
|
| Speaking for myself, if Apple is correct about those odds,
| I'm not personally feeling creeped out by it. If they're
| wrong, my opinion could change. I certainly don't have the
| math and security background to actually verify their
| claims from the white paper they posted about the system,
| though.
|
| [1]: https://www.apple.com/child-safety/
| pseudalopex wrote:
| Apple's claim is completely unverifiable. And people who
| worked with NCMEC's database said it contains non CSAM
| images.
| kemayo wrote:
| Well, okay, if you have enough non-CSAM images on your
| phone that are also in the NCMEC database, the reviewers
| will presumably get to see those specific images. They'll
| then go "these are obviously not naked children" and move
| on. If they're in the database and you also have a copy,
| presumably they're something like common memes that
| people save? That seems like it has a lower creepiness
| factor for reviewers to see anyway.
|
| Getting more information about the hashing function
| they're using would be nice. It'd make it much easier to
| see how actually collision-prone this is. I'd be all for
| them getting some external review of it published, much
| like the review of the security-tokens they've published.
| (I appreciate that it's difficult, because providing the
| hashing function itself to experiment with lets awful
| people tune their images to be just-distinct-enough that
| they won't match.)
|
| It's worth bearing in mind that Apple has a fairly strong
| motivation for the hashing to be good. They have to pay
| reviewers to look over these matches, and it's bad PR if
| it turns out that they're massively backlogged.
| notJim wrote:
| > And people who worked with NCMEC's database said it
| contains non CSAM images.
|
| Where did you see that? I tried to find more info about
| it, but I didn't find anything.
| pseudalopex wrote:
| https://www.hackerfactor.com/blog/index.php?/archives/929
| -On...
| notJim wrote:
| I skimmed this, and don't see anything that says the DB
| contains non-CSAM.
| pseudalopex wrote:
| "The false-positive was a fully clothed man holding a
| monkey -- I think it's a rhesus macaque."
| notJim wrote:
| Thank you. This does not support your claim, however.
|
| 1) That is an MD5 hash, not a perceptual hash. Apple is
| not using md5s.
|
| 2) It is a false positive, not bad info in the database.
| All involved acknowledge the possibility of false
| positives.
| pseudalopex wrote:
| You think an MD5 collision is more likely than a wrongly
| classified image?
|
| NCMEC generated the hashes Apple will use by running
| NCMEC's collection of forbidden media through Apple's
| algorithm. And perceptual hashes have more collisions
| than cryptographic hashes.
|
| Several people said the database includes non CSAM seized
| in investigations.[1]
|
| [1] https://news.ycombinator.com/item?id=28069844
| nowherebeen wrote:
| And when a government want to scan for "illegal" images, they
| will just fall back to the argument that it's the law there.
| It's a terribly slippery slope.
| 734129837261 wrote:
| What's worse than child pornography in, say, Saudi Arabia?
| Atheism is. They can force Apple to tag accounts that have
| images that are popular in atheist circles (memes, information,
| etc.) and track these people down. The penalty for that in
| Saudi Arabia is death.
|
| China can start finding Uyghurs based on the images they tend
| to share. If we're unlucky (as a world), they can even start
| searching for particular individuals.
|
| "Save the children" is just the classic political ploy to get a
| ruling through that's just a precursor for evil things to come.
|
| I'm absolutely disgusted by Apple.
| kemayo wrote:
| I don't see how this is any different from what Apple could
| already have been forced to do. If the argument is that
| they're going to knuckle under to an abusive request
| involving this system, then they'd presumably have done so
| under the prior status quo which was no more secure.
|
| They _already_ were storing the photos unencrypted (or with
| keys available, at least) on their servers, so any government
| that was able to push them to add a hash to this scanning
| system could have gotten them to scan for something in
| iCloud.
|
| China, in particular, could _definitely_ already be doing
| that, since China made Apple host all iCloud data for Chinese
| users on servers inside China that 're operated by a Chinese
| company. See: https://support.apple.com/en-us/HT208351
| m-ee wrote:
| It's worse because it blurs what was a previously clear
| line. If the photos weren't in iCloud, they couldn't be
| scanned. In this new implementation Apple will only scan
| photos destined for iCloud, but they now have the
| capability to scan photos on your device and all that's
| holding them back is corporate policy.
| kemayo wrote:
| Apple already did scan photos on your device, though.
| They do a massive pass of ML classification over all
| photos on a pseudo-regular basis (I assume "whenever a
| new OS release includes a new ML classifier build") --
| it's what makes things like searching for "cat" in Apple
| Photos work.
|
| If Apple's willing to change the new system to do a full
| scan of all photos on-device and send notifications to
| them outside of the upload-to-iCloud-with-security-
| tickets mechanism, they could just as easily have done
| that with the old system.
| thephyber wrote:
| > they now have the capability
|
| What do you mean by capability here?
| cblconfederate wrote:
| > in the region of 30 matching images before this feature would
| be triggered
|
| Kind of a funny twist. But what about the core of the issue that
| you created a new affordance for spies and malware, legitimate or
| illegitimate or government-backed. Why not implement the whole
| thing in your (few) own cloud servers instead of billions of
| phones all over the planet
| system2 wrote:
| BBC, grow some balls and say it how it is. Still normal people
| reading this article wouldn't understand what's going on. This is
| the only acceptable time to use a click-bait title to get normies
| attention.
| codezero wrote:
| I am disappointed that none of their messaging at all attempts to
| explain how the feature won't be further misused (by governments
| or others, quietly or loudly) in the future.
| alibert wrote:
| Apple said that they are intersecting multiples databases in
| different jurisdiction to avoid rogue hashes and that this will
| be available to audit.
|
| They also said that because it's on device, security
| researchers will be able to check any change to the program.
| (probably via the Apple Security Research Device Program ?)
|
| [1] https://developer.apple.com/programs/security-research-
| devic...
| emko7 wrote:
| How can you figure out what a neural net is trained to find?
| Are they releasing the data set to verify? That would be the
| bad images we are told it is scanning for and that would be
| bad.. is there some 3rd party that can do the audit?
|
| Also now that this is a thing how effective will it be at
| all? Or these sick people that dumb? After all this news? I
| do hope they are that dumb but who knows.
| spiderice wrote:
| > How can you figure out what a neural net is trained to
| find
|
| See, it's comments like this that clearly illustrate that
| there is confusion, and many people are still outraged over
| things they don't understand. A neural net is not scanning
| your phone for CP. You are conflating two things. Just
| watch the video that you're commenting on before commenting
| on it.
| tehnub wrote:
| The neural net isn't exactly searching for CSAM itself. Its
| role is to extract perceptual features from the image, and
| it is applied to both the CSAM images and your iCloud
| images. If those were the same to start with, then the
| extracted features will be the same.
|
| As for exactly how they'll do the auditing, I'm confused as
| well.
| codezero wrote:
| Thanks. That's a helpful bit of info I wasn't aware of.
| alibert wrote:
| FYI, they also published this today:
| https://www.apple.com/child-
| safety/pdf/Security_Threat_Model...
|
| With new bits of infos too.
| fmakunbound wrote:
| I think they're pretty surprised that "but what about saving the
| children" didn't just slide smoothly by.
| nix23 wrote:
| It reminds me of 2002 and the "patriot act" it had the perfect
| name at the perfect time so that you could just choose to be a
| "patriot" or a "pro terrorist"...nothing in between was
| acceptable in public.
| system2 wrote:
| Besides anger, I guess we are all learning very valuable
| marketing tactics. Just name it extremely so we cannot choose
| the other option because it makes us a bad person.
| sneak wrote:
| Who doesn't like progress? Or responsible disclosure?
| [deleted]
| fmajid wrote:
| Ah yes, the studied non-apology apology.
|
| Just as you cannot be partly pregnant, you cannot be partly
| trustworthy on privacy and security. Apple blew all their
| credibility in one stupid decision to appease the unappeasable
| authoritarians.
| pcrh wrote:
| That's my opinion exactly.
|
| I was impressed a few years ago when Apple wouldn't allow
| interference with a device that belonged to an unconvicted
| suspect (I can't remember the details, apologies). But this
| concession to unmonitored surveillance is really disappointing.
| system2 wrote:
| If you and I can tell this after the decision was made, imagine
| how their ultra highly educated and experienced consultants and
| development team wouldn't see this coming. They knew exactly
| how we would react, but they went with it.
|
| Call me conspiracy theorist, they must have been forced so
| badly to make this kind of a change.
| sneak wrote:
| My theory is that they want to do e2e and this is the only
| way that they wouldn't be punished by the USG for doing so.
|
| This also means that if you get big enough, you lose your 1A
| rights in the USA because the feds will punish you
| extralegally if you do things they don't like or that make
| life harder for them.
|
| Sad state of affairs in the USA.
| system2 wrote:
| That is true, there is no absolute freedom. But I also
| think instead of twisting their arms, feds simply give them
| enough incentive to do it. Imagine a company completely
| supported by the government. There is no shortage of money,
| no risk of getting bankrupt, big contracts between
| government bodies, even better infrastructure support. How
| much can they lose by doing this? I don't think much.
| novok wrote:
| Apple does not make significant income from the US
| government where the loss of them as a customer would
| materially affect their revenues.
| fmajid wrote:
| I think this is actually motivated by upcoming EU
| legislation that would mandate CSAM filtering by cloud
| storage providers. They tried hard to engineer a solution
| that would provide some guarantees, but doing the
| processing client-side sets an even more damaging
| precedent. The same guarantees might have been doable
| purely server-side using homomorphic encryption but that
| tech is still very nascent and not deployed at the scale of
| iCloud.
| belter wrote:
| "Apple's Software Chief Explains 'Misunderstood' iPhone Child-
| Protection Features"
|
| https://www.wsj.com/video/series/joanna-stern-personal-techn...
| GekkePrutser wrote:
| I know what it does and how it works. I just don't like being
| considered a potential criminal without any reason.
|
| For starters they should exclude photos made on the phone's own
| camera. Because it's literally impossible for a just-taken
| photo to appear in this database since that only contains
| already known content found in the wild. And most people's
| photos would be original content. So it would alleviate a lot
| of concern while not harming Apple's goals.
|
| If those goals are indeed what they say they are, of course.
| legrande wrote:
| > it's literally impossible for a just-taken photo to appear
| in this database
|
| Well it appears the CSAM scanning algo doesn't have Dost[0]
| scanning built in so, many people will evade this 'utility'
| made by Apple
|
| [0] https://en.wikipedia.org/wiki/Dost_test
| fsflover wrote:
| What if you take a picture of another illegal photo?
| GekkePrutser wrote:
| I don't think this algorithm is meant to capture that
| anyway. It's relying on content staying digital. It can
| deal with cropping according to its developers but I doubt
| it will capture a photo of a photo if it really has a false
| positive chance of one in a trillion.
|
| Also, this is not a viable distribution method anyway.
| Every photo introduces more noise. Like dubbing tapes back
| in the day but worse.
| [deleted]
| heavyset_go wrote:
| Perceptual hashes between a source image and its
| derivatives will be similar if they kind of look similar
| to one another. That's the point of perceptual hashing.
| zepto wrote:
| It's not about looking 'kind of similar'. The point is to
| match images that have been resized or had contrast
| enhanced etc. That's all.
| heavyset_go wrote:
| That's literally how perceptual hashes work. Two images
| that look similar to each other will have the same, or
| similar, hashes[1][2].
|
| [1] https://news.ycombinator.com/item?id=28091750
|
| [2] https://news.ycombinator.com/item?id=28110159
| zepto wrote:
| Yes, it will match two images that look the same even
| after minor transformations.
|
| That isn't the same as saying it will match things that
| look _kind of_ the same.
| heavyset_go wrote:
| That's a distinction with no functional difference. Two
| images that were modified from a source image will
| sometimes kind of look like one another, and so will two
| images that coincidentally look like one another. The
| first two will have similar or the same hashes, as will
| the latter two. That's how you get false positives.
|
| And it's incorrect. The way perceptual hashing works is
| that an image is shrunk down to a 8x8 or 26x26 etc image
| and then transformations are applied to them to
| exaggerate features.
|
| If two images look kind of the same when shrunken down,
| they will have the same or similar hashes. If two images
| kind of look the same when shrunken down, then their
| parent images will also kind of look the same.
|
| Please read the OPs of the two links I posted. They're
| both from people who work in this field. The latter link
| is from someone[2] who invented many perceptual hashing
| methods himself that are used widely across the industry.
| Both articles touch on this subject, and the first[1] one
| includes two photo examples. I have built products using
| these methods, and what is said by these two experts
| matches my experiences.
|
| [1] https://rentafounder.com/the-problem-with-perceptual-
| hashes/
|
| [2] https://www.hackerfactor.com/blog/index.php?/archives
| /929-On...
| zepto wrote:
| It's not a distinction without a difference. 'Kind of
| like' is going to be read by most people as 'easily
| fooled'. That is a reflection of the false positive rate.
| It all depends on the quality and tuning of the algorithm
| and how it is deployed. If you are going to imply false
| positives are common, then you need to back it up.
|
| Nobody is saying false positives are impossible.
|
| Apple is saying false positives are on the order of one
| in a trillion per user account per year. That doesn't
| sound like something that matches images that are only
| 'kind of similar'. Yes - cryptographic hashes have much
| lower false positives rates even than that, but _that_ is
| a distinction without a difference since both make the
| risk negligible.
|
| > The way perceptual hashing works is that an image is
| shrunk down to a 8x8 or 26x26 image and then
|
| Which is it for Apple's hashes?
|
| There is no point in reading old articles about
| perceptual hashes if the conclusions don't apply to
| Apple's neuralhash algorithm. If they don't then reading
| about other hashes is just a distraction.
|
| What can you tell us about the likelyhood of _Apple's_
| hashes to create false positives?
| heavyset_go wrote:
| > _It's not a distinction without a difference. 'Kind of
| like' is going to be read by most people as 'easily
| fooled'._
|
| I'm not really concerned with what you're afraid most
| people will think. Two images that kind of look like one
| another will have the same or similar hashes. There are
| literal examples of this in the links I posted above. And
| it's literally the point of perceptual hashing, to find
| images that look similar to a source image by comparing
| hash similarity.
|
| > _If you are going to imply false positives are common,
| then you need to back it up._
|
| I just did with two links I posted above. Twice.
|
| > _Apple is saying false positives are on the order of
| one in a trillion per user account per year._
|
| Sounds like a claim that wasn't replicated or
| independently verified. Of course Apple is going to say
| their system is nearly perfect, that's what all companies
| do. The onus is on Apple to prove that their marketing
| claims reflect reality.
|
| > _There is no point in reading old articles about
| perceptual hashes if the conclusions don't apply to
| Apple's neuralhash algorithm. If they don't then reading
| about other hashes is just a distraction._
|
| The onus is on Apple to demonstrate that their methods
| are remarkably different from the rest of the science and
| industry.
|
| This is like saying the normal principles of computing
| don't apply to new Apple products because they might have
| invented a new brand computing paradigm that isn't
| anything like any classical or quantum computer mentioned
| in scientific literature at all. Yeah, maybe they did,
| but it's unlikely and the onus is on Apple to prove it.
| zepto wrote:
| > I'm not really concerned with what you're afraid most
| people will think.
|
| What matters is not what I think, but whether _you_ care
| about making misleading comments.
|
| > > If you are going to imply false positives are common,
| then you need to back it up.
|
| > I just did with two links I posted above. Twice.
|
| No, you posted some links that are not about Apple's
| system, and you can't explain how they apply presumably
| because you don't understand what Apple is doing.
|
| > Sounds like a claim that wasn't replicated or
| independently verified. Of course Apple is going to say
| their system is nearly perfect, that's what all companies
| do. The onus is on Apple to prove that their marketing
| claims reflect reality.
|
| So this tells us _you_ don't know what algorithm Apple is
| using...
|
| ...And are accusing Apple of lying, when it is clear that
| you haven't read about how they avoid false positives.
|
| I think the onus is on you to prove your accusation.
|
| > This is like saying the normal principles of computing
| don't apply to new Apple products because they might have
| invented a new brand computing paradigm that isn't
| anything like any classical or quantum computer
|
| That just silly. It's doesn't take breaking the laws of
| quantum or classical computing to build a system with a
| low false positive rate.
|
| One obvious way would be to leverage multiple images
| rather than just one. Increasing the sample size of a
| population sample generally reduces the false positive
| rate.
|
| Have you considered that someone could build a system
| this way?
| zekrioca wrote:
| Very patronizing of you. You clearly never learned that
| both classical (like your "sample size" example) and
| advanced statistical models are susceptible to first-
| order evasion attacks (or "Adversarial Examples") that
| fool models at run-time. But go trust Apple's
| 1/1.000.000.000.000 claims.
| zepto wrote:
| Obviously such an attack has nothing to do with the false
| positive rate.
|
| But, setting that aside, can you explain how a first
| order evasion attack can be used against Apple's
| mechanism?
|
| They are a real kind of attack in the lab, but it's not
| obvious how they could be used to exploit Apple's CSAM
| detection.
|
| If you have reason to think they are a real threat, I'm
| sure you can explain.
| shuckles wrote:
| In particular, false positives with perceptual hashes are
| not because they are similar in semantic content but
| because they are similar in whatever features the neural
| network determined stay stable across transformations.
| Your fall colors landscape photo is just as liable to be
| a neural hash match as your college sweetheart's nudes.
| zepto wrote:
| Right - which in this case are protected against by the
| visual derivative.
| teclordphrack2 wrote:
| They make it sound like what is getting scanned is some sub set
| of the pictures that people take.
|
| In reality, with the way the new phone setup is, most people are
| sending every photo automatically to their iCloud.
|
| On top of that this is a feature that had a lot of hands touching
| it. You know there was an option for an architecture that meant
| apple had to spend the money doing the hash on the server side.
| They decided to pass that cost on to the consumer.
| Siira wrote:
| "Confusion" needs scare quotes badly.
| notJim wrote:
| I wouldn't be surprised at all if they actually are confused.
| Consider from Apple's perspective. They were being criticized for
| allowing CSAM to be uploaded to iCloud, since they don't scan for
| it. They see two options:
|
| * Start scanning all images uploaded to iCloud
|
| * Start scanning on-device, but only photos that are to be
| uploaded, and only alerting after some threshold is reached
|
| No matter what HN says, for them, not scanning is not an option.
| If you look at it this way, maybe the latter option looks better
| than the former?
| BoorishBears wrote:
| Can I know why not scanning is not an option?
|
| What happens, DoJ fines Apple? Police walk into Apple HQ and
| arrest Tim Cook?
|
| Not speaking rhetorically here to be clear, I'm actually
| curious why it's not an option.
| notJim wrote:
| Note that I said "for them". In other words, from their PoV.
| This is an assumption on my part. Of course you can disagree.
|
| Apple has been criticized for lax enforcement of anti-CSAM
| policies. Facebook and Google reported more CSAM images than
| Apple does, because Apple didn't previously scan iCloud
| images.
| BoorishBears wrote:
| Well I guess what I'm asking is why you think it's not an
| option then.
|
| Is criticism they didn't have as many CSAM matches per year
| why?
|
| Because I don't believe it was nearly as strong as the
| criticism this garnered. In fact at this point it's fairly
| clear they could roll back the change and get more kudos
| than criticism
| notJim wrote:
| Personally, I would not want to be the major cloud photo
| provider who is most friendly to hosting CSAM. For me
| that would be reason enough. As a company, there is also
| PR risk, and risk that laws will be passed requiring
| enforcement. This could be a way to get ahead of those
| laws. But it's really wild to me that people on HN would
| be so comfortable hosting CSAM on their servers.
| BoorishBears wrote:
| This becomes a bad faith argument the moment you start
| trying to browbeat people over "being comfortable hosting
| kiddie porn."
|
| Let's pretend you didn't just do that... you're talking
| about PR risk, but here we're seeing that risk blow up
| into a full blown scandal on the other side.
|
| Apple already was the most friendly hoster to CSAM, and
| the bad PR from it was quantifiable and minimal compared
| to the current PR they're getting.
|
| This isn't a new thing, encryption helps bad guys too.
| The same reason Apple was the most "friendly to CSAM" is
| the same reason any E2EE platform would be.
| notJim wrote:
| > This becomes a bad faith argument the moment you start
| trying to browbeat people over "being comfortable hosting
| kiddie porn."
|
| Fair enough, I could have phrased this more carefully.
| But my point stands, so I'll rephrase it.
|
| To put it more carefully, people here are saying they
| would prefer the tradeoff of hosting CSAM compared
| against the tradeoff of the privacy implications of
| scanning users photos when they're uploaded to iCloud. I
| personally would not make that tradeoff, as I do not want
| to host a website that distributes CSAM.
|
| > The same reason Apple was the most "friendly to CSAM"
| is the same reason any E2EE platform would be.
|
| iCloud photos are not E2E encrypted [1]. iCloud photos
| allows you to share photo albums with others and
| publicly. This is the reason I feel strongly about this,
| because if you don't scan for CSAM, iCloud will be used
| to distributed it.
|
| [1]: https://support.apple.com/en-us/HT202303
| BoorishBears wrote:
| I'm not saying iCloud Photos is using end-to-end
| encryption.
|
| I'm saying that the same arguments you're making against
| their at rest encryption scheme apply to all E2EE
| communication
|
| The idea being we already went through the "think of the
| kids" moment for that and now iMessage for example
| doesn't come up as being a defender of illegal content
| (at least not as often)
|
| -
|
| And your refined point isn't much better.
|
| You're painting people who are against on device scanning
| as being pro-hosting kiddie porn, and that's a terrible
| base for an argument.
|
| It's like saying people who are against banning matches
| are pro-forest fires.
|
| It doesn't pass a sniff test.
| neolog wrote:
| The quotation marks are on the wrong part of the title.
|
| Apple regrets "confusion" over iPhone scanning.
|
| We are not confused.
| nowherebeen wrote:
| Apple choosing the word confusion is horrible. It's like a
| backhanded diss at their users.
| cwkoss wrote:
| I wonder if the false positive rate of this system is equal among
| races. This depends a lot on the algorithm being used and what's
| in the database, but we know there are significant racial
| discrepancies in image classification.
|
| for example:
|
| Are a black parent's photos of their own children more likely to
| be falsely marked as CSAM than a white parent's photos of their
| own children?
| jmull wrote:
| I believe I understand the distinction between the two features
| perfectly well.
|
| Personally, I don't have a problem with the parental control
| feature in Messages (it's pretty clear what it does and the user
| can decide whether to use it or not, or the parent for younger
| kids -- that's exactly as it should be).
|
| I do have a problem with the feature where they scan images on my
| phone to match against a database of images. To be clear, here's
| a list of things that _don 't_ make me feel better about it: that
| it scans only a certain subset of images on my phone; that the
| technology parts of it are probably good; that NCMEC maintains
| the database of images (is there any particular reason to believe
| the database is near perfect and has all appropriate quality
| controls in place to ensure it remains so?)
|
| There are several issues about this that Apple does not address.
| A big one for me is the indignity and humiliation of them force
| scanning my phone for CP.
|
| Here's a hypothetical for Craig Federighi and Tim Cook to
| consider:
|
| Suppose we know there are people who smuggle drugs on airplanes
| on their person for the purpose of something terrible, like
| addicting children or poisoning people. If I run an airport I
| could say: to stop this, I'm going to subject everyone who flies
| out of my airport to a body-cavity search. Tim, and Craig, are
| you OK with this? If I can say, "Don't worry! We have created
| this great robots that ensure the body cavity searches are gentle
| and the minimum needed to check for illegal drugs," does it
| really change anything to make it more acceptable to you?
| OrvalWintermute wrote:
| I see this as wrought with constitutional issues.
|
| If Congress has to obey the Constitution, then they cannot
| create an organization which they control, and then push for
| that organization to execute functions they cannot perform by
| getting in cahoots with industry.
| kemayo wrote:
| > There are several issues about this that Apple does not
| address. A big one for me is the indignity and humiliation of
| them force scanning my phone for CP.
|
| Are you okay (conceptually, assuming a perfect database and
| hashing function) with them scanning pictures uploaded to
| iCloud for this material if the scanning happens on their
| servers? Or is this a complete "these pictures should never be
| scanned, regardless of where it happens" position?
|
| If the former, I _personally_ don 't feel a distinction between
| "a photo is scanned immediately before upload" and "a photo is
| scanned immediately after upload" is very meaningful. I'd be
| more concerned if there wasn't a clear way to opt-out. I
| acknowledge that there's room to disagree on this, and maybe
| I'm unusual in drawing my boundaries where I do.
|
| If the latter... I think that ship has sailed. Near as I can
| tell, all the major cloud platforms are scanning for this stuff
| post-upload, and Apple was a bit of an outlier in how little
| they were doing before this.
| cft wrote:
| Right, the hubris rained there after Jobs. All they have to do is
| to explain and educate their customers that this is fine, because
| we are just stupid mass.
| perardi wrote:
| _Right, the hubris rained there after Jobs._
|
| rained -> reigned
|
| It's also a bit of a laugh to suggest Jobs didn't suffer from
| hubris. See: arguably the Mac itself, $10,000 NeXT Cube, the
| Power Mac cube...really anything with cubes.
| underseacables wrote:
| I regret is not the same as corrective behavior. I'm sure Jeff
| Bezos regrets that Amazon drivers have to piss in bottles, but
| that doesn't mean anything is going to change.
| kelnos wrote:
| "Confusion"? No, there's no confusion. I think we know exactly
| what Apple is doing, and we think it's bad. Nothing more
| complicated than that.
| aaaaaaaaaaab wrote:
| _They_ are confused. They thought they could get away with
| this.
| zug_zug wrote:
| This is one of those failure apologies, that's just making us
| dislike them even more.
|
| I have no idea why they haven't done a 180 yet, this is a bigger
| failure than the butterfly keyboard. They are letting themselves
| become the symbol of technological dystopia in the public
| consciousness. Even an acquaintance who does construction was
| venting to me about how bad apple's policy is and why she is
| getting a pixel.
|
| After entirely removing that feature and making a commitment to
| fight against that kind of future I feel like they owe two more
| apologies to get on my good side - one for screwing up this bad
| in the first place and one for insulting my intelligence with
| their handling of the outcry. This isn't 1990, you don't handwave
| a mistake this big.
| ffritz wrote:
| > I have no idea why they haven't done a 180 yet, this is a
| bigger failure than the butterfly keyboard.
|
| Look at the stock. It barely moved (up).
| d6e wrote:
| The stock isn't a like/dislike button. Apparently, the
| stockholders think that, regardless of what happens, Apple
| will still be here tomorrow. And to be fair, it's not like a
| significant portion of Apple customers will throw away their
| phones.
| kragen wrote:
| > I have no idea why they haven't done a 180 yet, this is a
| bigger failure than the butterfly keyboard.
|
| Pressure from governments.
| Fordec wrote:
| Bingo. They're not deliberately pushing this, they're just
| the public face on the initiative. You can complain about
| Apple all you like, but you're not given the choice to
| boycott the CIA.
|
| The only reason we were even told this was being introduced
| in the first place is because it's being run on edge hardware
| (ie, phones). One talk at DEFCON on weird resource/energy
| spikes on apple devices and its existance leaks to the public
| domain which is even worse PR. The only difference is that
| historically such government level analysis has been
| conducted behind data center black boxes.
| OneLeggedCat wrote:
| I honestly think Apple could save some face if they simply
| came out and said, "Various U.S. government agencies are
| compelling us to do this behind the scenes, and we feel we
| have no real choice."
| jjcon wrote:
| > Even an acquaintance who does construction was venting to me
| about how bad apple's policy is
|
| I overheard a group of women on the marketing team at my
| company talking about how creepy it is and I've started having
| a lot people ask me about it - it doesn't seem contained to
| just techie at this point but it is concentrated there. I do
| think it will continue to grow though, apple has lost control
| of the narrative around their brand.
| spideymans wrote:
| On TikTok there are plenty of videos now going around saying
| "Apple is scanning your phone to report you to the
| authorities", with little to no nuance.
|
| This is really, really bad for their brand.
| zsmi wrote:
| Regret is not an apology. It means Apple is stating they are
| disappointed by the confusion, and I am pretty sure that's
| true.
| honksillet wrote:
| Why haven't they done a 180?
|
| I speculate their hand is being forced by one or more
| governments and rather than admit that they tried to sell it as
| best they could. Just speculation.
| imglorp wrote:
| How will anyone know what the OS is doing on a locked, signed,
| opaque, device? The company can do whatever pleases its masters
| and say anything they want.
| pcdoodle wrote:
| Too late, you lost my trust. You had me at handoff and lost me
| with this.
| throwaway_apple wrote:
| Maybe I'm misinterpreting CF's explanation, but it sounds like
| the scanning does not happen on device. The neural hash is
| generated on device, both the hash and image are uploaded to
| iCloud (if you have that enabled), and the matching happens on
| the server side.
|
| This still isn't great from the perspective that scanning's
| happening, but it seems better than all your images being scanned
| server side (which all the other big cloud storage providers do),
| or all images being scanned on your device.
| bondarchuk wrote:
| So now we know that the threshold is about 30 photos. And we know
| this:
|
| >" _The threshold is set to provide an extremely high level of
| accuracy and ensures less than a one in one trillion chance per
| year of incorrectly flagging a given account._ "
|
| Does that mean they expect about one in every 2.5 people per year
| to have at least a single false positive image match? (2.512^30 =
| a trillion)
| andrewmcwatters wrote:
| I don't understand why they mention this probability, because
| it's useless to me without other information.
|
| "How fast were you going?"
|
| "30."
|
| "30 what?"
|
| "...Speed."
| notJim wrote:
| You need 30 matches before they do anything, so a single false
| positive wouldn't cause anything to happen.
| tehnub wrote:
| To get the probability of a single false positive match, you
| need to look at the CDF of the binomial distribution [0].
|
| Let X be the number of matches in your iCloud library. Assuming
| each photo's probability of a match is independent of other
| photos in the library (shaky assumption), then X ~ Binomial(n,
| p), where n is the number of photos in the library, and p is
| the probability of match.
|
| The free plan, which gives 5GB, will store up to 2500 photos
| taken on a 5 megapixel camera. Assuming that's the most common
| library size, n = 2500.
|
| So we need to solve for p given P(X >= 30) = 1/trillion and X ~
| Binomial(2500, p). Notice P(X >= 30) = 1 - P(X <= 29), and we
| can use the CDF formula to get 1 - P(X <= 29) = 1 -
| sum_{k=0}^{29} of (2500 choose k) (1 - p)^k p^(2500 - k).
|
| Set that equal to 1/trillion and solve for p. I don't have an
| easy way to compute that, unfortunately.
|
| [0]:
| https://en.wikipedia.org/wiki/Binomial_distribution#Cumulati...
| [deleted]
| zekrioca wrote:
| He works with software. He probably also knows that to filter out
| specific files in a filesystem, it requires "scanning" all files
| and checking the ones that really represent image and video
| blobs, presumably for CSMA post processing. For doing so, one
| really needs to scan _everything_. The way I see it, there is
| really no confusion.
| pvarangot wrote:
| This feature runs only on the files that are uploaded to iCloud
| using iCloud Photos.
| xkcd-sucks wrote:
| Right now, Apple says they chose to configure it such that
| the feature runs only on the files that are uploaded to
| iCloud using iCloud Photos, and end users have no way to
| confirm whether this claim is actually true.
| cassianoleal wrote:
| It's also a single bug away from not being true.
|
| Oh wait, Apple software don't have bugs though, right? /s
| zepto wrote:
| What kind of single bug? How can a photo upload process
| suddenly impact random files on the filesystem?
| zekrioca wrote:
| https://www.cvedetails.com/cve/CVE-2007-5037/
| zepto wrote:
| That CVE is not a link to anything relevant to the
| question. Were you aware of that when you posted it?
| shuckles wrote:
| People who sincerely believe this is accidentally
| possible should be constantly freaked out that iCloud
| Backups is "accidentally" uploading all their photos to a
| warrant accessible database.
| notJim wrote:
| It's part of the process that uploads the photos to iCloud.
| If you think through the likely implementation of such a
| thing, it would be more than a simple config change to
| change this.
|
| > end users have no way to confirm whether this claim is
| actually true
|
| This is true of all proprietary software.
| zekrioca wrote:
| It will most likely scan files in the /iCloud directory
| inside iPhone/iPad. In there, the file scan process described
| above will be executed, regardless of what the user stores
| there.
|
| I do not see how this wouldn't be easily extended to all
| mountpoints on the device. And again, one needs to have faith
| and assume the /iCloud binary information on storage is
| really physically isolated from everything else. Sorry, it is
| very unlikely they aren't really scanning, as I said,
| _everything_.
|
| Edit: clarity.
| zepto wrote:
| This is just plain bullshit. The process is well
| documented, and there is no general scan of files inside
| iCloud.
|
| In fact there is no filesystem scan at all. There is only a
| check that takes place during the upload process to iCloud
| Photo Library, which is separate from iCloud Drive.
| zekrioca wrote:
| Assume /iCloud == /iCloud/Photo Library then.
| zepto wrote:
| That makes no sense - iCloud Photo Library is a separate
| service from iCloud Drive. It isn't part of the
| filesystem.
|
| In any case the claim that the system will scan files
| other than photos chosen for upload is just a lie.
| vineyardmike wrote:
| > The process is well documented
|
| The only well documented process in tech is source-
| available. There is documents and speculation regarding
| this process but we don't actually know the details
| beyond what they claim and they're not saying too much.
| zepto wrote:
| The information about what is checked by this mechanism
| is well documented.
|
| If you want to say that Apple could be lying or mistaken
| about what the code does, that is a different claim from
| whether they have documented what I said they documented.
| pvarangot wrote:
| There's no "most likely", it's already implemented. I just
| saw a USENIX talk from a developer and a cryptographer and
| they said it's a hook when iCloud Photos opens a file to
| upload it to cloud.
|
| I don't like it. I wish it had never happened. Fuck the
| government. But you are wrong and blowing this out of
| proportion.
| zekrioca wrote:
| Have you ever read about inotify/inotifywait?
| threatofrain wrote:
| Which I find really puzzling because if you announce this
| fact to people who will basically have their lives ended if
| they get caught with CSAM material... then they will be the
| ones to avoid it. While the rest of the population is being
| scanned.
| notJim wrote:
| The point is to prevent people from using iCloud to
| distribute CSAM.
| neolog wrote:
| Without the implementation, they can respond to government
| demands with "we don't have the capability to report
| dissidents."
|
| Once they have the capability rolled out, it's just a one-
| line config change to enable it.
| oozeofwisdom wrote:
| *For now
| notJim wrote:
| This is incorrect. They scan the photo as part of the iCloud
| upload process. If that process does not run for a given file
| or photo, this scanning does not run, according to the
| interview.
| ballenf wrote:
| I think the "confusion" was 100% intentional. That the two
| features (iMessage scanning & on-device spying pre-upload to
| iCloud) were intentionally released at the same time to make the
| whole thing harder to criticize in a soundbite.
|
| Confusion is the best-case scenario for Apple because people will
| tune it out. If they had released just the on-device spying,
| public outcry and backlash would have been laser targeted on a
| single issue.
| jimbob45 wrote:
| Do you have a source on the iMessage thing? I don't remember
| seeing anything about iMessage but maybe I failed to adequately
| read the press release.
| kemayo wrote:
| It's a feature that only applies to kids under 18 who're in a
| family group, whose parents turn it on. It warns the kid
| before letting them see an image which machine-learning
| thinks is nudity. If the kid is 12 or under, their parents
| can be notified if they choose to see it. It apparently does
| no reporting to anyone apart from that parental notification.
|
| Check the section "WHAT IS APPLE DOING WITH MESSAGES?" in
| this article:
| https://www.theverge.com/2021/8/10/22613225/apple-csam-
| scann...
| jchw wrote:
| Fanatics also have a tendency to try to latch onto whatever
| details may offer a respite from the narrative. The core
| problem here is that Apple is effectively putting code designed
| to inform the government of criminal activity _on the device_.
| It's a bad precedent.
|
| Apple gave its legendary fan base a fair few facts to latch
| onto; the first being that it's a measure against child abuse,
| which can be used to equate detractors to pedophile apologists
| or simply pedophiles (these days, more likely directly to the
| latter.) Thankfully this seems cliche enough to have not been a
| dominant take. Then there's the fact that right now, it only
| runs in certain situations where the data would currently be
| unencrypted anyways. This is extremely interesting because if
| they start using E2EE for these things in the future, it will
| basically be uncharted territory, but what they're doing now is
| only merely lining up the capability to do that and not
| _actually_ doing that. Not to mention, these features have a
| tendency to expand in scope in the longer term. I wouldn't call
| it a slippery slope, it's more like an overton window of how
| much people are OK with a surveillance state. I'd say Americans
| on the whole are actually pretty strongly averse to this,
| despite everything, and it seems like this was too creepy for
| many people. Then there's definitely the confusion; because of
| course, Apple isn't doing anything wrong; everyone is just
| confusing what these features do and their long-term
| implications.
|
| Here's where I think it backfired: because it runs on the
| device, psychologically it feels like the phone is not
| trustworthy of you. And because of that, using anti-CSAM
| measures as a starting point was a Terrible misfire, because to
| users, it just feels like your phone is constantly assuming you
| could be a pedophile and need to be monitored. It feels much
| more impersonal when a cloud service does it off into the
| distance for all content.
|
| In practice, the current short-term outcome doesn't matter so
| much as the precedent of what can be done with features like
| this. And it feels like pure hypocrisy coming from a company
| whose CEO once claimed they couldn't build surveillance
| features into their phones because of pressures for it to be
| abused. It was only around 5 years ago. Did something change?
|
| I feel like to Apple it is really important that their
| employees and fans believe they are actually a principled
| company who makes tough decisions with disregard for "haters"
| and luddites. In reality, though, I think it's only fair to
| recognize that this is just too idealistic. Between this, the
| situation with iCloud in China, and the juxtaposition of their
| fight with the U.S. government, one can only conclude that
| Apple is, after all, just another company, though one whose
| direction and public relations resonated with a lot of
| consumers.
|
| A PR misfire from Apple of this size is rare, but I think what
| it means for Apple is big, as it shatters even some of the
| company's most faithful. For Google, this kind of misfire
| would've just been another Tuesday. And I gotta say, between
| this and Safari, I'm definitely not planning on my next phone
| being from Cupertino.
| Krasnol wrote:
| > I'd say Americans on the whole are actually pretty strongly
| averse to this, despite everything, and it seems like this
| was too creepy for mant people.
|
| You mean that country which gives a damn about privacy
| altogether because all those fancy corps are giving them toys
| to play? You know, those companies which feed on the worlds
| populations data as a business model. The country which has a
| camera on their front door which films their neighbourhood
| 24/7? The country which has listening devices all over their
| homes in useless gadgets?
|
| You have to be joking or that scale you impose here is
| useless.
|
| This whole thing will go by fast and there won't be much
| damage on the sales side. Apple is the luxus brand. People
| don't buy it for privacy. Most of the customers won't
| probably even understand the problem here.
|
| The only thing we might be rid of are those songs of glory in
| technical spheres.
| abecedarius wrote:
| I bought my first iPhone this year, and privacy was the
| reason.
| Krasnol wrote:
| Congratulations.
|
| How did that work out for you?
| tgsovlerkhgsel wrote:
| > Apple is the luxus brand. People don't buy it for
| privacy.
|
| Privacy is the main selling point Apple is pushing in their
| current PR campaigns. They've been slowly building up a
| brand around privacy with new privacy features.
|
| They've just sunk that entire brand/campaign. Instead of
| "iPhone, the phone that keeps all your data private", it's
| "iPhone, the phone that looks through your pictures and
| actively rats you out to police to ruin your life".
| Krasnol wrote:
| The reason they pushed privacy was because of the media
| attention that Androids bad privacy got. Please don't
| tell me you believe privacy was at the usual consumers
| mind when they bought their devices...this is ridiculous
| or you don't meet many normal users. It's marketing.
| They'll something new. You can fit everything in front of
| a white background...
| Bud wrote:
| Could we not pretend, please, that the US is the only
| country with a lot of pervasive surveillance. Because
| that's clearly laughable.
| Krasnol wrote:
| Could we not build straw man, please.
|
| I never did that.
|
| Americans were the topic here. See quote.
| danudey wrote:
| > The core problem here is that Apple is effectively putting
| code designed to inform the government of criminal activity
| on the device. It's a bad precedent.
|
| This is wildly disingenuous.
|
| Apple is putting code on the device which generates a hash,
| compares hashes, and creates a token out of that comparison.
| That is 100% of what happens on the device.
|
| Once the images and tokens are uploaded to iCloud photos,
| iCloud will alert if 30+ of those security tokens show a
| match, it will alert Apple's team, and they will get access
| to only those 30+ photos. They will manually review those
| photos, and if they then discover that you are indeed
| hoarding known child pornography _then_ they report you to
| the authorities.
|
| Thus, it would be more accurate to say that apple is putting
| on your device code which can detect known child pornographic
| images.
|
| > And it feels like pure hypocrisy coming from a company
| whose CEO once claimed they couldn't build surveillance
| features into their phones because of pressures for it to be
| abused.
|
| This isn't a surveillance feature. If you don't like it,
| disable iCloud Photos. Yes, it could theoretically be abused
| if Apple went to the dark side, but we'll have to see what
| this 'auditability' that he was talking about is all about.
|
| Honestly, with all of the hoops that Apple has jumped through
| to promote privacy, and to call out people who are violating
| privacy, it feels as though we should give Apple the benefit
| of the doubt at least until we have all the facts. At the
| moment, we have very few of the facts.
| mensetmanusman wrote:
| They created a tool that, in principle, lets a government
| ask about certain hash matches that are on the iPhone but
| not necessarily on iCloud, correct?
|
| There is no way to determine whether the hashes are about
| CP or about HK protests.
| insulanus wrote:
| > This isn't a surveillance feature.
|
| > Thus, it would be more accurate to say that apple is
| putting on your device code which can detect known child
| pornographic images
|
| > If you don't like it, disable iCloud Photos.
|
| > Yes, it could theoretically be abused if Apple went to
| the dark side [...]
|
| > [...] it feels as though we should give Apple the benefit
| of the doubt at least until we have all the facts.
|
| No, nobody gets "the benefit of the doubt". The very use of
| that phrase admits that you are being put into a situation
| where you could be screwed in the future.
|
| There is zero transparency or oversight into the code that
| does the scanning, the in-person review process, or the
| database of images being scanned for.
| spicybright wrote:
| > Yes, it could theoretically be abused if Apple went to
| the dark side, but...
|
| > ...we should give Apple the benefit of the doubt...
|
| You have to take off your apple branded rose tinted glasses
| my friend.
|
| Any company as big as apple needs to be scrutinized as
| harshly and critically as possible.
|
| Their influence on the world is so big that a botched roll
| out of this sort of tech could be absolutely devastating
| for so many people, for so many reasons.
|
| I don't care if it's hashed tokens or carrier pidgins. We
| should only allow companies to act in ways that improve our
| lives. Full stop.
| jchw wrote:
| Describing the implementation details does nothing to
| change the reality that the device is acting as an
| informant against its owner. The number of hoops literally
| changes nothing. Adding an AI model versus using SHA sums
| changes nothing. Adding some convoluted cryptography system
| to implement some additional policy changes nothing. In
| trivial cases like anti-piracy measures or anti-cheat in
| games, we tolerate that the device will sometimes act
| against our best interest, but at least in this case, the
| stakes are low and the intentions are transparent.
|
| We have every fact we need to know to know this shouldn't
| be done, and I'm glad that privacy orgs like EFF have
| already spoken much to this effect.
| feanaro wrote:
| > Yes, it could theoretically be abused if Apple went to
| the dark side, but we'll have to see what this
| 'auditability' that he was talking about is all about.
|
| Or we can just short circuit the entire issue by deciding
| firmly we don't want this and punish Apple's behaviour
| accordingly. Which is what appears to be happening.
|
| > it feels as though we should give Apple the benefit of
| the doubt
|
| It really doesn't feel like this to me at all. Users have
| clearly stated: we don't want this. It's time for Apple to
| simply pull it all back and apologize.
| dawnerd wrote:
| What I want to know, and maybe it's listed somewhere, are users
| alerted when their photos are manually reviewed? If I get falsely
| flagged and someone looks at my photos I want to know. What are
| the security processes around the people reviewing? Are they
| employed in some low income country like most other moderation
| teams are?
| swiley wrote:
| This sounds like how your ex might regret beating you when the
| police show up.
| notJim wrote:
| I think this link that has the actual interview might be better.
| The bbc is picking quotes out of context
|
| https://www.wsj.com/video/series/joanna-stern-personal-techn...
| foobiekr wrote:
| this interview is full of deliberately misleading statements on
| the part of Craig Federighi.
| notJim wrote:
| For example? I found it clarifying.
| intricatedetail wrote:
| Oh we are just stupid and confused. With such patronising
| attitude towards customers, I hope your company goes bankrupt.
| There is no place in society for such predatory business. And
| start paying your taxes!
| Klonoar wrote:
| Hot take, but: they _never_ should have released the news about
| the neural hash side of things with the iMessage child account
| scanning.
|
| Regardless of how you feel about it, both issues were being
| completely mixed up by every single person I saw discussing this
| - even otherwise very technically competent people on this very
| site.
|
| I've no doubt that it muddied the waters significantly when it
| comes to discussing this.
| acdha wrote:
| The other part was not comparing it to either existing server-
| side scanning or E2E. Maybe it's just optimistic but it seems
| like the reaction might have been different if it had been
| something like "we are currently scanning our servers. To make
| our service E2E with this narrow exception, we are moving that
| scan to the client."
| zepto wrote:
| They _aren't_ currently scanning their servers though.
| Klonoar wrote:
| I'd really like clarification on this from Apple,
| considering we know from warrants - if nothing else - that
| they have been doing this:
|
| https://www.forbes.com/sites/thomasbrewster/2020/02/11/how-
| a...
|
| Do they mean they haven't been doing it for _iCloud Photos_
| , but were arbitrarily doing it for other parts of iCloud?
| notJim wrote:
| > Do they mean they haven't been doing it for iCloud
| Photos, but were arbitrarily doing it for other parts of
| iCloud?
|
| I read elsewhere they they scanned Mail but not Photos.
| zepto wrote:
| That's a fair question. The idea that they aren't
| scanning already comes from the fact that they make very
| few reports compared to Google or Facebook. Literally a
| few hundred vs 10s of millions.
|
| If they were already scanning, you'd expect more reports
| since although there is no legal requirement to scan,
| there _is_ a legal requirement to report detections.
| shuckles wrote:
| They have explicitly said they don't scan iCloud photos
| in their interview with Tech Crunch.
| zepto wrote:
| Thanks. I didn't know they had said so explicitly. That
| is helpful extra context.
| acdha wrote:
| How do we know that, though? They're secretive enough that
| it's hard to tell -- they certainly aren't reporting high
| numbers to NCMEC's tipline, although it's possible that
| some of that might be the difference between human-reviewed
| and aggregated reports versus other companies having a
| fully-automated process making one report per image or
| something like that, but that doesn't necessarily mean that
| they aren't using other channels.
|
| Which, again, really hits the need for disclosure -- so
| much of the response to this announcement has been heavily
| shaped by both that secrecy and just springing it on the
| world without much prior public recognition of this issue.
| shuckles wrote:
| They said so in an interview with Tech Crunch. The other
| articles confused iCloud Mail with iCloud Photo Library.
| notJim wrote:
| > How do we know that, though?
|
| At some level, if you're uploading files to their
| servers, you have to trust them. And to a lesser extent
| if you're using their proprietary software (although you
| can monitor network traffic and so on.)
|
| > Which, again, really hits the need for disclosure
|
| Isn't that what they did?
| acdha wrote:
| I think there was a misunderstanding: I wasn't saying
| that you don't have to trust them to use their cloud
| services but rather that it would be a surprise to me if
| they were _not_ already scanning iCloud Photos (i.e. is
| this a change from "scanned after upload" to "scanned
| before upload" or from "not scanned" to "scanned"?). I've
| always assumed that they do scan your hosted files, just
| like Dropbox, Google, etc. do.
| notJim wrote:
| In the interview, Craig discusses this. They did not
| previously scan iCloud photos because they consider it
| too invasive. They consider this less invasive, because
| Apple does not look at the content of your photo, except
| on the device. So the change was from "not scanned" to
| "scanned before upload on device".
| throwawaymanbot wrote:
| The chinafication of how big tech interacts with civilians in the
| west. We are all Chinese citizens now.
| cassianoleal wrote:
| How can this:
|
| > [Federighi] said it would do the image-matching on a user's
| iPhone or iPad (...)
|
| be reconciled with this:
|
| > Mr Federighi said the "soundbyte" that spread after the
| announcement was that Apple was scanning iPhones for images.
|
| > "That is not what is happening," he told the Wall Street
| Journal.
|
| without at least one of them be a blatant lie?
|
| Is it the tense of "was" in "Apple was scanning (...)" as opposed
| to "will start to scan"?
| notquitehuman wrote:
| He was lying. That's the face Craig makes when he's lying.
| karaterobot wrote:
| I assumed the hair they're splitting is it's your own device
| that's doing the scanning, and not "Apple". Meaning, their
| iCloud servers won't scan your photos. They want people to read
| that as "oh, I thought my pictures were going to get scanned
| without my say so, I guess that was just wrong and this is a
| false alarm", but what is actually going to happen is that
| _you_ are going to scan your pictures, then send Apple a hash.
|
| To be clear, this is a distinction without a meaningful
| difference. Or, if there is a difference, it's that it's
| actually worse than the alternative (cf. the Stratechery
| article that's been making the rounds).
|
| If that's right, then this isn't a lie, but it's incredibly
| mealy-mouthed, misleading, and disrespectful of their
| customers' intelligence.
| breck wrote:
| "Critics have said the database of images could be corrupted,
| such as political material being inserted...Federighi said the
| database of images is constructed through the intersection of
| images from multiple...organizations...He added that at least two
| "are in distinct jurisdictions.""
|
| Oh that's a relief. Good luck trying to get 2 intelligence
| agencies to cooperate.
|
| /s
|
| https://en.wikipedia.org/wiki/Five_Eyes
| systemvoltage wrote:
| The best part about this is that it's a non-partisan issue and
| pretty cool to see people rise up against this overreach.
| JoeyJoJoJr wrote:
| One point that I haven't seen mentioned is that pedophiles
| probably aren't going to be using iOS devices for very long. They
| will catch wind very quickly and adapt.
| cwkoss wrote:
| Do you think pedophiles were previously uploading their photos
| to icloud? Seems implausible except for the stupidest ones.
| [deleted]
| croes wrote:
| So they regret we aren't buying their excuses and explanations
| and not the wrongdoing itself. Seems more like Apple is the one
| confused.
| boublepop wrote:
| Apple for what They wanted and what they needed. Each year
| Facebook scans and flags tens of thousands of pictures with child
| pornography the majority of which got to Facebook through an
| iPhone. Apple flags less then a hundred each year. There is
| definately pressure politically for Apple to do more. So what do
| they need? A thousand mainstream news articles explaining a
| massive backlash to them doing any sort of scanning on their
| devices from across the globe.
|
| They can stand back and say "we just can't do anything, the users
| won't have it" while Facebook keeps drowning in political
| pressure while doing a thousand times better than Apple.
| cwkoss wrote:
| Companies transferring encrypted data are not responsible for
| its contents. Full stop.
|
| Apple shouldn't do anything, because their duty is to report
| the CSAM that is visible to them and no private data should be
| visible to them.
|
| Facebook has those images in the clear. They aren't doing "a
| thousand times better", they have an infinite amount more
| unencrypted images.
|
| This is a reductive apples to oranges comparison that misleads
| anyone who reads it.
|
| Would you similarly argue that the postal service needs to open
| every letter and inspect it to ensure there aren't photos of
| child porn contained within? Should uber drivers be required to
| search every passenger and their bags for child porn?
|
| NO! Because that's private, and we respect privacy in this
| country.
| lvxferre wrote:
| I see - Apple is now gaslighting users.
| zugi wrote:
| This should really be "Apple Regrets Clarity over 'iPhone
| Scanning'".
|
| "Confusion" is what they're trying to sow now.
| arecurrence wrote:
| Fundamentally, I think this along with encrypting iCloud backups
| is strictly a win for customer privacy. This set of data is
| subject to scanning seconds later already. However, charging
| someone simply because these hashes detected illicit material is
| a scary reality.
|
| Everyone's heard about SWATting... get ready for CSAMming. I
| don't even know where to begin with services like Pegasus that
| rootkit a phone floating around. Got a major business deal a
| rival is about to close... CSAM their negotiators and win the
| contract.
|
| I'm sure there are variations that wipe themselves without a
| trace after delivering their payloads.
| cwkoss wrote:
| I bet state sponsored hackers already have CSAMing capability.
| This change will make them much more effective and streamline
| prosecution of their victims.
| daxuak wrote:
| Even for the child protection purpose alone... the 30 photos
| threshold thing is meaningless unless the false positive of the
| hash matching process becomes transparent. To achieve good recall
| rate, the hashing has to be in feature space instead of a plain
| md5 on the jpeg file (otherwise any compression or metadata
| change happened to the file would render the reference dataset
| meaningless), and I don't think anyone can promise that this has
| no false alarms, i.e. you take a pic of your child playing in the
| pool but accidentally get a hash collision.
|
| Of course this is not the point. But skimming through the article
| I'm not impressed by these mostly irrelevant bits either.
| backtoyoujim wrote:
| I have scanned the phone for sorrow and the phone has scanned
| back at me I AM SORROWFUL
| mlazos wrote:
| It's classic politics to me, if you have to explain yourself,
| you've already lost. Did apple really think the average person
| would hear "scanning all devices" and think oh they're using
| device local keys to keep the data on the device, that's fine.
| I'm honestly shocked at Apple's expectation that this would go
| over well.
| Miner49er wrote:
| One thing I haven't seen mentioned, isn't this basically required
| by law in the US now? Doesn't FOSTA-SESTA make Apple legally
| liable if they permit these images to touch their servers?
| isx726552 wrote:
| This is a pathetic response. The CEO should be out there front
| and center with the press and the public explaining this. Having
| a VP do a spin interview with the (paywalled) WSJ and
| (mis)characterizing all the concerns as mere "confusion" is
| nonsense.
|
| Privacy has been presented as a top-line feature by Apple for
| many years now. By announcing this feature they have betrayed any
| trust they may have built. The CEO remaining silent is the icing
| on the cake.
|
| What value can Apple offer now? The Privacy story is done. Do
| they have anything else?
| newsbinator wrote:
| Having Tim Cook come out for damage control on any topic would
| tank the stock price. It would almost always be a VP doing a
| spin interview.
| orange_puff wrote:
| Hypothetical; Suppose that this scanning program only ever did
| what Apple said it was going to do, look for known CSAM. Would
| this still be upsetting? I am trying to parse if the blow back to
| this announcement is rooted in the tech communities' ideal of
| near perfect privacy, or if instead it's a reaction to what this
| tech could potentially be used for.
|
| I don't find the following argument compelling; Because this tech
| will be used to scan known CSAM, it will necessarily one day be
| used to scan for non CSAM. If Apple can implant this tech on your
| IPhone now, it always could have, and therefore the threat of the
| government coercing Apple to scan all images for whatever
| pernicious reasons they can think of has always existed.
|
| CSAM is a massive problem. The solution to how we deal with it
| will be nuanced and plagued with tradeoffs, but I refuse to be an
| extremist for either side. I do want something done about CSAM,
| which is why I am happy that Facebook reports over 10 million
| instances of it per year from messenger. I also want devices to
| be mostly private (to assume that a device manufactured by a
| large corporation would ever be perfectly private in the internet
| age is delusional). But anyone who acknowledges that CSAM is a
| problem must also acknowledge that some sacrifice of privacy
| would be necessary to mitigate it. Or, perhaps one day we can
| rely on homomorphic encryption to deal with this.
| xur17 wrote:
| > Hypothetical; Suppose that this scanning program only ever
| did what Apple said it was going to do, look for known CSAM.
| Would this still be upsetting? I am trying to parse if the blow
| back to this announcement is rooted in the tech communities'
| ideal of near perfect privacy, or if instead it's a reaction to
| what this tech could potentially be used for.
|
| No, because there will always be false positives, which means
| someone is going to be manually reviewing your photos.
| akomtu wrote:
| Translating this corpspeak to plain language: "Apple regrets its
| own lack of integrity, but shareholders want more profits and gov
| wants more control, so Apple will return to this idea half a year
| later, rebranded as a tool to combat terrorism."
| [deleted]
| tines wrote:
| > The system could only match "exact fingerprints" of specific
| known child sexual abuse images, he said.
|
| It has to match the fingerprint exactly, but the fingerprints
| themselves are not exact, otherwise they would be useless.
|
| And this is completely beside the point. People's concerns aren't
| mostly over false positives, they're over the possibility that
| this feature will be perverted by authoritarian governments. Way
| to miss the point.
|
| > Mr Federighi said the "soundbyte" that spread after the
| announcement was that Apple was scanning iPhones for images.
|
| > "That is not what is happening," he told the Wall Street
| Journal.
|
| That's... exactly what's happening.
| [deleted]
| zepto wrote:
| What authoritarian governments are people concerned about? I
| don't think this makes any difference in a place like China.
| swiley wrote:
| It does make a difference in a place like the UK or New York.
| [deleted]
| zepto wrote:
| What difference? Are you suggesting the UK or New York have
| authoritarian governments?
| Aaargh20318 wrote:
| Are you suggesting they don't ?
| zepto wrote:
| I'm not suggesting anything. I want to understand what
| swiley meant by their comment.
| [deleted]
| notJim wrote:
| The distinction he's making (which I realize you will likely
| not find satisfactory) is that they aren't proactively scanning
| all of the photos on your device or in your photo library. The
| scan happens as part of the pipeline that uploads images to
| iCloud.
| Youden wrote:
| I feel like that's a pretty weak distinction given Apple's
| push to have you upload everything to iCloud.
| notJim wrote:
| I agree, but I don't think that's an excuse to
| mischaracterize what the feature does.
| [deleted]
| [deleted]
| UseStrict wrote:
| > Mr Federighi said the "soundbyte" that spread after the
| announcement was that Apple was scanning iPhones for images.
|
| But that's exactly what's happening? Most people using an iPhone
| sync photos with iCloud (especially after they introduced the
| more cost-effective 2TB Apple One plan), images are scanned
| before they are uploaded to iCloud, ergo Apple will be scanning
| the iPhone for images.
| balozi wrote:
| Apple software chief Craig Federighi's only regret appears to be
| that their users are too dumb to grasp how Apple is enhancing
| user privacy by exposing their data.
| erdos4d wrote:
| There was no confusion, this is a turnkey surveillance system
| who's scope will expand to whatever those with power over apple
| decide is taboo. I think we all got the message loud and clear.
| emko7 wrote:
| They say no but they already give up privacy to many
| authotarian goverments like China .... they already scanning
| iCloud data.
| clarkrinker wrote:
| Where do the hashes come from? I assume the system is designed to
| minimize the number of people who have to look at the CP, but how
| do they audit that someone isn't inserting hashes to look for
| dissidents?
| notJim wrote:
| It seems to come from images reported to this group
| https://www.missingkids.org/ through their CyberTipline.
|
| Edit: toward the end of the interview, Craig says the database
| can be audited. Obviously not the actual images, but people can
| verify that the list is the same across all countries, for
| example.
| heavyset_go wrote:
| > _how do they audit that someone isn 't inserting hashes to
| look for dissidents?_
|
| They don't. They expect you to just trust them.
| emko7 wrote:
| Next up gov will be saying well the system exists and we need
| to catch x until we end up going after political opposition.
| wilg wrote:
| So many people in this thread are convinced this whole thing is
| intentionally malicious, that Apple is doing this because they
| want to enable government spying, and they are intentionally
| using child sex abuse as a way of trying to make it palatable in
| a PR battle.
|
| I don't think that is the most likely situation at all.
|
| Apple has been, as part of their privacy initiatives, trying to
| do as much as possible on the device. That's how they have been
| defining privacy to themselves internally. Then someone said "can
| we do something about CSAM" and they came up with a pretty good
| technical solution that operates on device and therefore, to
| them, seemed like it would not be particularly controversial.
| They've been talking about doing ML and photo scanning object
| recognition on device for years, they're moving much of Siri to
| on-device in iOS 15, all as part of their privacy initiatives.
|
| It seems to have backfired in that people actually seem to prefer
| scanning in the cloud to on-device scanning for things like this,
| because it feels less like a violation of your ownership of the
| device.
|
| I think the security arguments about how this system can be
| misused are compelling and it's a fine position to be strongly
| against this, but I don't know that there's good justification
| that Apple has some ulterior motive and is faking caring about
| privacy. I think they were operating with a particular set of
| assumptions about how people view privacy that turned out to be
| wrong and they are genuinely surprised by the blowback.
| firebaze wrote:
| That's one of the few good aspects of a legendary fuck-up like
| this: you learn about people defending it.
|
| People defending CSAM should go to hell, fast. But are we
| already done destroying all low-hanging fruits? Did we stop
| Johnny Savile? Did we put all clerical actors behind bars? Did
| we extinguish the child porn network behind Marc Dutroux
| (https://en.wikipedia.org/wiki/Marc_Dutroux)?
|
| And even if we did, would that be enough of an excuse to
| implicitly accuse anyone? My spouses' family (well-off, so
| using iDevices) took photos of their young age kids playing,
| partially naked at the sea. They are now _frightened_ if their
| photos could be stolen by someone and marketed as child porn.
|
| So unbelievable.
| notJim wrote:
| > They are now frightened if their photos could be stolen by
| someone and marketed as child porn.
|
| That sounds bad, someone high up at Apple should do an
| interview clarifying that that's not what's happening!
| cwkoss wrote:
| If you don't think this is possible, you are not
| understanding the technical implementation they announced.
|
| Perceptual hashes on chunks of images will yield false
| positives.
| system2 wrote:
| Doesn't matter how technically well done this is. I do not want
| my device to poke my files and send them to an AI software to
| make a decision. It makes me uncomfortable.
|
| This is malicious. I do not want them to touch my photos or
| anything personal. I paid for this device, now it is doing
| things against my will.
| jachee wrote:
| It doesn't happen against your will. You still have full
| control over whether scanning happens.
|
| Simply disable iCloud Photo Library, and nothing gets
| scanned.
| asddubs wrote:
| for now
| jachee wrote:
| Exactly! There has been so much FUD, conspiracy theory, and
| fear-mongering.
|
| None of the usual anti-regulation apologists have pointed out
| that Apple shouldn't be forced to download and host
| potentially-illegal material in the interest of ensuring
| whether or not it's actually illegal.
|
| This whole program is their intelligent solution to protecting
| as much user privacy as possible while still being compliant
| with the law. On-device hashing is actually _pro_ privacy
| compared to in-the-cloud scanning (which all other cloud
| hosting providers are also required to do).
| m-ee wrote:
| This is not about compliance, the relevant law specifically
| says that companies are not required to proactively scan for
| CSAM.
| sagarm wrote:
| > because it feels less like a violation of your ownership of
| the device.
|
| Agreed that this was not intended to be malicious. Apple has
| always been pretty clear they they should decide what happens
| on their devices. This sort of on-device scanning that doesn't
| serve the user is just the latest example of it, and one that
| people who would never be affected by the code signing
| restrictions can relate to.
| insulanus wrote:
| Actually, I'm with you in that I think Apple's motives are
| different than people think, but I think yours are incorrect as
| well.
|
| > Apple has been, as part of their privacy initiatives, trying
| to do as much as possible on the device. That's how they have
| been defining privacy to themselves internally. Then someone
| said "can we do something about CSAM" [...]
|
| As a separate issue, many people in the company certainly do
| care about privacy, and that may go all the way to to Tim Cook.
| Who knows.
|
| What is much more important to Apple the company, though, is
| making money. Governments have been hounding them for years
| about letting them spy on users. And they have painted
| themselves into a bit of a corner, by having the most secure
| phones.
|
| Now the government comes to them with an offer they can't
| refuse, cloaked in child porn motivations. I believe many
| (most?) of the people involved are sincere. It's clear they
| have tried to make the least invasive system that still does
| what the government wants.
|
| But that's not good enough in the crazy connected cyber-world
| we find ourselves in today.
|
| Apple doesn't have a motivation to do this themselves. But they
| will do what they calculate they need to do.
| cwkoss wrote:
| Apple has a huge potential profit motive. Once they roll this
| out for US users, they can sell the exact same capabilities
| to authoritarian regimes for detecting subversive images,
| images of warcrimes, etc.
|
| China would happily pay billions of dollars per year for this
| capability.
| ipv6ipv4 wrote:
| I agree it is likely not malicious at all. It's the result of
| koolaid in an echo chamber. And inertia at this point.
|
| However, I also think this is the poster child of the proverb
| that the road to hell is paved with good intentions.
|
| Now Apple needs to cancel this misguided initiative and never
| speak of it again, if they want to salvage some of their
| reputation.
| willio58 wrote:
| I agree with you. I'm all for privacy but have no issues
| whatsoever with companies scanning for CSAM. I do not feel this
| is an invasion of my privacy, because I know how hashing works
| and I know I do not have CSAM on my device.
| blintz wrote:
| Do you know how NeuralHash works? NeuralHash is _not_ a
| cryptographic hash. Unless you 're an Apple employee, you
| can't - the model is private and not inspectable.
| newsclues wrote:
| Outrage is not equal to confusion.
| mensetmanusman wrote:
| Apple: "we aren't going to scan your phone"
|
| >>
|
| Apple: "we are going to make a tool that can scan your phone"
|
| >>
|
| Apple: "Sorry, the government is forcing is to use this tool to
| scan your phone"
| aborsy wrote:
| I will absolutely not tolerate on-device scanning.
|
| I will drop Apple if they proceed, and spread the word as much as
| I can.
| stakkur wrote:
| "Confusion"
|
| "Misunderstanding"
|
| "The screeching voices of the minority"
| mnd999 wrote:
| The arrogance here is next level. Nobody is confused, you're just
| wrong.
| throw7 wrote:
| "The system could only match "exact fingerprints" of specific
| known child sexual abuse images, he said."
|
| This disinfo really angers me. That is the exact opposite of what
| I've read up till now. People talking about "NeuralHash" and
| being able to detect if the image is cropped/edited/"similar". SO
| what is the truth?
| JohnCurran wrote:
| That "exact fingerprint" is, in my opinion, intentionally
| confusing.
|
| This DaringFireball[0] article states the goal of the system is
| to "generate the same fingerprint identifier if the same image
| is cropped, resized, or even changed from color to grayscale."
|
| So while the fingerprint may be "exact", it's still capable of
| detecting images which have been altered in some way
|
| [0]
| https://daringfireball.net/2021/08/apple_child_safety_initia...
| elliekelly wrote:
| Does it make a difference? My iPhone shouldn't do anything to
| or with my photos unless and until I direct it to. Scanning,
| hashing, whatevering -- Apple doesn't get to decide to do any
| of it. I do. And only I do.
| patrickthebold wrote:
| If I had to guess, cropping and other transformations result in
| the same (exact) fingerprint. So different images but same
| fingerprints.
|
| Of course, that's just a nasty way to imply that the images
| match exactly.
| gizdan wrote:
| The truth is they're rephrasing what was already known. They're
| going to match match finger prints of pictures against a
| database. Every picture. This was widely report. What confusion
| they're referring to I don't know, because they're saying
| exactly what has been reported.
| notJim wrote:
| > Every picture
|
| This is the confusion, it's only photos being uploaded to
| iCloud.
| salamandersauce wrote:
| So almost every picture. Isn't it the default to upload all
| photos to iCloud on iOS devices? Doesn't it even helpfully
| remove photos that aren't used as much to make room for new
| ones?
| kemayo wrote:
| I can't find an answer about whether it's the default
| nowadays. You certainly used to have to turn it on --
| e.g. the Apple support page on the feature tells you how
| to do so: https://support.apple.com/en-us/HT204264
|
| That said, the argument that many people in these threads
| are making is that they say it's reasonable to scan
| photos that are uploaded once they're on Apple's servers,
| they just don't want them scanned while they're still on
| their phones. In either case, the same photos will be
| scanned -- ones which are in the process of being
| uploaded to iCloud -- the disagreement is just about
| exactly when in said process it's okay to do so. Which
| seems like a pretty fine distinction to me?
| totetsu wrote:
| Just wait till you download that meme image, to upload to your
| reaction meme folder on icloud, that some troll has kept the
| background of some csam image and edited meme text over the
| bits that might have made you aware of its origins. will that
| match?
| laurent92 wrote:
| Who cares, the NCMEC database is certainly full of unreviewed
| material, given even their employees can't automatically have
| access to it. For any dystopian state, the goal is to have as
| many false positives as possible in the NCMEC database, to be
| able to legitimately have your photos uploaded to their
| headquarters.
| btown wrote:
| It's all on pages 4 and 5 of https://www.apple.com/child-
| safety/pdf/CSAM_Detection_Techni...
|
| > The main purpose of the hash is to ensure that identical and
| visually similar images result in the same hash, and images
| that are different from one another result in different hashes.
| For example, an image that has been slightly cropped or resized
| should be considered identical to its original and have the
| same hash. The system generates NeuralHash in two steps. First,
| an image is passed into a convolutional neural network to
| generate an N-dimensional, floating-point descriptor. Second,
| the descriptor is passed through a hashing scheme to convert
| the N floating-point numbers to M bits. Here, M is much smaller
| than the number of bits needed to represent the N floating-
| point numbers. NeuralHash achieves this level of compression
| and preserves sufficient information about the image so that
| matches and lookups on image sets are still successful, and the
| compression meets the storage and transmission requirements
|
| Just like a human fingerprint is a lower-dimensional
| representation of all the atoms in your body that's invariant
| to how old you are or the exact stance you're in when you're
| fingerprinted... _technically_ Federighi is being accurate
| about the "exact fingerprint" part. The thing that has me and
| others concerned isn't necessarily the hash algorithm per se,
| but rather: how can Apple promise to the world that the data
| source for "specific known child sexual abuse images" will
| actually be just that over time?
|
| There are two attacks of note:
|
| (1) a sophisticated actor compromising the hash list handoff
| from NCMEC to Apple to insert hashes of non-CSAM material,
| which is something Apple cannot independently verify as it does
| not have access to the raw images, which at minimum could be a
| denial-of-service attack causing e.g. journalists' or
| dissidents' accounts to be frozen temporarily by Apple's
| systems pending appeal
|
| (2) Apple no longer being able to have a "we don't think we can
| do this technically due to our encryption" leg to stand on when
| asked by foreign governments "hey we have a list of hashes,
| just create a CSAM-like system for us"
|
| That Apple must have considered these possibilities and built
| this system anyways is a tremendously significant breach of
| trust.
| LeifCarrotson wrote:
| He carefully avoided saying that the image itself is the same.
| The exact fingerprint is the same, yes, but the fingerprint is
| just a hash of the actual image. Disinformation indeed!
|
| The whole point of the system is that you get a matching hash
| after mirroring/rotating/distorting/cropping/compressing/transf
| orming/watermarking the source image. The system would be
| pretty useless if it couldn't match an image after someone,
| say, added a watermark. And if the algorithm was public, it
| would be easy to bypass.
|
| The concern, of course, is that all of this many-to-one hashing
| might also cause another unrelated image to generate the same
| fingerprint, and thereby throw an innocent person to an
| unyielding blankface bureaucracy who believes their black-box
| system without question.
| mLuby wrote:
| > Apple decided to implement a similar process, but said it
| would do the image-matching on a user's iPhone or iPad,
| before it was uploaded to iCloud.
|
| Is this list of hashes already public? If not, seems like
| adding it to every iPhone and iPad will make it public. I get
| the "privacy" angle of doing the checks client-side, but it's
| little like verifying your password client-side. I guess they
| aren't concerned about the bogeymen knowing with certainty
| which images will escape detection.
| occamrazor wrote:
| The hashes on device are encrypted. iPhone users do not
| have access to the unencrypted version.
| 734129837261 wrote:
| It simply means that they can have whatever the hell kind of
| method they use to identify specific images, and the scary
| part is: there IS an error-margin built-in because otherwise,
| as you said, this tech would be pretty useless.
|
| "Find all images and tag them if they look like this
| fingerprint" doesn't mean that. It means: "Find all images
| and tag them if they look 80% like this fingerprint".
|
| Which also means that it will allow governments to upload
| photographs of people's faces and say: "Tag anyone who looks
| like this".
|
| Worse, this will allow China to track down more Uyghurs, find
| people based on guides in the form of images that are spread
| around to stay safe from the Chinese government, and
| countries like Saudi Arabia can start looking for phones with
| a significant amount of atheist-related images, tracking down
| atheists, and killing them. Because that's what that country
| does.
| izend wrote:
| The CCP has had access to iCloud in China for multiple
| years...
|
| https://www.reuters.com/article/us-china-apple-icloud-
| insigh...
| intricatedetail wrote:
| These perceptual hashes do have high number of false
| positives. That's why they employ AI to discard images that
| don't have certain features from the pool to minimise the
| risk. But that method in general without actual human
| checking manually is a recipe for disaster.
| jachee wrote:
| This is why there's a _threshold_ of matches "on the order
| of 30+" before _anything_ is sent to the system for further
| review.
| blintz wrote:
| The simple summary is: NeuralHash is _not_ a cryptographic hash
| function. It 's a private neural network trained on some
| images. We have no guarantees of its difficulty to reverse,
| find collisions for, etc. The naming of it as a 'hash' has
| confused people (John Gruber's post comes to mind) into
| thinking this is a cryptographic hash. It simply is not.
| ddlutz wrote:
| And we all know software never has bugs. Somebody is going to
| get arrested over this feature for some benign photo one day, I
| guarantee it.
| nomel wrote:
| How so? It would require passing the threshold to get human
| review, so actual material + false flags > threshold. This
| should probably result in the person getting in trouble. The
| case of false flags > threshold should not result in an any
| trouble since it would then go through human review.
| foobiekr wrote:
| An exact match of a perceptual hash is basically deliberately
| misleading. The entire point of a perceptual hash is that there
| are an almost unlimited number of images which it will match
| "exactly."
|
| But hey, I'm just one of the screeching voices of the minority.
| intricatedetail wrote:
| It will also match completely different images, that why
| there is "neural" bit to discard images that e.g. don't have
| nudity from the pool of matches.
| pseudalopex wrote:
| The neural bit doesn't do that.[1] Maybe you got it mixed
| up with the iMessage nudity detection.
|
| [1] https://www.apple.com/child-
| safety/pdf/CSAM_Detection_Techni...
| intricatedetail wrote:
| Did you read it? They use neural network to discard false
| positives, because perceptual hash alone is not reliable.
| It's pretty much the same concept I described.
| pseudalopex wrote:
| Please quote what you think supports your claim.
|
| "Indeed, Neural-Hash knows nothing at all about CSAM
| images. It is an algorithm designed to answer whether one
| image is really the same image as another, even if some
| image-altering transformations have been applied (like
| transcoding, resizing, and cropping)."[1]
|
| [1] https://www.apple.com/child-
| safety/pdf/Security_Threat_Model...
| eloisant wrote:
| Company does a bad things.
|
| Customers get angry.
|
| Company: "I'm sorry you misunderstood me!"
| beervirus wrote:
| There was no confusion here. Everybody I've read was talking
| solely about that first feature.
| InternetPerson wrote:
| When trying to figure out what the truth is, it's important to
| keep in mind two things: (1) Corporations never lie! and (2) Once
| a corporation enacts a policy, that policy will never change!
|
| So you see, Apple is only going to scan certain things at certain
| times under certain conditions. So we can all relax now, OK?
| vmception wrote:
| Hey @dang, can you search for an exact hash of users that sign in
| from Apple campuses and corporate VPNs and show it next to their
| username?
| tharne wrote:
| I think the problem Apple ran into was that there was no
| confusion at all. Apple announced they were going to scan users'
| devices after years of marketing themselves as a "privacy-
| focused" company. Shockingly, customers were pretty mad about the
| whole thing.
| xibalba wrote:
| A true story...
|
| Me (Last month): "Apple is taking privacy very seriously. I'm
| going to vote with my dollars and switch from Android."
|
| Me (This month): "..."
| godelski wrote:
| Honestly I was going to make the switch next gen of phones
| (been Android since the get go). Glad I waited. At least a
| Google phone I can flash
| hypothesis wrote:
| There was no confusion at all.
|
| There is no way Apple released their initial PR piece without
| thinking it through and deliberately fusing all those new
| features together as one big unassailable initiative. It was
| typical my way or the highway.
|
| Which also make it funny now that they attempt to distinguish
| between them and run into same hole that they dug for other
| people.
|
| [1] https://www.apple.com/child-safety/
| echelon wrote:
| It's good because now Apple employees have a ton of reasons
| to question their employer and quit.
|
| Apple:
|
| - Isn't going to be remote work friendly.
|
| - Shut down internal polls on compensation.
|
| - Bows to the FBI, CIA, FSB, CCP.
|
| - Treats its customers as criminals.
|
| - Treats its employees as criminals.
|
| - (Spies on both!)
|
| - Doesn't let customers repair their devices or use them as
| they'd like.
|
| - Closes up (not opens up) the world of computing. Great
| synergy with the spy dragnet.
|
| Take your time and talent elsewhere. This bloated whale is
| bad for the world. There are a lot of good jobs out there
| that pay well and help society.
| kblev wrote:
| This is all so true, all please quit Apple. (so there will
| be some openings for me)
| system2 wrote:
| Why would you want to work there? For money?
| recursive wrote:
| Everyone who's working a non-volunteer position is doing
| it for the money. So, obviously, yes.
|
| There's a facade that we really work for other reasons,
| and money is just an inconvenient byproduct. During a job
| interview, you may be asked "Why do you want to work for
| us?". And for some reason "So I can afford to buy food"
| is not a good answer.
| megablast wrote:
| > For money?
|
| As opposed to what??? Free apple stickers??
| system2 wrote:
| There is something called "company culture". I changed
| jobs just because of that before. Instead of 200k, make
| 120 and be much more happier somewhere else. Mental
| health is more important than money after certain amount
| of it.
| hughrr wrote:
| As someone who doesn't work for Apple I wouldn't work for
| Apple on principle even if they tripled my salary today.
| There are some lines that none of us should cross. There
| needs to be an ethical code for software engineering.
| orasis wrote:
| OK? So why do you want the privacy focused employees to
| quit? It sounds like that would only make the problem
| worse.
| aesh2Xa1 wrote:
| Quitting IS a form of protesting the administrative
| decisions. Joining a company that does respect privacy IS
| a form of exercising one's own values in one's
| employment.
| whoaisme wrote:
| LOL all that silly bravado and you still didn't answer a
| straightforward question
| ksec wrote:
| >There is no way Apple released their initial PR piece
| without thinking it through and deliberately fusing all those
| new features together as one big unassailable initiative.
|
| Something I bet wouldn't have happened when Katie Cotton was
| in charge. But yeah. Tim Cook thought he need new PR
| direction. And that is what we got. The new Apple PR machine
| since 2014.
| Bud wrote:
| This is inaccurate by definition, of course. Obviously. "My
| way or the highway" implies there is no alternative.
|
| But in this case, of course, if you're an adult, the Messages
| part of this doesn't apply to you at all, and the photos part
| can be completely avoided by not using iCloud Photos.
| gary17the wrote:
| No offense, but have you even owned an iPhone/iPad for a
| considerable length of time? The darn things include a maze
| of settings that are inter-dependent and unexpectedly
| lose/alter their values; perhaps not on regular basis, but
| always once in a while (e.g., with a new iOS version). If
| file scanning and reporting capability is present, code-
| wise, on your device, you can consider it active - sooner
| or later.
| shapefrog wrote:
| No offense, but are you mentally retarded? Can you not
| figure out how to click a slider? My 95 year old
| grandmother, who was schooled to the age of 14, has
| figured it out.
| flyinglizard wrote:
| Sorry for commenting on a comment, but it was so
| hauntingly offensive that it wrapped around to the
| poetic. It reminds me of the when BMW designed their
| motorcycles to be so ugly they'd find beauty of their
| own, K1200R for example [0]
|
| [0] https://ibb.co/zZMtqQk
| shapefrog wrote:
| Why thank you. There is a beauty of its own in that
| K1200R - except for the headlamp, that is not beautiful,
| no offense intended of course.
|
| My knowledge in the space is limited to the GS range,
| having been privy to a few storys of romance between
| rider and bike while crossing continents. A beauty of its
| own.
| christkv wrote:
| For now
| slg wrote:
| >There was no confusion at all.
|
| I don't know what you and tharne are talking about here.
| There was definitely confusion. HN is a tech forum and I
| still saw plenty of people here worried about how they would
| get in trouble for having innocent photos of their own
| children on their phone. You are allowed to be against
| Apple's plan while still recognizing that many people didn't
| understand what exactly was part of that plan.
| hypothesis wrote:
| I'm sorry, at what point there was any confusion that Apple
| is going to use _our_ phones to do scanning?
| slg wrote:
| That is the _where_ of the story. There was a inarguably
| confusion over the _when_ , _what_ , and _how_ of the
| story.
|
| It was not universally understood that this would only
| apply to photos sent to iCloud.
|
| It was not universally understood that this was only
| looking for previously known CSAM.
|
| It was not universally understood that they were using
| some sort of hash matching so photos you took yourself
| would not trigger the system.
|
| I understand if you consider the _where_ more important
| than the others, but it is simply a fact that there was
| confusion on what exactly was happening here.
| hypothesis wrote:
| I appreciate your more detailed clarification and agree
| with 'where' conclusion.
|
| To the extent that other parts of this story was
| explained to us by Apple, I did try to clarify some
| exaggeration in other thread.
| asddubs wrote:
| There's always going to be plenty of people commenting
| who didn't even bother to read the article at all. But by
| and large, from what I saw, people did understand the
| nuances of this and outlined how little stands in the way
| of expanding these policies' scope once the technology is
| in place
| tomp wrote:
| Your problem is trusting evil people at face value.
| cwkoss wrote:
| I think "universally understood" is doing a lot of work
| to portray a much higher degree of certainty about each
| of those statements than is justified.
|
| A lot of the contention wasn't about the specifics of
| their plan, but rather how subtle changes could vastly
| expand the scope of their plan.
|
| "this would only apply to photos sent to iCloud." for
| now, until scope creeps.
|
| "this was only looking for previously known CSAM." for
| now, until scope creeps.
|
| "using some sort of hash matching so photos you took
| yourself would not trigger the system." well this one is
| immediately concerning even within claimed scope because
| there _ARE_ going to be false positives that apple
| records some database. Millions of iphone users are going
| to have a non-zero "possible childporn" score.
|
| They are building an engine for iphone users to self-
| incriminate. If they rigidly hold the scope to only what
| they announced and never expand, it could be argued that
| this is a reasonable concession to fight CSAM. However,
| in making the announcement, they boldly stepped past
| their existing hard line in privacy (local device content
| is private and not surveilled by apple), so it seems
| naive to expect that this announcement reflects the
| eventual scope of this self-incrimination engine for the
| next decade of apple updates.
| slg wrote:
| >A lot of the contention wasn't about the specifics of
| their plan, but rather how subtle changes could vastly
| expand the scope of their plan.
|
| The how helps show us how changing this system is not a
| subtle change. It isn't like they can flip a switch and
| suddenly they are identifying new suspected CSAM on
| people's phones. That would require a new system since
| the current one is only hash matching.
|
| >However, in making the announcement, they boldly stepped
| past their existing hard line in privacy (local device
| content is private and not surveilled by apple), so it
| seems naive to expect that this announcement reflects the
| eventual scope of this self-incrimination engine for the
| next decade of apple updates.
|
| This is an arbitrary line that is being drawn. These are
| photos that are marked for sending to iCloud. Whether the
| scanning happens on the phone before they are sent or in
| the cloud after they sent is largely immaterial when it
| comes to the impact of the code. People are acting as if
| the line Apple drew was motivated by technology. That was
| never the deciding factor. Technology is the easy part
| here. That line was only a policy line and that policy
| has not changed. Only photos that are sent to iCloud are
| scanned. If you fear Apple changing that policy going
| forward, you should have always feared Apple changing
| that policy.
| insulanus wrote:
| You forgot the _who_
|
| - Whose fault is it that those points were not clearly
| communicated?
|
| - Who wrote the perceptual hash matching code?
|
| - Who is allowed to audit the code, the review system,
| and the hash database?
|
| - Who updates this code?
|
| - Who decides if your phone OS is updated?
|
| - Who decides the iCloud upload defaults?
|
| - Who decides if you are reported?
|
| - Who asked for this feature?
| romwell wrote:
| - Who is going to have their dog killed by a misguided
| SWAT team that doesn't bother analyzing the automated
| report before acting on it?
| romwell wrote:
| It was not universally "understood" because it was not
| universally _agreed upon_. To wit:
|
| * It was not universally understood that this would only
| apply to photos sent to iCloud.
|
| Since the scanning doesn't happen on iCloud, this
| distinction is irrelevant.
|
| "We are going to intrusively scan the subset of your
| photos that you care enough to back up to the cloud that
| we've been pushing to you for years" is pretty clear.
|
| * It was not universally understood that this was only
| looking for previously known CSAM.
|
| It was only looking for whatever is in an opaque database
| which, according to a third party we don't have any
| contract with, contains CSAM.
|
| * It was not universally understood that they were using
| some sort of hash matching so photos you took yourself
| would not trigger the system.
|
| Yeah right, I feel totally safe knowing that I won't be
| falsely reported to FBI by a "some sort of" hash
| matching.
|
| Here's a hash function: f(x) = 0 for all x
|
| It's "some sort of" hash, too.
| lstodd wrote:
| > some sort of hash matching so photos you took yourself
| would not trigger the system.
|
| This is ignorance in extreme.
| notriddle wrote:
| Nobody's confused about that.
|
| > HN is a tech forum and I still saw plenty of people
| here worried about how they would get in trouble for
| having innocent photos of their own children on their
| phone.
|
| They're confused about this. NeuralHash doesn't look for
| pictures of naked kids. It looks for pictures that are
| identical to the ones they've put in their signatures
| list.
|
| The problem is that Apple claims that the signatures in
| their list are all pictures of sexually-abused kids, but
| we have no way of verifying that. Heck, _they don 't even
| have any way of verifying that_. Everyone just has to
| take NCMEC's word for it.
| cwkoss wrote:
| Perceptual hashing can have collisions, and they will be
| at a higher rate than completely "identical".
|
| The public does not know what the false positive rate is
| for 'average iphone user pictures'. As engineers we can
| be certain the false positive rate is not zero. This
| means that some number of iphone users are going to have
| non zero "possible child pornographer" scores in the
| apple database.
|
| The false positive rate is crucial to understanding how
| concerning this should be. If the average iphone user has
| 1000 photos, and the false positive rate is the claimed 1
| in 1 trillion, there is a 1 in a billion chance that
| you'll be flagged as a potential child pornographer. (~1
| in the world will be falsely accused). This seems
| reasonable enough with the apple-internal screening step.
|
| If the chunking and perceptual hashing functionally ends
| up having a much higher false positive rate for images
| which have similarities to the dataset (parents pictures
| of kids playing shirtless, legal adult porn, etc), the
| false positive rate could actually be more like 1 in 1
| million. In which case there are potentially hundreds of
| thousands of people who will be falsely accused by this
| system.
|
| How many matches will US judges require before they sign
| warrants for arrests, search and seizure of digital
| devices? If they are technically competent it shouldn't
| only be 1, but I don't trust all judges to understand
| probability well enough to require multiple matches.
| hypothesis wrote:
| Even if we put aside collisions, new system features
| 'Synthetic Match Vouchers', which is seemingly adding
| noise into actual CP counter.
|
| I yet to understand what happens to people who only have
| those synthetic positives? Regardless of what counter
| threshold is, can't those people be hoovered up by a
| subpoena of counter >0 ?
| cwkoss wrote:
| Oh yikes, I didn't realize they were handing out fake CP
| points to preserve privacy.
|
| That's really really really user-hostile design.
| FireBeyond wrote:
| > How many matches will US judges require before they
| sign warrants for arrests, search and seizure of digital
| devices? If they are technically competent it shouldn't
| only be 1, but I don't trust all judges to understand
| probability well enough to require multiple matches.
|
| Even that doesn't come without issue. How long before '1'
| becomes the value, because, say for example the number is
| ten, there's also a horrendous PR spin of "Apple has a
| high degree of suspicion that you have CSAM on your
| device, but since there's only 8 images, they won't do
| anything about it" - "Apple allows up to X non-reported
| CSAM images on Apple devices" is hard to represent in any
| positive fashion.
| lazide wrote:
| You're also wrong no? The perceptual hashing doesn't
| match EXACTLY the same photos, it intentionally matches
| photos similar to the same photos, so a minor rotate by a
| few degrees or crop or whatever also matches
| benhurmarcel wrote:
| Why does it matter that much which CPU runs the check?
| mixmastamyk wrote:
| True, but a small point of contention compared to the
| introduction of on-device scanning for illegal activities.
| The policies of which could be changed at a moment's
| notice. Other details are relatively unimportant.
| sharken wrote:
| This is in essence the problem with the new on-device
| initiative from Apple (calling it a tool is rather
| misleading).
|
| If allowed to go forward, it is only a matter of time
| before the capability is expanded.
|
| So it's a big no to the scanning capability, you would
| think that Apple had gotten the message by now.
|
| And the other initiative is also open for abuse, by
| allowing the device administrator to spy on the user.
| Admittedly not as bad as the on-device scanning.
| smnrchrds wrote:
| I don't think that was confusion either, because there were
| discussion and articles on how a hash collision is possible
| in the scenario you mention due to the way perceptual
| hashes work.
| slg wrote:
| They have said that the system won't be triggered on a
| single image. You would need to have multiple photos on
| your phone experience this hash collision which drops the
| odds of false positives considerably.
|
| EDIT: It has now come out that you need to trigger the
| system 30 times before Apple acts on it. I can't imagine
| the odds for someone to have 30 hash collisions.
| smnrchrds wrote:
| So they understand that their system is very susceptible
| to false positives, but they are saying their clients
| shouldn't worry because the black box hash gets compared
| with a black box inauditable threshold, both of which
| solely determined by Apple. I don't think the reaction
| was due to any confusion. People understood what Apple
| was trying to do and realized how much it sucked from a
| technical perspective.
| slg wrote:
| Where did "very susceptible to false positives" come
| from? If the odds of a collision are one in a million
| that is troublesome if they only need one match. If they
| ignore anyone that has less than 3 matches, we don't
| really have to worry about false positives. People who
| have CSAM generally don't have only 1 or 2 images.
| indymike wrote:
| There was zero confusion. Apple's new feature was
| universally rejected and if it cone to market will cause a
| severe loss of market share.
| OrvalWintermute wrote:
| I do think it will result in a loss of market share if it
| comes to pass, if, for the simple reason that Apple will
| likely lose the Privacy Moat.
|
| Upcoming contenders like Purism [1] and the Pine Phone
| [2] will start gaining a great deal more traction from
| this. Other SV firms will sense business opportunity.....
| If merely 5% of the TAM around mobile is willing to
| prioritize non-spying features that would be enough to
| stand up very healthy businesses.
|
| It isn't like an iPhone is very customizable, repairable,
| or that usable with all the App restrictions Walled-
| Garden stuff.
|
| [1] https://puri.sm/products/librem-5/
|
| [2] https://www.pine64.org/pinephone/
| jsjohnst wrote:
| > Upcoming contenders like Purism [1] and the Pine Phone
| [2] will start gaining a great deal more traction from
| this.
|
| I'll bet you $500 to the charity of your choice that this
| won't come to be. Set the terms on how you want to
| measure the outcome.
| mdoms wrote:
| Year of the Linux ~desktop~ telephone.
| [deleted]
| quickthrowman wrote:
| > and if it cone to market will cause a severe loss of
| market share.
|
| AAPL closed 90 cents short of an all time high share
| price today. Why isn't the market pricing in the loss of
| market share?
| megablast wrote:
| I doubt that it was universal. Nothing ever is.
|
| But I haven't seen any positive discussions about it,
| which is odd.
| geoduck14 wrote:
| Case in point, I'm pretty apathetic about the whole thing
| defaultname wrote:
| Severe loss of marketshare? It wouldn't even register.
|
| I don't like the feature. Putting this on the client
| device is dubious and should never have made it past the
| brainstorming stage.
|
| Having said that, technology companies, big and small,
| are bound in the US to do this. By law. If anything Apple
| was by far the laggard of the bunch (with reporting
| counts magnitudes lower than peers, despite a larger
| customer base). As I said in another comment, no company
| can protect you from your government.
|
| Much has been made about it being on device, which while
| a serious optics issue...the hot takes being given on
| here are manifestly absurd. Like, literally the company
| that holds all of your data, all of your passwords, all
| of your info and you need to invent slippery slopes to
| imagine up what they "might" do?
|
| If they want to have their way with your data, they could
| have been doing it for decades.
|
| They should never have announced two very different
| systems at the same time. Contrary to some of the
| insincere claims given in this very thread, there is
| _massive_ disinformation and confusion about them. In the
| end I feel like 98% of the "the end is nigh!" comments
| are by long time Apple detractors who just see this
| glorious opening.
|
| And while I still hope that Apple says "Mea culpa, we're
| just going to scan on the ingress to iCloud Photos",
| whatever they do in a month this is going to be
| completely forgotten.
| Terretta wrote:
| > _Putting this on the client device is dubious_
|
| Putting it server side is categorically worse. Putting it
| in the client SDK for iCloud (architecturally speaking)
| rather than on cloud storage or in the OS is clearly the
| better correct technical choice, tying surveillance's
| hands in a way server side or OS would not.
|
| Most every client SDK routinely checks content before
| upload, it's a best practice. Careful examination
| suggests this was engineered better than that practice.
|
| (Note: even tech trade posts such as LWN, Stratechery, or
| Daring Fireball trying to write well about this need to
| sit down a minute and have how it _actually_ works walked
| through for them, as do many in this community.)
|
| FWIW, I agree with much of the rest of your post except
| the rationale for low reporting counts.
| slg wrote:
| Tech folks drastically overestimate how much the average
| person cares about privacy.
|
| Plenty of people believe that the Facebook and Instagram
| apps are recording audio 24/7 and target you ads based on
| the speech the apps hear. That doesn't stop people from
| using the apps.
|
| A few years ago some of the most famous people in their
| world had their iClouds accounts hacked and had their
| naked photos leaked. That is a lot of people's worst
| fear. People literally commit suicide over this sort of
| thing. It didn't hurt the iPhone's market share.
|
| People largely don't care.
| pseudalopex wrote:
| > Having said that, technology companies, big and small,
| are bound in the US to do this. By law.
|
| Other companies scan their servers instead. And what law
| banned E2E encryption?
| cm2012 wrote:
| Yeah, I dislike Apple and want them to have less
| marketshare, but I doubt they will even lose 1% of
| revenue over this.
| mdoms wrote:
| This is misinformation.
| notJim wrote:
| There is a good amount of confusion throughout this very
| thread about what this is.
| OrvalWintermute wrote:
| The confusion was about the pushback. They expected a 2 foot
| wave, and they are getting a tsunami.
|
| We drank the Apple Privacy Kool-aid, and now we are holding
| them to it.
|
| This is totally a battle worth fighting!
| tiahura wrote:
| "They expected a 2 foot wave, and they are getting a
| tsunami."
|
| Are you sure? My local Apple store is just as crowded as it
| was two weeks ago.
| pgt wrote:
| I for one am looking at alternatives.
| dunnevens wrote:
| I am too. I haven't pulled the trigger yet, but I'm
| thinking about one of Google's Pixels. The reason for
| that specific product line is because they are well
| supported by a wide range of de-Googled Android variants.
| I'm leaning towards CalyxOS, which seems to have the best
| mix of privacy, security, and ability to use apps from
| the Play Store. But GrapheneOS looks tempting too.
|
| I already own a Pinephone but it's not at a point where
| I'd want to use it as a daily driver. But they're only
| $150-$200, so worth taking a chance if you don't want an
| Android alternative. You may end up liking it. I do know
| people who are using it daily. It's just not for me. Not
| yet.
|
| If you want to look into the Android alternatives
| further, this HN discussion about CalyxOS went into some
| great detail about that OS, and about other alternatives
| too.
|
| https://news.ycombinator.com/item?id=28090024
| fsflover wrote:
| Here you go: https://puri.sm/products/librem-5 and
| https://pine64.org/pinephone
| dane-pgp wrote:
| You write a comment like this in every single discussion
| about privacy on smartphones. And I upvote you every
| time. Keep up the great work!
| Bud wrote:
| Good luck with that. Every alternative is an order of
| magnitude worse and also less honest about it.
| formerly_proven wrote:
| Maybe a smartphone is not worth it if the price is so
| high.
| Syonyk wrote:
| Indeed.
|
| For the past week (entirely related to this being a
| kicker of a motivation on top of a bunch of other
| simmering long term concerns over Apple and the tech
| industry in general), I've been carrying around a Nokia
| 8110 4G - also known, for very understandable and valid
| reasons, as "The Bananaphone." It's quite literally
| curved and bright yellow.
|
| The world hasn't ended yet...
|
| It's a bit less of a step for me than other people
| because I'm already pretty cell-phone hostile. My iPhone
| (I regret buying a 2020 SE to replace my 6S under the
| assumption that the 6S wouldn't get iOS 15, which it's
| getting... maybe...) was pretty well nerfed to start with
| - very few apps, literally the only apps on my homescreen
| were person to person or group chat apps (Signal,
| iMessage, Google Chat, and the Element Matrix client,
| plus phone, browser, and camera in the bottom).
| Everything else had to live in the app library thing,
| which increased friction to use it, and I really didn't
| have much on there.
|
| But that has been shut down except for a few 10-15 minute
| windows the past week, and I've been trying, very hard,
| to work out the transition back to a "dumbphone" (or, as
| we used to call them, a cellphone).
|
| The main pain point so far is that all my messaging apps
| used to come to a central point on my phone - so if
| someone wanted to contact me, it didn't matter what they
| used, it would ping me if I had my phone on me. Now,
| that's split (my wife is the main party impacted, I'm
| pretty high lag on other platforms anyway). If I'm out
| and about, I can get SMS, but not Matrix/Signal/Chat. If
| I'm in my office, I can get all of them, but would rather
| not have a long conversation over T9 - except some of
| them don't do a great job of notifying me, depending on
| what machines are running and muted at any given time.
| Etc. I'm still working this out, and some of it is simple
| enough - add audio notifications to my "Chat Pi" by
| wiring in a speaker instead of relying on my phone to
| chirp if I get a message in Chat or element. That my M1
| Mac Mini is going out the door at some point gives me
| added motivation to solve this.
|
| When out and about, I do at least have the option of
| tethering to the banana - so I could carry some other
| device that handles more than the phone does (which
| seriously isn't much). I'm debating between going back to
| a small tablet (2nd gen Nexus 7 would be a perfect form
| factor), or something like a YARH (http://yarh.io) of
| some variety - a little Pi based mobile computer thing
| that is exceedingly "We didn't invent smartphones"punk.
|
| I'm at a point in my life (professionally, socially,
| culturally, etc) where I can happily do "You're weird...
| whatever..." sort of things with regards to technology,
| and I'm going to pull the thread until I either figure
| out alternatives, or determine that they simply don't
| exist and I can't live without them.
| dane-pgp wrote:
| > also known, for very understandable and valid reasons,
| as "The Bananaphone." It's quite literally curved and
| bright yellow.
|
| I thought you were going to say it's cellular, modular,
| interactivodular.
| 3000000001 wrote:
| The price being what exactly? That you'll get caught for
| storing CSAM in the device makers cloud?
|
| I think the pros list stays longer than the cons list.
| aesh2Xa1 wrote:
| Your rebuttal is, at best, a specific, straw man instance
| of "If you were doing nothing wrong then you have nothing
| to hide."
|
| I needn't be holding child pornography to be concerned
| about a third party viewing my photos, writing, or other
| media on a device that is just mine and not published,
| public content.
| mrzimmerman wrote:
| CSAM is a hash database. The images are converted to a
| hash and then compared to the hashes of known pornography
| of children, not directly viewed.
|
| The weirdly less discussed aspect of this is that anyone
| who is storing their images of any kind on someone else's
| computer and network thinks that nothing could have been
| viewed before. If Apple or Google or Amazon want to scan
| the data you store with them they could be doing it, so
| if that was a concern for a person from the get go then
| they wouldn't have been storing their data with third
| parties to begin with.
| Syonyk wrote:
| It's not _just_ this. This is a major push, certainly,
| but... as we come up on about a decade of smartphones
| being more than "that weird nerd phone one person I know
| has" it's worth stepping back and looking at the benefits
| and costs.
|
| Where you put these will depend on your view on a lot of
| the issues, certainly.
|
| But, in the past decade:
|
| - Every interaction with your primary device is now, by
| default, an opportunity for aggressive data collection,
| often in ways even the people who write the software
| don't know (because they rely on tons of other libraries
| and toolkits that are doing this quietly under the hood).
|
| - The default is now that you use a smartphone for
| everything, with the desktop experience limited or turned
| into a crappy version of the smartphone version (Image!
| Video! Scroll, scroll, scroll, never stopping, always
| seeing more ads! Text, who cares about that ancient
| stuff?)
|
| - The default has gone from "If you're alone in a social
| space, you talk to other people" to "You stare at your
| phone." Certainly was a trend before, with the
| Walkman/iPod/etc, but it accelerated dramatically.
|
| - Everything has been turned into either a subscription
| service, or a "Free-to-play" world in which the goal is
| addiction and microtransactions.
|
| There are plenty of benefits of smartphones, but
| culturally we're exceedingly bad at looking at the
| opportunity costs of new technology, and they're
| increasingly becoming harder to ignore.
|
| If you can honestly evaluate the device and decide it's a
| net positive, great. But I know an increasing number of
| people, myself included, who are evaluating them and
| saying, "You know, never mind. They're not worth the
| downsides."
| Wowfunhappy wrote:
| Unfortunately, we're so far down the path that I no
| longer have a choice.
|
| I'm starting graduate school in the fall. A few weeks
| ago, I went in to pick up my new college ID card. The
| security guard would not let me into the building until I
| downloaded an app called "Everbridge" on my phone and
| used it to answer a series of health screening questions
| (ie, have you tested positive for COVID in the past 14
| days).
|
| The app was for iOS and Android. There was no web
| version. There was no option to fill out a paper form. I
| was not warned in advanced. But I guess it wasn't a
| problem for anyone (including me), because who the heck
| doesn't have a smartphone? It's like having a wallet now
| --an expected requirement for modern life, even in
| situations when an analog solution could have worked just
| as well.
| Syonyk wrote:
| So what would they do if you emptied your pockets out and
| demonstrated that you _did not have a smartphone_? You
| pulled out the candy bar or the flip phone?
|
| Again, I'm at a point where I can be a thorny pain in the
| ass about stuff like this, but you carrying a smartphone,
| even though you (presumably?) know it's evil means that
| people can do things like this - expect you to download
| some large blob of unknown code that you're going to run.
|
| As long as they don't encounter people who literally
| can't comply, it's fine. It works for them.
|
| I mean, I would have refused to download an unknown app
| I'd never heard of, but... if I pull out a clearly-not-a-
| smartphone, what are they going to make me do? Go down
| the street to Best Buy, buy a phone, and come back?
| Wowfunhappy wrote:
| They wouldn't have let me into the building. Yes, I
| assume they wouldn't have retracted my acceptance and we
| would have made some arrangement, but I have better
| things to deal with in my life. I'm on a (Jailbroken)
| iPhone, so the app should at least be sandboxed--I'm not
| entirely sure what I would have done on Android.
| godelski wrote:
| Honestly what do you need in a smartphone? Good camera?
| IDK about you but all I use it for is texting, calling,
| taking pictures, and maybe checking Hacker News while I'm
| standing in line. 100% of phones above the $500 mark are
| going to fit 100% of peoples needs for people like me.
| Let's be honest, those needs are camera and battery life.
| What do you need that is the latest and greatest? I am
| willing to bet that this is fine for 90% of people,
| including us here.
|
| And we're on Hacker News. People know about ROMs and how
| to use them. Get a Pixel and throw Lineage onto it. It'll
| be at minimum $100 cheaper and the specs are pretty damn
| close (minor trades in either direction).
| laserlight wrote:
| Integration between Apple devices makes the experience
| greater than the sum of its parts. If someone switches
| from iPhone they'll lose the ability to use iMessage on
| their phone, to receive SMS messages on their Mac, to
| sync Notes, Calendar, Photos, etc. That's why the
| alternatives are an order of magnitude worse for me.
| godelski wrote:
| What? I can do all this without Apple. I mean I might
| have to browse photos.google.com instead of opening up my
| photos folder on my desktop but that's not meaningfully
| different. I have all these things between an Android
| phone and a linux computer. It may be in different
| locations than what Apple has them, but everything does
| sync if I want them to. I can even sync all these things
| without Google if I want to and have them go into the
| corresponding folders on my desktop. How is this an order
| of magnitude? The same services exist.
| mapgrep wrote:
| /e/os seems reasonably good for phones. It's far from an
| iPhone or even stock new Android but not an order of
| magnitude worse and none of this file scanning.
| hypothesis wrote:
| This was an earthquake that causes a tsunami...
|
| On a side note: I went to Apple site trying to find that
| page for all those new features and I could not find one
| (at least by going to obvious places). The way I was able
| to get it to link in my posts is by googling it... this
| whole thing is not yet obvious to laypeople.
| jjcon wrote:
| Agreed - even if Apple doesn't back down, giving them hell
| would make other companies less likely to follow suit. This
| is a very important line in the sand that they have crossed
| zepto wrote:
| They aren't scanning users devices. If you think this, there is
| definitely confusion in the information getting out.
| RussianCow wrote:
| You're splitting hairs unnecessarily. Apple is scanning
| users' photos on their devices. To say that they are not
| "scanning devices" because they are (currently) only
| targeting photos and not every single other part of the phone
| is unhelpful at best, and detracts from the point that this
| is a massive violation of their users' privacy. The exact
| wording here really doesn't matter as much as you think it
| does.
| david_shaw wrote:
| _> They aren't scanning users devices._
|
| They are scanning images on iPhones and iPads prior to
| uploading those images to iCloud. If you're not uploading
| images to iCloud, your photos won't be scanned -- but if you
| are using iCloud, Apple will absolutely check images on your
| device.
|
| From Apple's Child Safety page:
|
| _> Apple's method of detecting known CSAM is designed with
| user privacy in mind. Instead of scanning images in the
| cloud, the system performs on-device matching using a
| database of known CSAM image hashes provided by NCMEC and
| other child safety organizations. Apple further transforms
| this database into an unreadable set of hashes that is
| securely stored on users' devices.
|
| > Before an image is stored in iCloud Photos, an on-device
| matching process is performed for that image against the
| known CSAM hashes. This matching process is powered by a
| cryptographic technology called private set intersection,
| which determines if there is a match without revealing the
| result. The device creates a cryptographic safety voucher
| that encodes the match result along with additional encrypted
| data about the image. This voucher is uploaded to iCloud
| Photos along with the image._
|
| Source: https://www.apple.com/child-safety/
| zepto wrote:
| > Apple will absolutely check images on your device.
|
| Yes, they will check the images you have chosen to upload.
| No 'scanning is involved'.
|
| Claiming this is 'scanning users devices' is just dishonest
| - it's obvious that it creates a false dichotomy impression
| of what they are actually doing.
|
| Don't do that.
| zekrioca wrote:
| Check == Scanning, because to create the output, the hash
| function needs to "scan" the whole blob.
| zepto wrote:
| Even if we accept that. It is a lie to say _the device_
| is being scanned. It is definitely not. Only the photos
| the user chooses to upload are checked. That is not _the
| device_.
| zekrioca wrote:
| _It is_ part of the device, and this specific part is
| being scanned. Can I physically _remove_ this "checking"
| part and end up with a working iDevice D that resembles D
| = {{Device \ { /iCloud/Photo Library}}?
|
| Frame it the way you want. It is the device.
| zepto wrote:
| The checking is being done on the device. Nobody disputes
| that. Indeed it is being _marketed_ by Apple as a
| feature. Yes, this feature is part of the device.
|
| If you say Apple is scanning the device, you are lying.
| They are not scanning the device. They are scanning
| photos chosen for upload.
| stale2002 wrote:
| So then they are scanning photos on your device.
|
| I'd call that photo scanning... and they are scanning the
| photos on the device.
| zepto wrote:
| Yes, Apple would agree with you, but I assume you would
| not call it 'scanning the device'.
| RussianCow wrote:
| At what point would you consider it "scanning the
| device"? What if they start scanning messages? Browsing
| history? Downloads? Where do you draw the line?
| zepto wrote:
| Do you think they are scanning the device, or just the
| photos being uploaded?
|
| This isn't some ambiguous case that needs to be addressed
| philosophically. They aren't scanning anything other than
| the photos being uploaded.
| RussianCow wrote:
| I don't think the distinction matters.
| zepto wrote:
| Why not? Don't you think people should understand the
| difference?
| psyc wrote:
| > _They are scanning photos chosen for upload._
|
| This is about the 16th time I have seen language just
| like this used to explain away this concern. I don't know
| if you realize, but this wording makes it sound like you
| can select some photos and leave others local. I can find
| no indication anywhere, including on my phone, that
| iCloud Photos is anything other than an All Or Nothing
| singular toggle in iCloud settings. If you have
| instructions to the contrary, I will be happy to stand
| corrected.
|
| Seriously, _everybody_ is wording it like this. "Photos
| you choose..." and similar.
| zepto wrote:
| You choose to use iCloud photos. There are several
| competitors.
| [deleted]
| zekrioca wrote:
| Another commenter put it in better terms, so you may
| understand it:
|
| _Suppose we know there are people who smuggle drugs on
| airplanes on their person for the purpose of something
| terrible, like addicting children or poisoning people. If
| I run an airport I could say: to stop this, I 'm going to
| subject everyone who flies out of my airport to a body-
| cavity search. Tim, and Craig, are you OK with this? If I
| can say, "Don't worry! We have created this great robots
| that ensure the body cavity searches are gentle and the
| minimum needed to check for illegal drugs," does it
| really change anything to make it more acceptable to
| you?_
| jodrellblank wrote:
| They're not misunderstanding it; you're deliberately
| using an inaccurate description to mislead people while
| trying to hide behind "technically not lying", and
| they're calling you out on it.
| zekrioca wrote:
| Am I, though? Is Apple? Is parent's? It seems "their"
| (whoever you meant) interpretation of what does and
| doesn't constitute something is looser than my
| interpretation.
| zepto wrote:
| You said Apple was scanning the device. They aren't. This
| is what you are being called out on.
| zekrioca wrote:
| A-ha, so you meant that the device is scanning itself,
| and not Apple? Clever, very clever technicality.
| jodrellblank wrote:
| What you're doing is changing "Two Americans run their
| homes on solar panels" into "American homes run on solar
| panels" with the intent of fudging the quantity so that
| readers assume it means most or all of them, while being
| able to argue "they are American homes, plural, so it's
| correct".
|
| "Device scans photos" and "Apple scans device" imply two
| very different things about how much is scanned, and
| you're using the latter because you know that if you
| describe it accurately readers won't be as panicked as
| you want them to be.
| zepto wrote:
| I haven't said anything about what is acceptable.
|
| I was just pointing out a falsehood you wrote about what
| is actually being done.
| zekrioca wrote:
| No falsehoods, it is the device, even though it is only a
| specific part of it. I know you got the point I tried (or
| rather, the other commenter) to make about the part of
| someone's body meaning "the whole" of a person. Same
| philosophical view can be applied to the device.
|
| Anyway, someone in here can accept what the other can't,
| so let's leave at that and let history tells.
| [deleted]
| zepto wrote:
| It's a lie to say they are scanning the device, when you
| know that are only scanning files that are uploaded. We
| know now that you understand both this distinction _and_
| what Apple is actually doing, so it's clear that you were
| lying.
| nix23 wrote:
| And expand that "feature" in the future.
|
| >They are scanning photos chosen for upload
|
| That's pretty much scanning on the device.
| zepto wrote:
| Yes, scanning is happening _on_ the device. Apple markets
| that as a feature.
|
| That is different from _scanning the device_. Saying they
| are 'scanning the device' is a lie.
|
| Yes, Apple _could_ scan the device in future. It's still
| a lie to say they are doing it now.
| nix23 wrote:
| They want to expand that "feature" to 3rd party apps too.
|
| >Yes, Apple could scan the device in future. It's still a
| lie to say they are doing it now.
|
| Puh..i am relieved now...wait i don't even have a apple
| product.
|
| EDIT: For Question below
|
| https://technokilo.com/apple-child-safety-feature-third-
| part...
|
| >Apple didn't announce any timeframe about when will they
| implement child safety features in third-party apps.
| Apple said that they still have to complete testing of
| child features and ensure that the use of this feature in
| third-party apps will not bring any privacy harm
| [deleted]
| zepto wrote:
| > They want to expand that "feature" to 3rd party apps
| too.
|
| Do they? Where have they said that?
| zekrioca wrote:
| > This program is ambitious, and protecting children is
| an important responsibility. These efforts will evolve
| and expand over time. [1]
|
| [1] https://www.apple.com/child-safety/
| zepto wrote:
| That link contains nothing at all about expanding to
| include 3rd party apps.
|
| Were you aware of that when you posted it?
| teclordphrack2 wrote:
| What about all the images already in the iCloud?
| durovo wrote:
| You spend almost too much time defending Apple[0]. If you
| have any association with them, you should probably
| disclose that first.
|
| 0. Check the comment history, every comment is defending
| Apple and this has been going on for many months
| (years?). In fact, I don't see any comment that is not
| defending Apple. I know I am making very serious
| allegations, but please go through the comment history to
| form your own opinion.
|
| I don't believe that the user is a bot though, most
| comments are 'well-reasoned'.
| zepto wrote:
| First of all, I have no affiliation with Apple, not do I
| own any Apple stock.
|
| Do you have any affiliations we should be aware of?
|
| Secondly, I haven't 'defended Apple' in any comments.
| Indeed there are comments in which I make a judgement
| about this topic where I say that what Apple is doing is
| _distasteful_ and _offensive_.
|
| Elsewhere I have pointed out that if Apple wants to scan
| people's devices they have many other mechanisms at their
| fingertips than this narrowly tailored tool.
|
| What exactly do you think I'm 'defending Apple' from?
| Quite a few of my comments are critical of false or
| immaculate characterizations of what Apple is actually
| _doing_.
|
| If you consider _that_ to be a defense of Apple, then I
| disagree.
|
| For the most part there just seems to be a lot of
| confusion about what Apple is doing, and general
| frustration about the state of computing.
|
| Do you really think of these as 'attacks' on Apple?
| JeremyHerrman wrote:
| Wow you're right - over the past 9 days zepto's generated
| ~5 pages of comments defending Apple's scanning. Why
| would one dedicate so much time and effort to rise to the
| defense of a trillion dollar company?
| zepto wrote:
| I have a bunch of time on my hands right now. This is a
| good way to pass the time when for reasons beyond my
| control I can't be working on projects.
| swiley wrote:
| What about when your iCloud account is full (the default
| storage size is useless if you enable any kind of backup)
| so the photos never get uploaded?
| zepto wrote:
| Right.
| hughrr wrote:
| If Craig tells me I'm misunderstanding this I distrust them
| further because I completely understand the full arena of
| possibilities and not just the narrow intent.
| karmakaze wrote:
| The confusion was that many were previously taking Apple at
| their word when past actions should make that a questionble
| premise.
| notJim wrote:
| I was definitely confused, for the record. I had the impression
| that Apple would scan all photos on device, but that is not
| true. I was also confused because several changes were
| announced at once, and the conversations sometimes blended
| them.
| acchow wrote:
| It's my understanding that the your iPhone would be checking
| ALL your photos on your device. Where did you get to
| understand otherwise?
| jb1991 wrote:
| I thought it was only an iCloud scan, which is perhaps just
| as bad, but not a scan of anything that only exists on the
| iPhone itself.
| burlesona wrote:
| That's what Federighi says in this interview. It's only
| scanning photos that are about to be uploaded to iCloud.
| Otherwise photos aren't scanned. Disable iCloud and there's
| no scanning.
|
| Still, I think Apple misjudged the whole cloud vs. device
| thing in this case. They've historically preached a lot
| about how everything should happen on the users device, not
| the cloud. I think that got myopic for them, and led them
| to this decision.
|
| But in this case I think users would be much happier if
| Apple had just said "under pressure from law enforcement we
| are now scanning photos when they arrive at the iCloud data
| centers. If you don't want scanning don't use iCloud."
| Because it's not so much the scanning of uploaded photos
| that has people upset, it's the fact that the scanning and
| phoning home is baked into the device itself.
| mixmastamyk wrote:
| If you run Little Snitch you'll find there is no such
| thing as "disabling" icloud. Mac OS still talks to it
| frequently, unless you run LS.
| vetinari wrote:
| > That's what Federighi says in this interview. It's only
| scanning photos that are about to be uploaded to iCloud.
| Otherwise photos aren't scanned. Disable iCloud and
| there's no scanning.
|
| That's for now. The first update can change that and you
| will have no recourse.
| benhurmarcel wrote:
| Updates could always have changed that, on any closed-
| source system.
| Johnny555 wrote:
| _Disable iCloud and there's no scanning._
|
| Many people (like myself) are worried about the slippery
| slope where this is turned on for _all_ photos, since why
| not? Not all abusers will upload their CSAM content to
| the cloud, why wouldn 't Apple flip a flag in the future
| to scan everything, including photos and downloaded
| content? If they are serious about fighting CSAM and have
| this great privacy preserving platform, I don't see why
| they wouldn't do this?
| notJim wrote:
| It's not a flag, the feature is part of the iCloud upload
| process. No upload, no scan. Of course a code change
| could do anything.
|
| The point of the feature is to prevent people from using
| iCloud to distribute CSAM. If you're recording it with
| your phone, it's no different than using an slr camera.
| The cloud part is what they're worried about.
| yati wrote:
| Why flag the account and report it if the only goal is to
| politely prevent people from uploading? Like you say, a
| "code change can do anything" and we simply don't know
| how the current feature is done or how it will evolve.
|
| edit: like many comments here already say, reporting
| doesn't sound terrible for CSAM, but nothing about the
| feature guarantees it wont be extended to other kind of
| content.
| notJim wrote:
| Completely agree with this. But apples perspective is
| that on-device scanning prevents them from looking at
| your photos in the cloud at all. This is actually more
| secure than other cloud providers that do all kinds of
| shit with your photos.
| laurent92 wrote:
| But if they find a few which have a high correlation,
| they will eagerly upload them to Apple's police service
| and have them viewed by the entire floor.
|
| Also, the matches are supposedly only to actual babyporn
| pictures. We have 0% way to verify that, as even
| employees of NEMSEC are not all allowed to view them.
| Such DBs are often full of unreviewed fluff, and why not
| unrelated photos entirely, cookware, computer cases, who
| knows, as long as "some degree of matching" with your
| photos allows Apple to send a .zip to the police.
| strzibny wrote:
| I think right now it only scans pictures destined to be
| sent to iCloud. The problem is you won't hear from them
| once they start scanning everything. Besides, it's just not
| really provable, right? You still have to take their word
| for it.
| notJim wrote:
| > The problem is you won't hear from them once they start
| scanning everything.
|
| What makes you say this? They announced this change after
| all. Why wouldn't they announce future changes?
| fsflover wrote:
| Because of the peoples' reaction?
| rOOb85 wrote:
| Or a government order
| notJim wrote:
| He said it in the interview and it was in the initial
| announcements. This is exactly the confusion he's talking
| about.
| gentleman11 wrote:
| They are gas lighting people who are upset about what is really
| happening, and what will happen in 5-10 years, and portraying
| them as confused and ignorant instead. It's the standard
| "you're holding it wrong" Apple play
| ksec wrote:
| Well there is a huge _difference_ to AntennaGate. The two
| aren 't really comparable. Not to mention Apple did in a way
| admit to the mistake and gave out a bumper within weeks of
| the complain.
|
| Compared to their Keyboard which took nearly 3 years before
| they have a programme for free repair.
| politelemon wrote:
| Five years ago it was: "you're holding it wrong"
|
| Five years from now it'll be: "you're wrong"
|
| HN will as usual agree and take pride in being wrong.
| Bud wrote:
| Actually, HN is, as usual, doing a great job of evaluating
| and discussing this issue, in real time.
| krapp wrote:
| Judging from the threads so far, HN believes Apple is
| scanning everything using a naive perceptual hash
| implementation it probably got off of Stack Overflow,
| with no oversight or sanity checks, and that even a loose
| match (which will be trivial) means SWAT teams
| automatically being sent to bust down your front door
| like the FBI meme, and that it's all just a pretext for
| CCP style authoritarian surveillance anyway, and we'll
| all be in dissident reeducation camps by the end of the
| year.
|
| I suppose it's great if you're looking for entertainment
| value. For rational, informed discussion of the
| technology and its political and social ramifications,
| not so much. It's just the same refrain of "we never
| bother to actually RTFA but we imagine a boot stomping on
| a human face forever."
| echelon wrote:
| I've tagged HN users using a small browser plugin I
| wrote. It's amusing to see all the Apple users jump to
| the defense of Apple despite their continued shitty
| behavior.
|
| It's also great to see the handful that have changed
| their minds.
| fossuser wrote:
| The actual iPhone quote you're referring to was "Just avoid
| holding it in that way" not "you're holding it wrong" as
| it's often misquoted to.
|
| Most of the commentary on this more recent issue is
| similarly misrepresented and inaccurate.
|
| I think Apple's mistake here was a PR one, they shouldn't
| have announced this until they had e2ee ready. Then they
| could have announced that which would have gotten most of
| the (positive) press attention. Then they could have gone
| into details about how they were able to do it while still
| fighting CSAM.
| FireBeyond wrote:
| "Just avoid holding it in [one of the most widely
| accepted and ergonomic grips people use for their
| phones]".
| philipov wrote:
| It's a typical "We're sorry you got mad" non-apology that
| deftly avoids admitting fault for the thing people are actually
| mad about.
| innagadadavida wrote:
| This is limited to users of iCloud photos. If you want to store
| your photos on Apple servers, shouldn't they have the right to
| exclude CSAM content? Apple owns those servers and is legally
| liable. Why is this such a big issue?
| chrismcb wrote:
| While apple owns the servers they shouldn't be legally lake.
| No more than a self storage facility is liable for the items
| individuals sure in their units.
| Miner49er wrote:
| I'm not so sure that is true anymore. FOSTA-SESTA makes
| them liable, I think?
| psyc wrote:
| > _If you want to store your photos on Apple servers,
| shouldn't they have the right to exclude CSAM content?_
|
| This seems worded to get a Yes answer. So, yes.
|
| It's a big deal because it's unprecedented (to my knowledge)
| outside of the domain of malware*. Other cloud providers run
| checks of their own property, on their own property. This
| runs a check of your property, on your property. That's why
| people care now. The fact that this occurs because of an
| intention to upload to their server doesn't really change the
| problem, not unless you're only looking at this like an
| architectural diagram. Which I fear many people are.
|
| A techie might look at this and see a simple architectural
| choice. Client-side code instead of server-side. Ok, neat. A
| more sophisticated techie might see a master plan to pave the
| way for E2EE. A net-win for privacy. Cool. But the problem
| doesn't go away. My phone, in my pocket, is now checking
| itself for evidence of a heinous crime.
|
| *I hope the comparison isn't too extra. I was thinking, the
| idea of code running on my device, that I don't want to run,
| that can gather criminal evidence against me, and report it
| over the internet... yeah I can't get around it, that really
| reminds me of malware. Not from society's perspective. From
| society's perspective maybe it's verygoodware. But from the
| traditional user's perspective, code that runs on your
| device, that hurts you, is at least vigilante malware, even
| if you are terrible.
| innagadadavida wrote:
| > My phone, in my pocket, is now checking itself for
| evidence of a heinous crime.
|
| I see your point here - this is a slippery slope for Apple.
| However I don't see how anyone could achieve both purposes
| - no fingerprint reporting and prevention of CSAM storage
| on Apple servers.
|
| Also, a practical thing to do is to just not store your
| photos on iCloud but use something else for sync and backup
| - there might be a startup opportunity here if enough
| people care.
| gambiting wrote:
| Because if the content is entirely encrypted(like apple says
| it is) they aren't legally liable and it's entirely voluntary
| that they do this.
|
| Also, no one(well, most people) has any issue with photos
| being scanned in the iCloud. Photos in Google Photos have
| been scanned for years and no one cares. The problem is that
| apple said that photos are encrypted on your device and in
| the cloud, but now your phone will scan the pictures and if
| they fail some magical test that you can't inspect, your
| pictures will be sent unencrypted for verification _without
| telling you_. So you think you 're sending pictures to secure
| storage, but nope, actually their algorithm decided that the
| picture is dodgy in some way so in fact it's sent for viewing
| by some unknown person. But hey don't worry, you can trust
| apple, they will definitely only verify it and do nothing
| else. Because a big American corporation is totally
| trustworthy.
| strzibny wrote:
| I mean even the verification is problematic. At no point I
| want an "certified" Apple employee or another parent
| looking at naked pictures of my kids, for example.
| Bud wrote:
| But there's nothing in this proposed implementation that
| could ever possibly result in that, because random pics
| of your kids would not be in a database of known CSAM
| content. So your pics wouldn't match the hash values.
| gambiting wrote:
| I'm constantly surprised how even people on HN are
| confused about this - read the white paper apple
| published. It very explicitly says that they are using a
| perceptual(similarity based) hash, and we(well, not me
| specifically, researchers) have demonstrated that it's
| trivial to produce a picture that isn't even remotely
| similar in theme but still produces the same perceptual
| hash.
|
| Apple's solution to this problem is that their employee
| will actually verify the picture before sending it to
| authorities. Which again, is one of the problems people
| have with this system.
| short_sells_poo wrote:
| So you are saying the hash values can never result in a
| collision? That in fact there is literally zero chance
| that two different images could result in the same hash?
|
| Because that doesn't sound correct to me...
| frosted-flakes wrote:
| If they were scanning images that were uploaded to icloud _on
| Apple 's servers_, no one would care. iCloud is not encrypted
| and Apple provides governments access to iClod data, everyone
| knows that, and other cloud providers already scan content
| for CSAM material. The difference is that Apple is doing this
| scanning _on your phone /computer_. Right now, they say that
| only images that uploaded to iCloud will be scanned, but
| what's to stop them from scanning other files too? There's
| been a lot of pushback because this is essentially a back
| door into the device that governments can abuse.
| gowld wrote:
| How can Apple scan encrypted photos?
| frosted-flakes wrote:
| I should have said, "iCloud is not _end-to-end_ encrypted
| ". Apple has full access to everything you upload to
| iCloud, because they control the encryption keys, not
| you.
| anon9001 wrote:
| Apple has the keys to decrypt them.
| Joeri wrote:
| Apple can do anything they want on every iphone, always
| have, and always will. Whether this feature exists or not
| changes that in no way, their technical ability to snoop
| through everyone's stuff is the same. So far they've shown
| restraint with that ability.
|
| I think what people are getting riled up about is not the
| technical ability, it's the lack of restraint, the
| willingness to search through everyone's personal stuff on
| their phones. This is like the cops sending a drug-sniffing
| dog into everyone's home once a day, with the excuse that
| it is privacy-preserving because no human enters the
| premises, and that only truly bad people will get caught.
| There is a difference between scanning in the cloud and
| scanning on device. One is looking through your stuff after
| you've stored it in a storage unit, and the other is
| looking through your stuff while it is still in your home.
| Apple's excuse is that you were going to move it anyway,
| but somehow that doesn't actually excuse things.
| short_sells_poo wrote:
| I'd expect a secure and privacy focused cloud data storage
| provider to not know what I'm storing _at all_.
|
| Let's not beat about the bush, if someone wants to store
| information in a form that can't be decrypted by Apple, they
| can. This is a stupid dragnet policy that won't catch anyone
| sophisticated.
|
| Apple focused the last years pitching themselves as the tech
| giant who actually cares about privacy. They seemed to be
| consciously building this image.
|
| To now implement scanning of private information and then try
| and sell this obvious 180degree slippery slope turnaround in
| the most weasel worded "but think of the children" trope is
| an insult to the customers' intelligence.
|
| I was a keen Apple consumer because I felt that even if their
| motivation was profit, this was a company who focused on
| privacy. It was a distinct selling point.
|
| I certainly won't be buying more Apple products.
|
| For me, Apple lost the main reason to buy their stuff. If
| they are going to do the same thing everyone else is doing, I
| refuse to pay the premium they charge.
| hypothesis wrote:
| Note how they use _your_ device to do the dirty work for
| them, instead of doing what everyone else is doing and
| scanning stuff on their servers.
| ithkuil wrote:
| Because this way they can encrypt things on your phone and
| claim that they can't see your photos once on the server
| (because whatever they had to do with those photos, was
| already done on your own phone).
|
| (There are many ways this can be a slippery slope, but we
| don't have to pretend they could just so what ever body
| else is doing just as easily and they just want to do it on
| your phone because they are lazy or whatever. This is a
| solution to a legitimate problem and also it turns out that
| people are rightfully worried about what's next; those two
| facts can coexist)
| gowld wrote:
| Because they don't want to see your private files.
| btkramer9 wrote:
| The issue is that the scanning happens on your device just
| before upload. So now your own device is scanning for illegal
| activity _on_ your phone not the servers.
|
| The second issue is that it will alert authorities.
|
| In regards to CSAM content those issues may not sound
| terrible. But the second it is expanded to texts, things you
| say, websites you visit or apps you use it's a lot scarier.
| And what if instead of CSAM content it is extended to alert
| authorities for _any_ activity deemed undesirable by your
| government
| gambiting wrote:
| Logically the next step is to scan for any copyrighted
| content and notify authorities that you're watching a movie
| without paying for it. After all, it's all about catching
| criminals, how could you possibly object.
| rootusrootus wrote:
| That's a fairly large step, though. Apple cares first and
| foremost about their reputation. If this feature catches
| a real predator, it is 100% good PR. Every single false
| positive that makes it into the news is a huge loss,
| which strongly incentivizes them to avoid that. The last
| thing I expect them to do is expand the risk surface for
| something as trivial as copyright enforcement.
| anon9001 wrote:
| > Every single false positive that makes it into the news
| is a huge loss, which strongly incentivizes them to avoid
| that.
|
| Just to be clear, "false positive" in this case means an
| innocent person is accused of trafficking in child sexual
| abuse material. It's likely they will be raided.
|
| Sure, that's bad if you're Apple, but it's a lot worse if
| you're the alleged predator.
| gowld wrote:
| You're intentionally running a device with a Digital
| Rights Management module, so...
| hackinthebochs wrote:
| Personally I don't see on device scanning as significantly
| different than cloud scanning. I think the widespread
| acceptance of scanning personal data stored on the cloud is a
| serious mistake. Cloud storage services are acting as agents
| of the user and so should not be doing any scanning or
| interpreting of data not explicitly for providing the service
| to the end user. Scanning/interpreting should only happen
| when data is shared or disseminated, as that is a non-
| personal action.
|
| If I own my data, someone processing this data on my behalf
| has no right or obligation to scan it for illegal content.
| The fact that this data sometimes sits on hard drives owned
| by another party just isn't a relevant factor. Presumably I
| still own my car when it sits in the garage at the shop. They
| have no right or obligation to rummage around looking for
| evidence of a crime. I don't see abstract data as any
| different.
| rootusrootus wrote:
| What does the law say, though? Possession of CSAM by any
| organization or person other than NCMEC is flatly illegal.
| Even other branches of government, including law
| enforcement, may not have any in their possession. My
| question is -- does CSAM residing on Apple's servers, even
| when it is 'owned' by a customer, count as them possessing
| it? What about if it is encrypted?
| fakedang wrote:
| Ben's Stratechery article explains the distinction:
|
| > (f)Protection of Privacy.--Nothing in this section
| shall be construed to require a provider to--
|
| (1) monitor any user, subscriber, or customer of that
| provider; (2) monitor the content of any communication of
| any person described in paragraph (1); or (3)
| affirmatively search, screen, or scan for facts or
| circumstances described in sections (a) and (b).
|
| https://stratechery.com/2021/apples-mistake/
| short_sells_poo wrote:
| If it is encrypted, the data storage provider has no
| chance to know what it is.
|
| Which is exactly why these policies are so dim witted.
|
| Dragnet violation of everyone's privacy while anyone even
| remotely sophisticated can easily evade it by just
| encrypting the data upfront.
| frenchyatwork wrote:
| (Edit: it seems like the algorithm does not work like I
| thought it did, you can basically disregard this comment.
|
| This has been mentioned on here before, but it's known
| CSAM possession that's illegal. Apple keeps your files
| encrypted until its algorithm thinks your encrypted file
| is too similar to CSAM, and then it decrypts it and sends
| it to Apple for review. There's a few things here.
|
| - The algorithm is a black box, so nobody knows how many
| false positives it hits.
|
| - Apple's willingness to decrypt files without the
| consent of the owner makes the encryption seem like a bit
| of a sham.
|
| - I imagine many are skeptical of Apple's ability to
| judge CSAM accurately. If I take a photo of my kids in a
| bathtub, is that CSAM? What about teenagers in a
| relationship sharing nudes. The law is a blunt and cruel
| instrument, and we've gotten away without hurting too
| many innocent people so far because the process is run by
| humans, but computers are not known for being gracious.
| rootusrootus wrote:
| > The algorithm is a black box
|
| So we know for sure they're not just using PhotoDNA?
|
| > If I take a photo of my kids in a bathtub, .....
|
| Kinda the same question. If they're using PhotoDNA, then
| that's not really a risk, right? Isn't this technology
| well understood at this point?
| frenchyatwork wrote:
| Looks like you're right. I edited my comment. It looks
| like there's a couple fairly different changes that are
| happening:
|
| - There's a system to catch CSAM that is either PhotoDNA
| or something that works similarly.
|
| - There's a system to detect novel nudes, and notify
| parents if their children view them.
|
| I think I got these two mixed together.
| rootusrootus wrote:
| > I think I got these two mixed together.
|
| That's fair. Apple did a shit job of explaining
| themselves, and it has been compounded by a lot of
| misinformation (deliberate or not) in response. I'm
| trying really, really hard to moderate my reaction to
| this whole mess until I feel like I actually understand
| what Apple intends to do. I don't make platform jumps
| lightly.
| cwkoss wrote:
| Should every minecraft server be checking if any
| arbitrary sequence of blocks on that server can encode to
| a binary representation of CSAM which when hashed matches
| something in the NCMEC database?
|
| You could argue that a minecraft server is technically in
| possession of CSAM if that's the case, but you could
| spend an infinite amount of money looking at various
| possible sequences and are bound to find many more false
| positives than true positives.
|
| Services should have a duty to report CSAM when they
| notice it, but the lengths they should go to search for
| CSAM should be limited by cost/benefit and privacy
| concerns.
| nonbirithm wrote:
| My impression is that letting people upload CSAM to a
| cloud service has no positive benefit because of the
| supposed link between CSAM consumption and CSA, and it
| carries a very high risk of criminal liability, so
| there's no incentive for companies to completely ignore
| the files that users upload. Otherwise, people will
| eventually notice and the service will be denounced as
| "pedophile-friendly," and then the law will take notice
| and force them to give up the data.
|
| This type of scenario is what happened with the messaging
| service Kik, which was reportedly used to distribute CSAM
| in private chats. Law enforcement agencies said the
| company wasn't providing timely responses and that
| children were being actively abused as a result. This is
| about as damaging of an accusation you can leverage
| against a company.
|
| Laws against CSAM worldwide are not going away for good
| reasons, so there is always going to be a justifiable
| argument that storing certain classes of data is illegal.
| Hence, anyone wanting to run a cloud service that stores
| user data will have to obey by those laws, regardless of
| how proactive they are in scanning for the material.
| Absolute privacy in the cloud is impossible to achieve
| with those rules in place.
| mortenjorck wrote:
| The "confusion" is splitting hairs. Federighi is trying to draw
| an artificial distinction between client-side scanning and in-
| transit scanning where the code performing that in-transit
| scanning merely happens to be running... on the client.
| willcipriano wrote:
| User story for this feature: "As a user if the phone I spent
| $1200 on is going to spy on me, I want it to also use my
| electricity."
| [deleted]
| pcurve wrote:
| They knew they were being hypocritical, so they were reluctant
| to even divulge the fact that other cloud providers have
| already been doing it; they wanted to position themselves as
| the pioneer.
|
| I can't imagine how they thought this would go well.
|
| It's another example of Apple being stuck in an echo chamber
| and not being able to objectively assess how their actions will
| be perceived.
|
| How many times have they made product and PR blunders like
| this?
| tungah wrote:
| It was pure hubris on their part.
| thrill wrote:
| Apple regrets confusion over "you're holding it wrong".
| [deleted]
| schappim wrote:
| "The company says its announcement had been widely
| "misunderstood"."
|
| You're holding understanding it wrong!
| emko7 wrote:
| Could they add to the network to say find people? Sounds good for
| goverments, Russia would love this. Also what goverment do they
| report to? How can anyone trust this? Trust that its not scanning
| for other things ? That the network has not been modified for
| goverments ?
|
| Just scan the images on iCloud ... I mean the CCP can scan the
| iCloud files why cant Apple?
| pdimitar wrote:
| "Confusion", yeah right. As an Apple user I was always realistic
| about this and I am pretty sure things like those are being done
| for years, but now Apple just decided to go public about it.
|
| Nothing confusing about it however. When you use a closed
| platform, things like these are literally the endgame for those
| corporations -- namely being able to not only have access to all
| that goes through the devices but to profile you and, in one
| bright and an ever-so-close future, censor and police you.
|
| I've a made the conscious choice of using Apple because I value
| my time and energy more than the 0.01% chance of me being
| wrongfully flagged. Their products are robust and convenient. But
| with these news I have partially revisited my stance and I'll
| start pulling some of my erotic photography collections to a
| private NAS / home cloud server. I wish them luck breaking
| through my router and several layers of Linux virtualization and
| containerization.
|
| I really have nothing illegal to hide but the slippery slope of
| "for the children!" can be used for anything and everything. I
| won't be a part of their game.
|
| In a few weeks/months it will be "your move, corpos".
|
| ----------------------
|
| SIDE NOTE / OFF-TOPIC:
|
| I wonder at what point we'll get to the trope of "non-approved
| Internet traffic is a crime"? Hopefully not in my lifetime but I
| believe we're bound to get there eventually.
| cwkoss wrote:
| > I wonder at what point we'll get to the trope of "non-
| approved Internet traffic is a crime"? Hopefully not in my
| lifetime but I believe we're bound to get there eventually.
|
| I bet there are facebook lobbyists pushing for this today.
| farmerstan wrote:
| Unless some exec loses their job over this, this entire sequence
| of events was already playbooked by Apple. They knew to wrap the
| feature with CSAM to hopefully quell the protests, and also to
| add two features at the same time, so they could backpedal in
| case pushback was strong, and then they could blame
| "misunderstanding". Even though they are being purposefully
| obtuse about the "misunderstanding" because there is none.
|
| It's a perfectly planned PR response but no one except the
| biggest sheep is buying it.
| rcfaj7obqrkayhn wrote:
| of course it is planned, even dropping this news on friday
| evening no less
| kemayo wrote:
| They announced the whole thing back on Monday, though. If
| they were trying to hide it, the initial announcement would
| have been buried. Burying the "huh, we didn't expect this
| backlash" comment makes no sense.
| tinalumfoil wrote:
| The issue with things like this is, it's often a tradeoff of
| making the public happy vs making the government. If the
| initiative is partisan it _might_ make sense to make the public
| happy. If the it 's bi-partisan you make the government happy,
| and if you're lucky the government/political complex will
| eventually alter public opinion until your not really fighting
| the public anymore.
|
| The PR show is kind of besides the point.
| vouchmeplox wrote:
| >The issue with things like this is, it's often a tradeoff of
| making the public happy vs making the government.
|
| The company should only be concerned with following the law,
| not earning bownie points for extralegal behavior. Making the
| government happy shouldn't be a thing in a country ruled by
| law.
| tgsovlerkhgsel wrote:
| I highly doubt it.
|
| The other feature they're packaging with this (nudity warnings
| for children/teenagers) should be relatively uncontroversial.
| It seems well designed and respects the user's privacy: It
| shows a bypassable warning on the device, only sends a warning
| to parents for children up to 12 years old, and only if the
| child chooses to view it, and only after disclosing that the
| parents will be notified. I don't think there is much criticism
| they'd catch for that, no protests to quell.
|
| On the other hand, the proposal that they're (rightfully) under
| fire for now is something that they can't easily back out of
| (they will immediately be accused of supporting pedophiles),
| and it's basically a "do or don't" proposal, not something that
| they can partially back out of. The press is also incredibly
| damaging to the "iPhones respect your privacy" mantra that's at
| the core of their current PR campaign.
|
| I don't think they expected this level of pushback.
| cma wrote:
| It is very telling of our age that the first widespread
| commercial use of homomorphic encryption*, predicted to be
| letting you run private computations on public cloud and
| distributed infrastructure to preserve privacy, turned out to be
| letting your device's true owner run private computations on your
| (nominally your) device to destroy privacy.
|
| * they use some kind of homomorphic set intersection algorithm as
| part of it
| erhk wrote:
| There is no confusion. However I would say im disappointed by the
| BBCs lack of integrity
| system2 wrote:
| Next: Apple pays crapload of money to all tech blogs and
| youtubers out there to make normies believe what they want them
| to believe.
|
| Facebook was something but what Apple doing right now is
| disgraceful. They destroyed the trust they built in years. At
| least tech people will remember this and make the right choice. I
| don't trust Android, now no more apple. We will all be forced to
| use Nokia 3310's again or use these niche crowdsourced Linux
| phones which suck majority of the time.
| tgsovlerkhgsel wrote:
| If you're claiming that criticism you're facing is just due to
| "confusion", it is helpful to state what the confusion is and
| what the actual facts are, and those facts better differ from the
| common understanding of the issue.
|
| Otherwise, like in this case, it just becomes an article
| basically stating "company trying to gaslight people after they
| got caught doing something bad".
| andrewmcwatters wrote:
| Consumers regret Apple's confusion over 'willingness to purchase
| future Apple devices.'
| studentrob wrote:
| Weak. Hit the pause button, Apple. I'm not won over by being told
| "I misunderstood". No, _you_ misunderstood your customers.
| conradev wrote:
| Apple published a threat model review recently, with an explicit
| response to the threat from governments:
|
| https://www.apple.com/child-safety/pdf/Security_Threat_Model...
|
| Specifically, it looks like they will be requiring that hashes
| exist in two separate databases in two separate sovereign
| jurisdictions.
| cutler wrote:
| Whatever new spin Apple try to add to the original gaffe the fact
| still remains that they are opening the door on everyone's iPhone
| to some kind of scanning. That in itself is the problem, not
| whatever form that scanning may take. From this point onwards
| iPhone users have to trust Apple that it won't, for example, do
| secret deals with China to extend the reach of this scanning.
| Data scanned at source on-device is a much bigger issue than data
| scanned when saved in the cloud. The difference is night and day.
| darwingr wrote:
| I am not confused. Apple Inc's about-face on backdoors for "good
| guys" and engineering tools that create the potential for
| "unmasking" says to me that they are the ones confused.
| cwkoss wrote:
| Nothing like the CIA heart attack gun to win hearts and minds!
| 45ure wrote:
| Next month, when I am watching Tim Cook in a split diopter
| parasocial interaction, from somewhere inside the glass fronted
| circle of Apple Park, in-between drawling on about 'even x%/times
| faster' -- I want to hear an explicit apology and/or an
| explanation of what happened, in the last week or so. I accept it
| would be a tough ask, Apple, like the Royal family, doesn't
| capitulate - they believe that the motto of never complain, never
| explain, will get them through anything. Not this time, the trust
| is irreparably broken.
| paxys wrote:
| Apple PR is clearly working overtime trying to spin this as a
| "misunderstanding" and "confusion".
| OrvalWintermute wrote:
| Let us hope a rollback on this bad idea will remedy the
| confusion
| system2 wrote:
| I bet you my life it will not happen.
| Animats wrote:
| "Sow confusion, and reap inaction" - common military tactic. If
| what you're doing is too big to hide, you also cause visible
| activity in multiple places, so the enemy can't decide where to
| send reinforcements. The Normandy invasion had quite a bit of
| that. The enemy was confused for days about where the main
| attack was hitting, and didn't commit reserves until it was too
| late.
| notapenny wrote:
| It's sadly typical of their apologies. The "I'm sorry you feel
| that way" apology. As a long time customer its beginning to
| annoy me. You fucked up. Say "we fucked up" and move on.
| tgsovlerkhgsel wrote:
| > Say "we fucked up"
|
| That would require them actually changing their plans though,
| and it doesn't seem like they're willing to do that (yet).
| boardwaalk wrote:
| Are they wrong? It seems like most people don't understand that
| Apple was already scanning iCloud Photos on their servers like
| Google scans Google Photos, they're just going to be doing it
| client side now.
|
| I'm not defending Apple, I wish they wouldn't do this, but I
| see section 230 levels of lack of understanding out there.
| UseStrict wrote:
| Why move it on-device then? They've made no announcement or
| attempt to encrypt iCloud backups, so they are free to keep
| scanning on their servers. Moving it on-device has zero
| value-add for iPhone users, it only serves as a Trojan Horse
| for any future "scanning" projects that they may adopt,
| willingly or otherwise.
| boardwaalk wrote:
| I wasn't commenting on the why. I'm just saying that all
| the words in the phrase "Apple regrets confusion" are
| probably true: There is a lot of confusion, and Apple's PR
| is probably really regretting it right now.
|
| If people understood what was going on, would they be as
| upset? I don't know. Apple doesn't seem to think so.
| insulanus wrote:
| There _was_ a lot of confusion. Caused by Apple trying to
| slip this feature under the radar, and omitting information.
| nix23 wrote:
| Why then do it on the device when you can do it on the
| backend with much less publicity and constantly updating it?
|
| >but I see section 230 levels of lack of understanding out
| there.
|
| Mirror mirror on the wall....
| boardwaalk wrote:
| If you have an example of my lack of understanding, please
| show it to me. Also, your snark is below this forum.
| notJim wrote:
| They actually weren't. They wanted to start, and they thought
| this way of doing it was more privacy-friendly. Craig says
| all this, but apparently listening to what he says is
| forbidden.
| boardwaalk wrote:
| I didn't see that. Do you have a link? I only was able to
| find Jane Horvath (Apple Chief Privacy Officer) saying they
| were already scanning photos using PhotoDNA at CES 2020
| [1].
|
| [1] https://www.engadget.com/2020-01-07-apple-facebook-ces-
| priva...
| notJim wrote:
| This article just says "iCloud," not specifically Photos.
| My understanding is that they previously scanned iCloud
| Mail, but not iCloud Photos. I don't have a link handy
| unfortunately, and don't have time to dig it up again.
| c7DJTLrn wrote:
| No, there's no confusion. I'm not happy to have my personal files
| scanned on my personal device that I paid for, simple as that.
| Apple aren't getting another penny from me.
| lstamour wrote:
| But according to this article you can avoid this by not
| uploading photos to Apple's service. Google is already doing
| this when you upload photos to Google's service and Microsoft
| too.
|
| The distinction is whether the matching happens on-device
| before upload or in the cloud after upload, it seems. If Apple
| already does on-device ML, it makes sense they would add more
| photo processing client-side to take advantage of encrypted or
| archival blob storage server-side.
|
| Additionally, there's still the option of using a third-party
| camera app, which wouldn't upload photos by default at all.
| crooked-v wrote:
| > But according to this article you can avoid this by not
| uploading photos to Apple's service.
|
| ...for now.
| spideymans wrote:
| Heck Apple themselves said they would be happy to explore
| expanding this functionality to third party apps as well.
| browningstreet wrote:
| Same here. I'm not confused about what they're doing in the
| least.
| systemvoltage wrote:
| Craig Federighi's interview with WSJ:
|
| https://www.wsj.com/video/series/joanna-stern-personal-techn...
| sdze wrote:
| By illuminating, I assume you mean gaslighting?
| systemvoltage wrote:
| Yep, basically he is gaslighting. Sorry, wrong word choice,
| I'll change it.
| underscore_ku wrote:
| apple is a shiny jail.
| swiley wrote:
| *smartjail
| coding123 wrote:
| This whole thing reminds me of when Facebook asked everyone to
| upload nude photos of themselves.
___________________________________________________________________
(page generated 2021-08-13 23:00 UTC)