[HN Gopher] Apple plans to scan US iPhones for child abuse imagery
___________________________________________________________________
Apple plans to scan US iPhones for child abuse imagery
Author : jdmark
Score : 35 points
Date : 2021-08-05 16:45 UTC (6 hours ago)
(HTM) web link (californianewstimes.com)
(TXT) w3m dump (californianewstimes.com)
| randomluck040 wrote:
| The headlines are marginally misleading. They won't see actual
| images but hashes. They probably have a database including hashes
| of child abuse imagery which they will compare/try to match to
| the hashes of the images on your phone. If you take an image of
| your child's bathing time to send it to your parents so they can
| see the little bugger during covid, it most probably won't be a
| problem. It's an important distinction in my opinion and it's
| happening on a lot of cloud services already.
|
| The breach of trust and going for the images on my personal
| device however is not acceptable for me. As a non-US citizen
| living outside of the US I'd refuse to use any Apple devices
| anymore. It's contradicting their own privacy talk as well which,
| looking at some parts of the world, is evidently "talk" to some
| degree. Child abuse is probably the worst crime there is, I agree
| wholeheartedly, but this will create a back door like situation
| for every government on earth.
| SN76477 wrote:
| Right, as I understand it, they are comparing images to known
| abuse, not scanning images looking for "new" abuse.
| taylorius wrote:
| And how will they see these hashes? By uploading a hash of
| every file on your iphone to a database in the cloud. Oh -
| don't worry - they promise to delete them all after checking
| for child porn. Scouts honour.
| olliej wrote:
| It explicitly says the phone downloads the set of hashes and
| does the computation on the client side.
|
| That said I agree that this is going to be hideously abused
| by every government.
| tomas_nort wrote:
| Incorrect. " Messages uses on-device machine learning to
| analyze image attachments and determine if a photo is sexually
| explicit."[0]
|
| They do not state this explicitly, but utilizing this
| technology a mere visit to a website theoretically could result
| in your home search by the law enforcement of your country. ---
| [0] - https://www.apple.com/child-safety/
| throwaway0a5e wrote:
| >They won't see actual images but hashes
|
| Oh great, even better. /s
|
| How many billions of images does apple deal with? There's bound
| to be a heck of a lot of collisions in there.
| new_realist wrote:
| This has already been considered and addressed via a secret
| sharing threshold scheme.
| taylorius wrote:
| Who could object to a measure to eliminate child pornography?
| Unless you're a fan of it, right? And so the precedent is set -
| and before you know it, they will be checking your phone for all
| sorts of things.
| cbanek wrote:
| As much as I hate child abuse (and I really do) I don't like the
| privacy aspects of this. I feel like this could also be troubling
| in that people who send you unsolicited images / emails / texts
| with pictures that might be downloaded to your device could trip
| this up.
|
| I'm already wondering how a hash collision attack could defeat
| this, although then I can already hear the judge saying "well,
| you had it there, but you deleted it, so now we'll also charge
| you with destruction of evidence."
|
| This is going to get very interesting very quick as soon as this
| is first used in case law.
| jsnell wrote:
| https://news.ycombinator.com/item?id=28068741 (640 comments)
|
| https://news.ycombinator.com/item?id=28075021 (296 comments)
| [deleted]
___________________________________________________________________
(page generated 2021-08-05 23:02 UTC)