[HN Gopher] One Bad Apple
___________________________________________________________________
One Bad Apple
Author : cwmartin
Score : 127 points
Date : 2021-08-08 21:12 UTC (1 hours ago)
(HTM) web link (www.hackerfactor.com)
(TXT) w3m dump (www.hackerfactor.com)
| netr0ute wrote:
| To help fight back against false positives, why not just
| repeatedly trigger the code that sends the data to NCMEC (per the
| article's claimed legal requirements) and create a DoS attack?
| [deleted]
| jolux wrote:
| > To reiterate: scanning your device is not a privacy risk, but
| copying files from your device without any notice is definitely a
| privacy issue.
|
| Not a lawyer, but I believe this part about legality is
| inaccurate, because they aren't copying your photos without
| notice. The feature is not harvesting suspect photos from a
| device, it is attaching data to _all_ photos before they are
| uploaded to Apple's servers. If you're not using iCloud Photos,
| the feature will not be activated. Furthermore, they're not
| _knowingly_ transferring CSAM, because the system is designed
| only to notify them when a certain "threshold" of suspect images
| has been crossed.
|
| In this way it's identical in practice to what Google and
| Facebook are already doing with photos that end up on their
| servers, they just run the check before the upload instead of
| after. I certainly have reservations about their technique here,
| but this argument doesn't add up to me.
| eivarv wrote:
| Legality aside - how is this not a privacy risk? Privileged
| users of the infrastructure can gain information about users
| (whether they possess CSAM that's in the hash-database... for
| now).
| websites2023 wrote:
| This basic implementation fact has been misrepresented over and
| over and over again. Does anyone read anymore? I'm starting to
| get really concerned. The hacker community is where I've turned
| to be more informed, away from the clickbait. But I'm being let
| down.
| dmitriid wrote:
| What does "manual review" mean then and how are those images
| reported?
| shuckles wrote:
| Before: you would upload images to iCloud Photos. Apple can
| access your images in iCloud Photos, but it does not.
|
| Now: You upload images to iCloud Photos. When doing so,
| your device also uploads a separate safety voucher for the
| image. If there are enough vouchers for CSAM matched images
| in your library, Apple gains the ability to access the data
| in the vouchers for images matching CSAM. One of the data
| elements is an "image derivative" (probably a thumbnail)
| which is manually reviewed. If the image derivative also
| looks like CSAM, Apple files a report with NCMEC's CyberTip
| line. Apple can (for now) access the image you stored in
| iCloud, but it does not. All the data it needs is in the
| safety voucher.
|
| Lot of words spilled on this topic, yet I'd be surprised if
| a majority of people are even aware of these basic facts
| about the system operation.
| tgsovlerkhgsel wrote:
| As I understand it:
|
| When you choose to upload your images to iCloud (which
| currently happens without end-to-end encryption), your
| phone generates some form of encrypted ticket. In the
| future, the images will be encrypted, with a backdoor key
| encoded in the tickets.
|
| If Apple receives enough images that were considered a
| match, the tickets become decryptable (I think I saw
| Shamir's Secret Sharing mentioned for this step). Right
| now, Apple doesn't need that because they have unencrypted
| images, in a future scheme, decrypting these tickets will
| allow them to decrypt your images.
|
| (I've simplified a bit, I believe there's a second layer
| that they claim will only give them access to the offending
| images. I have not studied their approach deeply.)
| shuckles wrote:
| These are not "claims." The process by which they get
| access to only the safety vouchers for images matching
| CSAM is private set intersection and comes with a
| cryptographic proof.
|
| In no step of the proposal does Apple access the images
| you store in iCloud. All access is through the associated
| data in the safety voucher. This design allows Apple to
| switch iCloud storage to end to end encrypted with no
| protocol changes.
| whoknowswhat11 wrote:
| Agreed - so dissapointing.
|
| The idea that standard moderation steps are a felony is such
| a stretch. Almost all the major players have folks doing
| content screening and management - and yes, this may invovle
| the provider transmitting / copying etc images that are then
| flagged and moderated away.
|
| The idea that this is a felony is rediculous.
|
| The other piece is that folks are making a lot of assumptions
| about how this works, then claiming things are felonies.
|
| Does it not strain credibility slightly that apple, with it's
| team of lawyers, has decided to instead of blocking CASM to
| commit CASM felonies? And the govt is going to bust them for
| this? Really? They are doing what govt wants and using
| automation to drive down the number of images someone will
| look at and what even might get transferred to apple's
| servers in the first place.
| msandford wrote:
| Does the law have a moderation carve out? There are plenty
| of laws that have what's called 'strict liability' where
| your intent doesn't matter.
|
| I'm not suggesting that this is absolutely positively a
| situation where strict liability exists and that moderation
| isn't allowed. But the idea that "hey we're trying to do
| the right thing here" will be honored I court is....not
| obvious.
| antman wrote:
| There are a lot of articles about Apples hadh algorithm and for
| me they are mostly irrelevant to the main problem.
|
| The main problem is that Apple has backdoored my device.
|
| More types of bad images or other files will be scanned since now
| apple does not have plausible deniablity to defend any of ghe
| government'x requests.
|
| In the future a false? positive that happened? to be of a
| political file that crept in the list can pin point people to the
| future dictator wannabe.
|
| It's always about the children or terrorism.
| twoodfin wrote:
| The author of this article purports to have done a ton of
| research into this system, but appears to have missed basic
| information that I've acquired from a few podcasts.
|
| Namely the "1-in-a-trillion" false positives per account per year
| is based on the likelihood of _multiple_ photos matching the
| database (Apple doesn't say how many are required to trip their
| manual screening threshold).
| cratermoon wrote:
| Question: who is or will be making money on this deal? Answer
| that ("follow the money") and then I think we'll have a handle on
| what's really going on.
| dmitryminkovsky wrote:
| > However, nothing in the iCloud terms of service grants Apple
| access to your pictures for use in research projects, such as
| developing a CSAM scanner. (Apple can deploy new beta features,
| but Apple cannot arbitrarily use your data.) In effect, they
| don't have access to your content for testing their CSAM system.
| > > If Apple wants to crack down on CSAM, then they have to do it
| on your Apple device.
|
| I don't understand? Apple can't change their TOS but they can
| install this scanning service on your device?
| NKosmatos wrote:
| Really nice explanation from someone who knows a thing or two
| about images/photos (Dr. Neal Krawetz is the creator of
| https://fotoforensics.com and specializes in computer forensics).
| valparaiso wrote:
| He wrongly interpreted CSAM scanning. He said that Apple will
| scan your photos and if finds something, it will send photo to
| Apple. Which is absolutely not how it works. Photo is only
| scanned before uploading to iCloud Photos. Apple already
| confirmed it to iMore and it's clearly stated in Apple papers
| from press-release.
| tgsovlerkhgsel wrote:
| NCMEC has essentially shows that they have zero regard for
| privacy and called all privacy activists "screeching voices of
| the minority". At the same time, they're at the center point of a
| highly opaque, entrenched (often legally mandated) censorhip
| infrastructure that can and will get accounts shut down
| irrecoverably and possibly people's homes raided, on questionable
| data:
|
| In one of the previous discussions, I've seen claims about the
| NCMEC database containing a lot of harmless pictures
| misclassified as CSAM. This post confirms this again (ctrl-f
| "macaque")
|
| It also seems like the PhotoDNA hash algorithm is problematic (to
| the point where it may be possible to trigger false matches).
|
| Now NCMEC seem to be pushing for the development of a technology
| that would implant an informant in every single of our devices
| (mandating the inclusion of this technology is the logical next
| step that seems inevitable if Apple launches this).
|
| I'm surprised, and honestly disappointed, that the author seems
| to still play nice, instead of releasing the whitepaper. The
| NCMEC seems to have decided to position itself directly alongside
| other Enemies of the Internet, and while I can imagine that
| they're also doing a lot of important and good work, at this
| point, I don't think they're salvageable would like to see them
| disbanded.
|
| Really curious how this will play out. I expect attacks either
| sabotaging these scanning systems by flooding them with false
| positives, or exploiting them to get the accounts of your enemies
| shut down permanently by sending them a picture of a macaque.
| defaultname wrote:
| Good article, however-
|
| "Due to how Apple handles cryptography (for your privacy), it is
| very hard (if not impossible) for them to access content in your
| iCloud account. Your content is encrypted in their cloud, and
| they don't have access. If Apple wants to crack down on CSAM,
| then they have to do it on your Apple device"
|
| I do not believe this is true. Maybe one day it will be true and
| Apple is planning for it, but right now iCloud service data is
| encrypted in the sense that they are stored encrypted at rest and
| in transit, however Apple holds the keys. We know this given that
| iCloud backups have been surrendered to authorities, and of
| course you can log into the web variants to view your photos,
| calendar, etc. Not to mention that Apple has purportedly been
| doing the same hash checking on their side for a couple of years.
|
| Thus far there has been no compelling answer as to why Apple
| needs to do this on device.
| amelius wrote:
| Probably because Apple wants to stay away from any suspicions
| that they sometimes actually use their keys to access private
| information.
| tgsovlerkhgsel wrote:
| > why Apple needs to do this on device
|
| Presumably to implement E2E encryption, while at the same time
| helping the NCMEC to push for legislation to make it illegal to
| offer E2E encryption without this backdoor.
|
| Apple users would be slightly better off than the status quo,
| but worse off than if Apple simply implemented real E2E without
| backdoors, and everyone else's privacy will be impacted by the
| backdoors that the NCMEC will likely push.
| ncw96 wrote:
| You are correct -- most of the iCloud data is not end-to-end
| encrypted. Apple discusses which data is end-to-end encrypted
| at https://support.apple.com/en-us/HT202303
| [deleted]
| initplus wrote:
| This reads like a failure of the NCMEC, and the legal system
| surrounding it.
|
| It is insane that using perceptual hashes is likely illegal. As
| the hashes are actually somewhat reversible and so possession of
| the hash is a criminal offence. It just shows how twisted up in
| itself the law is in this area.
|
| One independent image analysis service should not be beating
| reporting rates of major service providers. And NCMEC should not
| be acting like detection is a trade secret. Wider detection and
| reporting is the goal.
|
| And the law as setup prevents developing detection methods. You
| cannot legally check the results of your detection (which Apple
| are doing), as that involves transmitting the content to someone
| other than the NCMEC!
| Dah00n wrote:
| > _18 U.S.C. SS 2258A is specific: the data can only be sent to
| NCMEC. (With 2258A, it is illegal for a service provider to turn
| over CP photos to the police or the FBI; you can only send it to
| NCMEC. Then NCMEC will contact the police or FBI.) What Apple has
| detailed is the intentional distribution (to Apple), collection
| (at Apple), and access (viewing at Apple) of material that they
| strongly have reason to believe is CSAM. As it was explained to
| me by my attorney, that is a felony._
|
| I'm not sure, after reading the article, who is/has the most
| insane system of Apple or NCMEC.
___________________________________________________________________
(page generated 2021-08-08 23:00 UTC)