[HN Gopher] We built a system like Apple's to flag CSAM and conc...
___________________________________________________________________
We built a system like Apple's to flag CSAM and concluded the tech
was dangerous
Author : Copernicron
Score : 251 points
Date : 2021-08-19 19:23 UTC (3 days ago)
(HTM) web link (www.washingtonpost.com)
(TXT) w3m dump (www.washingtonpost.com)
| livinginfear wrote:
| I've already written this in a previous comment, however I think
| it bears repeating: I think Apple have introduced this client-
| side content scanning technology under the auspices of protecting
| against CSAM, while its true intention is to allow for the
| Chinese government to scan citizens' phones for subversive
| content. I'm convinced that Apple's upper management figure the
| minimal blowback they're experiencing for this privacy invading
| technology in the west is worth the expansion of their technology
| into a much more totalitarian Chinese market. I think that this
| development has been precipitated by a very visible decline in
| America's economic, and social position as a world leader. Why
| not risk this move? America's trajectory is that of almost
| definite decline.
| legutierr wrote:
| > America's trajectory is that of almost definite decline.
|
| Well, sure, with that kind of attitude!
| fortran77 wrote:
| > Apple's motivation, like ours, was to protect children.
|
| Does anybody really believe Apple's motivation is to "protect
| children?"
| skinkestek wrote:
| I personally believe Apples motive is to protect their
| customers and by extension themselves.
| ummonk wrote:
| Copying over my comment from the last article about this:
|
| Nothing about those concerns seems specific to the end-to-end
| encryption compatible CSAM system they or Apple built...
|
| Honestly if I were Apple I'd consider just scrapping the whole
| thing and doing server side CSAM testing on iCloud photos without
| rolling out E2E encryption for iCloud photos. It's just not worth
| the PR blowback.
| baxtr wrote:
| I think in reality 99.9% of all people don't care at all about
| Apple doing this.
| bobbuf wrote:
| And 99.9% of people don't care if your algorithm is O(n) or
| O(n^2).
|
| It literally does not matter what clueless people think or
| say about this subject. The informed 0.01% (that number isn't
| correct but whatever) have a huge influence on things and
| Apple has just as many incentives to please them as they do
| for the "I bought a 2K device to jack off and watch Netflix"
| crowd.
| threeseed wrote:
| If not higher.
|
| Right now every company is already scanning your photos
| server side but somehow it's an issue if its client side ?
|
| I think once people think this through a bit more it will
| blow over.
| ElFitz wrote:
| Having things server side is an optional feature that has
| some value.
|
| But not being able to trust the device itself makes it
| pretty much useless.
| wang_li wrote:
| Yes it's an issue client side. I've never uploaded a photo
| to iCloud. I have no plans to do so. Getting scanned server
| side is effectively opt-in. Them putting the scanning
| client side has measurable, though minimal effects, even if
| I continue not to use iCloud.
|
| Not to mention that within a decade they'll be scanning
| every image that passes through people's phones.
| maverwa wrote:
| I'd guess you could add a few more 9s to that. Almost all
| people either don't care or like this. And if you do not fear
| (or do not understand) the implications and risks I see why
| they like what apple does. It's one of the few topics where
| all of mankind (with very little exceptions) agrees: CSAM is
| bad! That's why ,,we do it for the kids" always works.
| YLYvYkHeB2NRNT wrote:
| Within my circle, people do not care. They will continue to
| use Apple products because, "I have nothing to hide."
| That's what they told me when I brought it up.
| dijit wrote:
| My group is a bit more nuanced: most will likely stick
| with Apple since the effect is somewhat invisible to
| them, but this topic brought the question up of "should I
| stay on iPhone" - which is not a question you want to
| come up very often if you're trying to sell these
| devices.
| ipaddr wrote:
| Do they allow you to look through their phones if they
| give that opinion?
| PolCPP wrote:
| Why change now when the damage is already done? People who know
| how bad this can turn probably already left or are figuring out
| how to leave Apple's ecosystem.
| hughrr wrote:
| I've already left. They doubled down on it straight away.
| Then wheeled out Craig to tell us lies. They're not on my
| side. I don't have to be their customer.
|
| The dangerous part is they have advertised this capability.
| Now regimes may make that a prerequisite for allowing them to
| do business.
|
| Cat's out of bag. Ain't going back in now. We have to
| reevaluate any data that we do not control ourselves or
| control access to now.
|
| Ironically this isn't the world I want for my kids.
| TechBro8615 wrote:
| Every instance of "government" in this article comes with some
| qualifier, like "foreign" or "other" - watch out for those
| foreign governments who might spy on their foreign citizens.
|
| Is the implication that this technology could only do evil in
| other countries? If Apple deploys this in the US, they're saving
| the children, but if they deploy it in China, they're
| facilitating an oppressive autocracy?
|
| Is the US somehow immune from this same threat?
| bsder wrote:
| > Is the US somehow immune from this same threat?
|
| No, but it's easier to paint China as evil in the US and the US
| as evil in China if you want people to get the point.
| cyanydeez wrote:
| yes, the US will spy on you to help big corporations make
| profit.
|
| everyones missing the dmca2.0 trojan horse in apples actions.
| whymauri wrote:
| Good ol' Manufacturing Consent.
| Clubber wrote:
| >they're facilitating an oppressive autocracy
|
| >Is the US somehow immune from this same threat?
|
| No, if you investigate how many police in the US act now with
| qualified immunity and get away with it scott free, you would
| be horrified. I would guess people like Chauvin is one of
| thousands.
|
| Here's just a taste.
|
| https://www.theguardian.com/us-news/2020/oct/29/us-police-br...
| knaik94 wrote:
| I think the understanding is that US so far hasn't pushed into
| law any policy instructing companies like Apple to publicly
| censor people. Secret surveillance and privacy has been
| debated, but not freedom of speech. The US has not used the
| kind of public censorship used by other countries to facilitate
| an oppressive autocracy.
|
| The US government tends to use one of the four horsemen, CSAM,
| drugs, terrorism, or organized crime as motivation to deploy
| censorship and undermine privacy but freedom of speech is
| generally protected.
|
| Foreign governments censor things like undesirable political
| opposition, LGBTQ+ activism, women's rights activism, and
| historical events like the massacre of protestors.
|
| I think the implication is that the technology is likely to do
| a lot more harm in other countries compared to the harm done in
| the US, so "it's okay" if it's only deployed in the US in the
| name of saving children. A lot of people from the US are
| strongly against the Apple policy regardless.
| drivingmenuts wrote:
| We've been through all "for the children" brouhaha before:
| War on Rock Mudic Lyrics, War on Drugs, etc. It happens ny
| time a parent with influence decides that what's best for
| their children is good for all children.
|
| It doesn't take a village - it just takes a mom with a loud
| voice.
| noasaservice wrote:
| When the very mainstream news media is under the same
| financial umbrella of all the defense contractors, is it no
| surprise we see the "undesirable political opposition, LGBTQ+
| activism, women's rights activism, and historical events like
| the massacre of protestors" covered up or not even reported
| on to begin with?
| herbstein wrote:
| Herman & Chomsky's "Manufacturing Consent" proposes the
| method through which the media influence isn't by way of
| "old men in a dark, smoky backroom" as the claim conjures
| images of. Rather, it's a series of filters that funnel
| like-minded people into the decision-making roles of the
| companies. If we simplify it a lot, it's the claim that a
| CEO is very likely to hire a senior editor that shares his
| views on the world, that editor in turn hires writers with
| a similar outlook on the world, etc. etc. Likewise,
| promotions within these companies is predicated on this
| same mechanism.
|
| To the question "Do you think all journalists are self-
| censoring? Do you think _I 'm_ self-censoring?", posed by
| the BBC's Andrew Marr, Chomsky answered "I think you
| believe everything you're saying because if you didn't you
| wouldn't be sitting here". Both quotes paraphrased but
| meaning-preserving. [0]
|
| This is 1 of 5 filters laid out in "Manufacturing Consent".
| It's, in my eyes, a very compelling analytical framework.
|
| [0]: https://youtu.be/lLcpcytUnWU?t=176 - Video title is
| terribly divisive but the quote is at the timestamp. The
| whole 3 minute video just gives a bit of context to why the
| question is asked. It also cuts off as Andrew Marr is about
| to say something, making the video's title bunk. Viewer
| beware.
| gfrff wrote:
| I don't think any of those are offensive to defense
| contractors, etc. Polarizing the public on these issues is
| exactly the way to haves the public turn a blind eye took
| graft.
| howaboutnope wrote:
| > deploy censorship [..] but freedom of speech is generally
| protected
|
| Can you elaborate a little? I understand and agree with your
| overarching point but this bit threw me.
| 1vuio0pswjnm7 wrote:
| "But Apple has a record of obstructing security research."
|
| Any examples besides Corellium.
| IncRnd wrote:
| One of the words that you quoted from the article was linked
| directly to an example.
| dang wrote:
| I've re-upped this thread in lieu of
| https://news.ycombinator.com/item?id=28264032, which references
| this article but is baitier and led to more of a garden-variety
| thread.
|
| The current submission got a surprising amount of upvotes for a
| post that remained underwater (below the front page):
| http://hnrankings.info/28238163/. It's on my list to detect
| threads like that and rescue them. This case will be a hell of an
| example for testing.
| jsnell wrote:
| Though it's also a dupe of the thread from 2 days ago (310
| votes, 77 comments):
| https://news.ycombinator.com/item?id=28246054
| dang wrote:
| Ah good point. Thanks!
| MR4D wrote:
| It doesn't matter. The horse has left the barn already.
|
| Now that every country knows that Apple _can_ do this, they have
| a pretext for forcing them to do it in the manner of said
| country's choosing.
|
| That to me is the real loss here.
| kgeist wrote:
| Many of my oppressive country's laws are introduced under pretext
| "save the children". For example, public discourse of
| homosexuality is essentially banned, because otherwise
| underdeveloped minors might get involuntarily exposed to it (and
| supposedly get psychologically traumatized). Then another law
| allows banning websites that talk about drugs, LGBT, opposition
| protests etc. without court order, to save children from being
| involved in those traumatizing things of course (now it's used to
| ban opposition sites). And it's hard to argue against it, because
| you are pushed back, "what, you hate kids? you don't want them to
| be safe?" It's a clever ugly trick, because most adults are
| parents, their parental instincts kick in, hearing about all that
| abuse, and they will support any cause that'll make their kids
| safer
|
| I'm not saying Apple is definitively involved in some shady
| stuff, but from my perspective, it does look like NSA forced them
| to do some sort of file scanning backdoor and they came up with
| this "it's about saving the children" explanation, already
| successfully in use in oppressive countries.
| read_if_gay_ wrote:
| Child abuse, terrorism, money laundering and tax evasion.
|
| These are any government's four horsemen of the apocalypse.
| According to them, they are roughly the same degree of evil and
| all deserve the strictest prosecution.
|
| But only two aren't bogeymen.
| jancsika wrote:
| > But only two aren't bogeymen.
|
| Regardless of intent, that's an effective troll.
|
| You could ask 10 people which are the bogeymen and you'll
| probably get all 6 different combinations!
| jdavis703 wrote:
| If this is true, why is it so hard for the US to increase
| enforcement funding for the IRS while purported anti-sex
| abuse laws like SESTA/FOSTA are passed with broad bipartisan
| support?
| MR4D wrote:
| Because voting against said laws cause politicians to lose
| elections.
|
| Voting for IRS funding only gets you half the voters.
| shadilay wrote:
| IRS enforcement is targeted against people with the
| resources to stop it.
| kiba wrote:
| That would presume a consistent and systematic belief
| systems.
| slg wrote:
| On the flip side, "think of the children" does not necessarily
| mean the argument is without merit. There are countless debates
| in which it is a valid point. For example, climate change is a
| big "think of the children" issue because the impact will be
| felt greater by children than by the people who are currently
| in power. I think some people have become too cynical and see a
| "think of the children" argument and reflexively take the
| opposite stance in order to be contrarian. It is basically the
| fallacy fallacy[1] that you see all over the place, especially
| in debates on the internet.
|
| [1] - https://en.wikipedia.org/wiki/Argument_from_fallacy
| croes wrote:
| That are different kind of children. In case of climate
| change it means the next generation of adults. In terms of
| cyber security it means actual children and is simply used to
| trigger protective instincts and make counter arguments seems
| cruel and suspicious. Especially because child porn isn't not
| the initial but followup crime. The initial one is the actual
| abuse which isn't prevented.
| haspoken wrote:
| https://archive.is/y58Py
| rfd4sgmk8u wrote:
| I feel somewhat optimistic of the future when many groups saw
| through this push for on-device scanning for what it was. Damn
| straight the tech is dangerous.
| knaik94 wrote:
| One additional issue that I haven't really seen discussed is how
| to handle a situation when a false accusation is made.
|
| If a person knows the right people who work at these companies,
| things get sorted out, but I imagine sometimes a person is forced
| to just handle the consequences.
|
| Stepping away from CSAM and going back to something like
| developer account and apps getting banned on platforms for
| violating vague "guidelines". It's someone's livelihood that's
| sometimes destroyed. Demonetization, apps getting banned, payment
| processors freezing accounts are mostly black box events and most
| situations aren't even related to crimes dealing with CSAM.
|
| If it was something the government made a mistake with, there's
| legal ways to fight for your rights. There's generally a level of
| transparency that is afforded to you.
|
| It is concerning that people flagged for handling CSAM will not
| know if they have been manually reviewed. The need to keep the
| forwarding to authorities a secret is understandable, but a human
| review before forwarding is only necessary if you expect false
| positives to begin with. Keeping that flag secret seems like
| another black box you can't fight as a user.
|
| I don't deny the value of catching these criminals, but it throws
| the idea of due process out the window when the only assurance so
| far has been "trust us to do the right thing".
|
| It's also weird how Apple has chosen to intentionally insert
| itself into the investigation pipeline rather than just let NCMEC
| handle it like all other cloud providers.
|
| I am glad this hasn't flown under the radar just because it is
| Apple who is making these promises. I have heard non-tech people
| talk about this but there's a lot of misunderstanding.
| skinkestek wrote:
| > It's also weird how Apple has chosen to intentionally insert
| itself into the investigation pipeline rather than just let
| NCMEC handle it like all other cloud providers.
|
| Not so weird actually in my opinion:
|
| Apple wants to make sure their customers doesn't get into
| trouble for no good reason, that would be bad for business.
|
| Personally for me, a European living in Norway I do have great
| respect for the local police (mostly). Great people, mostly
| going out of their way to serve the public (again, mostly).
|
| Norwegian child protection however is someone I'd rather stay
| clear of. They are well known to simultaneously ignore kids who
| need help and also harass innocent parents.
|
| Again, not all of them are like this, many of them try to do
| good, but they seem to a large extent to be working outside of
| control of the courts so if they mess up you have to go to a EU
| court to fix it. (Two or three cases just last 18 months from a
| small country like Norway.)
|
| So something similar might also be at play, but I don't know
| what reputation NCMEC has, only that it is well known that a
| number of people have gotten serious trouble because of
| overeager employees at photo labs reporting innocent images.
| ec109685 wrote:
| There's a human review at other cloud providers too.
| hirvi74 wrote:
| > It's also weird how Apple has chosen to intentionally insert
| itself into the investigation pipeline rather than just let
| NCMEC handle it like all other cloud providers.
|
| So, I am not much of conspiracy theorist, but I do like to
| sometimes fantasize about alternative realities in which they
| were true.
|
| I am not saying the US government had any involvement in
| Apple's decision, but what if they did? I do agree with your
| point about how this topic more or less came out of Left-field.
| It's clear that Apple did not just recently acquire the
| technological ability to produce this feature in 2021. This
| feature could have been implemented years ago (like many other
| companies with a consumer-available cloud storage model already
| did to some degree). I am just curious if Apple did not really
| have a "choice" in this matter. Perhaps my monkey brain just
| want this to be the case.
| knaik94 wrote:
| I don't know for sure if the US government has publicly
| endorsed creating a backdoor after the San Bernardino
| shooting incident. That was 2015/2016. The Apple vs Epic case
| made an interesting email public about Apple's head of fraud
| acknowledging the problem of CSAM on their platform in Feb
| 2020 [1] but I agree with you that this kind of feature has
| to have been in development since before then because it's
| clearly been red teamed. The feature wasn't released until
| they knew they would have a response to all of the technical
| questions concerning the actual implementation.
|
| The UK policy, from 2017, around age verification [2] before
| giving access to pornographic material has definitely also
| played a part. Even youtube recently has made a strong
| verification effort going as far as asking for a credit card
| or photo ID as a response to COPPA [3] and EU policy. But
| Apple hasn't framed the porn blurring for minors feature as a
| response to that policy, which is surprising. And none of
| those government policies explain the need to do on device
| scanning of CSAM instead of in the cloud like everyone else.
|
| I personally believe Apple felt a requirement from a business
| perspective more than to avoid government regulation. It
| doesn't seem to be a secret warrant type situation, because
| publicly disclosure wouldn't really make sense. If catching
| criminals consuming/sharing CSAM is the motive, warning those
| criminals that this change is happening before implementation
| seems counter-productive.
|
| They even went as far as going after leakers a couple of
| months ago [4].
|
| > The letter purportedly cautioned leakers that they must not
| disclose information about unreleased Apple projects because
| it may give Apple's competitors valuable information and
| "mislead customers, because what is disclosed may not be
| accurate."
|
| It's clear in hindsight that these "features" leaking early
| is what Apple was afraid of as it's been confirmed that the
| CSAM scanning algorithm was found, inactive, in iOS 14.3. The
| current stance is that Apple will have an additional check
| but if the presence of the scanning algorithm was leaked
| then, they wouldn't have been able to do the same PR spinning
| that they're doing now.
|
| I agree with you that it really doesn't fit Apple's narrative
| of privacy first. They are the same company that developed
| the security enclave to make a more secure product starting
| with the iphone 5s in 2013.
|
| I hope we do get some clarity on the motivation because it's
| clear that no one buys the "we did it for better privacy"
| narrative they are currently pushing. Their hand being forced
| doesn't seem out of the question to me either, but their own
| public response seems to make it doubtful to me.
|
| I think it speaks volumes that the first item that shows up
| on google when you search the term "icloud scanning apple" is
| a lifehacker article titled "How to Stop Apple From Scanning
| Your iPhone Photos Before iOS 15 Arrives".
|
| 1. https://www.theverge.com/22611236/epic-v-apple-emails-
| projec...
|
| 2. https://www.theguardian.com/society/2021/may/05/uk-
| governmen...
|
| 3. https://www.androidpolice.com/2021/06/24/why-youtube-
| wants-s...
|
| 4. https://www.macrumors.com/2021/06/24/reliable-leaker-kang-
| hi...
| Geee wrote:
| Just think what kind of power this would give USA when they
| invade countries like Afganistan. They could easily cancel all
| people who don't like their presence, and be able to shape the
| narrative with their propaganda. I'm thinking that maybe this is
| the actual reason they want tools like this. Afganistan failed
| because of freedom of speech -> need more tools to limit freedom
| of speech.
| warkdarrior wrote:
| If only US could have effectively cancelled the Taliban using
| fake CSAM on their phones...
| nomoreplease wrote:
| I believe Jonathan Mayer (one of the authors) is a user/commenter
| here on HackerNews
| ufmace wrote:
| I haven't seen this asked on any of the threads about this yet,
| but what happens if we identify a few of the pics in their
| database of Evil Pictures, and send them (presumably from a non-
| Apple device) to the iPhone of anybody we don't like.
|
| Presumably the actual data on the device is still encrypted and
| can't be accessed remotely, which means we need to trigger a law
| enforcement investigation which involves seizing the device and
| compelling the owner to unlock it in order to determine if they
| actually are a kiddie diddler or something went wrong. Gee, can't
| see how that could possibly go wrong. /s
|
| Meanwhile, the actual kiddie diddlers out there have probably
| read the 10 million articles published about this by now and know
| not to use iMessage to trade their pictures, so probably not many
| of them would actually be caught this way.
| 1MachineElf wrote:
| This method of targeting people you don't like was used heavily
| against political groups on Facebook during the 2016 US
| election.
| shadilay wrote:
| What is the CSAM version of SWATting going to be called?
| hipsterhelpdesk wrote:
| Easy win. Not needed. There's enough hate for tech already. Apple
| scrapped it. I wish they would move on.
| ipv6ipv4 wrote:
| Do you know something that Apple hasn't publicly announced yet?
|
| It hasn't been scrapped.
| KitDuncan wrote:
| They didn't scrap it though?
| IncRnd wrote:
| No. Apple didn't scrap this.
| 1cvmask wrote:
| In a previous comment on this very same subject on Apple's
| attempt to flag CSAM I wrote: This invasive capability on the
| device level is a massive intrusion on everyone's privacy and
| there will be no limits for governments to expand it's reach once
| implemented. The scope will always broaden.
|
| Well in the article they correctly point out how the scope of
| scanning is already broad by governments around the world and a
| violation of privacy by content matching political speech and
| other forms of censorship and government tracking.
|
| We already have that now on the big tech platforms like Twitter
| that censor or shadow ban contetnt that they as the arbiters
| (egged on by the politicians and big corporate media) of truth
| (or truthiness as Colbert used to say in the old show The Colbert
| Report) label as misinformation or disinformation.
|
| Do we now need to be prevented from communicating our thoughts
| and punished for spreading the truth or non-truths, especially
| given the false positives, and malware injections and remote
| device takeovers and hijackings by the Orwellian Big Tech
| oligopolies.
|
| Power corrupts absolutely and this is too much power in the hands
| of Big Corporations and Governments.
|
| From the article in case you need the lowdown:
|
| Our system could be easily repurposed for surveillance and
| censorship. The design wasn't restricted to a specific category
| of content; a service could simply swap in any content-matching
| database, and the person using that service would be none the
| wiser.
|
| A foreign government could, for example, compel a service to out
| people sharing disfavored political speech. That's no
| hypothetical: WeChat, the popular Chinese messaging app, already
| uses content matching to identify dissident material. India
| enacted rules this year that could require pre-screening content
| critical of government policy. Russia recently fined Google,
| Facebook and Twitter for not removing pro-democracy protest
| materials.
|
| We spotted other shortcomings. The content-matching process could
| have false positives, and malicious users could game the system
| to subject innocent users to scrutiny.
|
| We were so disturbed that we took a step we hadn't seen before in
| computer science literature: We warned against our own system
| design, urging further research on how to mitigate the serious
| downsides. We'd planned to discuss paths forward at an academic
| conference this month.
|
| That dialogue never happened. The week before our presentation,
| Apple announced it would deploy its nearly identical system on
| iCloud Photos, which exists on more than 1.5 billion devices.
| Apple's motivation, like ours, was to protect children. And its
| system was technically more efficient and capable than ours. But
| we were baffled to see that Apple had few answers for the hard
| questions we'd surfaced.
|
| China is Apple's second-largest market, with probably hundreds of
| millions of devices. What stops the Chinese government from
| demanding Apple scan those devices for pro-democracy materials?
| Absolutely nothing, except Apple's solemn promise. This is the
| same Apple that blocked Chinese citizens from apps that allow
| access to censored material, that acceded to China's demand to
| store user data in state-owned data centers and whose chief
| executive infamously declared, "We follow the law wherever we do
| business."
|
| Apple's muted response about possible misuse is especially
| puzzling because it's a high-profile flip-flop. After the 2015
| terrorist attack in San Bernardino, Calif., the Justice
| Department tried to compel Apple to facilitate access to a
| perpetrator's encrypted iPhone. Apple refused, swearing in court
| filings that if it were to build such a capability once, all bets
| were off about how that capability might be used in future.
| rfd4sgmk8u wrote:
| Maybe it is 4d chess. I am very pleased by the pushback on
| this, in fact given the tech community outcry, this will not
| happen for another 5 years. Apple bought themselves some time
| before the beast forced a move. (regardless, i have already
| made steps to move away from the apple ecosystem. take that
| tim, see what happens!!!!!)
| zamalek wrote:
| The problem is that Apple have let the genie out of the bottle.
| With all the, very public, blowback and drama they have created,
| otherwise ignorant politicians are now aware of what is possible
| and could start demanding it.
|
| Great job, Apple.
___________________________________________________________________
(page generated 2021-08-22 23:00 UTC)