[HN Gopher] Apple's new abuse prevention system: an antritust/co...
___________________________________________________________________
Apple's new abuse prevention system: an antritust/competition point
of view
Author : simonebrunozzi
Score : 119 points
Date : 2021-08-06 17:47 UTC (5 hours ago)
(HTM) web link (blog.quintarelli.it)
(TXT) w3m dump (blog.quintarelli.it)
| echelon wrote:
| Or inversely, the FBI/CIA/CCP went to Apple and said "it'd be a
| shame it turned out you were a monopoly".
|
| Apple caved to pressure and had to implement this.
|
| Whatever the angle, this isn't about protecting kids whatsoever.
| It's about power.
| peakaboo wrote:
| And the vaccine is not about protecting people either, it's
| also about power.
| annadane wrote:
| lmao
| fsflover wrote:
| How does vaccine give power to anyone?
| arthurcolle wrote:
| "Do your own research, the 5G implants from the vaccine
| have a secret server and reverse proxy to let them implant
| thoughts into you"
| shotta wrote:
| There's no way that's true. I'd be more likable by now.
| FractalParadigm wrote:
| I don't know what's more impressive, the fact someone
| managed to come up with something so ridiculous, or the
| fact some people actually believe this level of
| technology exists.
| bbarnett wrote:
| The average Joe has no idea how computing, and tech
| works. None.
|
| To such a person, a smartphone is a piece of magic.
| Expecting them to know what is real tech, and what isn't,
| is not fair.
| nickthegreek wrote:
| It gave me the power to confidently shop again.
| echelon wrote:
| The vaccine that you've been _forced_ to take?
|
| Please don't turn this into a clown show.
|
| The privacy and openness of our computers and devices is
| paramount to freedom and a strong democracy.
| thepangolino wrote:
| Depending on where they lives they might have very well
| been forced to take it they were hoping to keep a semblance
| of normal life.
|
| That's why I got vaccinated at least (with the government
| forcing my hand).
|
| What truly scared me was realising that if the requirement
| was ratting on my Jewish neighbours, I probably would have
| done it too.
| revscat wrote:
| Good.
| jjulius wrote:
| That's not quite accurate; you both would end up having a
| "semblance of a normal life" without getting vaccinated,
| you just would've had to wait a bit longer.
| jbverschoor wrote:
| or just get fired if you're a pilot
| dcolkitt wrote:
| Apple's a $2 trillion company. If not even they have enough
| legal firepower to stand up to the three letter agencies, what
| possible chance does any other private citizen have. This
| should be a wake-up call.
|
| It's time to start dismantling massive chunks of the
| intelligence community. It no longer works for the citizenry
| that's supposedly their bosses. (If it ever did.) It's become a
| power blackhole unto itself.
|
| Even elected officials, up to POTUS, have found themselves
| unable to control the unelected and unaccountable fiefdoms that
| make up the intelligence community.
| zepto wrote:
| Most people _want_ the TLAs to be going after child abusers
| and pedophiles. Good luck using this as your argument for
| dismantling them.
| whartung wrote:
| Similarly, many people want the TLAs to be able to go after
| $2T companies as well.
| collaborative wrote:
| What a messed up world we live in where money and power are the
| only things that matter to the people that get to make a
| difference. But has it not always been this way? The show must
| go on..
| IfOnlyYouKnew wrote:
| This is some guy's theory, and they can't even spell "anti-
| trust" (in the headline, no less). It's not quite enough to
| lose all trust in society over.
| collaborative wrote:
| From what I can tell he is Italian, so spelling should not
| be a reason to judge imo
| asimpletune wrote:
| I think that's in essence what the author is arguing, at least
| the outcomes are the same. The only difference is maybe none of
| the 3-letter agencies had to come out and explicitly say it,
| when Apple is perfectly competent at spotting a bone to toss.
|
| In other words, the author thinks with Apple's back to a wall,
| they only needed to make the announcement of this feature for
| the government to see there are advantages to apple having
| tight control as well. Now they'll be able to make that very
| same argument in court in a public sense, but there's always a
| behind the scenes sense with 3-letter agencies as well.
|
| Granted all of that is speculation and who knows what is really
| driving any of this. The author does have a point that if this
| first step causes bad guys to move on from these services then
| that will be future justification to move the scanning further
| and further upstream to the point where it's baked into the
| API's or something. At that level, Apple would really need a
| "monopoly" to accomplish such a feat.
|
| It's certainly an interesting and creative perspective.
| [deleted]
| debt wrote:
| It's no coincidence this system launched around the same time the
| whole NSO scandal broke. The NSO leak shows what government-
| sponsored exploit analysis against a large tech company may
| yield. I mean the NSO exploit could've worked the same but been a
| worm; it could've been absolutely devastating for Apple, imagine
| something like every phone infected. Something like that was
| possible with that exploit.
|
| Apple has been a thorn in the side of the IC for a long while. IC
| probably saw an opportunity to gain a bit of leverage themselves
| via the whole NSO thing, and likely offered their cyber support
| in exchange of some support from Apple.
|
| I mean c'mon they've been consistently pressed by IC for tooling
| like what they just launched; it's the least invasive
| thing(compared to something like a literal backdoor like that
| NSA_KEY that MS did for Windows) they can offer in exchange for
| some cybersecurity support from the gov.
|
| idk if that's what's happened, but it's odd Apple would do this
| at all, and do it right around the time of the NSO thing.
| least wrote:
| There is no such thing as a trustworthy third party and even
| trusting yourself is questionable at the best of times. We are
| constantly balancing a bunch of different considerations with
| regards to the way that we compute, purchase devices, and utilize
| services. Security and privacy are of course important, and Apple
| to date has had a fairly good (if shallow) track record in this
| regard, at least in the United States.
|
| With that being said, what Apple is doing here is just a blatant
| violation of that 'trust' and certainly a compromise to their
| commitment to privacy. Under no circumstances is it justifiable
| to essentially enlist people's devices to police their owners,
| while using the electricity that you pay for, the internet
| service you pay for, and the device itself that you pay for to
| perform a function that is to absolutely no benefit to the user
| and in fact can only ever be harmful to them.
|
| It doesn't matter that the net data exfiltrated by Apple ends up
| being the same as before (through scanning unencrypted files on
| their servers). The distinction is so obvious to me that I find
| it incredible that people are legitimately arguing that it's the
| same, or that it in some way this is actually helping preserve
| user privacy.
|
| As mentioned in the article, this does absolutely nothing towards
| protecting children other than directing all but the biggest
| idiots towards platforms that can't be linked to them, which I'd
| imagine, they already are.
| fossuser wrote:
| > "As mentioned in the article, this does absolutely nothing
| towards protecting children other than directing all but the
| biggest idiots towards platforms that can't be linked to them,
| which I'd imagine, they already are."
|
| I suspect you're more wrong than you think about this. People
| share large volumes of CSAM through lots of different services
| - I knew someone who worked on the problem at _Linked In_ (!).
|
| HN likes to downplay the actual reality as if it's always some
| trojan horse, but the issue is real. It's worth talking to
| people that work on combatting it if you get the chance. I'm
| not really commenting on Apple's approach here (which I haven't
| thought enough about), but I know enough that an immediate
| dismissal based on it 'not helping' is not really appreciating
| the real tradeoffs you're making.
|
| You can be against this kind of thing from Apple, but as a
| result more CSAM will be undetected. Maybe that's the proper
| tradeoff, but we shouldn't pretend it's not a tradeoff at all.
|
| "Robin Hanson proposed stores where banned products could be
| sold. There are a number of excellent arguments for such a
| policy--an inherent right of individual liberty, the career
| incentive of bureaucrats to prohibit everything, legislators
| being just as biased as individuals. But even so (I replied),
| some poor, honest, not overwhelmingly educated mother of five
| children is going to go into these stores and buy a "Dr.
| Snakeoil's Sulfuric Acid Drink" for her arthritis and die,
| leaving her orphans to weep on national television" [0]
|
| [0]: https://www.lesswrong.com/posts/PeSzc9JTBxhaYRp9b/policy-
| deb...
| radicaldreamer wrote:
| This system is likely closely related to full encrypted E2E
| iCloud backups: https://www.reuters.com/article/us-apple-fbi-
| icloud-exclusiv...
| twobitshifter wrote:
| Were iCloud photos already scanned CSAM? In the on-device
| system, if you're not using iCloud are the photos scanned?
|
| If that's true, as an iCloud user you are exactly as likely to
| be charged with a crime based on your photos as you were
| before, but you now get E2E encryption.
|
| Obviously I'd prefer E2E without any scanning. If I wanted to
| upload a pirated mp3 to icloud, I wouldn't want the RIAA
| knocking on my door. However, given that scanning was already
| in place, is this a step forward?
| voidnullnil wrote:
| When I was 13 and my parent made me use a content filter on the
| web I bypassed it and watched porn and they never found out.
|
| On the other hand: Why would I ever want a piece of tech that
| reports me to the police (even if for legitimate reasons).
|
| EDIT:
|
| >Anonymous helplines and guidance exist for adults with at-risk
| thoughts and behaviour [https://www.apple.com/v/child-
| safety/a/images/guidance-img__...]
|
| LOL NVM I TRIED BEING POLITE BUT NUKE SAN FRANCISCO AT THIS POINT
| TO BE HONEST
|
| this horseshit is why i quit the software industry 10 years ago.
|
| see also: https://news.ycombinator.com/item?id=28077491
|
| this is the cancer you are creating
| whoknowswhat11 wrote:
| Then apple will lose market share and correct their ways.
|
| Conversely, what I've seen does put this top of list as a
| parent. Will notify me if my child is sending nudes. Will
| notify me if someone is sending porn to my child. Will notify
| police if known child porn is on the device.
|
| When folks talk about competition - part of this MUST include
| the USERS preferences (not as currently done the focus of what
| I see as largely predatory billers and businesses who I don't
| care about as a user).
|
| I don't want child porn on my systems. Be very happy if apple
| helps keep it off them.
|
| Are these hash databases available more broadly for scanning
| (ie, could I scan all work machines / storage using a tool of
| some sort)?
| abawany wrote:
| I don't think they will ever be able to walk this back. The
| governments that twisted Apple's arm to get this look-see
| into everyone's devices will roast them in the court of
| public opinion (or threaten to, which will be enough). IMO,
| this will just open up more and more - it will never go back
| to being what iDevice owners have now.
| zepto wrote:
| > The governments that twisted Apple's arm to get this
| look-see into everyone's devices will roast them in the
| court of public opinion
|
| Nothing about this technology gives governments a 'look
| see' into everyone's devices.
| artificial wrote:
| The government controls the hash list.
| [deleted]
| paulryanrogers wrote:
| The hash data is secret because if widely known then
| offenders would know which images were known to law
| enforcement, and therefore transform or delete only those.
| ribosometronome wrote:
| Isn't that most of the internet? I would be surprised if, for
| example, you didn't get reported by Hackernews if you started
| making criminal threats or sharing CSAM on here. From the legal
| tab:
|
| >3. SHARING AND DISCLOSURE OF PERSONAL INFORMATION
|
| >In certain circumstances we may share your Personal
| Information with third parties without further notice to you,
| unless required by the law, as set forth below:
|
| ...
|
| >Legal Requirements: If required to do so by law or in the good
| faith belief that such action is necessary to (i) comply with a
| legal obligation, including to meet national security or law
| enforcement requirements, (ii) protect and defend our rights or
| property, (iii) prevent fraud, (iv) act in urgent circumstances
| to protect the personal safety of users of the Services, or the
| public, or (v) protect against legal liability.
| voidnullnil wrote:
| >Isn't that most of the internet?
|
| No, bad analogy.
| ribosometronome wrote:
| Insightful reply, thanks.
| voidnullnil wrote:
| Well, it was succint and other people got it.
|
| Some website reporting you to the police for doing
| something illegal is not the same as your
| hardware/software being stuffed with snakeoil spyware
| that slows down the UI all for some made up cause.
| pl0x wrote:
| Apple and Google need to be under a serious investigation and
| broken up. Their hardware needs to be accessible to install an OS
| of your choice. This isn't possible on iPhones. It may take
| decades for this to happen given the lobbying dollars both spend
| but by then it will be too late.
|
| We are headed for a China style surveillance state and there is
| no stopping this train.
| lamontcg wrote:
| > But when a backdoor is installed, the backdoor exists and
| history teaches that it's only a matter of time before it's also
| used by the bad guys and authoritarian regimes.
|
| Problem is that this scanning is necessarily fuzzy and there is
| going to be a false positive rate to it. And the way that you'll
| find out that you've tripped a false positive is that the SWAT
| team will knock your door down and kill your dog (at a minimum).
| Then you'll be stuck in a Kafkaesque nightmare trying to prove
| your innocence where you've been accused by a quasi Governmental
| agency that hides its methods so the "bad guys" can't work around
| them.
|
| It isn't just "authoritarian regimes" abusing it, it is the
| stochastic domestic terrorism that our own government currently
| carries out against its own citizens every time there's a
| beaurocratic fuckup in how it manages its monopoly on violence.
|
| This is the "Apple/Google cancelled my account and I don't know
| why" problem combined with SWATing.
| throwayws wrote:
| Planting false evidence is getting a new twist here. The attacker
| doesn't even have to make a report! The victim's computer does it
| for him. Disk encryption malware may have new successor.
| Effective and scalable extortion as a service.
| Kaytaro wrote:
| This is a great point. You don't even need to unlock an iPhone
| to take a picture. So in theory anyone with access to your
| phone for a few seconds could incriminate you with little
| effort.
| selsta wrote:
| This is already possible today with things like iCloud
| Photos, Google Photos, OneDrive etc.
| cblconfederate wrote:
| Apple will release this in their desktop pcs. The major
| consequence of this is that, a year down the line, microsoft will
| also be compelled/forced to join this "coalition of the good".
| After all, they already scan all your files for viruses, it would
| be a shame if they didn't scan for anything that is deemed
| incriminating and also call the cops on you. Of course, the
| children are being used as a trojan horse again. We don't even
| mention the giant logical leap from finding someone possessing CP
| to automatically considering them a child molester, yet someone
| posessing an action movie is not considered a murderer.
|
| I m just imagining the situation where these companies took the
| initiative to scan all their users data in a situation like the
| attach in US capitol this year. Creating new affordances for
| spying always leads to their abuse in the first extreme
| circumstance. So there is no excuse for creating those
| affordances just "because they can"
| slg wrote:
| >We don't even mention the giant logical leap from finding
| someone possessing CP to automatically considering them a child
| molester, yet someone posessing an action movie is not
| considered a murderer.
|
| This is a wild and in my opinion a wrongheaded analogy.
| Possessing CP is a crime by itself. It doesn't matter if the
| person possessing is actually a molester or not. It is just
| like the possession of drugs being illegal and it does not
| matter whether the person has actually taken or plans to take
| those drugs.
| cblconfederate wrote:
| Yeah it wasn't meant to be an analogy. For example consider a
| real murder video, is its posession illegal and does it mean
| its owner should be a murder suspect?
| beebmam wrote:
| Windows has been doing this for years. Not with this new tech,
| but with older methods.
| TrevorJ wrote:
| I'm skeptical of the claim that MS has been A: scanning files
| on local windows machines and B: forwarding anything
| concerning to law enforcement. That's a fairly extraordinary
| claim, and I would like to see the evidence.
| [deleted]
| ribosometronome wrote:
| Microsoft hasn't been scanning files on local machines but
| Microsoft has been doing essentially the same thing Apple
| has been for years -- they pioneered it with PhotoDNA back
| in 2009. It's been in use with OneDrive since back when it
| was called SkyDrive
| (https://www.makeuseof.com/tag/unfortunate-truths-about-
| child...). Apple's implementation is scanning images 1)
| stored on iCloud or 2) in Messages if parents turn it on.
|
| The first seems pretty arbitrary -- why is it worse to scan
| files you're sharing with the cloud locally than in the
| cloud (except potential performance/battery impact, but
| that seems moot).
|
| If Apple brings this feature to the desktop, it seems
| likely they'd be using it the same way: files stored in
| their cloud.
| Grazester wrote:
| What are you talking about?
| Asmod4n wrote:
| Microsoft already does this.
|
| ,,Microsoft removes content that contains apparent CSEAI. As a
| US-based company, Microsoft reports all apparent CSEAI to the
| National Center for Missing and Exploited Children (NCMEC) via
| the CyberTipline, as required by US law. During the period of
| July - December 2020, Microsoft submitted 63,813 reports to
| NCMEC. We suspend the account(s) associated with the content we
| have reported to NCMEC for CSEAI or child sexual grooming
| violations."
|
| https://www.microsoft.com/en-us/corporate-responsibility/dig...
| cblconfederate wrote:
| I think that's for files uploaded to onedrive etc. Not for
| all the files on any windows pc
| threeseed wrote:
| > finding someone possessing CP to automatically considering
| them a child molester, yet someone posessing an action movie is
| not considered a murderer
|
| Sure sounds like you're trying to justify child pornography
| here.
| [deleted]
| u10 wrote:
| > possessing CP to automatically considering them a child
| molester, yet someone posessing an action movie is not
| considered a murderer
|
| That's specious reasoning. Someone who posesses an action movie
| likes action movies, while someone who posesses child porn
| likes child porn. One is ok, the other is pretty vile and
| illegal for a reason.
|
| I don't agree with Apple on this but let's be clear on what is
| and what isn't
| mike3498234 wrote:
| > We don't even mention the giant logical leap from finding
| someone possessing CP to automatically considering them a child
| molester, yet someone posessing an action movie is not
| considered a murderer.
|
| Are you suggesting it's OK to possess CP as long as you're not
| a molester? What a fucking idiot.
| hughrr wrote:
| That's a stupid assumption to make based on that. If someone
| emails or messages you a picture then you're in possession of
| it if you chose to be or not.
| Mike8435 wrote:
| No it was not a stupid assumption. You need to read more
| carefully as I exposed a flaw in his reasoning. The
| suggestion was that since CP possession does not prove
| physical molestation, therefore it is OK. Actually
| possession of CP is a crime in itself. This is an objective
| fact. I've known of otherwise intelligent people who do
| believe its actually OK to possess child porn if they're
| not physically molesting anyone.
| cblconfederate wrote:
| Of course it is not OK. Does making it illegal and
| violating everyone's privacy rights reduce child abuse?
| I'd like to see the evidence. Another example is virtual
| child pornography which , evil as it is, does not harm
| anyone during its production, yet that is also illegal
| (in the US). The question is how far will governments go
| with the instrumentalization of child abuse prevention in
| order to spy on their own citizens
| mike3498234 wrote:
| No it was not a stupid assumption. You need to read more
| carefully as I exposed a flaw in his reasoning. The
| suggestion was that since CP possession does not prove
| physical molestation, therefore it is OK. Actually
| possession of CP is a crime in itself. This is an objective
| fact. I've known of otherwise intelligent people who do
| believe its actually OK to possess child porn if they're
| not physically molesting anyone.
| rossmohax wrote:
| > yet someone posessing an action movie is not considered a
| murderer.
|
| I think it is more similar to drugs, possesing one doesn't mean
| you are consuming it, yet it is an illegal substance and
| production, transportation and distribution are understandably
| not allowed.
| 734129837261 wrote:
| It'll start with protecting children. We all want to protect
| children, don't we? Why do you want children abused? Are you a
| child abuser, what do you have to hide?
|
| Next it's elderly people. We don't want our forgetful elders to
| get lost, do we? What if grandma wanders off but is in someone's
| picture, surely you want the police to know right that second
| where she is?
|
| Next up, terrorists! Four adult brown men in an unmarked van are
| certainly suspicious, especially near a government building. Your
| Instagram selfies will help the police in the USA shoot even more
| innocent people for no good reason.
|
| Animal abuse is next. You don't like puppies being abused, do
| you? Why do you hate puppies? Do you take part in illegal
| underground dog fights?
|
| Gosh, that video looks like it might have been pirated.
|
| Nice house, but based on your estimated income it's really
| strange that you have such a big television. Is that safe full of
| cash? How much cash is on your table?
|
| Is that a bit of dust, flour, or maybe crack cocaine?
|
| Is that person asleep or recently murdered?
| [deleted]
___________________________________________________________________
(page generated 2021-08-06 23:00 UTC)