[HN Gopher] Apple's new abuse prevention system: an antritust/co...
___________________________________________________________________
Apple's new abuse prevention system: an antritust/competition point
of view
Author : simonebrunozzi
Score : 424 points
Date : 2021-08-06 17:47 UTC (1 days ago)
(HTM) web link (blog.quintarelli.it)
(TXT) w3m dump (blog.quintarelli.it)
| echelon wrote:
| Or inversely, the FBI/CIA/CCP went to Apple and said "it'd be a
| shame it turned out you were a monopoly".
|
| Apple caved to pressure and had to implement this.
|
| Whatever the angle, this isn't about protecting kids whatsoever.
| It's about power.
| peakaboo wrote:
| And the vaccine is not about protecting people either, it's
| also about power.
| annadane wrote:
| lmao
| fsflover wrote:
| How does vaccine give power to anyone?
| arthurcolle wrote:
| "Do your own research, the 5G implants from the vaccine
| have a secret server and reverse proxy to let them implant
| thoughts into you"
| shotta wrote:
| There's no way that's true. I'd be more likable by now.
| FractalParadigm wrote:
| I don't know what's more impressive, the fact someone
| managed to come up with something so ridiculous, or the
| fact some people actually believe this level of
| technology exists.
| bbarnett wrote:
| The average Joe has no idea how computing, and tech
| works. None.
|
| To such a person, a smartphone is a piece of magic.
| Expecting them to know what is real tech, and what isn't,
| is not fair.
| nickthegreek wrote:
| It gave me the power to confidently shop again.
| echelon wrote:
| The vaccine that you've been _forced_ to take?
|
| Please don't turn this into a clown show.
|
| The privacy and openness of our computers and devices is
| paramount to freedom and a strong democracy.
| thepangolino wrote:
| Depending on where they lives they might have very well
| been forced to take it they were hoping to keep a semblance
| of normal life.
|
| That's why I got vaccinated at least (with the government
| forcing my hand).
|
| What truly scared me was realising that if the requirement
| was ratting on my Jewish neighbours, I probably would have
| done it too.
| revscat wrote:
| Good.
| jjulius wrote:
| That's not quite accurate; you both would end up having a
| "semblance of a normal life" without getting vaccinated,
| you just would've had to wait a bit longer.
| jbverschoor wrote:
| or just get fired if you're a pilot
| literallyaduck wrote:
| Or work at CNN.
| dcolkitt wrote:
| Apple's a $2 trillion company. If not even they have enough
| legal firepower to stand up to the three letter agencies, what
| possible chance does any other private citizen have. This
| should be a wake-up call.
|
| It's time to start dismantling massive chunks of the
| intelligence community. It no longer works for the citizenry
| that's supposedly their bosses. (If it ever did.) It's become a
| power blackhole unto itself.
|
| Even elected officials, up to POTUS, have found themselves
| unable to control the unelected and unaccountable fiefdoms that
| make up the intelligence community.
| zepto wrote:
| Most people _want_ the TLAs to be going after child abusers
| and pedophiles. Good luck using this as your argument for
| dismantling them.
| whartung wrote:
| Similarly, many people want the TLAs to be able to go after
| $2T companies as well.
| zepto wrote:
| Do they? What behavior do they want them to go after?
| 8note wrote:
| I imagine the IRS for taxes
| zepto wrote:
| Very amusing, although these companies pay their taxes.
|
| The problem isn't with them, it's with the loopholes that
| let them pay less than people would like.
|
| The IRS can't do anything.
| adrr wrote:
| The biggest question is who is providing the hashes? Since
| possession of images is illegal, it has to be the government.
| So we have to trust the government enough that they won't abuse
| it and stick in other hashes. Same government that recommended
| an encryption method that they figure out how to break it.
| collaborative wrote:
| What a messed up world we live in where money and power are the
| only things that matter to the people that get to make a
| difference. But has it not always been this way? The show must
| go on..
| IfOnlyYouKnew wrote:
| This is some guy's theory, and they can't even spell "anti-
| trust" (in the headline, no less). It's not quite enough to
| lose all trust in society over.
| collaborative wrote:
| From what I can tell he is Italian, so spelling should not
| be a reason to judge imo
| asimpletune wrote:
| I think that's in essence what the author is arguing, at least
| the outcomes are the same. The only difference is maybe none of
| the 3-letter agencies had to come out and explicitly say it,
| when Apple is perfectly competent at spotting a bone to toss.
|
| In other words, the author thinks with Apple's back to a wall,
| they only needed to make the announcement of this feature for
| the government to see there are advantages to apple having
| tight control as well. Now they'll be able to make that very
| same argument in court in a public sense, but there's always a
| behind the scenes sense with 3-letter agencies as well.
|
| Granted all of that is speculation and who knows what is really
| driving any of this. The author does have a point that if this
| first step causes bad guys to move on from these services then
| that will be future justification to move the scanning further
| and further upstream to the point where it's baked into the
| API's or something. At that level, Apple would really need a
| "monopoly" to accomplish such a feat.
|
| It's certainly an interesting and creative perspective.
| mr_toad wrote:
| This could be aimed at pre-emptively de-fanging one argument
| against end-to-end encryption.
| SV_BubbleTime wrote:
| That's the best case.
|
| But even in the best case, exists then worst abuse cases.
| That's the problem. This WILL be abused.
| mctub wrote:
| iOS 15 is adversarial and Apple lacks credible neutrality.
| zimpenfish wrote:
| > Do we really think criminals trade CSAM in plain-sight web
| sites ?
|
| "In 2018, Facebook (especially Facebook Messenger) was
| responsible for 16.8 million of the 18.4 million reports
| worldwide of CSAM"
|
| Apparently they do, yeah.
| joshstrange wrote:
| Also
|
| > Do we really think criminals don't know mathematics or
| programming ?
|
| Yes, by and large yes I do. Amon is a great example of this.
| Surely in the world this author is imagining no criminal would
| get caught up in a scheme like Amon since there are criminals
| that know "mathematics or programming", except... no one
| noticed/blew the whistle and instead the FBI rounded up the
| users of that device.
| e-clinton wrote:
| This is optional. You don't HAVE to use iCloud for your photos.
| This is no different that YouTube searching videos you upload for
| copyrighted music. If you don't want your photos scanned, don't
| upload your images to their servers.
| aburneraccnt wrote:
| You're 100% correct. I had my images on iCloud because I
| thought Apple put my privacy first. I was being naive. I've
| pulled all my photos off and am looking to move the rest of my
| data too. My iPhone and iPad just got a lot less useful.
| jacknews wrote:
| Child abuse, or any abuse should be stamped out vigorously.
|
| But this is the job of the police, using standard techniques to
| track down first, the people making and distributing this
| content, and then consumers of it.
|
| It is not Apple's job to put a policeman in everyone's phone.
| ctvo wrote:
| I don't think this issue is widespread enough and harmful enough
| to society to risk the privacy of all Apple users. Full stop.
|
| I haven't been particularly impressed with Apple's security
| record[1] lately, and I don't trust them to not mess this up.
|
| 1 - https://bhavukjain.com/blog/2020/05/30/zeroday-signin-
| with-a...
| DSingularity wrote:
| Why do you think this?
| ctvo wrote:
| I can't find a widely accepted statistic but these numbers
| should illustrate the point:
|
| - A 2016 study by the Center for Court Innovation found that
| between 8,900 and 10,500 children, ages 13 to 17, are
| commercially exploited each year in the United States.
| (Center for Court Innovation, 2016) https://www.courtinnovati
| on.org/sites/default/files/document...
|
| - The annual number of persons prosecuted for commercial
| sexual exploitation of children (CSEC) cases filed in U.S.
| district court nearly doubled between 2004 and 2013,
| increasing from 1,405 to 2,776 cases. https://www.ojp.gov/sit
| es/g/files/xyckuh241/files/archives/p...
|
| This is a niche crime from everything I've seen.
|
| Apple, if it were truly interested in the net good of
| children, could have picked something that impacts more of
| them (nutrition? early childhood education?), didn't
| introduce new vulnerability / abuse surface area, and was
| less politicized.
| DSingularity wrote:
| Here are other statistics that suggest this is anything but
| a "niche" crime.
|
| https://storage.googleapis.com/pub-tools-public-
| publication-...
| DSingularity wrote:
| Thanks for the statistics. To me, these numbers are
| significant.
|
| Maybe Whole Foods or maybe some popular restaurants are
| better candidates for working on improving nutrition in
| public schools? Why don't we let apple contribute where it
| thinks it can. Maybe with apple that number goes down from
| 10,000 to 2,000. Wouldn't that be a celebrated outcome?
| emptysongglass wrote:
| We cannot stop all crime. There will never be a day where
| we stop all crime. It is not an achievable goal nor is it
| desirable because what constitutes a crime is written by
| the governments of the world and we have tens of
| thousands of years of reigning authorities to tell us
| that they will abuse the power they are invested with to
| protect themselves.
|
| Authority and its keeping is the number two law of the
| jungle. Any power handed over in the name of security,
| "to stop all crime", is an affirmation, a concretization
| of its future abuse. You speak of the calculated cost of
| preventing child abuse as acceptable. What of the abuse
| of an entire people?
|
| This is not handwavy theoreticals. We already know what
| happens, in the US, when you push an agenda in the name
| of protecting the children: it looks like FOSTA/SESTA,
| which has driven sex workers of America underground and
| exposed them to more violence, more danger in a
| profession already one of the most murderous professions
| _in the world_. Those murders, in the name of protecting
| the young, are at the feet of the people who would
| protect the children with more authority.
| ctvo wrote:
| > To me, these numbers are significant.
|
| What would be insignificant? 1 child? 100? There are
| 73,000,000 children (under 18) in the US alone. 10,000 is
| .0001% of that population.
|
| > Why don't we let apple contribute where it thinks it
| can.
|
| Apple is the most profitable company in the world. It's a
| company that prides itself on its imagination and
| innovation, I wouldn't discount their ability to come up
| with something.
|
| > Maybe with apple that number goes down from 10,000 to
| 2,000. Wouldn't that be a celebrated outcome?
|
| No, it's not. We make trade-offs all the time. The
| possible harm to Apple's user base is not worth the
| possibility that this reduces child abuse. There's a
| possibility these people move on to another platform and
| this does nothing.
|
| To get that number down Apple creates an entry point for
| violating the privacy of half a billion users worldwide.
| Many of them are in China, where pressure from the
| government has already moved Apple in directions that are
| harmful to its customers[1].
|
| 1 - https://www.nytimes.com/2021/05/17/technology/apple-
| china-ce...
| [deleted]
| choeger wrote:
| It's inevitable that software and hardware will be controlled by
| law enforcement. It's just too good an opportunity to not use it.
| Nearly no one will fight it and the apathy of the masses will
| lead us directly into a system at least as oppressive as china is
| now.
|
| You can prepare for it, though. Organize some hardware while you
| still can, teach your children to not trust any device or
| service, and, more importantly keep your mouth shut. It might be
| hard for a typical US millennial to grasp the concept, but people
| that grew up in eastern Europe will be able to understand the
| concepts.
| lijogdfljk wrote:
| So this new Apple stuff has made me decide not to buy an M1. I'm
| leaning Framework laptop.
|
| For my phone though... no idea. My iPhone is honestly such a
| solid piece of tech. I don't _want_ to go Google either... so
| what else do i have?
|
| I know lots of people run de-googled Androids, which i guess
| works, but i'd prefer to avoid them entirely. Is there anything
| that works?
|
| _edit_ : I know of the Purism phone
| (https://puri.sm/products/librem-5-usa/) but that's the only one
| i know of. Anyone know of others?
| windthrown wrote:
| The Pinephone running something like Ubuntu Touch (UBports) or
| PostmarketOS is something to keep an eye on but it is still in
| heavy development and in my option not quite a reliable "daily
| driver" yet.
| mshroyer wrote:
| My MacBook Pro is overdue for an upgrade, and I literally
| ordered a Framework laptop (on which I plan to run Debian)
| partially in response to this announcement.
| intricatedetail wrote:
| I am looking at Framework, but it still feels like a legacy
| tech. I have XPS 15 from 2019 and it feels slow and fans
| drive me crazy. I don't think Framework will be much
| different. I was looking at M1 for fanless experience and
| performance. I wish they at least considered ditching Intel
| and adding a good AMD option - then I will buy it.
| jjcon wrote:
| Wow I'm actually looking at framework right now as well after
| this announcement- my 2015 pro has been great but the trade
| offs are just too high
|
| I wonder if framework will have a mysterious bump in sales
| because of this
| bishoprook2 wrote:
| GrapheneOS on Pixel? Dunno for sure. My strong temptation is to
| go dumb phone.
|
| >So this new Apple stuff has made me decide not to buy an M1.
|
| I hear you on that. I was kind of wanting one just for a break
| from the vintage Thinkpad Master Race but I'll wait for a non
| Apple OS port.
| lancemurdock wrote:
| commenting here as I share the same exact sentiment. I guess I
| am going to look into LineageOS which falls in the de-googled
| category but im not sure I am in love with that idea.
|
| edit: Whoa, 2K for that purism phone. fuckin' a.
| lijogdfljk wrote:
| The standard (https://shop.puri.sm/shop/librem-5/) is cheaper
| - i guess the USA one comes with a large uptick for locally
| sourcing?
| buildbot wrote:
| Yep, having spec'd a few boards for assembly in the usa and
| China, the manufacturing costs seems to always be about
| double for USA fabrication.
| zeroimpl wrote:
| It seems Apple only applies this if you sync photos to iCloud.
| Isn't the solution just to not use iCloud? I've personally never
| used iCloud sync for photos, am happy enough managing the files
| myself. Demanding that Apple allows you to use iCloud to host
| illegal pictures seems a bit entitled to me.
|
| I am concerned about the slippery slope of if they start scanning
| files that were never going to go through their servers in the
| first place though.
| Heliosmaster wrote:
| One of the major concerns that I've seen arising around here is
| the question "what's stopping Apple from doing this even
| without iCloud?". Once you open Pandora's Box, there is not
| going back.
| adib wrote:
| Say that you're an official with the Chinese Communist Party
| (CCP). You have huge stacks of brochures of anti-CCP materials.
| You've got them scanned and hashed. Next you call Apple and say,
| "Please alert us if similar imageries appears in your customers'
| devices. My assistants will send you weekly updates of the
| required hashes." Apple would say, "Sure, we're just following
| your law..."
|
| Hence when a Chinese photographs such brochure "in the wild"
| using an iPhone, someone from "the government" will knock the
| next day and "strongly enquire" about yesterday's photo. Likewise
| when a Chinese minor receives an iMessage containing such
| brochure.
|
| This is just _one_ example case of "extension" of the CSAM
| database as seen fit by some regulatory body.
| pentae wrote:
| Yep, and now Apple can no longer say "Sorry, we can't put a
| backdoor into our devices" - one already publicly exists
| adib wrote:
| Technically it's "front door" since it _publicly exists_.
| [deleted]
| totetsu wrote:
| Does anyone know about a good write up of this discussion in
| Japanese?
| iabacu wrote:
| Why stop at scanning photos in your phone?
|
| With lower power sensors, head-mounted devices with always-on-
| sensors, and whatnot, why not sample real-time hashes that can
| tip LEO about potential crimes happening?
|
| Then why sacrifice recall in order to achieve high accuracy?
|
| Err on the side of uploading more hashes. Then feed it all into a
| ML so it can use other data to filter out potential false
| positives.
|
| Then if in a distance future, any LEO wants to investigate you
| for whatever reason, the set of potential hashes associated with
| your account will provide sufficient evidence for any court to
| authorize further non-hashed data access (it doesn't matter if
| they were all false positives)
| m3kw9 wrote:
| Sophisticated abuser criminals will not be caught this way. This
| is for a likely significant percent of the abusers that doesn't
| know, forget or make mistakes. Also this system can slow spread
| CSAM images which can facilitates abuse.
| [deleted]
| debt wrote:
| It's no coincidence this system launched around the same time the
| whole NSO scandal broke. The NSO leak shows what government-
| sponsored exploit analysis against a large tech company may
| yield. I mean the NSO exploit could've worked the same but been a
| worm; it could've been absolutely devastating for Apple, imagine
| something like every phone infected. Something like that was
| possible with that exploit.
|
| Apple has been a thorn in the side of the IC for a long while. IC
| probably saw an opportunity to gain a bit of leverage themselves
| via the whole NSO thing, and likely offered their cyber support
| in exchange of some support from Apple.
|
| I mean c'mon they've been consistently pressed by IC for tooling
| like what they just launched; it's the least invasive
| thing(compared to something like a literal backdoor like that
| NSA_KEY that MS did for Windows) they can offer in exchange for
| some cybersecurity support from the gov.
|
| idk if that's what's happened, but it's odd Apple would do this
| at all, and do it right around the time of the NSO thing.
| least wrote:
| There is no such thing as a trustworthy third party and even
| trusting yourself is questionable at the best of times. We are
| constantly balancing a bunch of different considerations with
| regards to the way that we compute, purchase devices, and utilize
| services. Security and privacy are of course important, and Apple
| to date has had a fairly good (if shallow) track record in this
| regard, at least in the United States.
|
| With that being said, what Apple is doing here is just a blatant
| violation of that 'trust' and certainly a compromise to their
| commitment to privacy. Under no circumstances is it justifiable
| to essentially enlist people's devices to police their owners,
| while using the electricity that you pay for, the internet
| service you pay for, and the device itself that you pay for to
| perform a function that is to absolutely no benefit to the user
| and in fact can only ever be harmful to them.
|
| It doesn't matter that the net data exfiltrated by Apple ends up
| being the same as before (through scanning unencrypted files on
| their servers). The distinction is so obvious to me that I find
| it incredible that people are legitimately arguing that it's the
| same, or that it in some way this is actually helping preserve
| user privacy.
|
| As mentioned in the article, this does absolutely nothing towards
| protecting children other than directing all but the biggest
| idiots towards platforms that can't be linked to them, which I'd
| imagine, they already are.
| fossuser wrote:
| > "As mentioned in the article, this does absolutely nothing
| towards protecting children other than directing all but the
| biggest idiots towards platforms that can't be linked to them,
| which I'd imagine, they already are."
|
| I suspect you're more wrong than you think about this. People
| share large volumes of CSAM through lots of different services
| - I knew someone who worked on the problem at _Linked In_ (!).
|
| HN likes to downplay the actual reality as if it's always some
| trojan horse, but the issue is real. It's worth talking to
| people that work on combatting it if you get the chance. I'm
| not really commenting on Apple's approach here (which I haven't
| thought enough about), but I know enough that an immediate
| dismissal based on it 'not helping' is not really appreciating
| the real tradeoffs you're making.
|
| You can be against this kind of thing from Apple, but as a
| result more CSAM will be undetected. Maybe that's the proper
| tradeoff, but we shouldn't pretend it's not a tradeoff at all.
|
| "Robin Hanson proposed stores where banned products could be
| sold. There are a number of excellent arguments for such a
| policy--an inherent right of individual liberty, the career
| incentive of bureaucrats to prohibit everything, legislators
| being just as biased as individuals. But even so (I replied),
| some poor, honest, not overwhelmingly educated mother of five
| children is going to go into these stores and buy a "Dr.
| Snakeoil's Sulfuric Acid Drink" for her arthritis and die,
| leaving her orphans to weep on national television" [0]
|
| [0]: https://www.lesswrong.com/posts/PeSzc9JTBxhaYRp9b/policy-
| deb...
|
| Edit: After digging in, HN commentary is missing the most
| relevant details about this particular implementation. iCloud
| image checking compares to known CSAM image hashes - this means
| effectively zero false positive rate.
|
| For iMessage child account ML scanning it's on device and just
| generic sexual material restriction - more similar to standard
| parental controls than anything else (alerts only go to the
| parent, and only happen on child accounts)
|
| My initial impression is this is a good approach. The risk
| mostly comes from future applications (like the CCP adding
| hashes of other stuff that isn't CSAM like tankman or
| something).
|
| The frankly ignorant knee-jerk responses from technical HN
| readers do a disservice and weaken the ability of technical
| people to push back when necessary.
| int_19h wrote:
| It does not imply a false zero positive rate at all, because
| the hashes used are by necessity fuzzy, and even a completely
| benign picture can trigger a match.
|
| Now they're saying that if the match is a false positive,
| it'll be screened by the human reviewer. But that means
| there's some stranger there potentially looking at my
| _private_ photos (and, by the way, I wonder which country
| they 'll be located in?). That's already completely and
| utterly unacceptable - you don't have to wait for any "future
| complications".
| ComputerGuru wrote:
| You need to stop assuming everyone is commenting out of
| ignorance and read up on how perceptual hashing works. It's
| about as far as you can get from zero false positives, if
| configured otherwise it becomes just a poor version of SHA or
| whatever and can only detect exact matches.
| fossuser wrote:
| At the time of my comments people weren't discussing the
| fuzziness of perceptual hashing - they weren't discussing
| implementation details at all.
|
| I don't know enough about the specific implementation or
| perceptual hashing details and probably pushed back too
| hard as a result of the other comments at the time (which
| _were_ comments out of ignorance).
|
| The level of downvotes I received is disproportionate to
| what I wrote anyway - this is clearly a political issue on
| HN. The irony is I'd probably align more with the risks
| being too high position, but it needs to be considered
| after actually understanding what the true risks are first.
| JoshTko wrote:
| Is CSAM a specific list of images that does not change, or
| does a government body actively control this list? Is there
| anything that would technically prevent the addition of a
| specific image of an enemy of the state? Hypothetical
| question, how will Apple respond when the FBI asks Apple to
| scan for images they know are also on the phone of the latest
| domestic terrorist? And how likely do you think the FBI will
| try to expand the scope of images that are scanned?
| fossuser wrote:
| I don't know - I think these are the good questions and the
| things I would want to know in more detail. I think this
| kind of thing is where the risk lies.
|
| I think this is why getting the details right matter
| because if people are arguing about unrelated nonsense it's
| harder to focus on the actual risks represented by the
| questions you're asking.
| voidnullnil wrote:
| > You can be against this kind of thing from Apple, but as a
| result more CSAM will be undetected.
|
| You cannot claim to be making "the real reasonable analysis"
| and write this. So much for "you're all geeks stuck on
| technical details". Quite the contrary: I'm sick of bogus
| software pretending to solve problems for me, while the
| quality of tech has exponential degraded over the last 20
| years (often due to trying to solve some unsolvable problem
| in a bad way that backfires).
|
| Now imagine you have a 16 year old girlfriend. She sends you
| a nude photo. Your phone calls the cops on you (it doesn't
| matter if the phone doesn't quite do this now, it will in the
| future. They will use their ML crap to detect the age of
| subjects in photos and explicitness of the photo). You
| normally wouldn't go to jail for this since 16 is legal in
| 99% of the civilized world, but thanks to America with their
| super duper "non-technical" innovations that only big boy
| white collars can understand, you can go to jail for having a
| photo of your legal girlfriend.
| skinkestek wrote:
| Mostly good...
|
| > big boy white
|
| but why the totally uncalled racism and sexism here?
|
| It does nothing to strengthen your argument and it is just
| dumb.
| voidnullnil wrote:
| I am truly sorry for your loss in that your brain is
| implemented using regex.
|
| I meant "white collar big boys", but I did not bother to
| edit as I'm writing.
|
| The guy above is claiming everyone who is against apple's
| yet-another-bogus-TPM-style-snakeoil is a little geek who
| does not understand anything outside their little tunnel.
|
| Also now that I re-read his comment:
|
| > Edit: After digging in, HN commentary is missing the
| most relevant details about this particular
| implementation. iCloud image checking compares to known
| CSAM image hashes - this means effectively zero false
| positive rate.
|
| False: it's a perceptual hash. Ignoring the fact that if
| for some reason you choose to let people host stuff in
| your icloud account (perhaps as a neat hack), which may
| be out of terms of service, but certainly not worth 20
| years of jail: perceptual hashes have false positives,
| and can confuse images that appear harmless but were
| crafted to look like $badimg. But you don't have to be
| technical to understand that having your devices police
| you is bad, you just have to not be blinded by politics
| and boogeyman your state has sold you.
| White_Wolf wrote:
| I would say most (if not all) know how easily you can change
| the scope of a database like this(like you said yourself) to
| check for other inconvenient terms, links, memes, etc that do
| not "toe the party line"(whether is eastern or western - I
| don't care) and shut down inconvenient discourse.
|
| The point is: This is too easy to abuse.
| fossuser wrote:
| Being specific with the arguments and addressing the nuance
| matters imo.
|
| Most of the HN responses are dumb, get the details wrong,
| and don't take the trade offs seriously. That kind of thing
| causes me to dismiss them entirely.
|
| There _are_ legitimate arguments and risks which I'd
| concede, but they're not what most people in the comments
| are talking about.
| ctvo wrote:
| The trade-off here is to get pedophiles off of Apple
| services and on to something else, we give Apple an entry
| point into examining our private data. I understand it's
| hashes now, that does nothing to prevent this from
| expanding.
|
| "People just don't understand!!" has never been a
| convincing argument.
| fossuser wrote:
| This is just a slippery slope argument.
|
| The implementation specifically for detecting CSAM can be
| okay while using that for other purposes can _not_ be
| okay.
|
| The goal is to stop child sexual abuse, FB reports
| millions of cases a year with a similar hash matching
| model for messenger.
|
| > ""People just don't understand!!" has never been a
| convincing argument."
|
| That's not my argument - I just think most of the HN
| comments on this issue both miss the relevant details,
| and are wrong.
| ErikVandeWater wrote:
| Slippery slope fallacy is often implied incorrectly when
| it comes to people. Human nature is subject to the "foot
| in the door" sales tactic.
|
| https://en.wikipedia.org/wiki/Foot-in-the-door_technique
| DSingularity wrote:
| I think HN takes the incorrectly perceived moral high
| ground and ignore the fact that there is a real duty to
| act here.
|
| There are people who profit from CSAM and in turn this
| creates a market for abuse. Unless we create structures
| which disincentivizes these behaviors -- right now there
| are none - they will continue to grow. What Apple is
| building will basically make any criminal who sells to
| someone sloppy enough to store in iCloud risk being
| traced as the origin of the CSAM. Back to the darkest web
| they go.
|
| Anti CSAM is inevitable. You will find similar systems
| for all major providers eventually.
| aj3 wrote:
| I'm pretty sure child abuse is a criminal offense
| everywhere. That's quite a disincentive.
| 2pEXgD0fZ5cF wrote:
| One of the prerequisites to "discuss the tradeoffs" of this
| ordeal is the trust that the people behind it, and those in
| the position to push things further are actually interested
| in keeping the scope limited.
|
| There are diminishing returns to using the same old excuse
| again and again. I'd say most people are just tired of the
| whole "Just think of the children!" into "Ah we got this
| system in place why not use it against <a little less evil
| but still illegal thing> too" followed by "It's the law in
| China/Turkey/Russia/wherever, <Company> can't just ignore it
| (and thus not help putting reporters, critics and other
| people into prison)" combo.
|
| What you are saying is basically another rendition of "look
| the problem of child abuse exists and this could help so it
| is worth discussing", which is also a variant of the same.
|
| > iCloud image checking compares to known CSAM image hashes -
| this means effectively zero false positive rate.
|
| Actually we recently had news where an Apple tried to help
| cover up their errors [1], the system was supposed to be
| safe(tm), doesn't mean the people having control over it
| can't make mistakes, or worse.
|
| What details are being missed here, exactly? Ultimately it is
| trivial to expand the hashes to compare to, isn't it? What
| does it matter that they use CSAM for now? It doesn't remove
| the involvement of humans in controlling the system, so false
| positive rate will never be "effectively zero", and it can
| easily be expanded.
|
| We are left with the same two arguments as always:
|
| - Think about the children
|
| - Just trust the company, they use _technology_ (tm), no you
| aren't allowed to check. Yeah they can easily abuse it, but
| we don't know for sure!
|
| I wouldn't say HN is ignoring details, many people are just
| tired of the same old loop, there is no reason to put trust
| into these endeavors and it is reasonable to doubt even the
| motives.
|
| [1]: https://news.ycombinator.com/item?id=27878333
| dcow wrote:
| It's not just HN folks, go read the "open letter".
| [deleted]
| zepto wrote:
| > Under no circumstances is it justifiable to essentially
| enlist people's devices to police their owners, while using the
| electricity that you pay for, the internet service you pay for,
| and the device itself that you pay for to perform a function
| that is to absolutely no benefit to the user and in fact can
| only ever be harmful to them.
|
| This is an obvious misrepresentation. An opt-in system to
| detect when adults are trying to groom kids by sending them
| porn seems like the opposite of harm. I imagine a lot of
| parents want that.
|
| As for scanning what gets sent to iCloud. That's also an opt-in
| service, and frankly it seems entirely reasonable for Apple not
| to want their servers to be used as a repository or hub for
| child porn.
| least wrote:
| It's not a misrepresentation whatsoever.
|
| If Apple wishes to scan what's on their servers, that is
| their prerogative. They can use their compute resources and
| energy to do so. You needn't install spyware on a person's
| device that is of no benefit to the user. I'll reiterate,
| this can only ever be harmful to the user. Its utility right
| now is at its absolute best and most altruistic and it is
| still a violation of people's privacy and stealing computing
| resources from the device owner.
|
| > That's also an opt-in service, and frankly it seems
| entirely reasonable for Apple not to want their servers to be
| used as a repository or hub for child porn.
|
| This will not stop their servers being used as a repository
| for illicit materials, if that is what you're suggesting.
| rahkiin wrote:
| Maybe they want to turn iCloud fully e2e, so cannot
| actually scan anymore on their servers.
|
| They have to scan for CSAM by US law.
| tenpies wrote:
| Can I just say that I'm fascinated that this is happening under
| Tim Apple - an openly gay man in his 60s?
|
| I mean this is someone who would know what being on the wrong
| side of law means - not only federally, but probably quite
| intimately since Alabama was not exactly known for its acceptance
| of homosexual people.
|
| Now just imagine if the US still had anti-homosexuality laws
| (like the majority of the world still does) and your phone is
| constantly scanning your photos to check for signs of homosexual
| behaviour. Forgetfully take a selfie with your boyfriend, it gets
| flagged, sent to the Ministry of Morality, and next thing you
| know you're being dragged into a van. Best case scenario you're
| being jail. Worst you're being thrown off a building, stoned, or
| brutally beaten to death.
|
| That's the future Apple is signing us up for. There is zero
| chance this stops at CSAM, especially with the Democrats
| convinced that half the country are the absolute worst people on
| the planet and not being shy about completely ignoring the rule
| of law and Constitution to extend the reach of the state to
| levels that would make a totalitarian blush. This will end
| terribly.
| [deleted]
| SheinhardtWigCo wrote:
| Your last paragraph ruined an otherwise great comment.
| fouc wrote:
| His name is Tim Cook. Good points though
| ta56hf47yt wrote:
| Alternatively, and just as likely, is that the Republicans do
| the same evil things that you say the Democrats would do.
| Misusing tech isn't just for the "other" side, my friend.
| pcbro141 wrote:
| Yeah that was particularly weird comment for GP to make just
| 6 months after Republicans stormed the Capitol on the command
| of a Republican President to hang government officials and
| overturn an election.
| [deleted]
| alentist wrote:
| > on the command of a Republican President to hang
| government officials and overturn an election
|
| Literally never happened. The fact that you're parroting
| this lie shows how deeply media bubbles have disconnected
| people from reality.
|
| https://www.npr.org/2021/02/10/966396848/read-trumps-
| jan-6-s...
|
| https://www.npr.org/sections/insurrection-at-the-
| capitol/202...
| h0l0cube wrote:
| What's more galling is that an anti-authoritarian stance
| should be non-partisan in a country that sells itself on
| democracy and liberty
| jachee wrote:
| > ...your phone is constantly scanning your photos to check for
| signs of homosexual behaviour. Forgetfully take a selfie with
| your boyfriend, it gets flagged...
|
| Except that's not what this is doing.
|
| For all intents and purposes, "scanning" is just hashing
| content before it's uploaded and comparing that hash to a known
| database. So, unless your picture had been hashed and flagged
| before (And if you just took it how could it have been?) then
| there's nothing for it to match.
|
| Don't just conjure FUD from a contrived worst-case scenario.
| noduerme wrote:
| The hash matching part of this is a red herring. The real
| issue is that if a hash matches, there's now a mechanism to
| upload the image in question from your phone, unencrypted,
| without your consent. What assurance is there that the hidden
| upload mechanism can't be used to upload other files on
| demand? If a mechanism is built into the OS that can
| exfiltrate files on demand, what's to say there even has to
| be a hash match? Or that iCloud has to be active? What's to
| stop any government from going to Apple and saying "we know
| you have this ability, give us all this user's files."
| Nothing.
|
| The exfiltration mechanism is the problem. If this were
| saying a hash match could be used to obtain a _search
| warrant_ , that would be bad, but it would be far less
| egregious a breach of security and privacy than Apple adding
| a backdoor to just grab anything it likes.
| jachee wrote:
| It's not a "hidden upload mechanism". It's a hash-and-flag
| system on top of the extant "upload to iCloud" function.
|
| Apple will _already_ have the data. You're already
| consenting to that when you enable iCloud Photo Library
| (the only place this is being implemented).
| noduerme wrote:
| But all uploads to iCloud are [edit: will be] encrypted.
| This system includes another, separate mechanism to
| upload a file unencrypted, _not to iCloud_ , but to some
| other system where humans can review it. So it's not
| using the same functionality as a normal upload. The file
| doesn't have to have been uploaded to iCloud yet, or
| ever, to be copied off the phone using some other method.
| The whole point is that they don't want unencrypted
| copies being uploaded to iCloud, so they _can 't_ review
| what was uploaded via the iCloud route. Therefore they
| want some other method of checking your files locally -
| which means they need a backdoor to copy them without
| your permission.
| panda88888 wrote:
| For me the biggest concern is not that Apple is scanning the
| images for $BAD_STUFF but rather, now that the scanning occurs on
| my device instead of Apple cloud servers. The trust has been
| eroded. I can understand Apple scanning images on their servers
| (although I don't think Apple should). However, running the scan
| on device is taking it too far, even if the assertion is that
| only images that would be eventually uploaded is scanned. Apple
| promised security and privacy on their devices, and this breaks
| the trust. Now I question their future software roadmap, such as:
|
| 1. Will the scanning come to MacOS? 2. Will the scanning start to
| include additional $BAD_STUFF, such as political censoring and/or
| even other files (video, document, etc.)
|
| I really like Apple hardware. The iPads and the M1 Macs are
| awesome, but this news makes me hesitant to stay in the Apple
| ecosystem and will be looking at alternatives. I already run
| Linux desktops, and I'll probably move to Linux on laptop.
| pyuser583 wrote:
| Recently my IPad's AI made a montage of my pet cats.
|
| Only, my daughter was included in the montage. Why?
|
| She was wearing a shirt with a cat on it.
| radicaldreamer wrote:
| This system is likely closely related to full encrypted E2E
| iCloud backups: https://www.reuters.com/article/us-apple-fbi-
| icloud-exclusiv...
| twobitshifter wrote:
| Were iCloud photos already scanned CSAM? In the on-device
| system, if you're not using iCloud are the photos scanned?
|
| If that's true, as an iCloud user you are exactly as likely to
| be charged with a crime based on your photos as you were
| before, but you now get E2E encryption.
|
| Obviously I'd prefer E2E without any scanning. If I wanted to
| upload a pirated mp3 to icloud, I wouldn't want the RIAA
| knocking on my door. However, given that scanning was already
| in place, is this a step forward?
| least wrote:
| The entire reason to encrypt data before transferring is to
| preserve privacy of the individual, which is completely
| irrelevant when you have a system in place to scan everything
| before it ever gets uploaded.
|
| It's like living in the glass apartments of Yevgeny
| Zamyatin's _We_ but still thinking we 're preserving privacy
| because we put our items into an opaque box.
| testvox wrote:
| > If that's true, as an iCloud user you are exactly as likely
| to be charged with a crime based on your photos as you were
| before
|
| Is this strictly true? I feel like the evidence that a photo
| was present on specific device is different from evidence
| that a photo was uploaded by a specific account (and a
| specific ip address probably).
|
| It seems like it would be far easier for the government to
| justify a search warrant if they have evidence the photo they
| are looking for was on a specific device. Just having
| evidence that a specific account uploaded a photo seems like
| far shakier grounds to search a specific device, after all
| accounts are often stolen to be used for criminal purposes
| and ip addresses don't map cleanly to people or devices.
| twobitshifter wrote:
| Maybe there's some plausible deniability if a warrant
| uncovered nothing - but I think they would always be able
| get a warrant and try to find the device and more evidence
| based on the upload.
|
| From what I've read the on-phone scanning only alerts after
| multiple photos and is designed to have a 1 in a trillion
| false positive rate. If the iCloud scan is similar they
| would have a strong case for getting a warrant based on
| uploads.
| thesimon wrote:
| But why would Apple not mention this, if that was their
| intention?
| voidnullnil wrote:
| When I was 13 and my parent made me use a content filter on the
| web I bypassed it and watched porn and they never found out.
|
| On the other hand: Why would I ever want a piece of tech that
| reports me to the police (even if for legitimate reasons).
|
| EDIT:
|
| >Anonymous helplines and guidance exist for adults with at-risk
| thoughts and behaviour [https://www.apple.com/v/child-
| safety/a/images/guidance-img__...]
|
| LOL NVM I TRIED BEING POLITE BUT NUKE SAN FRANCISCO AT THIS POINT
| TO BE HONEST
|
| this horseshit is why i quit the software industry 10 years ago.
|
| see also: https://news.ycombinator.com/item?id=28077491
|
| this is the cancer you are creating
| whoknowswhat11 wrote:
| Then apple will lose market share and correct their ways.
|
| Conversely, what I've seen does put this top of list as a
| parent. Will notify me if my child is sending nudes. Will
| notify me if someone is sending porn to my child. Will notify
| police if known child porn is on the device.
|
| When folks talk about competition - part of this MUST include
| the USERS preferences (not as currently done the focus of what
| I see as largely predatory billers and businesses who I don't
| care about as a user).
|
| I don't want child porn on my systems. Be very happy if apple
| helps keep it off them.
|
| Are these hash databases available more broadly for scanning
| (ie, could I scan all work machines / storage using a tool of
| some sort)?
| abawany wrote:
| I don't think they will ever be able to walk this back. The
| governments that twisted Apple's arm to get this look-see
| into everyone's devices will roast them in the court of
| public opinion (or threaten to, which will be enough). IMO,
| this will just open up more and more - it will never go back
| to being what iDevice owners have now.
| zepto wrote:
| > The governments that twisted Apple's arm to get this
| look-see into everyone's devices will roast them in the
| court of public opinion
|
| Nothing about this technology gives governments a 'look
| see' into everyone's devices.
| artificial wrote:
| The government controls the hash list.
| mthoms wrote:
| No. A registered charity called _The National Center for
| Missing and Exploited Children_ controls the hash list.
|
| Yes, they are partly government funded, but I highly
| doubt they'd let their mission be compromised by allowing
| the government to inject non-CP hashes. Doing so would
| compromise all the work they've performed over the last
| four decades.
|
| These people are (rightfully) _very_ passionate about
| their work and can 't be easily paid off.
|
| There are many, many valid concerns about this Apple
| initiative. But, "The government" injecting non CP images
| into the CP databases is not one of them.
| nomel wrote:
| > by allowing the government to inject non-CP hashes
|
| I don't think "allowing" is the concern here, because I
| highly doubt they get to generate the hashes themselves.
| SV_BubbleTime wrote:
| Why not? How can we know this isn't the case? What
| technical or practical prevention would there be?
| mthoms wrote:
| I think you misunderstand the Center's role in this.
| _They_ review and categorize the images. _They_ maintain
| the database and provide access to different
| organizations for the purpose of catching CP. Why
| wouldn't they also generate the hashes?
| SV_BubbleTime wrote:
| > No. A registered charity called The National Center for
| Missing and Exploited Children controls the hash list.
|
| Are you aware of the ties to them and the Clinton
| foundation? I mean, the joke right there that Bill was
| flying to Epstein's private island 26 times or so,
| Maxwell was a guest at Chelsea's wedding, and their
| foundation is setting up the NCMEC. Classic Clintons.
| [deleted]
| voidnullnil wrote:
| > I don't want child porn on my systems. Be very happy if
| apple helps keep it off them.
|
| Apple is scanning your own _personal_ storage for illegal
| content. It wouldn't be there in the first place, unless you
| put it there.
| bississippi wrote:
| Would this service notify parents if such a thing happened ?
| I haven't see it in their official announcement.
| paulryanrogers wrote:
| The hash data is secret because if widely known then
| offenders would know which images were known to law
| enforcement, and therefore transform or delete only those.
| ribosometronome wrote:
| Isn't that most of the internet? I would be surprised if, for
| example, you didn't get reported by Hackernews if you started
| making criminal threats or sharing CSAM on here. From the legal
| tab:
|
| >3. SHARING AND DISCLOSURE OF PERSONAL INFORMATION
|
| >In certain circumstances we may share your Personal
| Information with third parties without further notice to you,
| unless required by the law, as set forth below:
|
| ...
|
| >Legal Requirements: If required to do so by law or in the good
| faith belief that such action is necessary to (i) comply with a
| legal obligation, including to meet national security or law
| enforcement requirements, (ii) protect and defend our rights or
| property, (iii) prevent fraud, (iv) act in urgent circumstances
| to protect the personal safety of users of the Services, or the
| public, or (v) protect against legal liability.
| voidnullnil wrote:
| >Isn't that most of the internet?
|
| No, bad analogy.
| ribosometronome wrote:
| Insightful reply, thanks.
| voidnullnil wrote:
| Well, it was succint and other people got it.
|
| Some website reporting you to the police for doing
| something illegal is not the same as your
| hardware/software being stuffed with snakeoil spyware
| that slows down the UI all for some made up cause.
| pl0x wrote:
| Apple and Google need to be under a serious investigation and
| broken up. Their hardware needs to be accessible to install an OS
| of your choice. This isn't possible on iPhones. It may take
| decades for this to happen given the lobbying dollars both spend
| but by then it will be too late.
|
| We are headed for a China style surveillance state and there is
| no stopping this train.
| lamontcg wrote:
| > But when a backdoor is installed, the backdoor exists and
| history teaches that it's only a matter of time before it's also
| used by the bad guys and authoritarian regimes.
|
| Problem is that this scanning is necessarily fuzzy and there is
| going to be a false positive rate to it. And the way that you'll
| find out that you've tripped a false positive is that the SWAT
| team will knock your door down and kill your dog (at a minimum).
| Then you'll be stuck in a Kafkaesque nightmare trying to prove
| your innocence where you've been accused by a quasi Governmental
| agency that hides its methods so the "bad guys" can't work around
| them.
|
| It isn't just "authoritarian regimes" abusing it, it is the
| stochastic domestic terrorism that our own government currently
| carries out against its own citizens every time there's a
| beaurocratic fuckup in how it manages its monopoly on violence.
|
| This is the "Apple/Google cancelled my account and I don't know
| why" problem combined with SWATing.
| fossuser wrote:
| This is wrong - the iCloud check is against known CSAM hashes,
| the false positive rate is essentially zero.
| bnj wrote:
| Yes but my understanding is that the hashes are perceptual,
| as opposed to cryptographic.
|
| If the system was matching against known cryptographic hashes
| the collision / false positive rate would be small, but the
| fuzzy matching involved with perceptual hashing necessarily
| has a greater false positive rate.
|
| And that doesn't even begin to address the detection of sent
| and received "explicit images" which are detected on device
| and don't have a set of known hashes.
| fossuser wrote:
| I suspect that's why they have some threshold that moves
| the false positive rate to one in one trillion.
|
| The iMessage bit is different - it's only on device, only
| on child accounts, and only alerts parents. It's more akin
| to a parental control feature than anything else.
| bnj wrote:
| This is a great point, thanks for adding that detail.
| skinkestek wrote:
| > I suspect that's why they have some threshold that
| moves the false positive rate to one in one trillion.
|
| So now instead of sending just one nice innocent very
| high resolution images of "Tokyo City" or something with
| something horrific hidden somewhere you have to send a
| few such images.
|
| That is reassuring. I can never believe anyone except me
| will think about that.
|
| (If the system is too dumb to detect this it is
| worthless, and if it is smart enough this opens the
| floodgates for anyone wanting to make trouble for just
| about anyone.)
| samatman wrote:
| It was pretty dumb of Apple to announce both of these
| things at the same time.
|
| They have nothing to do with each other, and I've seen a
| dozen people on HN confuse them. If the message is
| getting muddled here, it will be hopelessly conflated in
| less technical circles.
|
| I'm concerned and upset about the CSAM filter for all the
| reasons that keep hitting the front page, but don't care
| about the opt-in parental controls at all, and if I had
| kids, I might want them.
|
| But if I thought the CSAM filter worked like the nudie-
| detector filter, I'd be wigging out.
| [deleted]
| lamontcg wrote:
| It doesn't matter if the cryptographic algorithm is one
| in a trillion. If there's a concurrency bug in the
| application which applies the algorithm the wrong account
| could be flagged against a legitimate image in the
| database. If the human involved overly trusts the system
| they may not validate against the actual file in the
| users account, or may assume the user deleted it somehow
| and figure its "safter" to let law enforcement figure it
| out.
|
| Bugs in the application of the code, combined with human
| complacency and mistakes can certainly lead to errors,
| even if the cryptographic algorithm itself was perfect.
|
| We really need to bring back comp.risks
| hhh wrote:
| One in one trillion chance per year, per the paper on the
| Apple site.
| patmcc wrote:
| I see the 'one in one trillion' but it's a bit vague; I
| read it as '1 in trillion chance' per image.
|
| So when we have a billion iPhones in the wild taking 10
| images a day...1 in a trillion chances happen every few
| months. Now, if that triggers some further review, maybe
| that's an acceptable false positive. If it triggers a SWAT
| team, I don't think it is.
| jeromegv wrote:
| They have also indicated that a single match won't be
| enough to trigger it, so an accidental match (being that
| 1 in a trillion person) is not enough.
| fossuser wrote:
| Yeah, that's essentially zero.
| [deleted]
| jhayward wrote:
| I would like to bet Apple's market value against that
| number being correct in an adversarial context.
| paul_f wrote:
| This seems naive, there are always false positives
| DSingularity wrote:
| No. You can design a system where the FPR is essentially
| zero. Even if shitty md5 is used.
| Conlectus wrote:
| Perceptual hashes are not exact hashes, otherwise they
| would be useless for this task; you would just mirror the
| image or change 1 pixel.
|
| They are instead fuzzy classifiers, and thus have non-
| zero error rates.
| DSingularity wrote:
| How does this relate to the allegation I was responding
| to? There is a difference between false positives and
| false negatives you know. False negative is where the
| criminal evades the system through measures you describe.
| False positive -- what I was responding to -- is where
| Apple incorrectly accuses an innocent person and thus
| violates their privacy during the subsequent review.
| nomel wrote:
| But this trigger requires many of these false flags to
| exceed the threshold (most likely why it exists). I
| imagine the "one in a trillion" numbers they claim for
| false flagging rate are probably cemented in reality, and
| make it trivial for human review.
| skinkestek wrote:
| I've explained the problem here:
| https://news.ycombinator.com/item?id=28099927
| Zababa wrote:
| > No. You can design a system where the FPR is
| essentially zero. Even if shitty md5 is used.
|
| How can you do that, considering md5 can have collisions?
| schoen wrote:
| A confusing thing that I think people haven't addressed
| clearly for the most part:
|
| For a hash (whether cryptographic or perceptual), there
| is a chance of _random_ collisions and also a difficulty
| factor for _adversarially-created_ intentional
| collisions. The random collision probability has to be
| estimated based on some model of the input and output
| space (with cryptographic hash functions, you would
| usually model them as pseudorandom functions and assume
| that the collision probability is the same one created by
| the birthday paradox calculation).
|
| Intentional collisions depend on insight about the
| structure of the hash function, and there are also
| different kinds of difficulty levels depending on the
| nature of the attack (preimage resistance, second-
| preimage resistance, and collision resistance). Gaining
| more insight about the structure of the hash function can
| act to reduce the work factor required for mounting these
| attacks. That should be true for perceptual hashes just
| as much as cryptographic hashes, but presumably all of
| the intentional attacks should start off easier because
| the perceptual hashes' threat models are weaker and
| there's much less mathematical research on how to achieve
| them.
|
| And in AI systems involving classifiers, it was generally
| easy for people to create adversarial examples given
| access to the model. Perceptual hashes for estimating
| similarity to _specific known images_ aren 't the exact
| same thing because it's less like "how much like a cat is
| this image?" and more like "how much like NCMEC corpus
| image 77 is this image?", but maybe some of the same
| techniques would still work. In the cryptographic hash
| analogy, I guess that would be like trying to break
| preimage resistance.
| Zababa wrote:
| Thanks for the detailed explanation. I understand why
| that works for perceptual hashes if you make them really
| precise, however I doubt it would work with md5, which is
| why I asked.
| DSingularity wrote:
| The discussion I thought we were having was about false
| positives and not adversarially induced false positives.
| For the former the random collisions have a probability
| of 1/(2^64).
|
| To mitigate adversarial false positives one idea is to
| use the combination of a cryptographically strong hash
| along with a randomly selected perturbation of the file.
| Prior to hashing, perturb the file and submit both the
| hash and the selected perturbation to apple. Apple
| selects the DB based on the perturbation and proceeds
| with matching and thresholding.
| skinkestek wrote:
| > The discussion I thought we were having was about false
| positives and not adversarially induced false positives.
|
| I think the rest of us have been discussing how this can
| and will be abused, by definition by adversaries.
|
| Many of us have also observed for years how systems are
| abused so we sadly have a gut feeling for this.
| Zababa wrote:
| I thought that the random collision for md5 was way
| higher than that. If it's that low, you're right, this
| would work. I'm not sure I understand the part about the
| pertubation.
| DSingularity wrote:
| It's actually 1/(2^128).
|
| If the attacker does not know how the image will be
| perturbed prior to hashing then he cannot generate an
| image which matches with known CSAM.
| fossuser wrote:
| It's not naive, it's math.
| mpol wrote:
| It's not a math problem, it's a human problem.
|
| suppose you have a partner who is a 'petite' woman of 34.
| She enjoys posting nudies on a website, but without her
| face in that picture. Someone who collects child porn
| downloads it, because he enjoys that picture. A year
| later he gets caught by the police and all his pictures
| get marked as 'verified child porn'. Suddenly you get
| marked as owning child porn.
| fossuser wrote:
| That isn't CSAM.
|
| https://www.nytimes.com/interactive/2019/09/28/us/child-
| sex-...
|
| Apple's thing has some sort of threshold anyway so one
| image would not trigger it. I don't buy your example -
| the CSAM images are not what you're describing.
| farmerstan wrote:
| Who determines whether something is CSAM? How do you or I
| know that every single one of those images actually is
| CSAM? How do we know if the FBI or CIA or CCP adds hashes
| of innocent pics in order to pin a crime against someone?
| 2pEXgD0fZ5cF wrote:
| > the false positive rate is essentially zero
|
| I highly doubt that
| DSingularity wrote:
| Based on what? Unless you point out a flaw in the math,
| it's zero.
| 2pEXgD0fZ5cF wrote:
| > Based on what?
|
| Based on the fact that ultimately we can't check the
| system. And based on the fact that at some point in the
| chain humans are involved. [1]
|
| What we are left with is "trust in Apple" not "trust in
| math".
|
| [1]: https://news.ycombinator.com/item?id=27878333
| pyuser583 wrote:
| This isn't a normal file hash. It's not SHA or bcrypt. A wide
| variety of states (1s and 0s) can have the same hash.
|
| The idea is to take an image and have all of its possible
| derivatives create the same hash.
|
| For example, if a hash was made of the Mona Lisa, any copy no
| matter how large, small, black and white, would have the same
| hash.
|
| Think of all the ways the Mona Lisa could be transformed and
| still be the Mona Lisa.
|
| The combinations of ones and zeros would be in the billions.
| If not more.
|
| And all those possible combinations of ones and zeros go back
| to the same "hash."
|
| That's extremely resourceful intensive.
|
| My guess is that they are going to transform the images into
| a very low resolution, black and white thumbnail. Then
| compare it against known abuse images that have been
| similarly transformed.
|
| Or they're using AI. They might be using AI.
|
| Either way, it's guesswork. How many images might be
| transformable to the same black and white thumbnail. I don't
| know.
| opinion-is-bad wrote:
| Stochastic domestic terrorist is not a term I ever expected to
| read, but it seems very fitting for the "data-driven" (but
| generally poorly evaluated) future we find ourselves drifting
| into.
| dane-pgp wrote:
| "Stochastic terrorism" has been a popular term for a
| while[0], but it usually refers to the actions carried out by
| individuals, not government agents. Ironically, though, the
| word "terrorism" itself originally referred to the actions of
| a tyrannical government against its own citizens[1], so
| perhaps the definition has come full circle.
|
| [0] https://www.dictionary.com/e/what-is-stochastic-
| terrorism/
|
| [1] https://www.merriam-webster.com/words-at-play/history-of-
| the...
| [deleted]
| chongli wrote:
| This really has been decades in the making, hasn't it? Apple is
| only adding the latest piece of the puzzle. The seeds of
| everything you've described were planted years ago.
|
| How did we get here? How did everything become so politicized
| and polarized? How did law enforcement become so militarized?
| How did we as a society become so terrified and distrustful of
| our neighbours?
| [deleted]
| mthoms wrote:
| > _Problem is that this scanning is necessarily fuzzy and there
| is going to be a false positive rate to it. And the way that
| you 'll find out that you've tripped a false positive is that
| the SWAT team will knock your door down and kill your dog (at a
| minimum)._
|
| Not true. Hash matches are to be human reviewed. So no, people
| won't get "swatted" _accidentally_ as you allege.
|
| The other concerns people have been voicing are certainly valid
| though (IMHO).
| lamontcg wrote:
| Google and Apple have burned all their trust for this kind of
| thing. They already do terrible with all the people who get
| banned off their platforms with no notice of what they did
| wrong and no recourse. These companies cannot be trusted with
| the power to arrest you. They don't care if they ruin
| innocent lives due to edge conditions.
| mikewarot wrote:
| >Hash matches are to be human reviewed.
|
| Until it proves too expensive, then a different AI system
| will do it instead. I have zero faith that it'll be a fully
| competent, well trained, well rested, well paid person will
| actually be doing these reviews in the long run.
| mthoms wrote:
| If you actually think there is going to be a fully
| automated system dispatching police SWAT teams throughout
| the US without a manual (or judicial) review then.... I
| really don't know what to tell you.
| sithlord wrote:
| Guess its hard to believe that their is a world outside
| of the United States. Already do whatever it takes to pin
| some BS on someone that opposes them and has them
| executed. This would be no different.
| mthoms wrote:
| I don't understand. If the government can just claim you
| had CP without proof, then they can just... do that? They
| don't need Apples system.
|
| Especially since anything flagged by this system is
| manually reviewed by Apple. So, there would exist
| counter-evidence for the govt claim.
|
| Don't misunderstand, I see how this system is ripe for
| abuse. I was just commenting on the specific claim that
| there will be _automated_ SWAT call outs (presumably in
| the US).
| himinlomax wrote:
| > If the government can just claim you had CP without
| proof, then they can just... do that?
|
| The problem is that they can do whatever if you _have_
| CP. Emphasis on "have": you might not even know you
| _have_ it, because all it takes for you to be guilty is
| for a forbidden bit of data to be on your disk or cloud
| account. How did it end there? Doesn 't matter much. It's
| unlikely you'll be convicted if someone else maliciously
| puts CP on your drive, but it's extremely likely your
| whole neighborhood will know about it before it's
| settled. Think about that.
|
| > They don't need Apples system.
|
| Apple's system makes it sure that whenever someone puts
| CP on someone else's account, it rings the police and
| starts the nightmare. Without it, there's a few extra
| steps that make the nightmare much less likely to happen
| in the first place.
| mthoms wrote:
| This is plausible and exactly why I oppose this system.
|
| The only argument I made was against the claim that
| _fully automated_ police dispatches will occur because of
| this system (in the west).
|
| Making outrageous, tinfoil hat level claims does not help
| our collective argument against this system.
|
| It just makes us look like paranoid conspiracy theorists.
|
| Let's make coherent arguments (like the one you made
| above) instead of fantasy ones.
| smorgusofborg wrote:
| It should be extremely easy for any group that has access
| to tools like NSO Group's to use the information on a
| device to convincingly add data to devices. What is
| generally missing is the question of why certain people's
| devices were selectively searched. Attacks by the west on
| journalists often rely on the excuse of border searches
| for example.
| jandrese wrote:
| Yeah, it normally requires an anonymous phone call from a
| VoIP number.
| LMYahooTFY wrote:
| Is this an allusion to an actual occurrence?
| macintux wrote:
| https://en.wikipedia.org/wiki/Swatting
| mikewarot wrote:
| It'll all be shadow-ban type stuff, your account will be
| turned off, without appeal, things of that nature.
| mthoms wrote:
| That's a different thread of comments. This one is about
| automated dispatching of SWAT teams.
| himinlomax wrote:
| A dodgy phone call from an unknown number abroad by a
| stupid teenager is enough to get a SWAT team to murder
| innocent civilians, what makes you think they'll be
| smarter when the "call" comes from a robot?
|
| I mean, seriously, you're assuming as beyond obvious
| something that's demonstrably wrong RIGHT NOW.
| mthoms wrote:
| Those prank SWAT calls are (a) from a human and (b) based
| on the assumption of _IMMEDIATE_ risk of death /injury.
| Furthermore, (c) the CP scanning implementation itself
| _requires_ human review.
|
| So no, under the proposed implementation (and current
| judicial requirements in the west) robots dispatching
| armed police to kill pets or people without ANY human
| oversight because of a situation that is _not_ time
| sensitive is mere fantasy.
|
| Don't misunderstand, I am against this whole thing for
| the reasons many others have well articulated in this
| thread.
|
| But the claim that robots are dispatching armed police
| without any human interference RIGHT NOW is provably
| wrong.
|
| Show me one incident where this has happened already and
| I will donate $100 to the charity of your choice.
|
| And yes, the judicial rules and Apples implementation
| could change in the future. That's certainly a risk but
| is not the case RIGHT NOW. I mean, seriously?
| adib wrote:
| Or Mechanical Turked.
| hahajk wrote:
| I have to imagine that you'll be investigated and can let
| Apple rehash your photos and find the ones that tripped the
| system. If Apple rehashes all your photos and suddenly none
| of them are tripping, and they accuse you of deleting the
| bad ones (assuming you haven't deleted anything) the
| problem isn't the hash algo, it's something else in the
| system.
|
| That said, I don't like my phone being a snitch.
| thayne wrote:
| Eh, humans make mistakes, that may reduce the false positives
| but if this program runs long enough, I'm sure it will happen
| to someone when someone accidentally presses the wrong
| button.
| adrianmsmith wrote:
| > Hash matches are to be human reviewed.
|
| This comment suggests this phrase might be cleverly worded,
| to make it seem like the images are human reviewed, while
| that actually not being the case:
| https://news.ycombinator.com/item?id=28096059
| aj3 wrote:
| Reviewing potential matches sounds like an awful job.
| Retention might be even lower than in those FB abuse centers.
| alisonkisk wrote:
| That's part of why possession is illegal. The material is
| toxic.
| wizards420 wrote:
| For another thought on how easy this could suffer from scope
| creep, iOS does (did?) store whole-app screen captures as PNGs
| for use on the app switcher, etc.
|
| If this is already scanning photo libraries locally, just add the
| temp dir for app screen captures and they're effectively
| monitoring your screen too.
| [deleted]
| throwaway13239 wrote:
| I don't see why we would believe they are not already doing this?
| Today's news is just that they can now legally take action on it.
|
| Also the whole SWAT scenario is a bit far-fetched, as they will
| most likely read thru your entire life (don't forget they already
| have access to it) to make sure they don't look stupid on the
| news.
| antpol wrote:
| Your iphone is already capable to identify nude minors in near
| real time. Is it really that far-fetched that SWAT team gets
| triggered into action when iphone detects minor abuse?
| throwaway13239 wrote:
| Pretty far-fetched, to swat you need a crisis situation,
| hostage/bomb/shooting
| throwayws wrote:
| Planting false evidence is getting a new twist here. The attacker
| doesn't even have to make a report! The victim's computer does it
| for him. Disk encryption malware may have new successor.
| Effective and scalable extortion as a service.
| Kaytaro wrote:
| This is a great point. You don't even need to unlock an iPhone
| to take a picture. So in theory anyone with access to your
| phone for a few seconds could incriminate you with little
| effort.
| selsta wrote:
| This is already possible today with things like iCloud
| Photos, Google Photos, OneDrive etc.
| mobiledev2014 wrote:
| Today: "WTF is this?" _delete_
|
| iOS 15: "WTF is this?" _SWAT team crashes through window_
| fossuser wrote:
| This is not true. The check is only against known CSAM
| hashes.
| noisem4ker wrote:
| Just photograph a known bad picture.
| fossuser wrote:
| Then that's a different photo and will have a different
| hash.
| antpol wrote:
| Are you sure the hash function literally called
| "NeuralMatch" running on the device with 2+ gen of AI
| capable chips won't have "collisions"?
| Simucal wrote:
| These aren't cryptographic hashes. They are perceptual
| hashes and a picture of a picture could absolutely end up
| with the same phash value.
| skavi wrote:
| is there really no fuzziness to it? If not, can't this be
| defeated by simply reencoding the image?
| fossuser wrote:
| I think it has gotten more sophisticated to detect
| cropped images and small changes now:
| https://inhope.org/EN/articles/what-is-image-hashing
|
| The example is somewhat contrived.
|
| If a 'friend' takes your phone and has access to it and
| then uses it to take images of CSAM similar enough to the
| original image that it triggers the hash match _and_ does
| this enough times to go over Apple 's threshold to flag
| the account after these images are uploaded to icloud
| without the original phone owner noticing _then_ yes it
| might cause a match.
|
| At that point the match is probably a good thing (and not
| really a false positive anyway) - since it may lead back
| to the friend (that has the illegal material).
| zbobet2012 wrote:
| Or you know, anyone who wants to plant material on a
| device and has physical access. Say a disgruntled
| employee before leaving, or ex, or criminal or...
|
| Or anyone who can just text you since imessage backs up
| to icloud automatically...
| jeromegv wrote:
| While iMessage backups to iCloud, this measure is only
| for photos stored in the iCloud Photo Library. So sending
| a text with the photo is not enough.
| [deleted]
| archagon wrote:
| Picture of a picture?
| cblconfederate wrote:
| Apple will release this in their desktop pcs. The major
| consequence of this is that, a year down the line, microsoft will
| also be compelled/forced to join this "coalition of the good".
| After all, they already scan all your files for viruses, it would
| be a shame if they didn't scan for anything that is deemed
| incriminating and also call the cops on you. Of course, the
| children are being used as a trojan horse again. We don't even
| mention the giant logical leap from finding someone possessing CP
| to automatically considering them a child molester, yet someone
| posessing ~~an action~~ a murder video is not considered a
| murderer.
|
| I m just imagining the situation where these companies took the
| initiative to scan all their users data in a situation like the
| attach in US capitol this year. Creating new affordances for
| spying always leads to their abuse in the first chance when an
| extreme circumstance occurs. So there is no excuse for creating
| those affordances just "because they can"
| userbinator wrote:
| Of course, the next step after that is to make "non-authorised"
| operating systems -- which may include Linux (except for
| specially signed versions that will also include similar
| spyware) -- "deprecated" or "discouraged", then "suspicious",
| and perhaps even eventually illegal.
|
| Stallman predicted a similar outcome, and although he (and many
| others) thought the end of computing freedom would be due to
| copyright/DRM, I wouldn't be surprised if "the children" is
| what eventually pushes things over the edge.
|
| https://www.gnu.org/philosophy/right-to-read.en.html
|
| _in a situation like the attach in US capitol this year_
|
| ...and as much as I 'd like to see at least that amount of
| "watering the liberty tree" directed at Big Tech, it
| unfortunately would likely lead to even more authoritarian
| outcomes. Any future fights for freedom will need to happen
| online and non-violently, but on platforms that are also under
| their control.
| mulmen wrote:
| > ...and as much as I'd like to see at least that amount of
| "watering the liberty tree" directed at Big Tech [...]
|
| I'm having a hard time finding a reading of this that isn't
| advocating violence against tech workers. Is that what you
| intended?
| wizzwizz4 wrote:
| Given corporate personhood, destroying the _companies_
| could be considered corporate murder. (I admit it 's a very
| stretched reading.)
| mulmen wrote:
| I agree that is a favorable reading.
|
| But it is especially a stretch in the context of January
| 6th which involved violence directed at people. And the
| rest of the paragraph laments that future action must be
| nonviolent.
| wizzwizz4 wrote:
| So, in fact, it's _not_ advocating the violence.
| mulmen wrote:
| It's hard to say. Words like "I'd like" and
| "unfortunately" make me think it's a desire for violence
| but an acknowledgement that it's not worthwhile. Like I
| said, I'm trying to find the favorable reading. It seems
| hyperbolic at best.
| slg wrote:
| >We don't even mention the giant logical leap from finding
| someone possessing CP to automatically considering them a child
| molester, yet someone posessing an action movie is not
| considered a murderer.
|
| This is a wild and in my opinion a wrongheaded analogy.
| Possessing CP is a crime by itself. It doesn't matter if the
| person possessing is actually a molester or not. It is just
| like the possession of drugs being illegal and it does not
| matter whether the person has actually taken or plans to take
| those drugs.
| cblconfederate wrote:
| Yeah it wasn't meant to be an analogy. For example consider a
| real murder video, is its posession illegal and does it mean
| its owner should be a murder suspect?
| slg wrote:
| You are still falling into the same line of thinking here.
| The person isn't arrested because we assume they are a
| molester. They are arrested for possessing illegal
| material. Your argument here really sounds like you are
| suggesting that CP should be legal and it is the action of
| molesting that should be illegal. I don't think you would
| find many people who agree with that. Society has decided
| both are separate crimes and while there is likely large
| overlap, overlap is not required to be charged for either
| crime individually.
| cblconfederate wrote:
| > Society has decided both are separate crimes
|
| Actually it's legislators and the courts that come up
| with these laws and they are complicated. The question is
| if these laws reduce child abuse or simply increase
| spying, and in the end what is the acceptable balance
| between these two.
| slg wrote:
| Legislators are a proxy for society. Do you believe that
| if you polled the US that any sizable portion of the
| country would be in support of legalizing CP?
| cblconfederate wrote:
| In liberal democracies legislators also protect
| fundamental rights from the whims of the majority.
|
| > if you polled the US that any sizable portion
|
| No, but they also would agree to that they should have an
| option to keep their data privately. Maybe the public
| should be polled about this tradeoff?
|
| For example the case of virtual CP has an interesting
| legal history
| https://www.freedomforuminstitute.org/first-amendment-
| center...
| slg wrote:
| >No, but they also wouldn't consent to having no option
| left to record their thoughts privately. Maybe the public
| should be polled about this tradeoff?
|
| It is fine if this is your argument, but you don't have
| to wrap this argument in with the very legality of CP.
| You can acknowledge something is and should be a crime
| while also being against these type of automated dragnets
| to find people guilty of said crime.
| cblconfederate wrote:
| > something is and should be a crime while also being
| against these type of automated
|
| But there is a clear tradeoff between the two so saying
| that would be a useless platitude. If we really see the
| internet as an extension of our vocal chords, then we
| should have individual rights to it, especially
| considering the fact that the internet infrastructure is
| not provided by the governments themselves, only the
| spying is.
| slg wrote:
| I don't know what to say to you if you consider someone
| voicing opposition to this move by Apple as a "useless
| platitude" if they don't also believe that we have a
| constitutional right to post CP on the internet.
| matheusmoreira wrote:
| If you polled the population, chances are they'd support
| torture, rape and execution as punishment for abusers.
|
| The real crime is child abuse. Material related to that
| is also illegal because it presumably creates demand for
| the abuse. Whether that's actually true I don't know.
| int_19h wrote:
| The worst thing about it is that CP possession is a _strict
| liability_ crime in most jurisdictions. Meaning that
| prosecution doesn 't have to prove intent - if they can find
| it on your machine, you're guilty regardless of how it got
| there.
| anonuser123456 wrote:
| In criminal law you must be aware of an item for it to be
| under your possession. If someone plants illegal content on
| your computer, and you are unaware, you aren't legally in
| possession.
| int_19h wrote:
| "Typically in criminal law, the defendant's awareness of
| what he is doing would not negate a strict liability mens
| rea (for example, being in possession of drugs will
| typically result in criminal liability, regardless of
| whether the defendant knows that he is in possession of
| the drugs)."
|
| https://www.law.cornell.edu/wex/strict_liability
| anonuser123456 wrote:
| Note the word 'typically'.
|
| The purpose of strict liability in possession is to
| prevent the defense that someone does not know the legal
| status of an item in their possession. It does NOT
| prevent the defense that someone does not _know_
| something to be in their possession.
|
| For example, it is not a defense to have drugs and claim
| "but I didn't know they were illegal". It is a defense to
| claim "I did not know they were there."
|
| In drug cases with actual possession, it is difficult to
| support a defense of "I didn't know they were there",
| which is why charges typically result in criminal
| liability. They drugs were physically on you, and unless
| you have evidence that someone planted them, it is
| unlikely you could establish reasonable doubt.
|
| But in cases of electronic material for networked
| devices, there is most certainly an affirmative defense
| to counter actual possession and constructive possession.
| Computer devices are hacked all the time, and network &
| device logs exist. For example, if a prosecutor agrees to
| the fact that a defendant had no knowledge of the
| material, a judge would toss the case and a jury would
| not convict you. The law is not meant to pedantically
| convict you of non-crimes.
| ostenning wrote:
| But for any serious charge you need indisputable evidence,
| and usually lots of it, usually the threshold is quite
| high.. But in the United States the system is insane so I
| could be wrong
| himinlomax wrote:
| All the evidence required is that it is there.
|
| Does not matter if someone else put it there without your
| knowledge. Or maybe it matters, but then it's on you (for
| all intents and purposes) to prove that it was without
| your knowledges.
| alisonkisk wrote:
| The punishment for strict liability is much lower than
| for the same act with intent.
| intricatedetail wrote:
| It's still essentially a death sentence if you are
| innocent.
| beebmam wrote:
| Windows has been doing this for years. Not with this new tech,
| but with older methods.
| TrevorJ wrote:
| I'm skeptical of the claim that MS has been A: scanning files
| on local windows machines and B: forwarding anything
| concerning to law enforcement. That's a fairly extraordinary
| claim, and I would like to see the evidence.
| [deleted]
| ribosometronome wrote:
| Microsoft hasn't been scanning files on local machines but
| Microsoft has been doing essentially the same thing Apple
| has been for years -- they pioneered it with PhotoDNA back
| in 2009. It's been in use with OneDrive since back when it
| was called SkyDrive
| (https://www.makeuseof.com/tag/unfortunate-truths-about-
| child...). Apple's implementation is scanning images 1)
| stored on iCloud or 2) in Messages if parents turn it on.
|
| The first seems pretty arbitrary -- why is it worse to scan
| files you're sharing with the cloud locally than in the
| cloud (except potential performance/battery impact, but
| that seems moot).
|
| If Apple brings this feature to the desktop, it seems
| likely they'd be using it the same way: files stored in
| their cloud.
| Grazester wrote:
| Ok but this is not Windows, Which the op stated
| Grazester wrote:
| What are you talking about?
| Asmod4n wrote:
| Microsoft already does this.
|
| ,,Microsoft removes content that contains apparent CSEAI. As a
| US-based company, Microsoft reports all apparent CSEAI to the
| National Center for Missing and Exploited Children (NCMEC) via
| the CyberTipline, as required by US law. During the period of
| July - December 2020, Microsoft submitted 63,813 reports to
| NCMEC. We suspend the account(s) associated with the content we
| have reported to NCMEC for CSEAI or child sexual grooming
| violations."
|
| https://www.microsoft.com/en-us/corporate-responsibility/dig...
| cblconfederate wrote:
| I think that's for files uploaded to onedrive etc. Not for
| all the files on any windows pc
| Asmod4n wrote:
| The encouranged windows 10 setting is to sync your whole
| profile to OneDrive.
| cblconfederate wrote:
| Does this imply that apple was not checking the photos
| stored in their servers for CP? Then icloud must have
| been the best way to share it
| Asmod4n wrote:
| It at least looks like apple is the last one to do this.
| google, flickr, facebook, twitter etc did this since
| forever.
| jachee wrote:
| Almost like Apple was the last holdout to not give up
| customers' privacy, and eventually had to cave. So they
| came up with a sophisticated system to handle the
| scanning on-device, and only for data that's already
| destined for their servers.
| [deleted]
| threeseed wrote:
| > finding someone possessing CP to automatically considering
| them a child molester, yet someone posessing an action movie is
| not considered a murderer
|
| Sure sounds like you're trying to justify child pornography
| here.
| PhasmaFelis wrote:
| > We don't even mention the giant logical leap from finding
| someone possessing CP to automatically considering them a child
| molester, yet someone posessing an action movie is not
| considered a murderer.
|
| Filming an action movie does not require actually murdering
| people.
| jeltz wrote:
| And neither does producing child porn. Drawn child porn is
| banned in most countries.
| PhasmaFelis wrote:
| And that's silly. But I'm talking about _actual_ child
| porn.
|
| I'm not defending Apple. I'm saying it's absurd to act like
| "merely" distributing explicit photos taken without the
| subject's consent is a victimless crime.
| [deleted]
| u10 wrote:
| > possessing CP to automatically considering them a child
| molester, yet someone posessing an action movie is not
| considered a murderer
|
| That's specious reasoning. Someone who posesses an action movie
| likes action movies, while someone who posesses child porn
| likes child porn. One is ok, the other is pretty vile and
| illegal for a reason.
|
| I don't agree with Apple on this but let's be clear on what is
| and what isn't
| maximus-decimus wrote:
| You can have videos of somebody actually getting killed and
| it's not a crime. Reddit killed the /r/watchpeopledie
| subreddit, but it wasn't illegal to go on.
|
| Until people are actively encouraging people dying for people
| getting killed by paying for gladiator matches, I agree with
| you it's not the same thing, but I don't think the person
| you're answering to is talking about action movies.
| birdyrooster wrote:
| You realize if this is as problematic as we are talking about
| and peoples lives are ruined from false positives it will
| create a political movement that will make this illegal. Stop
| trying to lobby companies and lobby for politicians that will
| rein this shit in.
| flutas wrote:
| > Someone who posesses an action movie likes action movies,
| while someone who posesses child porn likes child porn.
|
| Unless it's planted. [0] Or sent to you. [1] Or (farther out
| there) happens to be embedded on a site you visited and ends
| up in your browser cache.
|
| [0]:
| https://www.nytimes.com/2016/12/09/world/europe/vladimir-
| put...
|
| [1]: https://www.nytimes.com/2019/06/17/nyregion/alex-jones-
| sandy...
| gjsman-1000 wrote:
| It only scans images in your iCloud Photo Library. Not
| paying for iCloud? No scanning. Not in your photo library?
| No scanning. And even then, only for known content, not new
| content.
| flutas wrote:
| Conveniently, iOS 15 also syncs images from your messages
| and many other places into your photos. Whether this will
| put that file in your iCloud Photo library, I do not
| know.
|
| https://www.macrumors.com/how-to/see-photos-shared-with-
| you-...
|
| On top of that, at this stage you are right. How long
| before they move it to every file in your device's
| storage "because of the children!"
| runlevel1 wrote:
| So the attacker only needs to get access to your iCloud.
| Your iPhone will happily sync down photos uploaded
| elsewhere.
|
| You don't have to be paying for iCloud, either. There's a
| free tier, so I'd imagine almost all iPhones are using
| some tier of it.
|
| iCloud account break-ins aren't exactly rare. An
| accusation, even if false, could ruin an innocent
| person's life.
| nerdponx wrote:
| Not only is there a free tier, but the iPhone defaults
| are cleverly configured so that you quickly fill it up
| with random junk on your phone, and feel pressured into
| paying for more iCloud storage, because the default sync
| behavior is so non-obvious, and the settings to disable
| it are buried.
|
| I know _multiple_ people (most of them in their 50s or
| older) who started paying for iCloud because they thought
| it was their only option.
| simondotau wrote:
| If this was going to happen, it would have already
| happened on Android, Gmail, OneDrive, or any of dozens of
| services which already do what Apple is now doing.
|
| Or are you saying that malicious activity is only
| interesting if it was on an Apple device?
| mercora wrote:
| > There's a free tier, so I'd imagine almost all iPhones
| are using some tier of it.
|
| i was quite surprised to see this was the default or at
| least was setup unknowingly to me.
| mike3498234 wrote:
| > We don't even mention the giant logical leap from finding
| someone possessing CP to automatically considering them a child
| molester, yet someone posessing an action movie is not
| considered a murderer.
|
| Are you suggesting it's OK to possess CP as long as you're not
| a molester? What a fucking idiot.
| hughrr wrote:
| That's a stupid assumption to make based on that. If someone
| emails or messages you a picture then you're in possession of
| it if you chose to be or not.
| Mike8435 wrote:
| No it was not a stupid assumption. You need to read more
| carefully as I exposed a flaw in his reasoning. The
| suggestion was that since CP possession does not prove
| physical molestation, therefore it is OK. Actually
| possession of CP is a crime in itself. This is an objective
| fact. I've known of otherwise intelligent people who do
| believe its actually OK to possess child porn if they're
| not physically molesting anyone.
| cblconfederate wrote:
| Of course it is not OK. Does making it illegal and
| violating everyone's privacy rights reduce child abuse?
| I'd like to see the evidence. Another example is virtual
| child pornography which , evil as it is, does not harm
| anyone during its production, yet that is also illegal
| (in the US). The question is how far will governments go
| with the instrumentalization of child abuse prevention in
| order to spy on their own citizens
| mike3498234 wrote:
| No it was not a stupid assumption. You need to read more
| carefully as I exposed a flaw in his reasoning. The
| suggestion was that since CP possession does not prove
| physical molestation, therefore it is OK. Actually
| possession of CP is a crime in itself. This is an objective
| fact. I've known of otherwise intelligent people who do
| believe its actually OK to possess child porn if they're
| not physically molesting anyone.
| anonuser123456 wrote:
| >We don't even mention the giant logical leap from finding
| someone possessing CP to automatically considering them a child
| molester, yet someone posessing ~~an action~~ a murder video is
| not considered a murderer.
|
| That is because the consumption of child porn induces people to
| create it. The consumption of action movies does not induce
| people to murder anyone.
| aj3 wrote:
| [citation needed]
| rossmohax wrote:
| > yet someone posessing an action movie is not considered a
| murderer.
|
| I think it is more similar to drugs, possesing one doesn't mean
| you are consuming it, yet it is an illegal substance and
| production, transportation and distribution are understandably
| not allowed.
| heavyset_go wrote:
| Good point. There are over 80,000 drug overdose deaths a year
| in the US, if these systems can detect nudity and CSAM, then
| why not save tens of thousands of lives by detecting heroin
| and fentanyl dealers, too?
| 734129837261 wrote:
| It'll start with protecting children. We all want to protect
| children, don't we? Why do you want children abused? Are you a
| child abuser, what do you have to hide?
|
| Next it's elderly people. We don't want our forgetful elders to
| get lost, do we? What if grandma wanders off but is in someone's
| picture, surely you want the police to know right that second
| where she is?
|
| Next up, terrorists! Four adult brown men in an unmarked van are
| certainly suspicious, especially near a government building. Your
| Instagram selfies will help the police in the USA shoot even more
| innocent people for no good reason.
|
| Animal abuse is next. You don't like puppies being abused, do
| you? Why do you hate puppies? Do you take part in illegal
| underground dog fights?
|
| Gosh, that video looks like it might have been pirated.
|
| Nice house, but based on your estimated income it's really
| strange that you have such a big television. Is that safe full of
| cash? How much cash is on your table?
|
| Is that a bit of dust, flour, or maybe crack cocaine?
|
| Is that person asleep or recently murdered?
| [deleted]
| gjsman-1000 wrote:
| Remember that it only scans against a known database of already
| found content, and does not try to find new content.
| ohazi wrote:
| For now.
| dane-pgp wrote:
| Just as "for now" it doesn't try to stop copyright
| infringement.
|
| Seemingly every aspect of digital technology, from search
| engines to DNS providers, has been co-opted into the fight
| against piracy, so I wouldn't be surprised if the media
| industries started threatening Apple with "contributory
| infringement" suits if they don't re-purpose this
| technology for them.
| antpol wrote:
| "it would be a shame if AppleTV+ won't get license
| extension for that popular <insert label/studio> show,
| isn't it"
| p2t2p wrote:
| There's this Apple TV+, a subscription for Apple's
| originals. I think in relatively short amount of time
| it'll turn into "It would be a shame if we banned <insert
| label/studio> from our Apple TV platform"
| nomel wrote:
| Is this true? The message triggers try to identify nudity
| within the accounts of minors. Is the notification only going
| to the parent? Is it stored for possible later use? Are those
| photos ever reviewed?
| noduerme wrote:
| Excellent point. The messaging triggers for nudity in child
| accounts would imply that there is a separate, on-device
| scanning capability that has nothing to do with hash
| matching. Taken together with the backdoor for data
| exfiltration, and considering these come in the same
| announcement, there's no reason not to consider them part
| of the same spyware framework.
| dwaite wrote:
| > It'll start with protecting children. We all want to protect
| children, don't we? Why do you want children abused? Are you a
| child abuser, what do you have to hide?
|
| To be clear, this is continued enforcement for years-old
| regulation. The feature is only enabled in the US where it is
| required.
|
| The implementation is changing from cloud-based matching (which
| requires photos to be readable by their cloud infrastructure)
| to local based matching with threshold tokens (which would
| allow them potentially to be in compliance while making the
| system E2E encrypted w two additional key release mechanisms
| (key escrow via separate audited HSM systems, the given
| threshold disclosure of the image encryption key)
| tomjen3 wrote:
| >E2E encrypted w two additional key release mechanisms
|
| Aka not E2E and therefore something that Cook should face a
| fraud charge for saying it is.
|
| Apple needs to make it true e2e yesterday, and tell the FBI
| that they can either approve of it, or never use an iPhone
| again.
| dwaite wrote:
| I'm the one saying it is, not Apple.
| roody15 wrote:
| " which would allow them potentially to be in compliance
| while making the system E2E encrypted ..."
|
| Apples system is not E2E encrypted if a local program scans
| files prior to upload.
| antpol wrote:
| > while making the system E2E encrypted w two additional key
| release mechanisms (key escrow via separate audited HSM
| systems, the given threshold disclosure of the image
| encryption key)
|
| such a long sentence to say "backdoor"
| dannyobrien wrote:
| Could you point to the regulation that requires this?
| macintux wrote:
| See the "Federal CSAM Law" section here:
|
| https://cyberlaw.stanford.edu/blog/2020/01/earn-it-act-
| how-b...
|
| > Section 2258A of the law imposes duties on online service
| providers, such as Facebook and Tumblr and Dropbox and
| Gmail. The law mandates that providers must report CSAM
| when they discover it on their services, and then preserve
| what they've reported (because it's evidence of a crime).
| dannyw wrote:
| The law doesn't require them to actively go around
| scanning for it.
| jachee wrote:
| They're scanning for it as it _enters_ their system. It's
| only scanned if it's set to go to iCloud.
| noduerme wrote:
| Yes, but then there's a way to upload it in plaintext.
| That's the backdoor. That can, and will, eventually be
| used to exfiltrate any file, anytime, whether it would be
| going to the cloud or not.
| macintux wrote:
| It does set a terrible precedent, but it's possible this
| is a step towards E2E encryption on iCloud data; a way to
| comply with the law while preventing law enforcement from
| being able to subpoena other data.
|
| Apple is being its usual cryptic self about this, which
| is once again breeding uncertainty, but I still have hope
| in the end this will work out.
| noduerme wrote:
| Again though, what is the guarantee this is only going to
| be used to scan data uploaded to the cloud? If Apple has
| the ability to exfiltrate data from a phone at whim, they
| lose any deniability or leverage they still have with
| authoritarian regimes that want all the data on a phone.
| It's not just a terrible precedent, it's a dangerous
| piece of malware.
| dwaite wrote:
| They of course have the ability to exfiltrate data, they
| created the hardware and OS.
|
| However since this is part of the opt-in iCloud photos
| sharing mechanism, it doesn't appear they have started to
| exfiltrate data without consent.
| noduerme wrote:
| >> They of course have the ability to exfiltrate data,
| they created the hardware and OS.
|
| I don't in know why you're assuming they have a backdoor
| built in already. The whole point is that they don't, but
| they're going to add one.
___________________________________________________________________
(page generated 2021-08-07 23:02 UTC)