[HN Gopher] An Open Letter Against Apple's Privacy-Invasive Cont...
       ___________________________________________________________________
        
       An Open Letter Against Apple's Privacy-Invasive Content Scanning
       Technology
        
       Author : nomoretime
       Score  : 1065 points
       Date   : 2021-08-06 11:22 UTC (11 hours ago)
        
 (HTM) web link (appleprivacyletter.com)
 (TXT) w3m dump (appleprivacyletter.com)
        
       | VBprogrammer wrote:
       | Seems to me that a 1x1 pixel image tag pointing at an image
       | matching one of the signatures would be enough to trigger some
       | kind of action, even if the end user doesn't see that single
       | pixel. What then, does your phone get seized automatically?
       | 
       | Of course the civil disobedience way of dealing with this would
       | be to find an entirely safe image which matches the hash so that
       | every website can safely embed such an image and show the utter
       | futility of the idea.
        
         | hnthrowaway9721 wrote:
         | I'm shocked that on a website filled with engineers, developers
         | and myriad other hackers, many of whom seem to be outraged by
         | this level of invasiveness, no one is suggesting direct action.
         | How far will people let this continue while standing back and
         | doing nothing? Rights are the things you're willing to fight
         | for. If all you're willing to do is make a vaguely angry post
         | on a forum, then that's all you'll end up with.
        
       | mark_l_watson wrote:
       | Yes, a major disappointment. I suggest reading a few William
       | Gibson cyberpunk books to get a rough idea of the future, both
       | society and tech. In fiction, and as a non-technical author, I
       | think he gets it right. Everyone who has the technical knowledge
       | to do so needs to decide how much effort they are willing to go
       | through for privacy.
       | 
       | Personally, for my iPhone photos, I have my phone setup to upload
       | every picture I take to Apple iCloud, Google Photos, and
       | Microsoft OneDrive, so I gave up on photo privacy many years ago.
        
       | Croftengea wrote:
       | Now I wonder how long will it take for this tech to reach macOS.
        
         | xyst wrote:
         | Later this year in the Monterey update.
         | 
         | " These features are coming later this year in updates to iOS
         | 15, iPadOS 15, watchOS 8, and macOS Monterey.*"
         | 
         | -- https://www.apple.com/child-safety/
        
           | dexwell wrote:
           | Of the three new features mentioned, the CSAM scanning one is
           | not yet coming to Monterey.
           | 
           | > new technology in iOS and iPadOS* will allow Apple to
           | detect known CSAM images stored in iCloud Photos.
           | 
           | > * Features available in the U.S.
           | 
           | The iMessage and Siri feature paragraphs do mention Monterey.
        
             | EasyTiger_ wrote:
             | The gall to refer to this as "features"
        
       | fortran77 wrote:
       | I'm glad people are sticking their necks out and opposing this.
       | Typically people keep quiet because they don't want to be branded
       | as "pro child-pornography" and that opens the door for this sort
       | of surveillance.
        
       | whitepaint wrote:
       | We need open-source smartphone, for the love of god.
        
         | glenvdb wrote:
         | I think there are some around?
         | 
         | https://en.wikipedia.org/wiki/List_of_open-source_mobile_pho...
        
       | klad_majoor wrote:
       | Simple rule: if it is possible it will happen. If you want total
       | security, go analog.
       | 
       | Strategically, apple showed its cards. But then again, that was
       | bound to happen too ;-)
        
       | rubyn00bie wrote:
       | So the M1 has turned me off of Apple products because, quite
       | frankly, I don't want to spend (more) of my time fixing shit a
       | trillion dollar company broke and doesn't care to fix.
       | 
       | This though, this will be the nail in the coffin with my 25 year
       | relationship to Apple. I probably wouldn't even have batted an
       | eye at it to be honest, iff, Apple hadn't been selling me on the
       | idea that their platform is "private and secure." But... they
       | have... And this has made it quite clear, they will absolutely
       | destroy that security/privacy the moment they want/need to. So I
       | have been paying, a hefty premium, to be lied to and that makes
       | me fucking cross. I have previously supported Apple because it
       | seems like they typically do "the right thing" but this is so
       | fucking insane to me I have to permanently question the judgement
       | of those in charge.
       | 
       | Do not sell privacy and security if you're going to completely
       | violate that security and privacy.
       | 
       | In 25 years or less this bullshit will be made illegal, because
       | there is ZERO chance nefarious actors won't learn how to create
       | benign images that match the hash of a heinous photo to destroy
       | people. I can almost guarantee, right now, nation-state sponsored
       | hackers and affiliated groups are attempting to get those hashes
       | and do exactly that. It's just too fucking easy to manipulate
       | once you're in and has absolutely zero chance of being detected
       | once you're generating the hashes until too many lives are
       | ruined.
       | 
       | May hell have no mercy for the souls who made this...
        
         | MikeUt wrote:
         | > So the M1 has turned me off of Apple products because, quite
         | frankly, I don't want to spend (more) of my time fixing shit a
         | trillion dollar company broke and doesn't care to fix.
         | 
         | What is broken about the M1?
        
           | trobertson wrote:
           | All the non-Apple software that people needed to spend months
           | fixing. There's a variety of posts on HN about being "M1
           | ready".
        
           | pornel wrote:
           | M1 Macs are basically iPads, with the same iOS-style locked-
           | down boot process. They even have the same DFU mode as iOS
           | devices.
           | 
           | In my case "my" M1 bricked itself, because _Apple servers_
           | have refused to permit changes to NVRAM in the machine, and
           | it can 't boot a reinstalled OS without Apple's approval.
        
             | grishka wrote:
             | No, M1 isn't locked down at all. You can disable secure
             | boot and there are efforts to port Linux to it. You can
             | block *.apple.com and everything will still work.
        
               | rcoveson wrote:
               | Puppies aren't messy at all. You can shave all their hair
               | off and there are efforts to deal with the slobbering.
               | You can put a diaper on the back end of it and everything
               | will still work.
        
               | noduerme wrote:
               | I've used macs since the 90s and the first thing I always
               | do is delete all the Apple apps, install little snitch,
               | and set up the hosts file to block apple, adobe and
               | anything else that's trying to call home. That's not
               | misunderstanding or misusing the device, it's just
               | standard configuration as far as I'm concerned. I don't
               | have an M1 yet, and I'm a little worried about apps
               | sidestepping little snitch, but I'll cross that bridge
               | when I get to it.
        
         | systemvoltage wrote:
         | I used to downvote people a couple of years ago who were
         | shedding doubt on Apple's commitment to privacy.
         | 
         | Boy. I was so wrong. I fell for the Marketing and it made sense
         | at the time "Their business is selling hardware and services,
         | not ads. Ofcourse they are privacy advocates".
         | 
         | Pass laws and legislation. I admit I was wrong and it's
         | refreshing to see this whole thing unfold before my eyes. It
         | just solidified my opinion about open source hardware.
        
           | jodrellblank wrote:
           | Is it not weird to have people argue that Apple should be
           | above the laws of places like Saudi Arabia and China, but
           | this should be dealt with by "pass laws and legislation" and
           | then expect Apple must obey those?
           | 
           | ...?
        
             | LanceH wrote:
             | I think people imply Apple should refuse to do business if
             | it means following unjust laws contrary to their core
             | values. Instead, it is a revelation that their core values
             | aren't to protect the individual, but themselves.
        
           | dehrmann wrote:
           | I'm just surprised they did it for CP and not terrorism.
        
             | hellbannedguy wrote:
             | I think it's terrorism, and CP.
             | 
             | I have a feeling the CIA, or FBI, had a little chat with
             | Tim Cook, and Tim caved into the pressure.
             | 
             | Who knows they might have some very embarrassing info about
             | the man, and used it to get what they wanted.
             | 
             | I guarantee government, including the IRS, Immigration,
             | etc. will be accessing Apples severs.
             | 
             | At least we won't have to listen to Apple privacy
             | commercials anymore.
        
               | echelon wrote:
               | "It would be a shame if your company had to be split up
               | for antitrust reasons."
               | 
               | The sad thing is that it _does_ need to be split up, but
               | this was probably the stick used by the FBI /CIA to get
               | their way. And as long as Apple does what they want,
               | they'll let it go about its merry way.
        
           | at_a_remove wrote:
           | That's part of downvote culture -- rather than verbally
           | disagreeing, just press a button. It's easy. You don't have
           | to present a counter-argument, you just begin to bury the
           | opposing side. Everyone adds a shoveful of dirt and
           | eventually that thing is just ... gone. Wished into the
           | cornfield.
           | 
           | And it feels good, too. You've done your part, helping to
           | make that sort of thing vanish.
           | 
           | Now you've seen how it works. That's why real engagement is
           | so very important, it allows us to communicate and specify.
           | What _would_ a real commitment to privacy look like? Now we
           | know that it isn 't just a press release and a bunch of shiny
           | happy faces (carefully chosen) holding Apple products. Now we
           | can start talking about what a real commitment to privacy
           | looks like in the world.
           | 
           | I know you think laws and legislation are a good idea, but my
           | guess is that this is a big no, or we will get them but with
           | all kinds of loopholes for three-letter agencies, or no
           | penalties specified for infraction.
           | 
           | We know we won't get any kind of legal punishment, that
           | recent ebay case is a fantastic example of the golden
           | parachute you get.
           | 
           | Perhaps commitment looks like a bond held in escrow. If my
           | Apple TV is found to be exfiltrating the filenames of
           | everything in a network share, the five million in that
           | escrow account gets kicked over to the EFF. Stocks could be
           | held out in reserve.
           | 
           | Essentially, commitment looks like the Sword of Damocles,
           | held over the heads of these corporate actors, and scissors
           | are to be held by people who do not like them much.
        
         | SV_BubbleTime wrote:
         | I have absolutely no comment on Apple that you or anyone else
         | hasn't made. Except your comments about hash matching, the
         | whole point is it goes to manual confirmation, they don't just
         | detect an illegal hash and come arrest you.
         | 
         | But do you seriously think this isn't going to be the standard
         | for Android, Windows, ChromeOS, OSX, etc coming at degrees of
         | time or implementation?
         | 
         | I know all the Android people are just thinking "I can root" or
         | "I'll run Lineage" which is well and good but relatively no one
         | else will.
         | 
         | Stomping your feet and saying No More Apple For Me is not a
         | winner here. It needs to be worse than that for them. What that
         | looks like? I'm just as clueless as anyone else.
        
           | [deleted]
        
           | gigel82 wrote:
           | I've been trying to get more clarity on this point (`it goes
           | to manual confirmation`) but was unable to. Remember, they're
           | saying this is something that happens on your own device,
           | with your personal photos, not in the cloud.
           | 
           | Is this saying they can randomly choose to upload any
           | personal files (photos) on your device to their servers for a
           | person to look at, because they match a "hash"? Is this not
           | absolutely batshit crazy !?
        
             | jodrellblank wrote:
             | You know these are _photos you requested to upload to
             | iCloud_ , which would already under the existing system be
             | scanned once they get there, right? But if they get scanned
             | first and then uploaded to Apple that's "absolutely batshit
             | crazy"??
             | 
             | Edit reply in here because I'm rate-limited by drive-by
             | downvoters:
             | 
             | Which part isn't clear to you?
             | 
             | - That they already scanned iCloud photos for such
             | material? Here is an article from 18 months ago about it: h
             | ttp://web.archive.org/web/20210728085553/https://www.telegr
             | ...
             | 
             | - That this is only for iCloud photos and not everyone's
             | every photo? " _CSAM detection will help Apple provide
             | valuable information to law enforcement on collections of
             | CSAM in iCloud Photos_ " - https://www.apple.com/child-
             | safety/ paragraph 3.
             | 
             | - Why? From the rest of that link, " _providing significant
             | privacy benefits over existing techniques since Apple only
             | learns about users' photos if they have a collection of
             | known CSAM in their iCloud Photos account. Even in these
             | cases, Apple only learns about images that match known
             | CSAM._ "
        
               | gigel82 wrote:
               | No, that part is not clear to me at all. If they would be
               | "scanned anyway", why would this system exist at all?
        
               | SV_BubbleTime wrote:
               | Because not every photo is uploaded to iCloud.
               | 
               | If you use iCloud, this is already happening today,
               | probably.
               | 
               | If you don't use iCloud, yes, the idea is local scan,
               | match hash, then upload for manual review. The
               | involuntary upload will have been covered by the EULA.
               | 
               | I'm not endorsing the idea. But this seems to be the only
               | way it could work.
        
         | jodrellblank wrote:
         | > " _has absolutely zero chance of being detected once you 're
         | generating the hashes until too many lives are ruined._"
         | 
         | ... the alerts go to Apple for human review. You think their
         | human review won't notice a garbled nonsense picture triggering
         | a false positive?
        
           | AlexandrB wrote:
           | > the alerts go to Apple for human review
           | 
           | This should be read as "the alerts go to a barely-trained
           | employee at a third party contractor like Cognizant[1] who
           | has a quota of material to review every hour or they get
           | fired and don't necessarily get points for accuracy for human
           | review". I don't think Tim Cook is going to be double
           | checking these images.
           | 
           | [1] https://www.theverge.com/2019/2/25/18229714/cognizant-
           | facebo...
        
           | cdolan wrote:
           | Have you not been reading about the App Store review
           | discrepancies for the last decade??
        
       | kypro wrote:
       | This letter assumes Apple is too stupid or has overlooked the
       | risks of the tool they've built. I guarantee this isn't the case.
       | 
       | We have to assume given the pressure Apple has been under to
       | build a surveillance tool like this that any "abuse cases" such
       | as identifying outlawed LGBTQ content is in fact exactly what
       | this tool was built for.
       | 
       | Apple have already proven their willingness to bend the knee to
       | the CCP with actions such as App Store removals. I almost can't
       | blame Apple for this because in all likelihood if Apple refuses
       | to cooperate they will eventually lose market access. Couple this
       | with the fact today Apple is seeing increasing pressure from
       | western governments to surveil and censor "hate speech" and
       | "misinformation" they've probably reluctantly accepted that
       | sooner or later they will have no choice but to spy on their
       | users.
       | 
       | What I'm trying to say is that Apple isn't the problem here. My
       | guess is Apple's engineers are smart enough to know this
       | technology can and likely will be abused. They're also probably
       | not that interested in spying on their customers as a private
       | company in the business of convincing people to buy their
       | communication devices. Apple did this because governments have
       | been demanding this. And in recent years these demands have not
       | only been coming from the CCP and other authoritarian regimes,
       | but also from governments in their primary markets in the West.
       | 
       | The only way we can fight this is by demanding our politicians
       | respect our right to privacy and fight for technologies like e2e
       | encryption on all of our devices.
       | 
       | I don't want to be overly negative, but realistically this isn't
       | going to happen. Most people who use Apple's devices don't
       | understand encryption or the risks content scanning technology
       | present. And even if they did, no one is going to vote for a
       | representative because of their stance on encryption technology.
       | It seems almost inevitable that the luxury of private
       | communication for all will eventually be a thing of the past.
       | 
       | I'm just glad I know how to secure my own devices.
        
       | baggachipz wrote:
       | Beyond the obvious here, I'm just shocked Apple thought they
       | could do this without causing a shitstorm of biblical
       | proportions. To market yourself as the "privacy-centric"
       | ecosystem and then do a complete about-face requires either
       | disarray and tone-deafness at the management level... or,
       | alternatively, extremely nefarious intentions. I'm honestly not
       | sure which one it is at this point, and their reaction in the
       | coming days will probably reveal that.
        
       | arvinsim wrote:
       | More horrifying is the number of people defending or apologizing
       | for this move. Truly mind boggling!
        
         | wyager wrote:
         | Anything for the children!!
         | 
         | I think that earnest acceptance of "well, it's for a good
         | cause" style arguments indicates a severely stunted ability to
         | generalize: either to generalize applications of the technology
         | (it works on any proscribed content, not just stuff people
         | generally agree is bad) or to generalize outcomes of this kind
         | of corporate behavior (Apple will continue to actively spy on
         | their customers as long as they can come up with a tenuous
         | justification for it).
        
           | rantwasp wrote:
           | yes, think about the children!
           | 
           | nobody thinks about the children that are literally starving
           | every day or that are in social settings where they simply
           | cannot succeed.
           | 
           | Let's just treat everyone like disgusting criminals until
           | THEY prove they are innocent by giving us access to
           | everything they do.
           | 
           | The more I see, the more I want to literally dump all
           | technology and go live off the grid.
        
         | EasyTiger_ wrote:
         | This is such a massive story now there will be damage control.
        
       | ChrisMarshallNY wrote:
       | I'm waiting for orthodox authorities to do stuff like reporting
       | pictures of people (not just girls) without head coverings. or
       | dressed in bikinis, thongs, or underwear (like a significant
       | number of "influencers" do, every day).
       | 
       | There are already places in this world, where a couple can be
       | arrested for kissing in public. I suspect that the folks
       | enforcing those laws, would have some real interest in this
       | capability.
       | 
       | Not to mention nations (many in Africa, but there are also
       | Eurasian nations), where homosexuality is banned (or even
       | considered for death penalties). If your phone has a picture of
       | two men (or women) embracing, it could cause nightmares.
        
         | jll29 wrote:
         | Or manipulating photos a little so the classifier deems them
         | dangerous: https://www.youtube.com/watch?v=YXy6oX1iNoA
        
           | isaacimagine wrote:
           | Or sending manipulated but otherwise innocuous photos to
           | someone _else_ so the classifier is triggered on _their_
           | device. Yikes.
        
           | judge2020 wrote:
           | This isn't a neutral network classifier, it's an algorithm
           | that searches against existing CSAM photos.
           | 
           | https://www.apple.com/child-
           | safety/pdf/CSAM_Detection_Techni...
        
             | ChrisMarshallNY wrote:
             | That's potentially, even worse. It means that the 'relevant
             | authorities" can substitute any images they want, whenever
             | they want.
        
             | tomas_nort wrote:
             | " Messages uses on-device machine learning to analyze image
             | attachments and determine if a photo is sexually explicit."
             | 
             | https://www.apple.com/child-safety/
        
         | robertoandred wrote:
         | That's not at all how this works. It doesn't scan for images of
         | arbitrary subjects. It scans for exact matches of known CP.
         | Your vacation pics are in no danger of being flagged.
        
           | randcraw wrote:
           | In the real world, exact matching of image hash values won't
           | work.
           | 
           | It's routine for images to change in small ways over their
           | lifetimes as they're shared. According to the model you
           | suggest, modifying a single pixel in the image by even the
           | smallest amount would cause a hash mismatch against the
           | original. If Apple's system is truly that inflexible, it will
           | be trivial to circumvent in no time. Just increment /
           | decrement a random RGB pixel in each of your images, and
           | voila, your porn is scot free.
           | 
           | Of course this countermeasure will be employed almost
           | instantly by miscreants, so how will the FBI respond? Will
           | they give up? Certainly not. They already have a blank check
           | to spy on our phones. So they will devise a clumsier match
           | algorithm that scans more sources of data on your phone and
           | your cloud accounts and your backups, and produces more false
           | positives. Why wouldn't they do this?
           | 
           | Once any telecom service provider opens a door which
           | compromises security or privacy, they will have a much harder
           | time closing it.
        
           | ChrisMarshallNY wrote:
           | I don't think it's an exact hash scan.
           | 
           | If so, it would be trivial to defeat. People would simply
           | need to do a 1-pixel crop, or recompress.
        
             | robertoandred wrote:
             | You're correct that it's not a pixel-by-pixel hash, but
             | it's still a hash of that specific image. It's not
             | analyzing the image subject and trying to identify it as
             | CSAM.
        
           | jjcon wrote:
           | Scans for exact matches of hashes of photos provided by the
           | government. Apple is not controlling the hash database.
        
           | heavyset_go wrote:
           | No, it uses perceptual hashing which is inexact, and a fuzzy
           | metric like hamming distance between hashes to determine
           | whether or not two images come from the same source image.
           | 
           | Not only is it entirely possible for two images to have the
           | same perceptual hash, it's even more likely that two
           | unrelated images have similar hashes, which would indicate to
           | the system that one image is likely an edited version of a
           | source image in their database.
        
             | robertoandred wrote:
             | It's possible, but very unlikely. Then of course you need
             | many matches to flag the account. And then of course
             | there's the manual review. The likelihood that an innocent
             | person would get caught up in this at all is zero.
        
               | heavyset_go wrote:
               | > _It's possible, but very unlikely._
               | 
               | I have built products in this space. It is entirely
               | likely, and in fact, it is incredibly common. You can
               | look at literally any reverse image search engine's
               | results and see this, because they use the same
               | perceptual hashing techniques to do reverse image
               | lookups.
        
               | robertoandred wrote:
               | And you don't think the threshold for a match will be a
               | lot tighter for this use case compared to an image search
               | engine? And you're ignoring all the other guards I
               | mentioned? Come on. You may not like Apple, but they're
               | not stupid.
        
       | hnthrowaway9721 wrote:
       | It won't matter. Most of the public will applaud the move, many
       | will call for it to be rolled out across every platform,
       | including desktops. As this gathers momentum, it will eventually
       | become illegal _not_ to have pervasive scanning apps running on
       | your device. Naturally, the technology will almost immediately be
       | expanded to other kinds of content, and will be used as a wide
       | open backdoor into everything in everyone's life.
       | 
       | While this is not inevitable, I see a strong, perhaps
       | overwhelming public response as being the only thing that will
       | prevent it, and I do not see that response happening.
        
       | nikkinana wrote:
       | Made for China, imported everywhere. Slippery slope you sheep.
        
       | thedream wrote:
       | So that "walled garden" turned out to be a cage after all.
        
       | fny wrote:
       | Why does everyone keep saying Apple _introducing_ a backdoor?
       | 
       | They've always been able to launch surveillance software on a
       | whim apparently. This is proof.
        
       | jbschirtz wrote:
       | Where is the line for scanning for content versus planting the
       | content and then finding it with a "scan"?
        
         | rantwasp wrote:
         | it's all ones and zeros. if you have nothing to hide you should
         | not worry about it /s
        
       | d--b wrote:
       | You guys can get outraged all you want. It's pretty clear that
       | Apple debated this internally and decided that the backlash was
       | worth whatever they got from this deal...
       | 
       | By today, it's pretty clear than privacy is not the top priority
       | of most people. See how much data people are giving fb/google
       | every second...
        
         | dijit wrote:
         | This defeatist attitude is not helpful.
         | 
         | The louder people are about this, the more it hurts and the
         | more likely the policy is reversed.
         | 
         | if the policy is not reversed, then at least the community has
         | been loud enough that people took notice and understood that
         | this is happening, giving them a chance to vote with their
         | wallet; for most people what happens on a computer or a phone
         | is a complete mystery.
        
           | bambax wrote:
           | > _This defeatist attitude is not helpful._
           | 
           | Maybe, but it's the truth. It's impossible Apple didn't think
           | this through. They did, and they made a decision, and Google
           | will do the exact same thing in 6 months if not earlier.
        
           | d--b wrote:
           | What I am saying is that people have been told pretty clearly
           | that FB would sell all your data to anyone, use it to target
           | people with misinformation that clearly influence election
           | outcomes, and lie about it everywhere. The dent that these
           | revelations had on FB's userbase is peanuts.
           | 
           | Same thing with whatsapp.
           | 
           | Apple's saying that they'll use a robot to check that their
           | users arent paedophiles is clearly not going to change
           | anything. Use your energy elsewhere...
        
           | [deleted]
        
         | norov wrote:
         | A protest outside of Apple's main offices or Tim Cook's house
         | would draw a good amount of attention to this issue.
        
       | game_the0ry wrote:
       | Disappointing.
       | 
       | While moderating CP distribution and storage is obviously the
       | right thing to do, I do not think this approach will be a
       | worthwhile endeavor for apple and governments.
       | 
       | Let me explain: imagine you are bad guy with bad stuff on your
       | iphone, and you hear apple will be scanning your phone. What is
       | the next logical thing you would do? Migrate your stuff to
       | something else, obviously. Encrypting a usb stick does not
       | require a high degree of technical skill [1]; neither does
       | running tor browser.
       | 
       | So I am thinking 3 things:
       | 
       | 1. This is not about CP or doing the right thing, but rather
       | apple bowing to government/s pressure to create a half-ass
       | backdoor to monitor and squash dissidents.
       | 
       | 2. Apple and government/s are incompetent and do not realize that
       | criminals are always one step ahead.
       | 
       | 3. Most likely, some combination of the above - government/s have
       | demonstrated they are willing to go hard on dissidents on the
       | personal electronics front [2], not realizing that they will only
       | mitigate, not exterminate, the dissent they fear so much.
       | 
       | For the average joe, I would say this - your privacy is
       | effectively gone when you use an electronic device that comes
       | with closed-source code pre-installed.
       | 
       | For the benevolent dissidents - there are many tools at your
       | disposal [3, 4].
       | 
       | Likely next iteration - governments / courts / cops compel
       | suspects to hand over passwords and encryption keys over to
       | prosecutors. It looks like it already started [5, 6], so act
       | accordingly.
       | 
       | [1] https://www.veracrypt.fr/code/VeraCrypt/
       | 
       | [2]
       | https://www.washingtonpost.com/investigations/interactive/20...
       | 
       | [3]
       | https://files.gendo.ch/Books/InfoSec_for_Journalists_V1.1.pd...
       | 
       | [4] https://www.privacytools.io/
       | 
       | [5] https://www.bbc.com/news/uk-england-11479831
       | 
       | [6] https://en.wikipedia.org/wiki/Key_disclosure_law
        
         | dannyw wrote:
         | It is not the role of infrastructure providers to reach into
         | private, on device content.
         | 
         | That is like the USPS opening every letter and scanning what is
         | inside. Even if it's done with machine learning, that is
         | absolutely not okay, no matter what it is for.
        
           | game_the0ry wrote:
           | Agreed, but not like we have a choice with that matter.
        
             | voakbasda wrote:
             | You have a choice about whether to keep or discard your
             | iPhone.
        
           | rcruzeiro wrote:
           | Though we were okay with Gmail doing the same because "hey,
           | it's free!".
           | 
           | But I understand your concern here. All our lives are now
           | digitalised and maybe stored forever somewhere and anything
           | you do, even if completely benign, could be considered a
           | crime in some country or even in your own country in a few
           | years from now.
        
             | gerash wrote:
             | Are you claiming GMail looks through the emails to match
             | them with crime keywords and reports you to the
             | authorities? citation needed
        
               | rcruzeiro wrote:
               | I'm claiming they use the content of your messages to
               | profile you and better target their ads until they
               | stopped this practice in 2017 in favour of reading your
               | email "for better product personalisation".
        
               | gerash wrote:
               | gmail (and any other web based email) needs to "read" the
               | email content to provide a search function. If the same
               | program also uses the content to pick some ads from the
               | inventory then I'm personally ok with that.
               | 
               | That is different from searching them for illegal
               | material based on what illegal implies at a given time
               | and location.
        
             | marcosdumay wrote:
             | Your gmail messages live on Google's servers. It's not ok
             | for them to scan everything, but it's completely different
             | from Apple scanning your phone's content.
        
               | pgoggijr wrote:
               | The hashes are computed from images that live in iCloud -
               | how is this different?
        
               | rcruzeiro wrote:
               | Suppose Apple flags some image in iCloud using a hash
               | that was probably sent from the device. Do they currently
               | have the capability to decrypt the image to check if the
               | content is actually illegal before reporting it to the
               | authorities without needing access to the physical
               | device?
               | 
               | (Not trying to make a counter-argument to your comment. I
               | genuinely don't get this part).
        
               | pgoggijr wrote:
               | Hmm not sure what you mean. The hashes are shipped with
               | iOS, and then (if parental controls are enabled) compared
               | with the hashes computed against the unencrypted data on-
               | device, and displays a warning to the user and their
               | parents.
               | 
               | I'm not sure how the image is displayed on the parents'
               | device - it could be sent from the child's device, or the
               | image used to compute the hash that the child's image
               | matched could be displayed.
        
             | rantwasp wrote:
             | nope. i'm not okay with Gmail doing that. in fact I've
             | completed removed all google products from my life. i don't
             | need someone to constantly spy on me.
        
           | Spivak wrote:
           | But the USPS _does_ scan every letter (and package for that
           | matter) for things like weapons and anthrax.
           | 
           | Don't think the US is taking a hard-line stance here. If it
           | were possible to non-destructively scan mail for CP they
           | absolutely would.
        
             | techdragon wrote:
             | The scanning for those things doesn't allow them to scan
             | the written or printed content of letters sent through the
             | post.
             | 
             | If i recall correctly isn't there a legal distinction
             | involved in mail privacy, I recall an explanation involving
             | the difference between postcards and letters with respect
             | to lawful interception...
        
               | judge2020 wrote:
               | Shine a bright light through the envelope, you Should be
               | able to make out some letters. Maybe make a neural
               | network for reading these letters and putting them
               | together into sentences.
        
               | buildbot wrote:
               | UPS and Fedex can do this as private companies, my
               | understanding is the USPS actually can't due to the 4th
               | Amendment:
               | 
               | "The Fourth Amendment protects the privacy of U.S. mail
               | as if it had been kept by the sender in his home. Ex
               | parte Jackson, 96 U.S. 727, 732-733, 735 (1878); U.S. v.
               | Van Leeuwen, 397 U.S. 249, 252-52 (1970). It cannot be
               | searched without a search warrant based on probable
               | cause."
        
       | sadness3 wrote:
       | Should they persist in this, if I don't replace my current iPhone
       | with an alternative, I will definitely not buy an iPhone for my
       | next phone. It's a gross overreach and I'd be too worried about
       | false positives.
        
         | rantwasp wrote:
         | the next phone? I will stop buying Apple products period.
        
       | devwastaken wrote:
       | There has to be some incentive somewhere for Apple to do this.
       | They know it's wrong, they know it will be abused. Tim Cook
       | himself, if he wasn't rich and powerful, would be executed in a
       | number of countries that Apple operates in for his
       | sexual/romantic identity.
       | 
       | Apple also removes LGBT based applications in countries where
       | they're illigal, to continue doing business. This demonstrates
       | that Apple complies with the demands of foreign governments, that
       | they value money over anything else.
       | 
       | So Apple, a company that complies with governments committing
       | human rights violations (Including the U.S), forces everyone to
       | have an image scanner that looks through their private images and
       | documents to find content Apple has deemed objectionable, with
       | the sources of that content supposedly being from these
       | governments.
       | 
       | FBI: Hey apple, here's some new hashes for images that are bad,
       | let us know who has them. You just have to trust us, no way that
       | we would ever put political imagery critical of the government in
       | that database. But you can't prove it even if we did.
       | 
       | Remember, this is the FBI that flaunts federal court orders,
       | breaks the law, and no one is ever held accountable.
       | https://youtu.be/oy3623YRsMk
       | 
       | My suspicion is that the FBI finally had enough of Apple not
       | complying with their encryption standards and have done some work
       | behind the scenes to make various individuals at Apple's lives
       | difficult. So they're implementing this to appease the feds.
       | 
       | There was a thread on HN here not too long ago with the FBI
       | stalking and threatening pentesters that wouldn't join them. No
       | doubt they're doing the same to big companies that are making
       | their "jobs" harder.
        
         | bbatsell wrote:
         | > There has to be some incentive somewhere for Apple to do
         | this. They know it's wrong, they know it will be abused.
         | 
         | The most positive spin I can put on it is that it has become
         | clear behind the scenes that NCMEC and partners have put enough
         | pressure on Congress that Apple believes that on-device
         | scanning for CSAM content will be soon be required by federal
         | law, and this is their attempt to define the parameters in as
         | privacy-preserving a way they can before the actual legislative
         | language is drafted and can't be changed.
         | 
         | Even if all of that is true, I don't think this was the best
         | way to do it and it is a huge own-goal to concede ground before
         | there's even been a public debate about it.
        
         | SpicyLemonZest wrote:
         | I'm not sure such a big conspiracy is needed. After all, the
         | reason "nothing to hide" memes are so common is that a lot of
         | people believe them enthusiastically. It seems entirely
         | plausible to me that a core team of passionate crusaders could
         | have driven this project to completion by just making it too
         | awkward for anyone to object. For a sample of outside
         | perspectives (https://www.npr.org/2021/08/06/1025402725/apple-
         | iphone-for-c...):
         | 
         | > Meanwhile, the computer scientist who more than a decade ago
         | invented PhotoDNA, the technology used by law enforcement to
         | identify child pornography online, acknowledged the potential
         | for abuse of Apple's system but said it was far outweighed by
         | the imperative of battling child sexual abuse.
         | 
         | > "Apple's expanded protection for children is a game changer,"
         | John Clark, the president and CEO of the National Center for
         | Missing and Exploited Children, said in a statement. "With so
         | many people using Apple products, these new safety measures
         | have lifesaving potential for children."
        
           | devwastaken wrote:
           | This is possible, but the decision is coming from higher up.
           | I can certainly see apple engineers thinking this would
           | really work, by focusing entirely upon the singular problem
           | and getting a chance to use fancy tech. But at the end of the
           | day Timmy is the head cook, and decisions like this go
           | against apples "mission", so, there's incentive higher than
           | just the engineers.
        
         | zepto wrote:
         | > that they value money over anything else.
         | 
         | Maybe or maybe they think that good computing devices are
         | valuable to everyone now and that repressive laws can be
         | changed eventually as they have in many countries, and denying
         | good computers to people who also live with repression makes
         | their lives worse, not better.
        
         | fragileone wrote:
         | > There has to be some incentive somewhere for Apple to do
         | this. They know it's wrong, they know it will be abused.
         | 
         | For several years now politicians have been asking for
         | "exceptional access for law enforcement" to backdoor
         | encryption. In Europe there's been a number of laws passed
         | recently which violate privacy also.
         | 
         | This is just Apple getting ahead of future legislation so that
         | they can be the ones who get the power and money from every
         | government agency turning to them first whenever they want to
         | monitor and punish their citizens.
        
       | rahkiin wrote:
       | I was thinking, what if they want to move their icloud photos to
       | e2e without losing the csam scanning they already have. So they
       | move it on-device, just for photos that would be uploaded to
       | iCloud (which is what Apple claims)
       | 
       | I really just hope they revert this plan.
        
       | __afk__ wrote:
       | I applaud Apple for this. It reaffirms the reasons why it's one
       | of the few companies I feel strongly about. If you can't
       | understand the logic of a decision like this, maybe you don't
       | realize the mountain of suffering that is caused from people
       | trading this material. I recommend listening to Sam Harris's
       | podcast with Gabriel Dance (#213 - THE WORST EPIDEMIC) to get a
       | better picture of the problem.
       | 
       | The main beef people seem to have here is the slippery slope
       | argument. Binary choices are nice, I agree - but almost always
       | they obscure a complex surface that deserves nuance.
        
         | bambax wrote:
         | This system will only search for _known images_. If you make
         | new images of child pornography, you 're fine (or at least, you
         | won't be flagged by this system). So this initiative does
         | nothing to prevent child abuse.
        
           | __afk__ wrote:
           | I understand that. I recommend listening to the podcast if
           | you want to understand the issue in more detail. It's a heavy
           | subject but people here generally want to see both sides of
           | an issue, and I am pretty confident that most people have not
           | really taken in the untold damage this brings to the abuse
           | victims and their families as they are continuously notified
           | about past images of their abuse being found and circulated.
           | Which is how this actually works.
           | 
           | I would be worried about a model predicting child sexual
           | abuse content from unknown images but I am not in the least
           | concerned with one that fingerprints known images.
        
       | vladharbuz wrote:
       | > To help address this, new technology in iOS and iPadOS* will
       | allow Apple to detect known CSAM images stored in iCloud Photos.
       | 
       | After reading OP, my understanding had been that this update will
       | cause _all_ photos on Apple devices to be scanned. However, the
       | above quote from Apple's statement seems to indicate that only
       | photos uploaded to iCloud Photos are scanned (even though they
       | are technically scanned offline).
       | 
       | This doesn't invalidate any of the points people are making, but
       | it does seem the update does not directly affect those of us who
       | never stored our photos "in the cloud".
        
         | dylan604 wrote:
         | How long until those of use not storing data in the cloud are
         | the guilty ones? At least, the very suspect ones?
        
         | conception wrote:
         | And this has always been Apple's stance because of the
         | government. We lock down the phone and don't let the government
         | in but if you use cloud service that doesn't have the same
         | rights because it's not stored on your property.
         | 
         | https://morningstaronline.co.uk/article/w/apple-drops-plans-...
         | 
         | This sounds like them trying to find a way to encrypt your data
         | and remove the fbi from having a "what about the children
         | excuse". They scan the image on upload and then store it on
         | iCloud and encrypt it.
         | 
         | It's of course a slippery slope but the FBI has been trying to
         | have back doors everywhere since forever.
        
           | gordon_gee123 wrote:
           | In a somewhat judo move, it could end up protecting user
           | security going forward if the FBI has no leverage with
           | pedophile content in iCloud, there's no argument for a
           | backdoor. I won't play total corporate shill here but it
           | seems people are jumping to this being the end of times vs a)
           | a way to catch severe abusers of child pornography and b)
           | removing a trump card for future security org strawman
           | arguments
        
       | aoetalks wrote:
       | I'm confused. If iCloud backups are not encrypted [1], and this
       | only scans iCloud photos, why can't they just do this server
       | side? I'm not saying server-side is OK, but its at least not
       | device side (which is obviously a slippery slope).
       | 
       | [1] https://www.reuters.com/article/us-apple-fbi-icloud-
       | exclusiv...
        
         | rantwasp wrote:
         | there has been some speculation that they will turn on
         | encryption after pushing this out. if this turns out to be true
         | apple sucked at controlling the PR for this whole thing.
        
       | john579 wrote:
       | Apple is just being innovative. Scan phones for porn, then sell
       | it for profit and use it for blackmail. Epstein on steroids.
        
       | robertoandred wrote:
       | This letter fundamentally misunderstands how this technology
       | works. Or intentionally misunderstands. It's sad that it's being
       | spread by people claiming to be tech-minded.
        
         | juniperplant wrote:
         | How so?
        
           | robertoandred wrote:
           | The letter implies a false positive can ruin your life, which
           | ignores the facts that a false positive is incredibly
           | unlikely, that you'd need many such false positives to flag
           | an account, and the following manual review would immediately
           | catch the error.
           | 
           | Then it goes on to scare you about WHAT IF OTHER PEOPLE USE
           | THE DATABASE? What if they EXPAND the database?? This
           | database isn't Apple's it belongs to the National Center for
           | Missing and Exploited Children and companies like Google and
           | Facebook have been using it for a decade.
           | 
           | And then it talks about how this will get queer kids thrown
           | out of their homes, and I don't even know where to start with
           | this. How they got from warning an 8-year-old to not open a
           | dick pic to this, I have no idea.
        
       | mfer wrote:
       | This reminds me that a two OS market isn't a healthy one for
       | consumers. We could use more diversity.
        
         | swiley wrote:
         | postmarketos.org
        
           | mfer wrote:
           | A quick look at that website and you see it's not for normal
           | or typical consumers. It's for a technical crowd who is in to
           | technical things and tinkering.
           | 
           | It's like telling an average person to use GNU/Linux for
           | their desktop OS and then watching them struggle to get
           | printing to work well (I have been through this).
        
         | Spivak wrote:
         | Okay, but in a n OS market the companies behind them would
         | still be pressured to implement CSAM detection. There's still a
         | monopoly on government.
        
       | sul_tasto wrote:
       | how is this not a violation of the 4th amendment if Apple is
       | performing this search on behalf of a government entity?
        
         | [deleted]
        
         | voakbasda wrote:
         | Because you are submitting to it via their EULA. Probably time
         | you read that contract again, eh? You have no rights once you
         | give them up.
        
         | john_yaya wrote:
         | There is quite a bit of case law precedent over the past fifty
         | years, notably US vs. Miller, that upholds the "Third Party
         | Doctrine": if you give your records to a third party, then the
         | government doesn't need a warrant to get those records.
         | Personally I think that ruling is awful and needs to be
         | overturned, but that's not likely to happen anytime soon.
        
       | jasoneckert wrote:
       | I still remember Scott McNealy's 1999 quote from when he was at
       | Sun Microsystems: "You have zero privacy anyway. Get over it."
        
         | [deleted]
        
         | echelon wrote:
         | Don't defend Apple. We have as much privacy as we're willing to
         | stand up and defend.
         | 
         | Even if Apple continues down this path, perhaps the backlash
         | will make techies abandon them. That would have a notable
         | effect on their ecosystem.
        
       | sizt wrote:
       | Just like other malware. Identify it. Isolate it. Remove it.
        
       | _davebennett wrote:
       | Wait, why is everyone so distraught about this? Am I missing
       | something? This only affects people who upload their photos to
       | iCloud?
       | 
       | Just don't store any data on iCloud. Seems simple enough. Yeah
       | sure, we can make conspiracy theories and all, but based off of
       | what was officially announced I'm not understanding all the
       | hysteria.
        
         | hnthrowaway9721 wrote:
         | Why is everyone so distraught about the man standing in
         | mainstreet shooting people with a rifle? Just don't go down
         | mainstreet! Seems simple enough.
        
       | jimbob45 wrote:
       | I'm sorry, I just don't see how this is any different from Google
       | auto-searching Google Drive files and flagging/deleting them.
       | Apple is _only_ doing this to  "Apple Cloud" files and
       | flagging/deleting them if they see them.
       | 
       | The best counterargument I've seen to that is that Apple is lying
       | and it's _not_ limited to Apple Cloud, unlike Google. However, no
       | one yet has been able to substantiate that claim.
        
       | ho_schi wrote:
       | I've someone has gone so far like Apple already did, I doubt that
       | a open letter will change their thinking. Maybe they step a
       | little back and bring the next similar thing which looks less
       | invasive some time later?
       | 
       | Thinks like this must to be stopped! Customers shall not buy
       | anything from companies which are hostile. And laws against these
       | usage need passed. As far as it looks, the laws in Europe are in
       | place and prevent this currently. But just currently, we've seen
       | how companies like Apple push the boundaries. We as humans behave
       | totally irrational, we complain about inhumane working conditions
       | at Amazon and then we order the next item. We complain about a
       | golden prison from Apple and buy the next iPhone. We should
       | change?
        
       | JGM_io wrote:
       | So will apple also pick up the bill for the electricity and
       | depreciation my device will consume?
        
       | danso wrote:
       | Both of the announced features sound fine on their own -- Apple
       | has always been able to scan iCloud and send your data to police
       | [0], and PhotoDNA is a decade old; whereas on-device machine
       | learning on kid phones does not (according to Apple) involve
       | sending anything to Apple reviewers and the govt.
       | 
       | But announcing both features in the same post makes it inevitable
       | that they get conflated, e.g. "Apple's machine learning will send
       | suspected child porn-like images from your iMessages to iCloud,
       | where it will be reviewed and possibly sent to the police". Apple
       | obviously knows this -- so them being ok with that raises some
       | serious eyebrows, as if they're quietly setting the groundwork to
       | make that a reality.
       | 
       | [0] https://www.businessinsider.com/apple-fbi-icloud-
       | investigati...
        
         | koolhaas wrote:
         | > Apple has always been able to scan iCloud and send your data
         | to police
         | 
         | I don't see how the article you linked to explains how Apple
         | can "scan iCloud" for the police. What do you mean? It seems
         | like they just hand data over for a specific warrant related to
         | individual users.
        
           | danso wrote:
           | Sorry, are you insinuating that Apple has the ability to
           | retrieve, read, and provide the unencrypted data from users'
           | iCloud accounts, but not the ability to search across that
           | data?
        
       | sathackr wrote:
       | US Government: We suspect the person in this photo of committing
       | a crime. Here is your subpoena, Apple. You are directed to scan
       | all iPhone and iCloud storage for any pictures matching this
       | NeuralHash and report to us where you find them.
       | 
       | Chinese Government: Here is the NeuralHash for Tienanmen square.
       | Delete all photos you find matching this or we will bar you from
       | China.
       | 
       | Apple has at this point already admitted this is within is
       | capability. So regardless of what they do now, the battle is
       | already lost. Glad I don't use iThings.
        
         | indymike wrote:
         | Its worse:
         | 
         | Apple: hey govt, here's some probable cause.
         | 
         | Govt: warrant, search, arrest
         | 
         | User: prosecuted
        
         | ggggtez wrote:
         | Technically true, though I don't know why you think that
         | capability is limited to Apple products...
         | 
         | https://transparencyreport.google.com/youtube-policy/feature...
        
         | khazhoux wrote:
         | > You are directed to scan all iPhone and iCloud storage for
         | any pictures matching this NeuralHash and report to us where
         | you find them.
         | 
         | I think the US Govt (and foreign) would actually send Apple
         | tens of thousands of NeuralHashes a week. Why would they limit
         | themselves? False positives are "free" to them.
        
           | dane-pgp wrote:
           | > False positives are "free" to them.
           | 
           | Correct, just look at the incentives when it comes to
           | handling sniffer dogs.
        
         | snowwrestler wrote:
         | I get this sentiment but the known-bad-image-scanning
         | technology is not new. That's not what Apple announced. Many
         | tech services already do that, which already enables the
         | slippery slope you're illustrating here.
         | 
         | I'm not trying to minimize the danger of that slope. But as
         | someone who is interested in the particulars of what Apple
         | announced specifically, it is getting tiresome to wade through
         | all the comments from people just discovering that PhotoDNA
         | exists in general.
        
           | vineyardmike wrote:
           | But now its on your device.
           | 
           | "Its just when you upload to icloud or recieve an iMessage"
           | they say. BUT the software's on your device forever. What
           | about next year? What about after an authoritarian goverment
           | approaches them? By 2023 it might be searching non-uploaded
           | photos.
           | 
           | You can always not use cloud tech like PhotoDNA but you can't
           | not use software BUILT IN to your device. Especially when its
           | not transparent.
        
             | gordon_gee123 wrote:
             | Nothing stopping countries from demanding every tech
             | organization does this anyway, it didnt just become a
             | possibility now Apples running code on-device. Also this
             | code can and probably will be able to be
             | activated/deactivated/removed remotely (for better or
             | worse!)
        
             | mirkules wrote:
             | It's also not inconceivable to access the camera live and
             | perform recognition live on feeds. It's not even that
             | expensive to do that anymore.
        
               | osmarks wrote:
               | It would probably kill the battery to do that constantly,
               | as the processor would not be able to sleep ever.
               | Although relying on energy efficiency to not improve is
               | not a good long term situation.
        
               | vineyardmike wrote:
               | Facial recognition is already part of all camera software
               | today.
        
         | brasscodemonkey wrote:
         | Hasn't Google been parsing Google Photos images, email content,
         | and pretty much everything else since forever? Do you just stay
         | off of smartphones and cloud platforms entirely?
        
           | fh973 wrote:
           | Microsoft and Google have been doing that in their online
           | services for ages, but not on your personal devices.
        
             | brasscodemonkey wrote:
             | So if I don't backup with Google Photos or Google Drive,
             | would they be safe for now?
        
               | vineyardmike wrote:
               | yes theoretically.
        
           | [deleted]
        
           | sathackr wrote:
           | They scan things I send them.
           | 
           | They don't (publicly announce that they) scan things on my
           | device.
        
             | dwaite wrote:
             | The Apple feature discussed here is for photos being
             | synched to iCloud Photos. It does not scan arbitrary local
             | content.
        
               | giantrobot wrote:
               | > It does not scan arbitrary local content.
               | 
               | Yet.
               | 
               | Before it was "only content uploaded to iCloud is
               | scanned" and now it's "photos are scanned on-device".
               | That's frog boiling that tomorrow easily becomes
               | "arbitrary files are scanned anywhere on the device".
        
               | hippari wrote:
               | They already can decrypt iCloud photos, why else perform
               | an on-device scan ? If not with the intention to scan all
               | local contents ?
        
               | sathackr wrote:
               | And the matching photo is uploaded upon match. So
               | regardless the photo is uploaded. What's the point again
               | of taking this further step?
        
               | vineyardmike wrote:
               | It also scans every photo that an iMessage user
               | sends/receives.
        
         | fny wrote:
         | The Chinese government already has direct access to everyone's
         | data and messages. There is no encryption. This is mandated.
         | 
         | [0]: https://www.nytimes.com/2021/05/17/technology/apple-china-
         | ce...
        
           | FabHK wrote:
           | > direct access to everyone's data and messages.
           | 
           | From what I understand, only to _Chinese customers_ 's data
           | and messages (bad enough, sure, but not as bad as you say).
        
             | brasscodemonkey wrote:
             | Depends on the company. If it's a Chinese company like
             | TikTok, data is stored in China and therefore property of
             | the state. Things are about to get worse and they crack
             | down on private companies in China.
             | 
             | And it's not just Americans who worry about this. When word
             | got out that Line was storing user data in China servers,
             | it blew up in the news in Japan and Taiwan and went into
             | immediate investigation. Line ended up moving Japanese user
             | data to South Korea. Chinese aggression and Xi's obsession
             | with power is not just something Americans spout off in
             | Reddit and YouTube comments. They're a legit threat to
             | Taiwan, Japan, Australia and India.
             | 
             | https://asia.nikkei.com/Business/Technology/Line-cuts-off-
             | ac...
        
               | FabHK wrote:
               | > Depends on the company
               | 
               | I was referring specifically to Apple & iCloud (and I
               | thought GP was as well).
        
               | jazzyjackson wrote:
               | TikTok is run by a US-based subsidiary of ByteDance
               | 
               | They have insisted "that TikTok U.S. user data is stored
               | in Virginia, with a back-up in Singapore and strict
               | controls on employee access."
               | 
               | https://techcrunch.com/2020/08/17/tiktok-launches-a-new-
               | info...
        
               | giantrobot wrote:
               | China's state security laws trump all "strict controls"
               | for employees.
        
               | jazzyjackson wrote:
               | Which employees? The ones who aren't Chinese citizens and
               | aren't located in China? What does China state security
               | have to do with them?
        
               | giantrobot wrote:
               | ByteDance's Douyin product has Chinese employees that are
               | based in China. TikTok employees are also ByteDance
               | employees which means a ByteDance employee that passes
               | through their "strict controls" can access whatever
               | TikTok data they want. Even if that's a dozen Chinese
               | nationals that can get access that's a dozen people
               | required by Chinese law to help the state security
               | aparatus.
               | 
               | I don't see any reason to give them the benefit of the
               | doubt considering they already moderate content the
               | Chinese government doesn't like [0] as a matter of
               | company policy.
               | 
               | [0] https://www.theguardian.com/technology/2019/sep/25/re
               | vealed-...
        
           | belter wrote:
           | "Apple CEO Tim Cook Slams Indiana's 'Religious Freedom' Law"
           | 
           | https://www.rollingstone.com/politics/politics-news/apple-
           | ce...
           | 
           | The hypocrisy levels are vomit inducing...
        
             | zepto wrote:
             | What hipocrisy are you seeing?
        
               | belter wrote:
               | You must be trolling...Provoking...Joking...Or you are
               | Tim Cook :-) Are the personal ethics and values
               | disconnected from daily business, to be changed based on
               | the place of business or Apple must do it because they
               | are just complying with local laws in China ?
               | 
               | Because IBM was also in 1939...Just the local laws.
               | 
               | https://en.wikipedia.org/wiki/IBM_and_the_Holocaust
        
               | zepto wrote:
               | So this is just about China?
               | 
               | When Apple started working in China, the prevailing
               | western belief was that economic liberalization would
               | cause political liberalization in China, so even though
               | China was still relatively repressive, _doing business
               | there would help move things in a better direction._
               | 
               | At the time, this was a very reasonable and widespread
               | belief, which turned out to be wrong.
               | 
               | Betting on other people and turning out to be wrong
               | doesn't make you a hypocrite. It makes you naive.
        
               | belter wrote:
               | Naive is not something I would associate with Apple, but
               | I guess they have seen it now. I guess they must be ready
               | to pull off any time...Or is their CEO too busy,
               | lecturing others on the righteous paths in other
               | jurisdictions?
               | 
               | "Apple's Good Intentions Often Stop at China's Borders"
               | 
               | https://www.wired.com/story/apple-china-censorship-apps-
               | flag...
        
               | zepto wrote:
               | > Naive is not something I would associate with Apple,
               | but I guess they have seen it now.
               | 
               | It wasn't just Apple who was naive, it was the entirety
               | of US and European foreign policy too. Do you want to
               | claim that the west didn't expect political
               | liberalization in China.
               | 
               | That article doesn't change anything. Apple didn't go
               | into China thinking they were helping to strengthen an
               | authoritarian regime. They went in thinking they were
               | helping to liberalize it.
               | 
               | We all, Apple included, got played.
        
               | [deleted]
        
         | zepto wrote:
         | As far as I can see:
         | 
         | 1. This _is_ a serious attempt to build a privacy preserving
         | solution to child pornography.
         | 
         | 2. The complaints are all slippery slope arguments that
         | governments will force Apple to abuse the mechanism. _These are
         | clearly real concerns and even Tim Cook admits that if you
         | build a back door, bad people will use it._
         | 
         | However:
         | 
         | Child pornography and the related abuse _is_ widely thought of
         | as a massive problem, that _is_ facilitated by encrypted
         | communication and digital photography. People _do_ care about
         | this issue.
         | 
         | 'Think of the children' is a great pretext for increasing
         | surveillance, because _it isn't an irrational fear._
         | 
         | So: where are the proposals for a better solution?
         | 
         | I see here people who themselves are afraid of the real
         | consequences of government/corporate surveillance, and whose
         | fear prevents them from empathizing with the people who are
         | afraid of the equally real consequences of organized child
         | sexual exploitation.
         | 
         | 'My fear is more important than your fear', is the root of
         | ordinary political polarization.
         | 
         | What would be a hacker alternative would be to come up with a
         | technical solution that solves for both fears at the same time.
         | 
         | This is what Apple has attempted, but the wisdom here is that
         | they have failed.
         | 
         | Can anyone propose anything better, or are we stuck with just
         | politics as usual?
         | 
         | Edit: added 'widely thought of as' to make it clear that I am
         | referring to a widely held position, not that I am arguing for
         | it.
        
           | webmobdev wrote:
           | The intention to prevent child sexual abuse material is
           | indeed laudable. But the "solution" is not. Especially when
           | we all know that this "solution" gives a legal backing to
           | corporates to do even more data mining on its users, and can
           | easily be extended (to scan even more "illegal" content) and
           | abused into a pervasive surveillance network desired by a lot
           | of government around the world.
           | 
           | For those wondering what's wrong with this, two hard-earned
           | rights in a democracy go for a toss here:
           | 1. The law presumes we are all innocent until proven guilty.
           | 2. We have the right against self-incrimination.
           | 
           | Pervasive surveillance like this starts with the presumption
           | that we are all guilty of something ( _" if you are innocent,
           | why are you scared of such surveillance?_" or _" what do you
           | have to hide if you are innocent?"_). The right against self-
           | incrimination is linked to the first doctrine because
           | compelling an accused to testify transfers the burden of
           | proving innocence to the accused, instead of requiring the
           | government to prove his guilt.
        
             | zepto wrote:
             | > But the "solution" is not.
             | 
             | Agreed, that's my point, and I would say that your comment
             | outlines some good principles a better solution might
             | respect, although it's worth noting that those principles
             | have only ever constrained the legal system.
             | 
             | Your employer, teacher, customers, vendors, parents etc,
             | have never been constrained in these ways.
        
           | MikeUt wrote:
           | > So: where are the proposals for a better solution?
           | 
           | Better policing and child protective services to catch child
           | abuse at the root, instead of panicking about the digital
           | files it produces? If you'd been paying attention, you'd have
           | noticed that real-world surveillance has _massively_
           | increased, which _should_ enable the police to catch
           | predators more easily. Why count only privacy-betraying
           | technology as a  "solution", while ignoring the rise of
           | police and surveillance capabilities?
           | 
           | Edit as reply because two downvotes means I am "posting too
           | fast, please slow down" (thank you for respecting me enough
           | to tell me when I can resume posting /s):
           | 
           | > How do you police people reaching out to children via
           | messaging with sexual content?
           | 
           | First, this is one small element of child abuse - you want to
           | prevent child rape, merely being exposed to sexual content is
           | nowhere near severe enough to merit such serious privacy
           | invasion. To prevent the actual abuse, one could use the
           | near-omnipresent facial recognition cameras, license plate
           | readers, messaging metadata, to find when a stranger is
           | messaging or stalking a child, DNA evidence after the fact
           | that is a deterrent to other offenders, phone location data,
           | etc. etc. At first I thought I didn't have to spell this out.
           | 
           | Second, to answer your question: _very easily_. With
           | _parental controls_ , a decades old technology that is
           | compatible with privacy and open-source. The parent can be
           | the admin of the child's devices, and have access to their
           | otherwise encrypted messages. There is no need to delegate
           | surveillance (of everyone, not just children) to governments
           | and corporations, when we have such a simple, obvious,
           | _already existing_ solution. It frankly boggles the mind how
           | one could overlook it, especially compared to how technically
           | complex Apple 's approach is. Does the mention of child abuse
           | simply cause one's thinking to shut down, and accept as
           | gospel anything Apple or the government says, without
           | applying the smallest bit of scrutiny?
        
             | zepto wrote:
             | How do you police people reaching out to children via
             | messaging with sexual content?
             | 
             | > Why count only privacy-betraying technology as a
             | "solution", while ignoring the rise of police and
             | surveillance capabilities?
             | 
             | I don't. But if your solution is 'just police more', you
             | need to explain how the police should detect people who are
             | grooming children by sending images to them.
        
           | SV_BubbleTime wrote:
           | Yes. _"We must do something_ , _"It's for the children"_ ,
           | _"Even if it saves just one person"_. No one has ever used
           | this triple fallacy before.
           | 
           | How big do you really think the market is on kiddie porn that
           | there are people storing it plainly on their iPhones and are
           | "safe" because they aren't uploading it to iCloud?
           | 
           | This is bullshit through and through.
           | 
           | The best case is this is the step Apple needs to get E2E
           | iCloud storage, to prove they're doing enough while
           | maintaining privacy. The worst case is that if their is
           | potential for a list and reporting to be abused, it will.
           | 
           | There seems to be no scenario for the best case to exist
           | without the worst case.
        
             | zepto wrote:
             | Your position is to simply deny that child porn or
             | predation is a serious problem.
             | 
             | I outlined that this was one of the sides in the debate -
             | I.e. you take position 2, and have no empathy for position
             | 1.
             | 
             | The problem is that this is a political trap. You can't win
             | position 2 by painting technologists as uncaring or
             | dismissive.
             | 
             | 'Think of the children' _works_ for a reason.
        
               | SV_BubbleTime wrote:
               | I don't believe people sharing child porn on iPhones is a
               | serious problem, no. Laptops, desktops, sure.
               | 
               | My position is that this system has far more likely harm
               | than potential good. It's not worth it.
               | 
               | It has nothing to do with not recognizing another side.
        
               | zepto wrote:
               | Do you believe that sexual predators sharing images
               | _with_ children on iPhones is a problem?
               | 
               | > It has nothing to do with not recognizing another side.
               | 
               | It seems like you are simply stating that the other
               | side's priorities are wrong.
               | 
               | That seems like a reasonable belief that won't help
               | anyone with the problem of creeping surveillance.
        
               | SV_BubbleTime wrote:
               | Do you believe you can stop people from sharing imagines
               | pre-determined to be exploitative and illegal by a
               | central authority who has already logged and recorded the
               | hash of that photo _with_ children by deploying a
               | universal scan system to all phones - when phones aren 't
               | a primary tool of trafficing illegal content?
               | 
               | >It seems like you are simply stating that the other
               | side's priorities are wrong.
               | 
               | This is a juvenile attempt at a nobility argument. That
               | you can do anything as long as the goal is noble. Just as
               | I wrote first, anything if it's "for the children". This
               | is how all sorts of abuses are carried out under the four
               | horsemen.
        
               | zepto wrote:
               | > when phones aren't a primary tool of trafficing illegal
               | content?
               | 
               | Are they not? This is just an assumption you have made.
               | It doesn't matter what I think about it
               | 
               | I asked:
               | 
               | > Do you believe that sexual predators sharing images
               | with children on iPhones is a problem?
               | 
               | You haven't answered.
               | 
               | >>It seems like you are simply stating that the other
               | side's priorities are wrong.
               | 
               | > This is a juvenile attempt at a nobility argument.
               | 
               | No it isn't. It is an honest attempt to characterize your
               | approach. It seems clear that you think the other sides
               | priorities are wrong.
               | 
               | That's fine. I have to assume everyone thinks that. The
               | point is that it doesn't matter that you think they are
               | wrong. You have already lost. The things you don't want
               | are already happening.
               | 
               | My argument is that since that debate is lost, any
               | attempt to restore privacy must accept that other
               | people's priorities are different. Simply trying to get
               | the other side to change priorities when you are already
               | losing doesn't seem like a good approach.
        
               | danShumway wrote:
               | > 'Think of the children' works for a reason.
               | 
               | "Think of the children" will _always_ work, no matter
               | what the context is, no matter what the stats are, and no
               | matter what we do. That does not mean that we should not
               | care about the children, and it does not mean that we
               | shouldn 't care about blocking CSAM. We should care about
               | these issues purely because we care about protecting
               | children. If there are ways for us to reduce the problem
               | without breaking infrastructure or taking away freedoms,
               | we should take those steps. Similarly, we should also
               | think about the children by protecting them from having
               | their sexual/gender identities outed against their
               | wishes, and by guaranteeing they grow up in a society
               | that values privacy and freedom where they don't need to
               | constantly feel like they're being watched.
               | 
               | But while those moral concerns remain, the evergreen
               | effectiveness of "think of the children" also means that
               | compromising on this issue is not a political strategy.
               | It's nothing, it will not ease up on any pressure on
               | technologists, it will change nothing about the political
               | debates that are currently happening. Because it hasn't:
               | we've been having the same debates about encryption since
               | encryption was invented, and I would challenge you to
               | point at any advancement or compromise from encryption
               | advocates as having lessened those debates or having
               | appeased encryption critics.
               | 
               | Your mistake here is assuming that _anything_ that
               | technologists can build will ever change those people 's
               | minds or make them ease up on calls to ban encryption. It
               | won't.
               | 
               | Reducing the real-world occurrences for irrational fears
               | doesn't make those fears go away. If we reduce shark
               | attacks on a beach by 90%, that won't make people with a
               | phobia less frightened at the beach, because their fear
               | is not based on real risk analysis or statistics or
               | practical tradeoffs. Their fear is real, but it's also
               | irrational. They're scared because they see the deep
               | ocean and because Jaws traumatized them, and you can't
               | fix that irrational fear by validating it.
               | 
               | So in the real world we know that the majority of child
               | abuse comes from people that children already know. We
               | know the risks of outing minors to parents if they're on
               | an LGBTQ+ spectrum. We know the broader privacy risks. We
               | know that abusers (particularly close abusers) often try
               | to hijack systems to monitor and spy on their victims. We
               | would also in general like to see more stats about how
               | serious the problem of CSAM actually is, and we'd like to
               | know whether or not our existing tools are being used
               | effectively so we can balance the potential benefits and
               | risks of each proposal against each other.
               | 
               | If somebody's not willing to engage with those points,
               | then what makes you think that compromising on any other
               | front will change what's going on in their head? You're
               | saying it yourself, these people aren't motivated by
               | statistics about abuse, they're frightened of the _idea_
               | of abuse. They have an image in their head of predators
               | using encryption, and that image is never going to go
               | away no matter what the real-world stats do and no matter
               | what solutions we propose.
               | 
               | The central fear that encryption critics have is a fear
               | of _private communication_. How can technologists
               | compromise to address that fear? It doesn 't matter what
               | solutions we come up with or what the rate of CSAM drops
               | to, those people are still going to be scared of the idea
               | of privacy itself.
               | 
               | Nobody in any political sphere has _ever_ responded to
               | "think of the children" with "we already thought of them
               | enough." So the idea that compromising now will change
               | anything about how that line is used in the future -- it
               | just seems naive to me. Really, the problem here can't be
               | solved by either technology or policy. It's cultural. As
               | long as people are frightened of the idea of privacy and
               | encryption, the problem will remain.
        
               | zepto wrote:
               | > So in the real world we know that the majority of child
               | abuse comes from people that children already know.
               | 
               | Historically this has been regarded as true, but
               | according to the FBI, online predation is a growing
               | problem.
               | 
               | https://www.fbi.gov/news/stories/child-predators
               | 
               | > Your mistake here is assuming that anything that
               | technologists can build will ever change those people's
               | minds or make them ease up on calls to ban encryption. It
               | won't.
               | 
               | What makes you think I think that? You have
               | misrepresented me here (effectively straw-manning), but I
               | will assume an honest mistake.
               | 
               | You are right that there are people who will always seek
               | to ban or undermine encryption no matter what, and who
               | use 'think of the children' as an excuse regardless of
               | the actual threat. 'Those people' as you put it, by
               | definition will never have their minds changed by
               | technologists. Indeed there is no point in technologists
               | trying to do that.
               | 
               | However I don't think that group includes Apple, nor does
               | it include most of Apples customers. Apple's customers do
               | include many people who are worried about sexual
               | predators reaching their children via their phones
               | though. These people are not ideologues or anti-
               | encryption fanatics.
               | 
               | Arguing that concerns about children are overblown or
               | being exploited for nefarious means may be 'true', but it
               | does nothing to provide an alternative that Apple could
               | use, not does it do anything to assuage the legitimate
               | fears of Apple's customers.
               | 
               | Perhaps you believe that there _is_ no way to build a
               | more privacy preserving solution than the one Apple has.
               | 
               | I would simply point out in that case, that the strategy
               | of arguing against 'think of the children', has already
               | lost, and commiserate with you.
               | 
               | I'm not convinced that there is no better solution.
               | Betting against technologists to solve problems usually
               | seems like a bad bet, but even if you don't think it's
               | likely, it seems irrational not to hedge, because the
               | outcome of solving the problem would have such a high
               | upside.
               | 
               | It's worth pointing out that Public Key cryptography is a
               | solution to a problem that at one time seemed insoluble
               | to many.
        
               | danShumway wrote:
               | > Arguing that concerns about children are overblown or
               | being exploited for nefarious means may be 'true', but it
               | does nothing to provide an alternative that Apple could
               | use
               | 
               | - If the stats don't justify their fears
               | 
               | - And I come up with a technological solution that will
               | make the stats even lower
               | 
               | - Their fears will not be reduced
               | 
               | - Because their fears are not based on the stats
               | 
               | ----
               | 
               | > Apple's customers do include many people who are
               | worried about sexual predators reaching their children
               | via their phones though
               | 
               | Are they worried about this because of a rational fear
               | based on real-world data? If so, then I want to talk to
               | them about that data and I want to see what basis their
               | fears have. I'm totally willing to try and come up with
               | solutions that reduce the _real world_ problem as long as
               | we 're all considering the benefits and tradeoffs of each
               | approach. We definitely should try to reduce the problem
               | of CSAM even further.
               | 
               | But if they're not basing their fear on data, then I
               | can't help them using technology and I can't have that
               | conversation with them, because their fear isn't based on
               | the real world: it's based on either their cultural
               | upbringing, or their preconceptions about technology, or
               | what media they consume, or their past traumas, or
               | whatever phobias that might be causing that fear.
               | 
               | Their fear is real, but it can not be solved by any
               | technological invention or policy change, _including
               | Apple 's current system._ Because you're telling me that
               | they're scared regardless of what the reality of the
               | situation is, you're telling me they're scared regardless
               | of what the stats are.
               | 
               | That problem can't be solved with technology, it can only
               | be solved with education, or emotional support, or
               | cultural norms. If they're scared right now without
               | knowing anything about how bad the problem actually is,
               | then attacking the problem itself will do nothing to help
               | them -- because that's not the source of their fear.
        
               | zepto wrote:
               | > Their fear is real, but it can not be solved by any
               | technological invention or policy change, including
               | Apple's current system. Because you're telling me that
               | they're scared regardless of what the reality of the
               | situation is, you're telling me they're scared regardless
               | of what the stats are.
               | 
               | Not really.
               | 
               | I'm agreeing that parents will be afraid for their
               | children regardless of the stats, and are unlikely to
               | believe anyone who claimed they shouldn't be. The 'stats'
               | as you put it won't change this.
               | 
               | Not because the stats are wrong, but because they _are
               | insufficient_ , and in fact predation will likely
               | continue in a different form even if we can show a
               | particular form to not be very prevalent. The claim to
               | have access to 'the reality of the situation' is not
               | going to be accepted.
               | 
               | You won't be able to solve the problem through education
               | or emotional support because you can't actually prove
               | that the problem isn't real.
               | 
               |  _You actually don't know the size of the problem
               | yourself_ , which is why you are not able to address it
               | conclusively here.
               | 
               | What I am saying is that we need to accept that this is
               | the environment, and if we want less invasive technical
               | solutions to problems people think are real, _and which
               | you cannot prove are not_ , then we need to create them.
        
               | danShumway wrote:
               | > What I am saying is that we need to accept that this is
               | the environment, and if we want less invasive technical
               | solutions to problems people think are real, and which
               | you cannot prove are not, then we need to create them.
               | 
               | And what I'm saying is that this is a giant waste of time
               | because if someone has a phobia about their kid getting
               | abducted, that phobia will not go away just because Apple
               | started scanning photos.
               | 
               | You want people to come up with a technical solution, but
               | you don't even know to define what a "solution" is. How
               | will we measure that solution absent statistics? How will
               | we know if it's working or not? Okay, Apple starts
               | scanning photos. Are we done? Has that solved the
               | problem?
               | 
               | We don't know if that's enough, because people's fears
               | here aren't based on the real world, they're based on
               | Hollywood abduction movies, and those movies are still
               | going to get made after Apple starts scanning photos.
               | 
               | You are completely correct that the stats are
               | insufficient to convince these people. But you're also
               | completely wrong in assuming that there is some kind of
               | escape hatch or technological miracle that anyone can
               | pull off to make those fears go away, because in your own
               | words: "parents will be afraid for their children
               | regardless of the stats."
               | 
               | If Apple's policy reduces abuse by 90%, they'll still be
               | afraid. If it reduces it by 10%, they'll still be afraid.
               | There is no technological solution that will ease their
               | fear, because it's not about the stats.
               | 
               | ----
               | 
               | I'm open to being proven wrong that predation is a
               | serious problem that needs drastic intervention. I'm open
               | to evidence that suggests that encryption is a big enough
               | problem that we need to come up with a technological
               | solution. I just want to see some actual evidence. People
               | being scared of things is not evidence, that's not
               | something we can have a productive conversation about.
               | 
               | If we're going to create a "solution", then we need to
               | know what the problem is, what the weak points are, and
               | what metrics we're using to figure out whether or not
               | we're making progress.
               | 
               | If that's not on the table, then also in your words, we
               | need to "accept that this is the environment" and stop
               | trying to pretend that coming up with technical solutions
               | will do anything to reduce calls to weaken encryption or
               | insert back doors.
        
               | zepto wrote:
               | > But you're also completely wrong in assuming that there
               | is some kind of escape hatch or technological miracle
               | that anyone can pull off to make those fears go away,
               | 
               | I can't be wrong about that since I'm not claiming that
               | anywhere or assuming it.
               | 
               | > because in your own words: "parents will be afraid for
               | their children regardless of the stats."
               | 
               | Given that I wrote this, why would you claim that I think
               | otherwise?
               | 
               | > There is no technological solution that will ease their
               | fear, because it's not about the stats.
               | 
               | Agreed, except that I go further and claim that the stats
               | are not sufficient, so making about the stats _can't_
               | solve the problem.
               | 
               | > People being scared of things is not evidence,
               | 
               | It's evidence of fear. Fear is real, but it's not a
               | measure of severity or probability.
               | 
               | > that's not something we can have a productive
               | conversation about.
               | 
               | I don't see why we can't take into account people's
               | fears.
               | 
               | > If we're going to create a "solution", then we need to
               | know what the problem is, what the weak points are, and
               | what metrics we're using to figure out whether or not
               | we're making progress.
               | 
               | Yes. One of those metrics could be 'in what ways does
               | this compromise privacy', and another could be 'in what
               | ways does this impede child abuse use cases'. I suspect
               | Apple is trying to solve for those metrics.
               | 
               | Perhaps someone else can do better.
               | 
               | > If that's not on the table, then also in your words, we
               | need to "accept that this is the environment"
               | 
               | This part is unclear.
               | 
               | > stop trying to pretend that coming up with technical
               | solutions will do anything to reduce calls to weaken
               | encryption or insert back doors.
               | 
               | It's unclear why you would say anyone is pretending this,
               | least of all me. I have wholeheartedly agreed with you
               | that these calls are 'evergreen'.
               | 
               | I want solutions to problems like the child abuse use
               | cases, such that _when calls to weaken encryption or
               | insert back doors are made_ as they always will be, we
               | don't have to.
        
               | danShumway wrote:
               | > except that I go further and claim that the stats are
               | not sufficient, so making about the stats can't solve the
               | problem.
               | 
               | Statistics are a reflection of reality. When you say that
               | the stats don't matter, you are saying that the _reality_
               | doesn 't matter. Just that people are scared.
               | 
               | You need to go another step further than you are
               | currently going, and realize that any technological
               | "solution" will only be affecting the reality, and by
               | extension will only be affecting the stats. And we both
               | agree that the stats can't solve the problem.
               | 
               | It's not that making this about the stats will solve the
               | problem. It won't. But neither will _any_ technological
               | change. You can not solve an irrational fear by making
               | reality safer.
               | 
               | ----
               | 
               | Let's say we abandon this fight and roll over and accept
               | Apple moving forward with scanning. Do you honestly
               | believe that even one parent is going to look at that and
               | say, "okay, that's enough, I'm not scared of child
               | predators anymore."? Can you truthfully tell me that you
               | think the political landscape and the hostility towards
               | encryption would change at all?
               | 
               | And if not, how can you float compromise as a political
               | solution? What does a "solution" to an irrational fear
               | even look like? How will we tell that the solution is
               | working?
               | 
               | You say the stats don't matter; then we might as well
               | give concerned parents fake "magic" bracelets and tell
               | them that they make kids impossible to kidnap. Placebo
               | bracelets won't reduce actual child abuse of course, but
               | as you keep reiterating, actual child abuse numbers are
               | not why these people are afraid. Heck, placebo bracelets
               | might help reduce parent's fear _more_ than Apple 's
               | system, since placebo bracelets would be a constantly
               | visible reminder to the parents that they don't need to
               | be afraid, and all of Apple's scanning happens invisibly
               | behind the scenes where it's easy to forget.
               | 
               | ----
               | 
               | > I want solutions to problems like the child abuse use
               | cases, such that when calls to weaken encryption or
               | insert back doors are made as they always will be, we
               | don't have to.
               | 
               | Out of curiosity, how will you prove to these people that
               | your solutions are sufficient and that they work as
               | substitutes for weakening encryption? How will you prove
               | to these people that your solutions are enough?
               | 
               | Will you use stats? Appeal to logic?
               | 
               | You _almost_ completely understand the entire situation
               | right now, you just haven 't connected the dots that all
               | of your technological "solutions" are subject to the same
               | problems as the current debate.
        
               | zepto wrote:
               | > Statistics are a reflection of reality.
               | 
               | No, they are the output of a process. Whether a process
               | reflects 'reality' is dependent on the process and how
               | people understand it. This is essential to science.
               | 
               | Even when statistics are the result of the best
               | scientific processes available, they are typically narrow
               | and reflect only a small portion of reality.
               | 
               | This is why they are insufficient.
               | 
               | > When you say that the stats don't matter,
               | 
               | I never said they don't matter. I just said they were
               | insufficient to convince people who are afraid.
               | 
               | > you are saying that the reality doesn't matter.
               | 
               | Since I'm not saying they don't matter, this is
               | irrelevant.
               | 
               | > It's not that making this about the stats will solve
               | the problem. It won't. But neither will any technological
               | change. You can not solve an irrational fear by making
               | reality safer.
               | 
               | Can you find a place where this contradicts something
               | I've said? I haven't argued to the contrary anywhere. I
               | don't expect to get the fears to go away.
               | 
               | As to whether they are rational are not, some are, and
               | some aren't. We don't know which are which because you
               | don't have the stats, so we have to accept that there is
               | a mix.
               | 
               | > Will you use stats? Appeal to logic?
               | 
               | Probably a mix of both, maybe some demos, who knows. I
               | won't expect them to be sufficient to silence the people
               | who are arguing in favor of weakening encryption, not
               | make parents feel secure about their children being
               | protected against predation forever.
               | 
               | > You almost completely understand the entire situation
               | right now, you just haven't connected the dots that all
               | of your technological "solutions" are subject to the same
               | problems as the current debate.
               | 
               | Again you misrepresent me. Can you find a place where I
               | argue that technological solutions are not subject to the
               | same problems as the current debate?
               | 
               | I don't think you can find such a place.
               | 
               | I have fully agreed that you can't escape the
               | vicissitudes of the current debate. Nonetheless, you can
               | still produce better technological solutions. This isn't
               | about prevailing over unquantifiable fears and dark
               | forces. It's about making better technologies in their
               | presence.
        
               | danShumway wrote:
               | > This is essential to science.
               | 
               | Okay, fine. Are you claiming that people who are calling
               | to ban encryption are doing so on a scientific basis?
               | 
               | Come on, be serious here. People call to ban encryption
               | because it scares them, not because they have a model of
               | the world based on real data or real science that they're
               | using to reinforce that belief.
               | 
               | If they did, we could argue with them. But we can't,
               | because they don't.
               | 
               | > Can you find a place where this contradicts something
               | I've said?
               | 
               | Yes, see below:
               | 
               | > such that when calls to weaken encryption or insert
               | back doors are made as they always will be, we don't have
               | to
               | 
               | I'm open to some kind of clarification that makes this
               | comment make sense. _How are your "solutions" going to
               | make people less afraid? On what basis are you going to
               | argue with these people that your solution is better than
               | banning encryption?_
               | 
               | Pretend that I'm a concerned parent right now. I want to
               | ban encryption. What can you tell me now to convince me
               | that any other solution will be better?
        
               | zepto wrote:
               | > This is essential to science.
               | 
               | >> Okay, fine. Are you claiming that people who are
               | calling to ban encryption are doing so on a scientific
               | basis?
               | 
               | No. Did I say something to that effect?
               | 
               | > Come on, be serious here. People call to ban encryption
               | because it scares them, not because they have a model of
               | the world based on real data or real science that they're
               | using to reinforce that belief.
               | 
               | You say this as if you are arguing against something I
               | have said. Why?
               | 
               | > If they did, we could argue with them. But we can't,
               | because they don't.
               | 
               | We can still argue with them, just not with science.
               | 
               | > Can you find a place where this contradicts something
               | I've said?
               | 
               | > Yes, see below:
               | 
               | You'll need to explain what the contradiction is. You
               | have said you don't understand it, but you not
               | understanding doesn't make it a contradiction.
               | 
               | >> such that when calls to weaken encryption or insert
               | back doors are made as they always will be, we don't have
               | to
               | 
               | > I'm open to some kind of clarification that makes this
               | comment make sense.
               | 
               | It makes sense to have solutions that don't weaken
               | privacy. Wouldn't you agree?
               | 
               | > How are your "solutions" going to make people less
               | afraid?
               | 
               | They won't.
               | 
               | > On what basis are you going to argue with these people
               | that your solution is better than banning encryption?
               | 
               | Which people? The parents, the nefarious actors, apple's
               | customers?
               | 
               | > Pretend that I'm a concerned parent right now. I want
               | to ban encryption. What can you tell me now to convince
               | me that any other solution will be better?
               | 
               | Of course not because you are going to play the role of
               | an irrational parent who cannot be convinced.
               | 
               | Neither of us disagree that such people exist. Indeed we
               | both believe that they do.
               | 
               | Why does changing such a person's mind matter?
        
               | danShumway wrote:
               | > Neither of us disagree that such people exist. Indeed
               | we both believe that they do.
               | 
               | > Why does changing such a person's mind matter?
               | 
               | Okay, finally! I think I understand why we're
               | disagreeing. Please tell me if I'm misunderstanding your
               | views below.
               | 
               | > You'll need to explain what the contradiction is.
               | 
               | I kept getting confused because you would agree with me
               | right up to your conclusion, and then suddenly we'd both
               | go in opposite directions. But here's why I think that's
               | happening:
               | 
               | You agree with me that there are irrational actors that
               | will not be convinced by any kind of reason or debate
               | that their fears are irrational. You agree with me that
               | those people will never stop calling to ban encryption,
               | and that they will not be satisfied by any alternative
               | you or I propose. But you _also_ believe there 's another
               | category of people who are "semi-rational" about child
               | abuse. They're scared of it, maybe not for any rational
               | reason. But they would be willing to compromise, they
               | would be willing to accept a "solution" that targeted
               | some of their fears, and they might be convinced than an
               | alternative to banning encryption is better.
               | 
               | Where we disagree is that I don't believe those people
               | exist -- or at least if they do exist, I don't believe
               | they are a large enough or engaged enough demographic to
               | have any political clout, and I don't think it's worth
               | trying to court them.
               | 
               | My belief is that by definition, a fear that is not based
               | on any kind of rational basis is an irrational fear. I
               | don't believe there is a separate category of people who
               | are irrationally scared of child predators, but fully
               | willing to listen to alternative solutions instead of
               | banning encryption.
               | 
               | So when you and I both say that we can't convince the
               | irrational people with alternative solutions, my
               | immediate thought is, "okay, so the alternative solutions
               | are useless." But of course you think the alternative
               | solutions are a good idea, because you think those people
               | will listen to your alternatives, and you think they'll
               | sway the encryption debate if they're given an
               | alternative. I don't believe those people exist, so the
               | idea of trying to sway the encryption debate by appealing
               | to them is nonsense to me.
               | 
               | In my mind, anyone who is rational enough to listen to
               | your arguments about why an alternative to breaking
               | encryption is a good idea, is also rational enough to
               | just be taught why banning encryption is bad. So for
               | people who are on the fence or uninformed, but who are
               | not fundamentally irrationally afraid of encryption, I
               | would much rather try gently reaching out to them using
               | education and traditional advocacy techniques.
               | 
               | ----
               | 
               | Maybe you're right and I'm wrong, and maybe there is a
               | political group of "semi-rational" people who are
               | 
               | A) scared about child abuse
               | 
               | B) unwilling to be educated about child abuse or to back
               | up their beliefs
               | 
               | C) but willing to consider alternatives to breaking
               | encryption and compromising devices.
               | 
               | If that group does exist, then yeah, I get where you're
               | coming from. BUT personally, I believe the history of
               | encryption/privacy/freedom debates on the Internet backs
               | up my view.
               | 
               | Let's start with SESTA/FOSTA:
               | 
               | First, Backpage _did_ work with the FBI, to the point
               | that the FBI even commented that Backpage was going
               | beyond any legal requirement to try and help identify
               | child traffickers and victims. Second, both sex worker
               | advocates and _sex workers themselves_ openly argued that
               | not only would SESTA /FOSTA be problematic for freedom on
               | the Internet, but that the bills would also make
               | trafficking worse and make their jobs even more
               | dangerous.
               | 
               | Did Backpage's 'compromise' sway anyone? Was there a
               | group of semi-reasonable people who opposed sites like
               | Backpage but were willing to listen to arguments that the
               | bills would actively make sex trafficking worse? No,
               | those people never showed up. The bills passed with broad
               | bipartisan support. Later, several Senators called to
               | reexamine the bills not because alternatives were
               | proposed to them, but because they put in the work to
               | educate themselves about the stats, and realized the
               | bills were harmful.
               | 
               | Okay, now let's look at the San Bernardino case with
               | Apple. Apple gave the FBI access to the suspect's iCloud
               | account, literally everything they asked for except
               | access to decrypt the phone itself. Advocates argued that
               | the phone was unlikely to aid in the investigation, and
               | also suggested using an exploit to get into the phone,
               | rather than requiring Apple to break encryption. Note
               | that in this case the alternative solution _worked_ , the
               | FBI was able to get into the phone using an exploit
               | rather than by compelling Apple to break encryption. The
               | best case scenario.
               | 
               | Did any of that help? Was there a group of semi-
               | reasonable people who were willing to listen to the
               | alternative solution? Did the debate cool because of it?
               | No, it changed nothing about the FBI's demands or about
               | the political debate. What _did_ help was Apple very
               | publicly and forcefully telling the FBI that any demand
               | at all to force them to install any code for any reason
               | would be a violation of the 1st Amendment. So minus
               | another point from compromise as an effective political
               | strategy in encryption debates, and plus one point to
               | obstinance.
               | 
               | Okay, now let's jump back to early debates about
               | encryption: the clipper chip. Was that solved by
               | presenting the government and concerned citizens with an
               | alternative that would better solve the problem? No, it
               | wasn't -- even though there were plenty of people who
               | argued at the time for encryption experts to work with
               | the government instead of against it. Instead the clipper
               | chip problem was solved both when encryption experts
               | broke the clipper chip so publicly and thoroughly that it
               | destroyed any credibility the government had in claiming
               | it was secure, and it was solved by the wide
               | dissemination of strong encryption techniques that made
               | the government's demands impossible, over the objections
               | of people who called for compromise or understanding of
               | the government's position.
               | 
               | ----
               | 
               | I do not see any strong evidence for a group of people
               | who can't be educated about encryption/abuse, but who can
               | be convinced to support alternative strategies to reduce
               | child abuse. If that group does exist, it does a very
               | good job of hiding, and a very bad job of intervening
               | during policy debates.
               | 
               | I do think that people exist who are skeptical about
               | encryption but who are not so irrational that they would
               | fall into our category of "impossible to convince."
               | However, I believe they can be educated, and that it is
               | better to try and educate them than it is to reinforce
               | their fears.
               | 
               | Because of that, I see no political value in trying to
               | come up with alternative solutions to assuage people's
               | fears. I think those people should either be educated, or
               | ignored.
               | 
               | It is possible I'm wrong, and maybe you could come up
               | with an alternative solution that reduced CSAM without
               | violating human rights to privacy and communication. If
               | so, I would happily support it, I have no reason to
               | oppose a solution that reduces CSAM if it doesn't have
               | negative effects for the Internet and free culture
               | overall, a solution like that would be great. However, I
               | very much doubt that you can come up with a solution like
               | that, and if you can, I very much doubt that outside of
               | technical communities anyone will be very interested in
               | what you propose. I personally think you would be very
               | disappointed by how few people arguing for weakening
               | encryption right now are actually interested in any of
               | the alternative solutions you can come up with.
               | 
               | And it's my opinion, based on the history of
               | privacy/encryption, that traditional advocacy and
               | education techniques will be more politically effective
               | than what you propose.
        
               | jodrellblank wrote:
               | > " _Reducing the real-world occurrences for irrational
               | fears doesn 't make those fears go away._" " _You 're
               | saying it yourself, these people aren't motivated by
               | statistics about abuse, they're frightened of the idea of
               | abuse_"
               | 
               | We could say the same thing the other way - people up in
               | arms are not frightened by statistics of abuse of a
               | surveillance system, but frightened of the _idea_ of a
               | company or government abusing it. This thread is full of
               | people misrepresenting how it works, claims of slippery
               | slopes straight to tyranny, there 's a comparison to IBM
               | and the Holocaust, and it's based on no real data and not
               | even the understanding from simply reading the press
               | release. This thread is not full of statistics and data
               | about existing content filtering and surveillance systems
               | and how often they are actually being abused. For example
               | Skype has intercepted your comms since Microsoft bought
               | it and routed all traffic through them, Chrome and
               | FireFox and Edge do smartscreen blocking of malware
               | websites - what are the stats on those systems being
               | abused to block politically inconvenient memes or
               | similar? Nothing Apple could do would in any way reassure
               | these people because the fears are not based on
               | information. For example your comment:
               | 
               | > " _We know the risks of outing minors to parents if
               | they 're on an LGBTQ+ spectrum._"
               | 
               | Minors will see the prompt "if you do this, your parents
               | will find out" and can choose not to and the parents
               | don't find out. There's an example of the message in the
               | Apple announcement[1]. This comment from you is reacting
               | to a fear of something disconnected from the facts of
               | what's been announced where that fear is guarded against
               | as part of the design.
               | 
               | You could say that the hash database is from a 3rd-party
               | so that it's not Apple acting unilateraly, but that's not
               | taken as reassurance because the government could abuse
               | it. OK guard against that with Apple reviewing the alerts
               | before doing anything with them, that's not reassuring
               | because Apple reviews are incompetent (where do you hear
               | of groups that are both incompetent and capable of
               | implementing worldscale surveillance systems? conspiracy
               | theories, mostly). People say it scans all photos and
               | when they learn that it scans only photos about to be
               | uploaded to iCloud their opinion doesn't seem to change,
               | because it's not reasoned based on facts, perhaps? People
               | say it will be used by abusive partners who will set
               | their partner to be a minor to watch their chats. People
               | explain that you can't change an adult AppleID to a minor
               | one just like that, demonstrating the argument was fear
               | based not fact based. People say it is a new ability for
               | Apple to install spyware in future, but it's obviously
               | not - Apple have been able to "install spyware in future"
               | since they introduced auto-installing iOS updates many
               | years ago. People say it's a slippery slope - companies
               | have changed direction, regulations can change, no change
               | in opinion; nobody has any data or facts about how often
               | systems do slide down slippery slopes, or get dragged
               | back up them. People saying it could be used by bad-
               | actors at Apple to track their Ex's. From the design, it
               | couldn't. But why facts when there's fearmongering to be
               | done? The open letter itself has multiple inaccurate
               | descriptions of how the thing works by _the second
               | paragraph_ to present it as maximally-scary.
               | 
               | > " _We would also in general like to see more stats
               | about how serious the problem of CSAM actually is_ "
               | 
               | We know[2] that over 12 million reports of child abuse
               | material to NMEC were related to FaceBook messenger and
               | NMEC alone gets over 18 million tips in a year. Does that
               | change your opinion either way? Maybe we could find out
               | more after this system goes live - how many alerts Apple
               | receives and how many they send on. A less panicky "Open
               | Letter to Apple" might encourage them to make that data
               | public, how many times it triggered in a quarter, and ask
               | Apple to commit to removing it if it's not proving
               | effective. And ask Apple to state what they intend to do
               | if asked to make the system detect more things in future.
               | 
               | > " _their fear is not based on real risk analysis or
               | statistics or practical tradeoffs_ "
               | 
               | Look what would have to happen for this system to ruin
               | your life in the way people here are scaremongering
               | about:
               | 
               | - You would have to sync to iCloud, such that this system
               | scans your photos. That's optional. - Someone would have
               | to get a malicious hash into the whole system and a photo
               | matching it onto your device. That's nontrivial to say
               | the least. - Enough of those pictures to trigger the
               | alarm. - The Apple reviewers would have to not notice the
               | false alarm photo of a distorted normal thing. - The NMEC
               | and authorities would have to not dismiss the photo.
               | 
               | It's not _impossible_ , but it's in the realms of the
               | XKCD "rubber hose cryptography" comic. Sir Cliff Richard,
               | his house was raided by the police, the media alerted,
               | his name dragged through the mud, then the crown
               | prosecution service decided there was nothing to
               | prosecute. The police apologised. He sued the police and
               | they settled out of court. The BBC apologised. He sued
               | them and won. The crown prosecution service reviewed
               | their decision and reaffirmed that there was nothing to
               | prosecute. His name is tarnished, forever people will be
               | suspicious that he paid someone off or otherwise pulled
               | strings to get away with something; a name-damaging flase
               | alarm which is something what many people fear happening
               | in this thread. Did anyone need to use a generative-
               | adversarial network to create a clashing perceptual hash
               | uploaded into a global analysis platform to trigger a
               | false alarm convincing enough to pass two or three human
               | reviews? No, two men decided they'd try to extort money
               | and made a false rape allegation.
               | 
               | People aren't interested in how it works, why it works
               | the way it does, whether it will be an effective crime
               | fighting tool (and how that's decided) or whether it will
               | realistically become a tyrannical system, people aren't
               | interested in whether Apple's size and influence could be
               | an independent oversight on the photoDNA and NCMEC
               | databases to push back against any attempts of them being
               | misused to track other-political topics, people are
               | jumping straight to "horrible governments will be able to
               | disappear critics" and ignoring that horrible governments
               | already do that and have many much easier ways of doing
               | that.
               | 
               | > " _So in the real world we know that the majority of
               | child abuse comes from people that children already
               | know._ "
               | 
               | Those 12 million reports of child abuse material related
               | to FaceBook messenger; does it make any difference if
               | they involved people the child knew? If so, what
               | difference do you think that makes? And Apple's system is
               | to block the spread of abuse material, not (directly) to
               | reduce abuse itself - which seems an important
               | distinction that you're glossing over in your position
               | "it won't reduce abuse so it shouldn't be built" when the
               | builders are not claiming it will reduce abuse.
               | 
               | > " _Nobody in any political sphere has ever responded to
               | "think of the children" with "we already thought of them
               | enough."_"
               | 
               | Are the EFF not in the political sphere? Are the groups
               | quoted in the letter not? Here[3] is a UK government vote
               | from 2014 on communication interception, where it was
               | introduced with "interception, which provides the legal
               | power to acquire the content of a communication, are
               | crucial to fighting crime, protecting children". 31 MPs
               | voted against it. Here[4] is a UK government vote from
               | 2016 on mass retention of UK citizen internet traffic,
               | many MPs voted against it. It's not the case that "think
               | of the children" leads to political universal agreement
               | of any system, as you're stating. Which could be an
               | example of you taking your position by fear instead of
               | fact.
               | 
               | > " _It doesn 't matter what solutions we come up with or
               | what the rate of CSAM drops to, those people are still
               | going to be scared of the idea of privacy itself._"
               | 
               | The UK government statement linked earlier[2] disagrees
               | when it says " _On 8 October 2019, the Council of the EU
               | adopted its conclusions on combating child sexual abuse,
               | stating: "The Council urges the industry to ensure lawful
               | access for law enforcement and other competent
               | authorities to digital evidence, including when encrypted
               | or hosted on IT servers located abroad, without
               | prohibiting or weakening encryption and in full respect
               | of privacy_ ". The people whose views you claim to
               | describe explicitly say the opposite of how you're
               | presenting them. Which, I predict, you're going to
               | dismiss with something that amounts to "I won't change my
               | opinion when presented with this fact", yes?
               | 
               | There are real things to criticise about this system, the
               | chilling effect of surveillance, the chance of slippery
               | slope progression, the nature of proprietary systems, the
               | chance of mistakes and bugs in code or human
               | interception, the blurred line between "things you own"
               | and "things you own which are closely tied to the
               | manufacturer's storage and messaging systems" - but most
               | of the criticisms made in this thread are silly.
               | 
               | [1] on the right, here: https://www.apple.com/v/child-
               | safety/a/images/child-safety__...
               | 
               | [2]
               | https://www.gov.uk/government/publications/international-
               | sta...
               | 
               | [3] https://www.theyworkforyou.com/debates/?id=2014-07-15
               | a.704.0...
               | 
               | [4] https://www.theyworkforyou.com/debates/?id=2016-06-07
               | a.1142....
        
           | bisRepetita wrote:
           | >Child pornography and the related abuse is a massive problem
           | 
           | >proposals for better solution
           | 
           | How do you define "massive"? What makes you think that
           | current approach is not working well enough?
           | 
           | Of course, it's really bad as soon as the very first case.
           | 
           | But we need to go beyond the sensationalist articles which
           | talk about millions of pictures outthere. Even though this
           | is, yes, a problem.
           | 
           | In the US, it looks like the number of child pornography
           | cases is going down each year, from 1,593 in 2016 to 1,023
           | cases in 2020:
           | 
           | https://www.ussc.gov/sites/default/files/pdf/research-and-
           | pu...
           | 
           | So yes, there is a problem, but I would love some to see some
           | justification that the problem is actually as massive as
           | people imply, or getting worse. Or if they have a political
           | goal in mind.
           | 
           | And it feels to me like we talk about it a lot more, there is
           | less taboo or social protections for perpetrators about it,
           | which must help to fight this more efficiently than in the
           | past.
        
             | dragonwriter wrote:
             | > In the US, it looks like the number of child pornography
             | cases is going down each year
             | 
             | Prosecuting fewer cases doesn't mean less abuse is
             | occurring.
             | 
             | It doesn't even mean that the _number of instances of abuse
             | for which cases are prosecuted_ are going down, just that
             | the number of _offenders prosecuted_ is going down.
        
               | bisRepetita wrote:
               | Agree. It may well be a bad proxy. But it does not mean
               | the problem is getting worse either, one would need to
               | back it up with some arguments/estimates. The number of
               | pictures on the internet does not seems a good one to me
               | (cats population has not grown that big).
               | 
               | I am thinking that people don't seem to know if the
               | problem is really getting bigger. They just say it. If we
               | don't know that, we cannot use growing child pornography
               | as an argument to reducing civil liberties.
        
               | zepto wrote:
               | > They just say it. If we don't know that, we cannot use
               | growing child pornography as an argument to reducing
               | civil liberties.
               | 
               | https://www.fbi.gov/news/stories/child-predators
               | 
               | The FBI says it. People can and do use that argument
               | whether we like it or not.
        
               | bisRepetita wrote:
               | Yes, they said it in 2011. With just one data point and
               | no trend. Nothing shows here it exploded since. So the
               | FBI and the whole society may have found somewhat
               | effective ways without scanning phones.
        
               | zepto wrote:
               | I don't expect to convince you the FBI is right. I am
               | pointing out that other people will be convinced.
        
             | zepto wrote:
             | > child pornography cases is going down each year, from
             | 1,593 in 2016 to 1,023 cases in 2020
             | 
             | Sam Harris (and the FBI) would likely say this is because
             | detection is getting harder thanks to encryption.
             | 
             | The only authority on this is actually the FBI, but I
             | presume you wouldn't trust them.
             | 
             | That distrust is reasonable, but irrelevant. What matters
             | is that many people will trust them and see the fear as
             | legitimate.
             | 
             | Arguing over numbers with me is irrelevant. I'm saying both
             | positions are reasonable fears.
             | 
             | You cannot win by denying that.
             | 
             | One winning move would be to take the problem seriously and
             | work out a better technical solution. There may be others.
        
               | bisRepetita wrote:
               | >One winning move would be to take the problem seriously
               | 
               | What makes you think that the problem is not taken
               | seriously? I am no specialist, but it looks like cases
               | are investigated and people are going to jail. You say
               | that it is not the case, so I am curious what leads you
               | to this statement.
               | 
               | Yes, policies can be based on fears and opinion, but some
               | numbers and rationality usually help make better
               | decisions. I'd love to hear something more precise than
               | fears and personal opinion, including the FBI of course.
        
               | zepto wrote:
               | > Yes, policies can be based on fears and opinion, but
               | some numbers and rationality usually help make better
               | decisions.
               | 
               | Sure, but you aren't the decision maker. This is a
               | political decision, and so will be based on the balance
               | of fears and forces at play, not a rational analysis.
               | 
               | It doesn't matter how right or rational you are. It only
               | matters how many people think you understand what they
               | care about.
               | 
               | If the Hacker position is 'people shouldn't care about
               | child pornography because the solutions are all to
               | invasive' so be it. I just don't think that's an
               | effective position.
        
               | bisRepetita wrote:
               | >you aren't the decision maker.
               | 
               | That is only partially true. If we both write, other
               | people may read, then you and me possibly influence
               | people. Wa are social beings, we are influenced by people
               | around us. So we're both a tiny part of the decision
               | process. And you talk about political decision:
               | politicians keep listening to people through polls, phone
               | calls received, and other ways. Even dictators can be,
               | when numbers are high enough.
               | 
               | So it does matter how rational we are. Over time and on
               | average, people are more often convinced by rational
               | discourse than irrationality. Probably because
               | rationality has a higher likelihood of being right. But
               | yes, we'll go through periods where irrationality wins...
               | Still, it's very hard to be on the opposite side of logic
               | and facts for a long time.
               | 
               | >If the Hacker position is "people shouldn't care about
               | child pornography"
               | 
               | I have not read that here. If you believe that child
               | pornography is a massive issue, I respect that. I just
               | would have hoped you could better describe the actual
               | size of the problem and its evolution. You could have
               | influenced me more effectively.
        
               | zepto wrote:
               | > If you believe that child pornography is a massive
               | issue, I respect that. I just would have hoped you could
               | better describe the actual size of the problem and its
               | evolution.
               | 
               | I don't mind whether you are convinced. I'm not trying to
               | convince you about the size of the problem.
               | 
               | My position isn't about how large the threat is. I don't
               | have any information that you don't have access to. My
               | position is that if we care about privacy we have to
               | accept that people other than ourselves think the problem
               | is big enough to warrant these measures.
               | 
               | You have already lost this battle because enough people
               | are convinced about the problem that tech companies are
               | already doing what you don't want them to do.
               | 
               | https://www.fbi.gov/news/stories/child-predators
        
               | bisRepetita wrote:
               | >if we care about privacy we have to accept that people
               | other than ourselves think the problem is big enough to
               | warrant these measures.
               | 
               | Not sure what you mean here by "accept". Accept the fact
               | those people and opinions exist? Sure! Accept their
               | opinion without challenging, without asking questions?
               | No. Accept this opinion is big and rational enough for
               | the majority to follow and make laws? No.
               | 
               | >You have already lost this battle
               | 
               | What makes you think that? That's just your opinion. You
               | know, even when battles are lost, wars can be won later
               | won. "Right to Repair", "Net neutrality", "Global
               | Warming", and here: "open source hardware". All those
               | battles have been fluid. Telling people it is over, it is
               | too late, is a very common trick to try to
               | influence/convince people to accept the current state.
               | That certainly does not make it true.
               | 
               | I understand you may try to convince readers that it is
               | over, because it may be your opinion. If that's the case,
               | just be frank about it, and speak proudly for yourself of
               | what you wish. Don't hide behind "politicians", "tech
               | companies" and "other people".
        
               | zepto wrote:
               | > What makes you think that? That's just your opinion.
               | 
               | It's not just my opinion that Apple has implemented a
               | hash based mechanism to scan for child pornography that
               | runs on people's phones. People complaining about it have
               | definitely lost the battle already. It is already here.
               | 
               | > I understand you may try to convince readers that it is
               | over, because it may be your opinion.
               | 
               | That is not an accurate understanding of my argument.
               | 
               | My position is to agree with those who see this as a
               | slippery slope of increasingly invasive surveillance
               | technology, and to point out that simply arguing against
               | it has been consistently failing over time.
               | 
               | I am also pointing out that one reason it's failing is
               | that even if the measures are invasive and we think that
               | is bad, the problems they are intended to solve are real
               | and widely perceived as justifying the measures.
               | 
               | What I advocate is that we accept that this is the
               | environment, and if we don't like Apple's solution, we
               | develop, or at least propose alternative ways of
               | addressing the problem.
               | 
               | That way we would have a better alternative to argue _in
               | favor of_ rather than just complaining about the solution
               | which Apple has produced _and which is the only proposal
               | on the table._
        
               | bisRepetita wrote:
               | Are you falling into their trap knowingly or not?
               | 
               | There is a child molestation problem everywhere in the
               | world, including online. I have seen nothing explaining
               | it is getting bigger / worse. I have read that most of
               | the cases are family members, in the real world.
               | 
               | So when I hear Apple and Government explain "because of
               | the children" they want to monitor our phones more, in
               | the context of growing assumed dictatorships, Pegasus,
               | Snowden reveleation, do you really think that solving the
               | child pornography issue will help refrain them, or slow
               | them down? Open source hardware, political pressure,
               | consumer pressure, and regulation, possibly monopoly
               | break-ups. In the US, it starts with the people.
               | 
               | But doing better with child pornography won't change
               | anything there, it juts moves the discussion to some
               | other topic. Distraction. That is my point all along.
               | There is no data that shows that all of a sudden child
               | pronography has progressed leaps and bounds. So people
               | suddenly concerned by that are most likely not truthful,a
               | dn they have a very strong agenda. That's what we need to
               | focus on, not their "look at the children" distraction.
        
               | zepto wrote:
               | > Are you falling into their trap knowingly or not?
               | 
               | This is a false dichotomy and a false assumption.
               | 
               | > There is a child molestation problem everywhere in the
               | world, including online.
               | 
               | Agreed.
               | 
               | > I have seen nothing explaining it is getting bigger /
               | worse. I have read that most of the cases are family
               | members, in the real world.
               | 
               | Have you listened to Sam Harris, or heard the FBI? They
               | have a very different view.
               | 
               | It could be that both are true: there is a child porn
               | problem _and_ governments are using it as an excuse.
               | 
               | The only thing you seem to be going on is a story you
               | once heard, that may have been true at the time, but may
               | not be now.
               | 
               | > So when I hear Apple and Government explain "because of
               | the children" they want to monitor our phones more, in
               | the context of growing assumed dictatorships, Pegasus,
               | Snowden reveleation, do you really think that solving the
               | child pornography issue will help refrain them, or slow
               | them down?
               | 
               | That would misleading sense given that you are assuming
               | child porn is not a growing problem.
               | 
               | Porn in general is growing hugely why wouldn't child porn
               | also be growing?
               | 
               | Generally Apple has resisted overreach, but I agree that
               | they are slowly moving in the wrong direction.
               | 
               | Apple is not the government.
               | 
               | > Open source hardware,
               | 
               | > political pressure, consumer pressure, and regulation,
               | possibly monopoly break-ups. In the US, it starts with
               | the people.
               | 
               | You contradict yourself here. You seem to think the
               | government can't be slowed and yet political pressure
               | will work. Which is it?
               | 
               | > But doing better with child pornography won't change
               | anything there,
               | 
               | I agree - it won't eliminate the forces that want to
               | weaken encryption etc.
               | 
               | But a more privacy respecting solution _would_ still
               | help.
               | 
               | > it juts moves the discussion to some other topic.
               | Distraction. That is my point all along.
               | 
               | > There is no data that shows that all of a sudden child
               | pronography has progressed leaps and bounds. So people
               | suddenly concerned by that are
               | 
               | Isn't there? The FBI claims it is growing.
               | 
               | > most likely not truthful,
               | 
               | Ok, we know you don't trust the FBI.
               | 
               | But enough people do that we can't ignore them. Even if
               | the problem isn't growing as Sam Harris claims it is,
               | trying to persuade people that the problem doesn't need
               | to be solved seems like a good way to undermine the
               | causes you support.
               | 
               | > a dn they have a very strong agenda. That's what we
               | need to focus on, not their "look at the children"
               | 
               | As I say, I agree there are people trying to exploit
               | 'look at the children' in support of their own agenda.
               | 
               | I just don't think that means there isn't a real problem
               | with child porn. Denying that there is a problem seems
               | equally agenda driven.
        
           | 4f832fd4-8f08 wrote:
           | >2. The complaints are all slippery slope arguments that
           | governments will force Apple to abuse the mechanism. These
           | are clearly real concerns and even Tim Cook admits that if
           | you build a back door, bad people will use it.
           | 
           | I don't know how old you are but for anyone over 30 the
           | slippery slope isn't a logical fallacy, it's a law of nature.
           | I've seen it happen again and again. The only way to win is
           | to be so unreasonable that you defend literal child porn so
           | you don't need to defend your right to own curtains.
        
           | danShumway wrote:
           | > Can anyone propose anything better
           | 
           | Widespread education about how child abuse usually works in
           | the real world outside of movies, public examination of what
           | tools we have available and whether they're being used
           | effectively, very public stats about how prevalent the
           | problem is and what direction it's moving in.
           | 
           | Better media that more accurately reflects how these problems
           | play out for real people and that doesn't use them as a cheap
           | gimmick to pray on people's fears to raise narrative stakes.
           | 
           | Better education about the benefits of encryption and the
           | risks of abuse, better education about how back doors play
           | out in the real world and how they are abused both by
           | governments and by individual abusers. Better education about
           | the fail-rate of these systems.
           | 
           | > Child pornography and the related abuse is widely thought
           | of as a massive problem, that is facilitated by encrypted
           | communication and digital photography.
           | 
           | Most people don't have a basis for this instinct, they
           | believe it because they were taught it, because it's what the
           | movies tell them, because it fits their cultural perception
           | of technology, because it's what the CIA and FBI tell them.
           | Unless they're basing their fear on something other than
           | emotion and instinct, there is no technical or political
           | solution that will reduce their fear. It's all useless. Only
           | education and cultural shifts will make them less afraid.
           | 
           | If you survey the US, over 50% of adults will tell you that
           | crime today is at the same level or worse than it was in
           | 1950s, an absurd position that has no basis in reality. So
           | what, should we form a police state? At some point, caring
           | about the real problems we have in the real world means
           | learning to tell people that being scared of technology is
           | not a good enough justification on its own to ban it.
           | 
           | Nothing else will work. People's irrational fears by and
           | large will not be alleviated by any technical solutions. They
           | won't even be alleviated by what Apple is doing, parents who
           | have an irrational fear of privacy are not going to have that
           | fear _lessened_ because Apple introduced this scanning
           | feature. They will still be afraid.
        
           | sathackr wrote:
           | I appreciate this well thought out and logical approach.
           | 
           | The issue I have is with the way this is presented.
           | 
           | I actually really like the local-only features related to
           | making underage users second-guess sending/viewing
           | potentially harmful content. IMO this is a huge problem
           | especially since most do not have the context to understand
           | how things live forever on the internet, and is likely a
           | source of a large percentage of this "Known CSAM material" in
           | circulation. I think it's a great step in the right
           | direction.
           | 
           | But the scan-and-report content on your device approach is
           | where it becomes problematic. It's sold as protecting privacy
           | because of a magic box "the cryptographic technology" somehow
           | prevents abuse or use outside the scope it is intended for.
           | But just like this entire function can be added with an
           | update, it can also be modified with an update. Apple
           | pretends the true reality of this feature is somehow solved
           | by technology, glosses over the technical details with
           | catchphrases that 99.9% of people that read it won't
           | understand. "It's 100% safe and effective and CANNOT IN ANY
           | WAY BE ABUSED because, cryptographic technology." They're
           | forcing their users to swallow a sugar-coated pill with very
           | deep ramifications that only a very small percentage will
           | fully comprehend.
           | 
           | I'm 100% for doing anything reasonably possible in the
           | advancement of this cause. But you're correct in that this is
           | "one fear is more important than another fear." And I don't
           | think anyone can say how slippery this slope is or where it
           | can go. I also don't really feel like you can even quantify
           | in a way that they can be compared the harm caused by CSAM or
           | mass surveillance. So a judgement call on "which is worse"
           | really isn't possible because the possibilities for new and
           | continued atrocities in both cases are infinite.
           | 
           | But at least in the US, a line is crossed when something
           | inside your private life is subject to review and search by
           | anyone else even when you have not committed a crime. If you
           | want to search my house, you must, at least on paper,
           | establish probable cause before a judge will grant you a
           | search warrant.
           | 
           | "It's okay, these specially trained agents are only here to
           | look for just drugs. They've promised they won't look at
           | anything else and aren't allowed to arrest you unless they
           | find a large quantity of drugs" -- Would you be okay with
           | these agents searching every house in the nation in the name
           | of preventing drug abuse(which also is extremely harmful to
           | many people including children even when they are not direct
           | users)?
           | 
           | The argument "well just don't use apple" doesn't stand
           | either. A landlord can't just deputize someone to rummage
           | through my house looking for known illegal things to report.
           | Even though it's technically not "my house" and you could
           | argue that if I don't like it, well I should just move
           | somewhere else. But I don't think that can be argued as
           | reasonable. Our phones are quickly becoming like our houses
           | in that many people have large portions of their private
           | lives inside them.
           | 
           | I can't quite put my finger on the exact argument so I
           | apologize for not being able to articulate this more clearly,
           | but there is something with removing the rights of large
           | groups of people to protect a much smaller subset of those
           | people, from an act perpetrated by bad actors. You are
           | somehow allowing bad actors to inflict further damage, in
           | excess of the direct results of their actions, on huge
           | populations of people here. I know there is a more eloquent
           | way to descibe this concept, but doing so just doesn't seem
           | like the right course of action.
           | 
           | And I'm sorry but I am not smarter than all of the people
           | that worked on this, so I don't have a better solution that
           | would accomplish the same goal, but I know that this solution
           | has a huge potential to enable massive amounts of abuse by
           | nations who do not respect what much of the world considers
           | basic human rights.
        
             | zepto wrote:
             | > But just like this entire function can be added with an
             | update, it can also be modified with an update.
             | 
             | That argument is a valid argument that by using an Apple
             | device you are trusting them not to issue an abusive update
             | in the future. It applies regardless of whether they
             | release this feature or not - at any time they could issue
             | spyware to any or all devices.
             | 
             | I actually fully agree that this is a problem.
             | 
             | My position is that Apple isn't going to solve this
             | problem. If we want it solved, we need to solve it.
             | 
             | The value of using Apple devices today, and even the sense
             | that they are going to protect children who use their
             | devices, far outweighs relatively vague and unproven
             | assertions about future abuses that haven't yet
             | materialized in most people's minds _even if they turn out
             | to be right in the end_.
        
           | babesh wrote:
           | There is no end to the loss of privacy in the name of safety
           | if your bogeyman is increasingly sophisticated actors.
           | 
           | The more sophisticated child molesters out there will find
           | out about what Apple is doing and quickly avoid it. Isn't
           | that what the FBI is complaining about, that they have grown
           | increasingly sophisticated?
           | 
           | The more sophisticated privacy adherents will also avoid
           | Apple and resort to end to encryption and offline tools.
           | 
           | What is the actual outcome? You won't get more arrests of
           | child molesters. Instead, you get a security apparatus that
           | can be weaponized against the masses. Furthermore, you will
           | have the FBI complaining that child molesters are
           | increasingly out of their reach and demanding greater powers.
           | They will then try to mandate child porn detectors built into
           | every phone.
           | 
           | This creep has been occurring for years. Go read the Snowden
           | disclosures.
           | 
           | First your cell phone companies worked with the government
           | for mass harvesting of data. No need for any suspicion
           | because they promise not to look unless there is one. That
           | wasn't enough because the data was encrypted.
           | 
           | Second they had the companies holding the data snoop in on
           | data that was shared. That wasn't enough.
           | 
           | Third they had the companies holding the data snoop in on
           | data EVEN when it wasn't shared, just when it was uploaded to
           | them. Not enough for them!
           | 
           | Now they will have it done on device prior to uploading. Does
           | this mean that if it fails to upload, it gets scanned anyway.
           | Why yes!
           | 
           | Next they will have it done on device even if it never is
           | held by the company and never shared and never even intended
           | to be uploaded.
           | 
           | The obvious goal is that the government has access to ALL
           | data in the name of safety. No need for warrants. Don't worry
           | about it. They won't look unless there was any suspicion.
           | Opps never mind that, we will just have tools look.
           | 
           | There is no end to the loss of privacy in the name of safety
           | if your bogeyman is increasingly sophisticated actors.
           | 
           | Why isn't this obvious to everyone?
           | 
           | Anyone old enough to remember the Clipper chips?
        
             | zepto wrote:
             | > The more sophisticated child molesters out there will
             | find out about what Apple is doing and quickly avoid it.
             | 
             | I think this is exactly what Apple wants to he the result
             | of their their iMessage scanning.
             | 
             | They are not in this to make arrests. They just want
             | parents to feel safe letting children use their products.
             | Driving predators to use different tools is fine with them.
             | 
             | As far as the FBI goes, this is presumably not their
             | preference, but it's still good for them if it makes
             | predation a little harder.
        
               | babesh wrote:
               | My point is that the FBI will just use that as a pretext
               | for greater intrusion into privacy. Why stop at users who
               | have iCloud photos turned on? Why not scan all photos?
               | 
               | Why limit it to child predators? Why not use this too for
               | terrorists and anyone the FBI or any other government
               | deems as subversive?
               | 
               | In fact, if you just look at what the FBI has been saying
               | over the years, that is exactly what they intend to do.
               | 
               | People who say that this is a slippery slope argument
               | don't even notice that they have been sliding down that
               | slippery slope for decades.
        
               | zepto wrote:
               | Where is anyone arguing against you? The point is that
               | the demands for solutions may be neverending, but that
               | doesn't mean the problems are non-existent.
               | 
               | If we want to limit the damage caused by the dynamic, we
               | need to offer better solutions, not just complain about
               | the FBI.
        
               | babesh wrote:
               | You were saying that this is a slippery slope argument
               | which implies that it is a fallacious argument. I am
               | saying that isn't the case. We have been sliding on this
               | slope for decades which deems that the argument is valid
               | and not fallacious.
               | 
               | From Wikipedia's entry for slippery slope: "The
               | fallacious sense of "slippery slope" is often used
               | synonymously with continuum fallacy, in that it ignores
               | the possibility of middle ground"
               | 
               | This isn't a middle ground. Every year the supposed
               | middle ground shifts toward less and less privacy. Notice
               | the slippery slope? The very presence of this means that
               | the supposed middle ground just slipped further?
               | 
               | Isn't that obvious?
        
               | zepto wrote:
               | You're ignoring the sentence after I mention the slippery
               | slope, where I say:
               | 
               | > These are clearly real concerns and even Tim Cook
               | admits that if you build a back door, bad people will use
               | it.
               | 
               | I'm not the one ignoring the middle ground.
        
               | babesh wrote:
               | I have read through your comments on the article. You are
               | the bogeyman. Good bye. You are the one promoting the
               | latest false middle ground along an actual sliding slope.
               | That is disingenuous.
        
               | zepto wrote:
               | Can you describe the middle ground that I have ignored?
               | 
               | What disingenuous argument have I made?
               | 
               | The only thing I have argued for is for hackers to
               | attempt technical solutions that are more to their liking
               | than Apple's, because arguments are not preventing the
               | slide.
        
               | babesh wrote:
               | I am saying that you are promoting a slippery ground
               | falsely as middle ground.
               | 
               | Basically the argument I hear from you is "If you build a
               | back door, then people will use it. So let's build it
               | anyway because it is a middle ground." The problem I have
               | with it is the "let's build it anyway".
               | 
               | That seems as clear as day. Why do I have to keep
               | repeating myself? Don't be an apologist.
        
               | babesh wrote:
               | The problem of child molesters is a fixed quantity. The
               | loss of privacy is ever increasing. When you see this
               | dynamic, you know that a con is afoot.
               | 
               | The solution for child porn isn't found in snooping on
               | everyone. It is in infiltrating the networks. Go have the
               | FBI infiltrate the rings. Here is an example of why the
               | FBI is disingenuous. This child porn ring wasn't using
               | phones. Guess what they were using?
               | 
               | https://abc7.com/archive/7844451/
               | 
               | Computers in public libraries
               | 
               | Like I said, sophisticated actors aren't the targets.
               | 
               | Another example. Osama bin Laden. He was air gapped. No
               | cell phones or computers for him. No one even calling on
               | a phone nearby. Osama bin Laden was found via an
               | informant.
               | 
               | The next actor will be even more sophisticated. Probably
               | single use dumb paired cell phones with call locations
               | randomized. Probably plastic surgery. Locations that spy
               | satellites cannot see.
               | 
               | Did snooping on cell phone data help find Osama? When
               | that wasn't enough, did grabbing all online data help?
               | How about grabbing data straight from smart phones? Nope.
               | Nope. Nope. Yet governments want more, more, more. Why do
               | you think snooping helps against people who don't even
               | use phones for their most private communications?
        
         | davidcbc wrote:
         | Have they? The whitepaper made it sound like the encrypted
         | security voucher is created from the database of known cp on
         | the device at the time the image is uploaded, and the only way
         | for Apple to decrypt something is if it matched something in
         | that database.
         | 
         | It did not sound like they could retroactively add additional
         | hashes and decrypt something that already was uploaded. They
         | could theoretically add something to the list of hashes and
         | catch future uploads of it but my understanding was they cannot
         | do this for stuff that has already been stored.
        
           | vineyardmike wrote:
           | > they cannot do this for stuff that has already been stored.
           | 
           | That's a very simple software update.
           | 
           | Every year iOS gets more images the automatic tagging
           | supports (dogs, shoes, bridges, etc). And if you add a
           | friend's face to known faces, it'll go and search every other
           | photo for that face.
           | 
           | It sounds absolutely trivial to re-scan when the hash DB gets
           | updated.
        
           | sathackr wrote:
           | Directly from the link:
           | 
           | > ...the system performs on-device matching using a database
           | of known CSAM image hashes provided by NCMEC and other child
           | safety organizations. Apple further transforms this database
           | into an unreadable set of hashes that is securely stored on
           | users' devices.
           | 
           | >... The device creates a cryptographic safety voucher that
           | encodes the match result along with additional encrypted data
           | about the image. This voucher is uploaded to iCloud Photos
           | along with the image.
        
         | xyst wrote:
         | Selfies are suddenly going to be used as surveillance tools by
         | the govt, or maybe a bad actor at Apple wants to find/track
         | down their ex.
         | 
         | This system has so much potential for abuse with very little
         | upside.
        
           | calmworm wrote:
           | Suddenly?
        
             | EnlightenedBro wrote:
             | Nailed it. It was the intention all along. Funny how
             | everyone just pretends that it's not what it is.
        
         | emodendroket wrote:
         | Was it ever actually in doubt that Apple could read stuff you
         | put in their cloud of they so desired? It seems obvious.
        
         | geekles wrote:
         | As if this will be limited to "iThings". If you don't think
         | this tech is coming to all devices, you're not paying attention
         | to the state of the world.
        
         | davidham wrote:
         | Even if you don't use iOS, seems likely Google could do/is
         | doing this also.
        
         | andruby wrote:
         | What will Apple do when Iran or Qatar (or any of the other 71
         | countries where homosexuality is illegal) upload photos to the
         | CSAM database that they consider illegal acts?
         | 
         | In some of those countries same-sex sex is punishable by death.
        
           | gordon_gee123 wrote:
           | Nothing is stopping these countries from doing this already.
           | China, Saudi Arabia, Iran already consider forcing tech
           | companies to track user activity. At the end of the day these
           | companies are subject to laws of the country they do business
           | in and this has already screwed over HK, Uigher, Iranian,
           | Egyptian citizens. Laws forcing data to be stored in given
           | regions alongside encryption keys has already made it
           | dangerous to be homosexual in these countries you've
           | mentioned (except Iran which most businesses cannot do
           | business in)
        
         | hughrr wrote:
         | Yeah as I mentioned down the thread, once you act against the
         | users, you're dead. I'm done. They are going from my life.
         | 
         | Good job it's ebay 80% off fees this weekend here in the UK.
        
           | klysm wrote:
           | I mean eBay isn't exactly free of bad behavior...
        
             | p410n3 wrote:
             | I think they're just selling their iDevices on there
        
               | klysm wrote:
               | Oh I misunderstood
        
               | hughrr wrote:
               | Yes that. I'd rather take them back to the Apple store
               | and have a large fire out the front while holding up a
               | doom saying sign about their laughable privacy stance,
               | but that'd detract from the point.
        
             | johnnyApplePRNG wrote:
             | Guess I'll just keep my spyPhone then. /s
        
           | ju_sh wrote:
           | Pretty sure I read this would only apply to US based users
        
             | hughrr wrote:
             | Yes so far. Once the mechanism is in place it will be
             | rolled out. I'm in the UK and we have a fairly nasty set of
             | state legislation on censorship already and an increasingly
             | authoritarian government so this is a big worry.
        
               | sharken wrote:
               | Absolutely agree, this will no doubt reach Apple devices
               | worldwide, just a matter of time.
               | 
               | But there is still time for Apple to halt these changes.
        
               | Taek wrote:
               | Apple will try again with something that's wrapped up to
               | be more palatable to consumers. A change of heart in the
               | next 90 days does not mean that Apple is a good actor.
        
               | hughrr wrote:
               | Well it's now common knowledge that they can and will do
               | this. So oppressive governments will almost certainly
               | mandate that its required to do business in those
               | countries. Thus it's a no win game for the end user.
               | Unless you choose not to play.
        
       | AJRF wrote:
       | Anyone in London area want to buy a 2020 M1 MacBook Pro with 16GB
       | RAM + 256GB SSD for PS1000?
        
         | juniperplant wrote:
         | So... What's next? A Thinkpad with Linux? I really wish there
         | was more competition. There is simply no other digital
         | ecosystem that works as well as Apple's.
        
           | AJRF wrote:
           | Probably one of those nice Clevo laptops, more bang for my
           | buck with that (https://shkspr.mobi/blog/2020/05/review-
           | clevo-n151cu-lafite-...)
           | 
           | I've weaned myself off big tech services now (except I still
           | use Chrome, but I used that un-synced - could just change to
           | Chromium).
           | 
           | I just want something capable, with a nice keyboard running
           | an OS that doesn't constantly change things underneath me, is
           | stable, and secure.
           | 
           | I agree about the ecosystem working well together, but for
           | me, the ends don't justify the means anymore.
        
       | todd3834 wrote:
       | I appreciate the privacy concerns, I really do. But please, can
       | we suggest an alternative that still benefits children the way
       | that this will? Imagine being Tim Cook and learning that
       | technology that you control is being used for things that
       | everyone agrees is horrific to children. Would you be able to
       | turn a blind eye in matter of principle? I'm struggling with my
       | opinion on this one because I care so much about the kids this
       | can help. For the first time I'm feeling like the trade off is
       | more than worth it.
        
         | ZekeSulastin wrote:
         | That's why this sort of thing is first used "for the children"
         | (or "to prevent terrorism" back in 2001 or 2015[0]) - because
         | when you argue in favor of privacy you then also implied to
         | support whatever is being used as the impetus.
         | 
         | The alternative is to not start scanning your device in the
         | first place and getting the foot in the door, because once it's
         | open it's really hard to close.
         | 
         | [0] https://en.wikipedia.org/wiki/FBI-Apple_encryption_dispute
        
       | sf_rob wrote:
       | Outside of the privacy debate on this, I cannot overstate how
       | dumb it was of Apple to announce this and sexual image ML in the
       | same press release. It's resulting in an errant and vastly
       | exagerated understanding of it by members of the public.
        
       | XorNot wrote:
       | Fixed:
       | 
       | Chinese Government: Here is the NeuralHash for some child abuse
       | imagery, please scan this user's phone and tell us if it's found.
       | 
       | Apple: Sure we found it.
       | 
       | Chinese Government: this user has unpermitted winnie the pooh
       | memes. Send them to an internment camp.
        
         | webmobdev wrote:
         | Fixed:
         | 
         | Modi's Indian Government: This opposition leader is criticising
         | us too much. Send him a child abuse imagery and ask Apple /
         | Google to scan for it on his phone.
         | 
         | Apple / Google: We found it!
         | 
         | Modi's Indian Government: Let the defamation begin!
         | 
         |  _Context_ :
         | 
         | - Evidence found on a second Indian activist's computer was
         | planted, report says -
         | https://www.washingtonpost.com/world/2021/07/06/bhima-korega...
         | ]
         | 
         | - Pegasus Snoopgate -
         | https://www.youtube.com/watch?v=Ppt3FIV2itQ
        
         | [deleted]
        
         | rvz wrote:
         | Google: 'Sharing is caring'. Thanks for sharing all of this
         | with us Apple Inc! [0]
         | 
         | Amazon: Don't forget us too! [0]
         | 
         | To Downvoters: Are you telling me that this is false? Even
         | worse for the Gmail users. [1] I guess the outrage was already
         | overdue with these tech giants.
         | 
         | [0] https://appleinsider.com/articles/21/06/29/apple-is-now-
         | goog...
         | 
         | [1] https://www.theverge.com/2014/8/5/5970141/how-google-
         | scans-y...
        
           | gjioshjoasif wrote:
           | welcome to reddit
        
         | dang wrote:
         | Please don't take HN threads on generic flamewar tangents and
         | certainly not off-topic ones. We're trying to avoid this sort
         | of crap thread here.
         | 
         | We detached this subthread from
         | https://news.ycombinator.com/item?id=28086140.
        
         | trashtester wrote:
         | Fixed further: Chinese Government: Here is the NeuralHash for
         | some child abuse imagery, please scan this user's phone and
         | tell us if it's found.
         | 
         | Apple: Sure we found it.
         | 
         | Chinese Government to some western company/instititution: We
         | have detected unacceptable behavior from your
         | employee/student/client/vendor. Please cancel them, or we will
         | stop funding you.
        
           | trainsplanes wrote:
           | Replace Chinese with absolutely any government and that's
           | what'll happen.
           | 
           | We're pretty much at a point where any hacker/leaker anywhere
           | on earth is "found" to have committed sex crimes or have 300
           | gigabytes of child abuse on their computers.
           | 
           | Eventually Disney will be getting their hands in it to detect
           | copyrighted content and then it's all over.
        
             | sathackr wrote:
             | Pretty sad when you trust the "bad" guys more than the
             | "good" guys.
             | 
             | I'm much less concerned with <random malware> on my device
             | than I am with the governments of the world having access
             | to it.
             | 
             | At least the motivations of the <random malware> are
             | generally monetary and likely don't represent a significant
             | threat to human rights and freedom of speech.
        
             | docmars wrote:
             | You're right, though the ability to detect copyrighted
             | content has been around for a good while.
             | 
             | Although with this, I think it's more likely that Disney's
             | executives and/or employees will be charged and arrested
             | before that happens. I mean, with this track record...
             | 
             | 1. (Aug 2021) https://djhjmedia.com/kari/disney-workers-
             | caught-in-child-se...
             | 
             | 2. (Dec 2017) https://variety.com/2017/biz/news/jon-heely-
             | disney-music-gro...
             | 
             | 3. (June 2019) https://insidethemagic.net/2019/06/former-
             | walt-disney-vice-s...
             | 
             | 4. (July 2014 / Dec 2017)
             | https://www.huffpost.com/entry/disney-employees-child-sex-
             | ch...
        
               | [deleted]
        
           | sdoering wrote:
           | Not too far from reality [1].
           | 
           | PhD student at Swiss University posts something critical on
           | Twitter. Chinese intervene at his professor.
           | 
           | Promotion gone with mildly interesting reasoning.
           | 
           | Link to Swiss newspaper in German language.
           | 
           | [1]: https://www.nzz.ch/schweiz/hsg-und-china-kritik-auf-
           | twitter-...
        
             | KarlKode wrote:
             | Here is the answer from HSG (also in German): https://www.u
             | nisg.ch/de/wissen/newsroom/aktuell/rssnews/camp...
        
       | hughrr wrote:
       | I'm not even interested in the outcome of this now. Apple has
       | chosen to act against the users without consultation. They should
       | be punished for this by their customers walking away.
        
       | mchusma wrote:
       | People always use child porn as the backdoor excuse.
       | 
       | Once they have the mechanism in place to censor and report
       | content. They will for sure be requested to use this. For
       | terrorism, hate speech, etc until it's so blurry until it's
       | absolutely limitless :(
        
         | robertoandred wrote:
         | Can you give an example of a widely shared picture of hate
         | speech that someone would keep in their online photo library?
        
           | rantwasp wrote:
           | this is a narrow way to look at it.
           | 
           | once they start doing this, it's not going to stop at
           | pictures and it's not going to stop at hashes.
           | 
           | you will literally end up with the thought police combing
           | everything you do (via Machine Learning!!!) and keep tabs on
           | everything you do, see, write, read. the future is bright!
        
       | t0mmyb0y wrote:
       | Sounds like windows 10...
        
       | jstsch wrote:
       | Disagree. I think this is a boon to E2E encryption. However you
       | might feel, content filter legislation is unstoppable. It will
       | just take a few high profile media cases for E2E encryption to be
       | killed, otherwise. This is the right (privacy-conscious) way to
       | implement it, on device. Google has been doing this for years,
       | yet server-side. I recommend reading the actual docs:
       | https://www.apple.com/child-safety/
        
       | jegaman wrote:
       | This is just a kind of confirmation I need to see from time to
       | time to justify being a bit "paranoid". This literally gives me
       | energy to configure my linux.
        
       | belter wrote:
       | Where is Richard Stallman when you need him?
       | 
       | This is the War on General Computation you have been warned
       | about, and its good to reiterate: "You ( and I mean you as an
       | Individual and you as a Company ) you are either with us, or
       | against us"
        
         | acuozzo wrote:
         | > Where is Richard Stallman when you need him?
         | 
         | He was canceled, but he still regularly updates his personal
         | website: https://stallman.org/
        
           | rantwasp wrote:
           | i mean. he did warn everyone that this is going to happen
           | decades ago.
           | 
           | chiming is on this would just give the cancel mob more
           | ammunition
        
       | CraftingLinks wrote:
       | Will be interesting to monitor sales when this goes live!
        
       | sudo_raspberry wrote:
       | this will lead to black mailing of future presidents in the U.S.
       | 
       | No normal citizen will be allowed to run for president anymore. I
       | think it's time for a true movement.
        
       | post_break wrote:
       | If Apple doesn't change on this my next phone will not be an
       | iPhone. I will dump my iPad, and Apple watch as well. This is
       | complete bullshit. I'm angry. I never thought I'd see "privacy"
       | Apple come out and say we're going to scan your photos on your
       | device, scan you imessages, etc. This is insane.
        
         | voakbasda wrote:
         | I gave up on Macs almost 20 years ago, but I have an iPhone 7
         | that I need to upgrade. This whole shitshow has put that
         | purchase on hold indefinitely.
        
           | relax88 wrote:
           | I just got an iPhone 12 a couple months ago because "Hey it's
           | definitely more private than google", better control of app
           | permissions, and nice hardware.
           | 
           | Seems unclear what the alternative is now. The Linux phones
           | I've looked at have a lot of catching up to do.
        
             | voakbasda wrote:
             | Linux on the desktop has a lot of catching up to do when it
             | comes to Mac/Win. The question is whether it is good
             | enough.
             | 
             | Personally, Linux desktop has been good enough for decades,
             | for my work as a software engineer. I gave up many
             | conveniences when I switched fully from Macs (e.g. BBEdit,
             | Finale for music, etc.).
             | 
             | I think phones are reaching the same point, where I am
             | ready to sacrifice a decade or more of software "progress"
             | to regain control of my hardware.
             | 
             | In the very long term, I will always bet on open source
             | when it comes to commodity devices. I believe phone
             | development will progress much like the Linux desktop
             | experience, which is now relatively rock solid. It may take
             | a decade, but each of us that joins their movement will
             | speed these alternatives on to their inevitable market
             | domination.
        
       | dancemethis wrote:
       | Why do people still think they have _any_ power over user-hostile
       | corporations and its proprietary code?
       | 
       | Apple can (and likely will) say they won't do it and then do it
       | anyway. It's a proprietary platform. They already have it, now.
       | All they are claiming is an excuse not to be caught in a certain
       | way they wouldn't like at a later point.
        
         | jackvezkovic wrote:
         | There can always be leaks, anonymous sources that confirm
         | something is still ongoing, despite what was publicly
         | announced. Brand image is important.
        
           | zibzab wrote:
           | Are you sure?
           | 
           | Here is a true story for you:
           | 
           | Company X claimed users voice commands never left their
           | devices. Then someone leaked recordings of people having sex,
           | commiting crimes, etc recorded from X devices. This was
           | brought up by media. For every concerned user there were ten
           | apologist trying to justify this behaviour and a week later
           | everyone forgot this ever happened.
        
             | Clubber wrote:
             | Post a link so everyone can remember. If it's something you
             | care about, you need to keep reminding people. I'll post
             | stuff about police brutality on threads that are relevant.
             | It helps remind everyone that they are susceptible to the
             | whims of any rogue policeman at any given time, regardless
             | of social status. Most likely with no recourse.
        
             | snowwrestler wrote:
             | If you're talking about Amazon, those stories did affect
             | Echo sales, and the trajectory of the category overall.
             | When was the last time you read a breathless article about
             | how smart speakers will change everything?
        
               | zibzab wrote:
               | I was thinking of another company, but I am very happy
               | that Amazon bottom line was affected (?)
        
         | failwhaleshark wrote:
         | And it's only a fraction of ubiquitous inverted totalitarianism
         | that pervades modern life. If powerful people decide they don't
         | like you, they have the power to exploit your content, shutoff
         | your access to everything containing software, and, in some
         | cases, have you arrested because they also buy the influence of
         | politicians. Kleptocracy. Plutocracy.
        
         | toxik wrote:
         | This seems fatalist and goes against what we have learned from
         | internet society: making a ruckus is exactly how you sway large
         | companies.
        
           | inanutshellus wrote:
           | I'm with you bud. Fatalism is how bad things survive.
           | 
           | "Oh, nothing will happen to this [corrupt politician|naughty
           | rich guy|corporation doing a thing I don't like], because
           | they're all [bad thing]. It's all over already." :facepalm:
           | 
           | (Also, I couldn't read the original article because my work
           | thinks it's spreading malware or something so I'm really only
           | referring to the fatalism thing here, not Apple.)
        
           | [deleted]
        
       | dmitryminkovsky wrote:
       | Maybe I'm an overly-sensitive person but I really can't get
       | comfortable with a neural network looking over my shoulder,
       | spying on me 24/7.
       | 
       | My parents gave me privacy and treated me with respect when I was
       | a child. Now I'm an adult, and in a way it's like I have less
       | privacy than when I was a kid. And the entities violating my
       | privacy have way more power than my parents.
       | 
       | I want to continue working with technology, but how can I make
       | mass consumer goods (i.e. apps) without being a user myself?
       | These moves are going to slowly force me out of technology, which
       | is sad, because creating with programming is my favorite
       | activity. But life without some semblance privacy is hardly life.
       | 
       | Here's to a slow farewell, Apple! It was a good run.
        
         | Uupis wrote:
         | I could have written this comment myself, albeit less
         | eloquently.
         | 
         | That is a very accurate representation of how I feel about
         | this, too. I enjoy building apps, but I don't know that I can
         | keep using these devices.
         | 
         | I was looking forward to upgrading to the new hardware in the
         | fall, but now I'm not sure I can stomach the implications of
         | buying a new device that may at any point start policing what I
         | do far beyond what I'd accepted at the time of purchase.
        
       | toxik wrote:
       | Please talk to non-tech people around you about this. This
       | overreach simply cannot stand, and from a company that sees
       | success from touting itself as a privacy-centric alternative?
       | These really are some dark times. I imagine that if the world had
       | Stasi fresher in its mind, this would never have happened.
        
         | ColinHayhurst wrote:
         | Privacy is not something Apple, or any company, gives us. But
         | it's something they extract from us; by tracking us and
         | harvesting data. If Apple were serious, as opposed to
         | positioning themselves as less extractive than GFAM and others,
         | they would not be taking this road. Yes, they say it's on
         | iCloud for now, but the signs are that it will be done client
         | side in future. Whether you believe that will happen or not the
         | risks are too high; so it's prudent to assume the worst.
         | 
         | This is this a first step to "Detective in your Pocket",
         | cloaked intentionally, or not, by a well-meaning objective. An
         | objective we can all support. If you wanted to put in a thin
         | wedge on distributed surveillance, where better to start? As
         | pointed out CSAM filtering/scanning is already done so that's
         | not the issue here. There's a big debate to be had, and being
         | had, on the upsides/downsides and benefits/dangers of AI and
         | false positives. That's a huge issue in itself; but that's not
         | the biggest concern with this move. If Apple pushes on with
         | this it's a clear signal that they wish to march us all to a
         | new world order with Distributed Surveillance, as a Service. A
         | march in step with the drumbeat of your increasingly
         | authoritarian (or if you are lucky or more generous, safety
         | conscious) government.
         | 
         | I have signed the letter to express my very strong personal
         | concerns but also as a CEO of a company that takes CSAM
         | seriously and seeks to provide solutions, in search, without
         | surveillance.
        
         | ren_engineer wrote:
         | it still boggles my mind how naive people in the tech industry
         | are, they fall for the same trick again and again over the
         | course of decades. Same people who fell for Google's "don't be
         | evil" and Google pretending to be better than the Evil Empire
         | Microsoft are now shocked that Apple was really only in
         | business for money from the start and the privacy stuff was
         | just a marketing gimmick
        
           | mekal wrote:
           | That's a little cynical though don't you think? Couldn't it
           | be that they were sincere in the beginning but over time,
           | company culture changes, the original founders move
           | on/retire/die/etc and pressure from government, lobbyists,
           | etc add up. I guess its the same result in the end.
        
         | rvz wrote:
         | > Please talk to non-tech people around you about this.
         | 
         | They were already sold into the Apple ecosystem with the iPhone
         | 12's, M1 Macs and iPad Pros. They are going to have a hard time
         | moving to another platform.
         | 
         | Best part? Apple is a huge customer of Google Cloud for storage
         | so you can now tell Google about all the files on your iCloud
         | account. [0]
         | 
         |  _' With privacy in mind.'_ /s
         | 
         | [0] https://appleinsider.com/articles/21/06/29/apple-is-now-
         | goog...
        
           | mosselman wrote:
           | In all fairness, in theory, they could have all that data
           | encrypted with keys that google doesn't have.
        
         | tobyhinloopen wrote:
         | Talking about this with non-tech people might make the distrust
         | in anything they don't understand worse. Think vaccines.
        
           | forcry wrote:
           | Theory of Vaccines is simple. They want you to outsource all
           | your immunity to big pharma. Just like when they took away
           | your guns, and made you outsource your self defence to
           | government. Just like when they outsourced your critical
           | thinking to fact checkers.
           | 
           | You are no good on your own. Submit to us and pay us, and we
           | will make you good.
           | 
           | You are nothing, your immune system is useless, your guns
           | don't fire true, your logic is not sound, without us.
        
             | jazzyjackson wrote:
             | When did they take away my guns? I must have my missed that
             | town hall.
        
           | faeyanpiraat wrote:
           | Yes, most people are not good at persuasion, and a generic
           | mention of the fact may be harmful as you say.
        
           | cle wrote:
           | Intentionally deceiving people is probably not a good
           | strategy for building their trust.
        
           | crocodiletears wrote:
           | Refusing to acknowledge egregious top-down decisions which
           | threaten or violate either one's well-being or some perceived
           | fundamental right has a much worse effect on public trust.
           | 
           | "Why didn't you tell me this was going on?" "You're simply
           | too stupid to handle the idea that your betters might
           | occasionally be untrustworthy"
        
             | zepto wrote:
             | How are you going to argue that it's bad for parents to be
             | informed when strangers send sexual imagery to their
             | children?
        
               | crocodiletears wrote:
               | You argue the applications of the technology further down
               | the line. Remind them how facebook's safety features
               | turned into tools for censorship.
        
               | zepto wrote:
               | Can you think of a time when that kind of argument has
               | ever worked?
               | 
               | I remind you that people made that argument about
               | Facebook and it didn't.
        
               | crocodiletears wrote:
               | Raising awareness won't stop any of this. It's
               | inevitable. We have the technological capacity and
               | institutional interest required to implement it, it will
               | be done, and it will be endemic.
               | 
               | Raising awareness is about letting people know so that
               | they might take the necessary precautions if they
               | consider themselves to be at risk of its abuse, and
               | degrades their faith in the institutions that support it.
        
               | zepto wrote:
               | It doesn't matter whether they are aware. They can't take
               | precautions. There are no technical solutions that people
               | can use, and technologists seem to be uninterested in
               | working on them. Apple's solution is the best on offer.
               | Raising awareness about Facebook's problems hasn't harmed
               | Facebook.
        
               | crocodiletears wrote:
               | They can take precautions by not using iOS devices. The
               | goal isn't to harm Apple, it's to make the knowledge
               | available to those who need it, and have the will to
               | avoid it.
               | 
               | That could mean anything from watching what they put on
               | their iPhone, to putting them down the track of using
               | degoogled android variants on select phones, to ditching
               | cell phones all together depending on their needs.
               | 
               | Knowing that your phone is watching, reporting, and
               | potentially moderating the content in it is information
               | which is valuable in and of itself. Even if it only has
               | utility to a fraction of the population.
               | 
               | We find ourselves in the tightening noose of a latent
               | authoritarian technocratic surveillance society. Few
               | people have anything to fear from it, or the resources to
               | escape it. But some do, and should be given every piece
               | of information that might help them moderate or escape
               | it.
        
               | zepto wrote:
               | Android has been doing this all along, so not using iOS
               | devices will not help.
               | 
               | As I said, there are no good solutions.
        
               | crocodiletears wrote:
               | * drop your phone
               | 
               | * don't store images on your phone
               | 
               | * use a rom like Graphene, don't install a cloud storage
               | or Google Play Services
               | 
               | Those aren't great solutions, but they're options.
        
               | zepto wrote:
               | They are options that almost nobody can or will use, so
               | they won't have any impact.
               | 
               | If your goal is to inform a small minority of expert
               | users that they should protect themselves against
               | corporate/government encroachment, by the looks of the
               | comments here, I'd say you've already succeeded.
        
               | crocodiletears wrote:
               | I think you're missing my point. You do your best to
               | inform the majority not because the majority will enact
               | change based on it, but so that the minute and disparate
               | slices of the population for whom that information is
               | relevant but might not otherwise have been exposed to it
               | can access it and perform or investigate whatever actions
               | they deem necessary and economical to mediate the
               | potential threat.
               | 
               | This forum is too niche to fulfill that purpose on its
               | own.
               | 
               | The mass distribution of the information, and arguments
               | against apple's behavior should be intended to
               | incidentally target relevant niches beyond technically
               | expert circles of which the communicator is unaware.
               | 
               | That the argument and resolution of these issues is
               | irrelevant to most of those who will be exposed to it is
               | immaterial.
        
           | bombcar wrote:
           | "See what Apple is doing is basically vaccinating your iPhone
           | against child porn. That's bad because there's the
           | possibility of side effects, though rare, and you should have
           | the decision yourself as to whether you vaccinate your
           | phone."
           | 
           | Wait, no, that's not the argument ...
        
           | u4ria wrote:
           | Theory of Vaccines is simple.
           | 
           | They want you to outsource all your immunity to big pharma.
           | Just like when they took away your guns, and made you
           | outsource your self defence to government. Just like when
           | they outsourced your critical thinking to fact checkers.
           | 
           | You are no good on your own. Submit to us and pay us, and we
           | will make you good.
        
         | sundvor wrote:
         | The privacy implications are fucking horrible.
         | 
         | As an Android user I'd love to gloat, after all Apple have
         | really had the upper edge on privacy so far, and their users
         | have not been shy about telling us. Again and again.
         | 
         | However any pleasure would be as short lived as peeing your
         | pants to keep warm in winter, because if Apple proceeds, Google
         | is bound to follow.
         | 
         | A user monitoring system like this is just _ripe_ for abuse of
         | the most horrific kinds. It must die.
        
           | thunkshift1 wrote:
           | What if you choose not to upload photos to iCloud?
        
             | snowwrestler wrote:
             | According to Apple, photos are only checked in the process
             | of being uploaded. So if you don't upload, no check either.
        
           | judge2020 wrote:
           | Google Photos runs PhotoDNA on all photos there, so it's no
           | different from Apple scanning photos destined for iCloud
           | photos with similar tech. I guess the only pushback is that
           | apple has decided to do it on-device (still only to photos
           | going to iCloud Photos) instead of server-side, where they
           | have to disable E2EE to do so?
        
             | wayneftw wrote:
             | > I guess the only pushback is that apple has decided to do
             | it on-device.
             | 
             | This is exactly what my problem is. They are using my CPU,
             | battery time and network bandwidth to do this when they
             | should be using their own resources. I know I can turn off
             | iCloud but I am paying for that and we already have an
             | agreement in place.
             | 
             | Honestly, it's the only concrete thing to complain about.
             | Every other complaint is based on a what-if, slippery slope
             | concern.
             | 
             | Of course, as per usual, nobody agrees with me here on
             | HN... but that's fine with me because they simply don't
             | reply which lets me know that they don't have any good
             | arguments against my PoV.
        
               | jazzyjackson wrote:
               | Consider it's not very interesting to argue with someone
               | who is set in their ways, and the lack of counter
               | argument is a lack of interest, not a superiority of
               | viewpoint.
               | 
               | Anyway I agree with you, and have been curious what the
               | neural net hardware on the new iPhone will be used for.
               | Turns out it's for checking my photo library for child
               | abuse, how futuristic!
        
               | wayneftw wrote:
               | > Consider it's not very interesting to argue with
               | someone who is set in their ways, and the lack of counter
               | argument is a lack of interest, not a superiority of
               | viewpoint.
               | 
               | I have yet to hear a good counter argument though. If I
               | had heard one, I would have changed my mind. So, I don't
               | think I'm set in my ways. (I actually change my mind
               | often when presented with new evidence!)
               | 
               | I will always consider downvotes without argumentation to
               | be a lazy way of disagreeing and the people who do that I
               | think _are_ set in their ways. Until I hear otherwise, I
               | will continue to consider my argument as superior.
               | 
               | Does anyone really change their mind until they hear a
               | counter to what they already think? I don't think so... I
               | already argued with myself and came up with my opinion on
               | this, so really - I need external arguments to change my
               | mind just like everyone else.
        
               | stagger87 wrote:
               | You're being downvoted for eschewing website guidelines.
               | 
               | https://news.ycombinator.com/newsguidelines.html
        
               | wayneftw wrote:
               | Oh yeah? Which one?
               | 
               | I kind of doubt it because I've been censored for simply
               | asking very straightforward questions with zero
               | adjectives.
        
               | mrtranscendence wrote:
               | "Please don't comment about the voting on comments. It
               | never does any good, and it makes boring reading."
        
               | wayneftw wrote:
               | Nah. Maybe that was one from today, but it certainly
               | doesn't apply to the comments I've been referring to.
               | 
               | No, what I gather is that many people cannot change their
               | minds when presented with good evidence. Here's one: They
               | complain about "monocultures" as if they're always bad
               | but when I point out that the Linux kernel created a
               | monoculture and the world hasn't imploded, they have no
               | come-back. So they do the only thing that they can do.
               | 
               | It's fine with me, I wear it as a badge of pride because
               | I know I'm right whenever that happens.
        
             | formerly_proven wrote:
             | iCloud Photos are not end-to-end encrypted in the first
             | place.
        
               | judge2020 wrote:
               | This opens the door to doing so.
               | https://news.ycombinator.com/item?id=28081863
        
             | Zak wrote:
             | There's a huge difference between Apple scanning files that
             | are on Apple's servers and Apple putting spyware on your
             | encrypted device.
        
             | notsureaboutpg wrote:
             | >I guess the only pushback is that apple has decided to do
             | it on-device (still only to photos going to iCloud Photos)
             | instead of server-side, where they have to disable E2EE to
             | do so?
             | 
             | Apple will scan only iCloud photos and when a certain
             | threshold of "illegal" images is found, they will start
             | scanning non-iCloud photos and disabling encryption to do
             | so.
        
           | stephen_g wrote:
           | Well, I mean if Gmail has already done this kind of scanning
           | for years (as is reported) then you'd assume Google Photos
           | probably already does too if you sync it to their cloud (as
           | does iCloud on the server side). But yeah, client side is a
           | whole different thing...
        
             | hef19898 wrote:
             | Out of curiosity, how does Samsung Knox handle privacy?
        
         | wintermutestwin wrote:
         | >touting itself as a privacy-centric alternative
         | 
         | There is a difference between selling your data to the highest
         | bidder through a stalker capitalism ecosystem and giving
         | governments carte blanche access.
         | 
         | I am not at all advocating for the latter, but if you are
         | fighting that battle, CP is not the hill to die on.
        
           | [deleted]
        
           | osmarks wrote:
           | This is _why_ it 's a problem - this is the most defensible
           | usecase for it, and any upscaling later will just get seen as
           | a natural progression.
        
             | wintermutestwin wrote:
             | My point was that battle was already lost a long time ago
             | when terrorism was the use case.
        
         | defaultname wrote:
         | Anyone who engaged in such a discussion with a non-technical
         | person is going to defacto seem like an advocate for child
         | pornography, similar to how advocating for encryption can
         | easily be twisted to being pro-crime.
         | 
         | Having said that, there is an _enormous_ amount of
         | misinformation and fear-mongering about a pretty tame change.
         | This seems like so much ado about very close to nothing.
         | 
         | a) They optionally scan messaged photos for nudity using a NN
         | if the participants are children and in a family (the account
         | grouping), and the group adult(s) have opted in, giving
         | children warnings and information if they send or receive such
         | material. A+++ thumbs up.
         | 
         | b) They scan photos that you've _uploaded to iCloud_ (available
         | at photos.icloud.com, unencrypted -- in the E2E sense,
         | effectively  "plaintext" from Apple's perspective -- etc) for
         | known CP hashes. Bizarrely Apple decided to scan these on
         | device as well, causing 99% of the outrage and confusion, yet
         | every major cloud photo service in the world does such checks
         | for the same reason, whether you have the photo set to private
         | or not, and presumably Apple decided to do it on device simply
         | as free distributed computing, taking advantage of hundreds of
         | millions of high performance chips, but most importantly as a
         | PR move demonstrating that "Apple Silicon helps with Child
         | Safety", etc.
         | 
         | That's it. Various "this is a harbinger of doom and tomorrow
         | they're going to..." arguments are unconvincing. This does
         | absolutely nothing to break or subvert E2E encryption or on
         | device privacy.
         | 
         | EDIT: The moderation of this comment has been fascinating,
         | going to double digits, down to negatives, back up again, etc.
        
           | new299 wrote:
           | They were already doing it on the cloud:
           | 
           | https://nakedsecurity.sophos.com/2020/01/09/apples-
           | scanning-...
           | 
           | So now they're doing it on device too. This feels like it's
           | putting in place the foundation to scan all offline content.
        
             | defaultname wrote:
             | Scanning on device (albeit only of photos shared _off_
             | device) seems like an ill-considered PR move for a whole
             | child safety push (perhaps with a  "look at how powerful
             | our iPhone chips are" angle). As you mentioned, they've
             | already been doing these checks for some time on their
             | servers, and people concerned about false positives should
             | realize that Microsoft, Google, Facebook, Amazon et al are
             | doing identical checks with a very similar process.
             | 
             | I imagine there are some frantic meetings at Apple today.
             | However the grossly misleading claims people have been
             | making to fear-monger aren't helpful.
        
             | cwizou wrote:
             | Thanks for the link, I had assumed that Apple was already
             | doing it on servers (like all other online services
             | providers), which makes the announcement even more
             | terrible.
             | 
             | Moving it on device will show 0 improvement to the original
             | goal, while opening a door that quite frankly I never
             | expected Apple to be the one to open (I would have bet on
             | Microsoft).
        
               | daemoon wrote:
               | > Moving it on device will show 0 improvement to the
               | original goal, while opening a door that quite frankly I
               | never expected Apple to be the one to open (I would have
               | bet on Microsoft).
               | 
               | The CSAM scan is only for photos that are to be uploaded
               | to iCloud Photos. Turning off iCloud Photos will disable
               | this.
        
               | cwizou wrote:
               | Sorry if my point wasn't clear, I do understand this yes.
               | 
               | My point is that to my knowledge, this is the first time
               | that an on device "content check" is being done (even if
               | it's just for photos that will end up in iCloud). This is
               | the precedent (the on device check) that makes me and
               | some others uneasy, as pointed out in the linked letter.
               | The fact that it applies only to photos going to the
               | cloud is an implementation detail of the demonstrated
               | technology.
               | 
               | Legislators around the world now have a precedent and may
               | (legitimately) want it extended to comply with their
               | existing or upcoming laws. This is not a particularly far
               | fetched scenario if you consider that Apple has already
               | accommodated how they run their services locally (as they
               | should, they have to comply with local laws around the
               | world in order to be able to operate).
               | 
               | That's the crux of the issue most of the people quoted in
               | the letter have, one can argue it's just a slippery slope
               | argument, I personally think that one can be legitimately
               | concerned of the precedent being set.
               | 
               | Keeping doing it on server, in my opinion, was a much
               | better option for users (with the same compliance to
               | local laws and effectiveness to the stated goal as far as
               | we know, there's no improvement on that front, or none
               | that couldn't have been brought to the existing server
               | check), and ultimately also a safer option in the long
               | run for Apple.
               | 
               | They've opened themselves, for little reason, to a large
               | amount of trouble on an international scale and at this
               | point rolling it back (to server checks) might not make a
               | difference anyway.
        
               | [deleted]
        
           | [deleted]
        
           | roenxi wrote:
           | "They're going to scan your phone and probably have someone
           | review any photos with a lot of human flesh in them" would be
           | enough to get a lot of non-technical users to take notice.
           | 
           | That would get a lot of people nervous. Let alone anyone
           | smart who thinks through the implications here of how far the
           | line is being pushed on how public your phone is.
        
             | daemoon wrote:
             | Simply turning off iCloud Photos will ensure that photos on
             | your iPhone are never scanned. Why are you trying to make
             | this thing about photos stored on-device? Photos in iCloud
             | have always been available to Apple through iCloud backups.
             | If you are concerned about privacy, turn it off.
             | 
             | "And it does so while providing significant privacy
             | benefits over existing techniques since Apple only learns
             | about users' photos if they have a collection of known CSAM
             | in their iCloud Photos account."
        
               | arvinsim wrote:
               | So if an offender turns off iCloud then this move will
               | absolutely be useless??
               | 
               | How would that help catch them if they can simply flip
               | the switch?
        
               | daemoon wrote:
               | > So if an offender turns off iCloud then this move will
               | absolutely be useless??
               | 
               | No on-device photo scanning unless iCloud Photos is
               | enabled. Isn't funny when you get the most important
               | aspect wrong?
               | 
               | > How would that help catch them if they can simply flip
               | the switch?
               | 
               | They never claimed that this would help "catch them" if
               | they are not using iCloud Photos.
        
             | davidcbc wrote:
             | It would, but luckily that's not what's happening
        
               | roenxi wrote:
               | Go read the announcement, the "CSAM detection" heading
               | [0]. It is exactly what they are doing.
               | 
               | Although they're assuring us that they don't make
               | mistakes. The technical term for that is either going to
               | be "blatant deception" or "delusion". Apple are
               | impressive but they haven't developed a tech that can't
               | make mistakes.
               | 
               | [0] https://www.apple.com/child-safety/
        
               | defaultname wrote:
               | That isn't what they're doing at all. You have
               | significantly misunderstood or conflated different
               | sections.
               | 
               | Though I don't blame you at all: Read through the various
               | hysterical posts about this and there are a lot of
               | extraordinary misrepresentations.
        
               | roenxi wrote:
               | Ah, I see what you're getting at. They're currently
               | hashing for specific photos.
               | 
               | I don't care. There is no way on this good earth that law
               | enforcement is going to let them get away with that.
               | They're claiming that they will be scanning things that
               | are obviously child porn and ignoring it. That isn't a
               | long term stable thing to be doing - if they think
               | scanning for anything is ok there is no logical reason to
               | stop here. So they probably aren't going to stop, and
               | they certainly aren't going to announce every step they
               | take to increase the net.
               | 
               | And their 1:1,000,000,000,000 number is still delusional.
               | The system is going to produce false positives. There are
               | more sources of error here than the cryptographic hash
               | algorithm.
        
           | schnable wrote:
           | Apple does a lot of the ML and personalization stuff on user
           | devices for privacy reasons as well, keeping your data out of
           | the cloud, and that is a good thing.
        
           | syshum wrote:
           | I think you are massively under estimating what this change
           | means if you think it is "pretty tame change"
           | 
           | Clearly you do not understand the full ramification of what
           | is happening here.
        
             | defaultname wrote:
             | What are the ramifications that I "do not understand"?
             | 
             | I will repeat: It is a _very_ tame change. Were you frantic
             | and delirious when a neural network first identified a dog
             | in your photos? Isn 't that the slippery slope to it
             | reporting you to the authorities for something bong shaped?
             | 
             | Speaking of which, every bit of fear mongering relies upon
             | a slippery slope fallacy. What is _clearly_ a PR move is
             | somehow actually the machinations of a massive surveillance
             | network. Why? Why would Apple do that?
        
               | syshum wrote:
               | That's not a slippery slope; that's a fully built system
               | just waiting for external pressure to make the slightest
               | change.
               | 
               | Apple's changes would enable such screening, takedown,
               | and reporting in its end-to-end messaging. The abuse
               | cases are easy to imagine: governments that outlaw
               | homosexuality might require the classifier to be trained
               | to restrict apparent LGBTQ+ content, or an authoritarian
               | regime might demand the classifier be able to spot
               | popular satirical images or protest flyers.
               | 
               | [1] https://www.eff.org/deeplinks/2021/08/apples-plan-
               | think-diff...
        
               | defaultname wrote:
               | We can carry every single element back to its inception
               | and make _identical_ arguments. That 's the problem with
               | slippery slopes.
               | 
               | Apple - creates messaging platform.
               | 
               | Slippery slope - governments can force Apple to send them
               | all messages.
               | 
               | Apple - creates encrypted messaging platform.
               | 
               | Slippery slope - governments can force Apple to send them
               | the keys and all messages
               | 
               | Apple - Adds camera to device (GPS, microphone,
               | accelerometer, step detection, etc)
               | 
               | Slippery slope - governments can force them to record
               | whenever the government demands and send a live stream to
               | the government. Or send locations, or walking patterns,
               | or conversations.
               | 
               | Apple - makes operating system
               | 
               | Slippery slope - Basically anything. There are so many
               | ways I could go with this. Government can force them to
               | make a messaging and photo system, add cameras to their
               | devices, entice users to take and accumulate pictures,
               | and do image hashing and report suspect photos.
               | 
               | That's the problem with slippery slope arguments. They
               | become meaningless rhetoric.
               | 
               | Image perceptual hashing is literally a single person's
               | work for two hours. If you _really_ think the barrier
               | between an oppressive government stomping on groups or
               | not is whether Apple did a (poorly communicated) Child
               | Safety PR move and implemented this _trivial_
               | mechanism...the world is a lot more perilous and scary
               | than you think.
        
               | rdedev wrote:
               | Do keep in mind that atleast one govt has successfully
               | preassured apple to give up on its privacy
               | 
               | Also the difference here compared to the scenarios you've
               | mentioned above is that Apple has walked pretty far. All
               | it would take is to make the verification happen on all
               | local files irrespective of its being uploaded to iCloud
               | or not. Then any government can provide their own hashes
               | to apple to keep track of. Apple won't have any idea what
               | those hashes actually mean
        
               | defaultname wrote:
               | "Do keep in mind that at least one govt has successfully
               | pressured apple to give up on its privacy"
               | 
               | No company can defend you from your government.
               | 
               | "All it would take"...
               | 
               | That is the slippery slope. If a government is going to
               | say "that's a nice looking hashing system you have there,
               | now we need you to..." they could as easily -- _more_
               | easily -- have said  "that's a nice filesystem you have
               | there, we need you to...".
               | 
               | Hashing files and comparing them against a list is
               | literally a college grad afternoon project. There is
               | absolutely nothing in Apple's announcement that empowers
               | any government anywhere in any meaningful way at all. It
               | is only fear-mongering (or simply raw factual errors as
               | seen throughout this discussion) that makes it seem like
               | it does.
        
               | rdedev wrote:
               | Sure no company can completely defend me from my govt but
               | atleast they can not build tools that make it easier.
               | Also while it could be easy for a college graduate to
               | build such a system, only Apple has the capability of
               | rolling out this system to all of their phones. Otherwise
               | we would have already seen such a system implemented in
               | other countries
        
               | defaultname wrote:
               | "only Apple has the capability of rolling out this
               | system"
               | 
               | Right. Exactly. Any country in the world can _mandate_
               | that Apple do anything they want (any of the slippery
               | slope mandates), and Apple can comply or withdraw from
               | the market. If any country wanted to demand that Apple
               | compare all files against a ban list, they could have
               | done that in 2007, or any year since. There is zero
               | technical barrier and it would be a trivial task.
               | 
               | The point is that this development moves the bar
               | infinitesimally. I would argue not at all. Fearmongering
               | that depends upon "Well what if..." didn't actually think
               | it through very well.
        
               | rdedev wrote:
               | > "only Apple has the capability of rolling out this
               | system"
               | 
               | I guess this is where I differ with you. No government
               | was able to preassure apple to implement a complete
               | client side verification till now but who knows how
               | things will be now that they have a system in place that
               | can be easily modified.
               | 
               | Anyways I hope your prediction turns out right. I live in
               | a place where privacy laws don't really exist. The last
               | thing I want is any system that can be easily exploited
               | by authorities to crush dissent.
        
               | syshum wrote:
               | That is because you are incorrectly using the the charge
               | of the Slippery slope fallacy to hand wave away
               | legitimate concerns by reducing them to an absurdity
               | which is itself a logical fallacy
               | 
               | The legitimate concerns here are not a slippery slop,
               | they are real and self evidence born from countless
               | examples through out history of these types of survelence
               | systems being abused. EFF points to a coulple, other
               | artiles point to other examples
               | 
               | It takes a person an extremely dense person (or
               | intellectually dishonest) to simply hand wave all of
               | these concerns away as "well that is just a slippery
               | slope so I can ignore you"
        
               | defaultname wrote:
               | I guess I'm just extremely dense.
               | 
               | I see Apple implement a _trivial_ (laughably trivial)
               | system -- a tiny little pimple on a massive platform of
               | hardware and software -- and I don 't think "this is it!
               | This is what puts the surveillance state over the top!",
               | I think "Hey look, trivial, constrained system".
               | 
               | But if you believe that this was what was between a
               | surveillance state or not...well you must be much more
               | intellectually capable than I and I am honored to be in
               | your presence.
               | 
               | It is the very definition of a slippery slope. A journey
               | of a thousand miles begins with a single step, but
               | usually you're just walking to the bathroom.
        
               | evrydayhustling wrote:
               | > Why would Apple do that?
               | 
               | Because it gets required to by the laws of countries
               | responsible for most of their market? And because
               | authoritarian regimes intentionally blur the lines
               | between criminal and political surveillance over time,
               | making it harder for companies to draw hard policy lines?
               | Concern about this doesn't require any bond villains, it
               | just requires well-intentioned pragmatists on one side
               | and idealogical politicos on the other.
               | 
               | FWIW, I think you have a point about the doom-saying.
               | Countries with good judicial protections around privacy
               | already use it as a backstop against dirty tricks where
               | folks use one type of surveillance to require another.
               | But it makes sense to wonder how those barriers will
               | erode over time, and to worry about places where they
               | don't exist.
        
               | dwaite wrote:
               | In which case how is this a slippery slope? They didn't
               | do something and there was no legal mandate. Now they are
               | required to do something to be able to operate in said
               | company. Is the slippery slope that they can be in
               | compliance faster?
        
               | robertlagrant wrote:
               | > What is clearly a PR move
               | 
               | I don't know what I think about the move, but I know
               | this: intentions don't matter. Capabilities matter. If
               | the current intention is benign, that means little.
        
           | silverpepsi wrote:
           | Why does everyone mention they already did it on cloud like
           | that has any relevance whatsoever?
           | 
           | I have never once in my life thought about activating an
           | automatic back up to cloud feature on any phone I have ever
           | owned, for a single second. So yes, it is hella different.
           | This is for all the same reasons I backup personal data only
           | to my NAS and use cloud accounts for generic shit like
           | purchased music backups and nothing more.
           | 
           | I prefer losing all my photos if my phone is pickpocketed in
           | between backups to having a public record of everything I
           | ever photographed. Am I the 0.00000001% or something? I
           | didn't even realize I was the odd man out, honestly.
        
             | defaultname wrote:
             | "Why does everyone mention they already did it on cloud
             | like that has any relevance whatsoever?"
             | 
             | Given that this only applies to photos that are stored to
             | the cloud, it seems like it has total relevance given that
             | for users literally _nothing_ has changed. To argue that
             | there is some fearsome new development requires one to
             | extrapolate into  "Well what if..." arguments.
        
           | noasaservice wrote:
           | This on-device scanning is even worse. They cant even tell
           | what was violating, or what hash, or what image. Just that
           | the computer said you are violating.
           | 
           | We have no idea about the hash collisions. When talking about
           | whole world, 2^256 isn't a big enough space.... even if
           | they're using 256 bits.
           | 
           | And how dare anybody criticize this - criticism is tantamount
           | to being for child porn. (Then again, that's why it was
           | chosen. We'll soon see other things 'forbidden'.)
        
             | defaultname wrote:
             | There is a shared (among all of the major tech companies
             | and presumably law enforcement) hash database of child
             | abuse material. Going from a photo to the hash is
             | deterministic: How it gets to a hash on your phone is
             | surely the same way it gets to a hash running the exact
             | same algorithm on the cloud, whether iCloud, Amazon Photos,
             | etc.
             | 
             | Such a collision would generate a human validation.
             | 
             | This applies to cloud-shared files. It actually has applied
             | to cloud shared photos for years. It applies to literally
             | every major cloud photo service.
        
               | noasaservice wrote:
               | 1. how many false positives have there been?
               | 
               | 2. is it _really_ reviewed by a human? I see how YT
               | works, and automated failure at scale is the name of the
               | game
               | 
               | 3. apple has said that the on-device scanning only
               | provides a binary yes/no on CP detection. How do you
               | defend against a "yes" accusation? (when stored on
               | someone else's server, the evidence is there)
        
               | dwaite wrote:
               | This is part of iCloud photo sync, and on hitting some
               | threshold of matching pictures it would trigger human
               | review.
               | 
               | There would also presumably be human review involved in
               | the legal process, e.g. law enforcement getting a
               | subpoena based on Apple notifying them, and then using
               | gathered evidence for a warrant.
               | 
               | The system is based on known image hashes, not arbitrary
               | ML detection.
               | 
               | As this system is used only for iCloud photo uploads, the
               | evidence gathering should be similar to that done by LE
               | with other cloud hosting providers for years
        
             | mrtranscendence wrote:
             | 2*256 is a _huge_ space. If everyone in the whole world had
             | 50,000 images stored in iCloud, the probability of a
             | collision would still be infinitesimally, unimaginably
             | small.
        
               | mousepilot wrote:
               | 2*256 is 512. you should probably type 2^256.
               | 
               | lol picky I know
        
           | bambax wrote:
           | > _a) They scan photos for nudity using a NN if the
           | participants are children and in a family (the account
           | grouping), giving children warnings and information if they
           | send or receive such material. A+++ thumbs up._
           | 
           | Leave my kids alone.
           | 
           | If they want to share photos of themselves naked, it's none
           | of anyone's business except them and maybe me (maybe),
           | certainly not a huge American corporation.
           | 
           | Neither me or my kids have iPhones, but as others have
           | observed, I have no illusions that Google will follow suit.
           | Our options are becoming pretty limited at this point.
        
             | barkerja wrote:
             | Hopefully this is another configurable option that falls
             | under the already very extensive family screen time
             | feature. I understand where you're coming from and respect
             | your position, but I fall on the opposite side. This is
             | something I do want for my kid.
        
               | belter wrote:
               | You want Apple employees going trough naked photos of
               | your kid and deciding if its child porn or not because
               | the algo flagged it? Because that is what this means.
        
               | defaultname wrote:
               | No, it doesn't. You have conflated two entirely different
               | systems.
        
               | belter wrote:
               | What would the two different systems be then?
               | 
               | I am certainly willing to agree I conflated them if you
               | clarify.
        
               | defaultname wrote:
               | One system optionally checks iMessages (to be sent or
               | received) against one or more "nudity" neural networks if
               | the user is identified as a child in the iCloud family
               | sharing group. If it triggered, the child is given
               | information/opt out, optionally choosing to go ahead with
               | sending or viewing, with the caveat that their parent(s)
               | will be notified (optionally, I presume). Nothing about
               | this event is sent off device. Apple employees aren't
               | receiving or evaluating the photos. No one other than the
               | identified parent(s) is party to this happening.
               | 
               | Neural networks are from perfect, and invariably parents
               | and child are going to have a laugh when it triggers on a
               | picture of random things.
               | 
               | The _other_ , completely separate system checks files
               | stored in iCloud photos against a hash list of known,
               | identified child abuse photos. This is already in place
               | on all major cloud photo sites.
               | 
               | Apple clearly wanted to roll out the first one but likely
               | felt they needed more oomph in the child safety push so
               | they made a big deal about expanding the second system to
               | include on-device image hash functionality. I would not
               | be surprised at all if they back down from the latter
               | (though 100% of that functionality will still happen on
               | the servers).
        
               | belter wrote:
               | There are two systems and I will comment first on the
               | second one. The second system you were referring to, as
               | in the 'other', is the CSAM. And by the funcionality
               | description, already sounds terrifying enough.
               | 
               | You will be flagged, reported to the authorities under
               | technical argumentation the Algorithm has a "one in a
               | trillion" chance of failure. Account blocked, and you
               | start your "guilty until proven innocent process" from
               | there.
               | 
               | Due to scale at what Apple works its also clear to see,
               | if you think about it, that the additional human process
               | will be on a random basis. The volume would be too high
               | for a human chain to validate each flagged account.
               | 
               | In any case at multiple occasions, it is clear that the
               | current model of privacy with Apple is that there is no
               | privacy. It is a tripartite between you, the other person
               | you interact with, and Apple algorithms and human
               | reviewers. Is there any difference between this, and
               | having a permanent video stream from what is happening
               | inside each division in your house, analyzed by a "one in
               | a trillion neural net algorithm", and additionally
               | reviewed by a human on a need to do basis ?
               | 
               | CSAM
               | 
               | https://www.apple.com/child-
               | safety/pdf/CSAM_Detection_Techni...
               | 
               | "Using another technology called threshold secret
               | sharing, the system ensures that the contents of the
               | safety vouchers cannot be interpreted by Apple unless the
               | iCloud Photos account crosses a threshold of known CSAM
               | content. Only when the threshold is exceeded does the
               | cryptographic technology allow Apple to interpret the
               | contents of the safety vouchers associated with the
               | matching CSAM images."
               | 
               | "The threshold is selected to provide an extremely low (1
               | in 1 trillion) probability of incorrectly flagging a
               | given account. This is further mitigated by a manual
               | review process wherein Apple reviews each report to
               | confirm there is a match, disables the user's account,
               | and sends a report to NCMEC. If a user feels their
               | account has been mistakenly flagged they can file an
               | appeal to have their account reinstated."
               | 
               | So in other words, there is no doubt for this one there
               | will be intervention, of Apple employees when required.
               | For training purposes, for system testing, for law
               | enforcement purposes etc...
               | 
               | I guess these and other functionality was the reason the
               | decided not to encrypt the icloud backups
               | 
               | "Apple dropped plan for encrypting backups after FBI
               | complained" https://www.reuters.com/article/us-apple-fbi-
               | icloud-exclusiv...
               | 
               | Now concerning the second feature/system called:
               | "Expanded Protections for Children"
               | 
               | "https://www.apple.com/child-
               | safety/pdf/Expanded_Protections_..."
               | 
               | After reviewing what I believe to be every single
               | document published so far by Apple on this, I found only
               | one phrase and nothing more detailing how it works.
               | Looking at the amount of detail published for CSAM, the
               | lack info on this one already looks like a redflag to me.
               | So I am not really sure how you can be so certain of the
               | implementation details.
               | 
               | The phrase is only this: "Messages uses on-device machine
               | learning to analyze image attachments and determine if a
               | photo is sexually explicit. The feature is designed so
               | that Apple does not get access to the messages."
               | 
               | Nothing more I could find. If you have additional details
               | please let me know.
               | 
               | This is a feature that will be added in a future update
               | there some conclusions from my part. If we just stay with
               | the phrase "The feature is designed so that Apple does
               | not get access to the messages." I have to note its
               | missing the same level of detail that was published for
               | CSAM.
               | 
               | I can only conclude that:
               | 
               | - They do not get access to the messages currently due to
               | the current platform configuration. Note they did not say
               | the images will stay on the phone currently or in the
               | future. Just that uses local neural nets technology and
               | feature is designed, so that Apple does not get access to
               | the messages.
               | 
               | - They did not say they will not update the feature in
               | the future for leveraging the icloud compute capabilities
               | 
               | - They did not say if there is any opt in or opt out for
               | data for their training purposes
               | 
               | - They do not say if they can update locally the
               | functionality at the request of law enforcement.
               | 
               | I agree with you that it looks like it stays locally by
               | the description, but the phrase as written looks like
               | weasel words.
               | 
               | They also mention one of the scenarios: Where the child
               | will agree to send the photo to the parent before
               | viewing, would that be device to device communication or
               | via their icloud/CSAM system ? Unclear, at least for what
               | I could gather so far.
        
               | defaultname wrote:
               | On the Child Safety page one of the dialogs is the opt in
               | (or out) configuration, so it seems, as one would expect,
               | that the adult(s) in the family sharing group get to
               | configure this.
               | 
               | And it's a useful, valuable option that many (I would
               | wager the overwhelming majority) parents will enable.
               | 
               | Apple made a huge PR mistake announcing both of these
               | systems together (the CP hashing system and the NN
               | message warning system), because as seen throughout the
               | comments it has led to lots of people conflating and
               | mixing and matching elements of both into a fearsome
               | frankensystems.
        
               | wizzwizz4 wrote:
               | The NN is the system I thought they were making, and I
               | applaud it. The hashing one feels really dangerous,
               | though; I don't think that's people just exaggerating.
               | Apple hasn't done enough to limit their own power, so
               | they might (read: will) be made to use it to hurt people.
        
           | stephen_g wrote:
           | I guess the upside might be that this could be a compromise
           | for them to start doing end to end encryption on iCloud
           | backups and iCloud photo libraries. They might be able to
           | argue that if they're scanning for illegal content on the
           | client side, then they don't need to be able to decrypt on
           | the cloud side... We'll have to see, it's still creepy but
           | potentially a small net gain overall if that becomes an
           | option...
        
             | tomjen3 wrote:
             | If they claim e2e, but we scan on the client side then we
             | should sue them for deceptive marketing.
             | 
             | e2e means something specific and should be an absolute
             | requirement in a post Snowdon age, not something that is
             | optional or a compromise.
        
           | tomjen3 wrote:
           | The problem with this is that they can now scan images on
           | phones regardless of upload, and it will render any future
           | promisses of e2e iCloud deceiving.
           | 
           | I don't see a reason to do this, other than to either scan
           | all images, or claim that iCloud is e2e in the future.
           | 
           | And of course they went with the protection of children
           | argument, which is bullshit. Apple gets paid by its users and
           | should have no other interests than to get paid as much money
           | as possible, regardless of who pays them.
        
             | zepto wrote:
             | > Apple gets paid by its users and should have no other
             | interests than to get paid as much money as possible,
             | regardless of who pays them.
             | 
             | That's exactly why they are doing this. Providing a safe
             | haven for child sex predators is bad for their brand.
        
       | donkeyd wrote:
       | I recently listened to a Darknet Diaries episode on messaging app
       | Kik. This app is apparently being used by many people to trade
       | child pornography. In this episode, there was some criticism
       | expressed on how Kik doesn't scan all the images on their
       | platform for child pornography.
       | 
       | I would really like to hear from people who sign this open
       | letter, how they think about this. Should the internet be a free
       | for all place without moderation? Where are the boundaries for
       | moderation (if it has to exist), one-on-one versus group chat,
       | public versus private chat?
       | 
       | To quote this open letter: "Apple is opening the door to broader
       | abuses". Wouldn't not doing anything open a different door to
       | broader abuse?
       | 
       | Edit: I would really love an actual answer, since responses until
       | now have been "but muh privacy". You can't plead for unlimited
       | privacy and ignore the impact of said privacy. If you want
       | unlimited privacy, at least have the balls to admit that this
       | will allow free trade of CSAM, but also revenge porn, snuff-films
       | and other things like it.
        
         | throwrqX wrote:
         | The boundary should be that user generated personal data that
         | is on a user device should stay personal on a user device
         | unless explicit consent is otherwise given. That's it. The
         | argument is always well "don't you want to save the children"
         | to which I give this example.
         | 
         | In the USA guns are pretty freely available, relative to other
         | countries. There is a huge amount of gun violence (including
         | shooting at schools targeting children) yet every time major
         | gun restriction legislation is introduced it fails, with one
         | major reason being the 2nd amendment of the US constitution.
         | This could again be amended, but sufficient support is not
         | there for this to occur. What does this say about the US?
         | 
         | They have determined, as a society, that the value of their
         | rights is worth more than all the deaths, injuries and pains
         | caused by gun violence. A similar argument can be made
         | regarding surveillance and child porn, or really any other type
         | of criminal activity.
        
           | thesuperbigfrog wrote:
           | >> The boundary should be what is on a user device should
           | stay personal on a user device.
           | 
           | But how many apps only store the locally on the device versus
           | sending data to the cloud, getting data from the cloud, or
           | calling cloud APIs to get functionality that cannot be
           | provided on device?
           | 
           | Having the power of a personal computing device is huge, but
           | having a networked personal computing device is far greater.
           | 
           | Keeping everything on-device is a good ideal for privacy, but
           | not very practical for networked computing.
        
             | throwrqX wrote:
             | Ah, I was talking specifically about user generated
             | personal information. I have edited my post to make it
             | clearer.
        
         | snowwrestler wrote:
         | The conversation today is not really about PhotoDNA (checking
         | image hashes against known bad list). That ship has sailed and
         | most large tech companies already do it. Apple will do it one
         | way or another. It's a good way to fight the sharing of child
         | porn, which is why it is so widely adopted.
         | 
         | The question is whether it is worse to do it on-device, than on
         | the server. That's what Apple actually announced.
         | 
         | I suspect Apple thought doing it on-device would be better for
         | privacy. But it feels like a loss of control. If it's on the
         | server, then I can choose to not upload and avoid the
         | technology (and its potential for political abuse). If it's on
         | my locked and managed mobile device, I have basically no
         | control over when or which images are getting checked, for
         | what.
        
         | sebzim4500 wrote:
         | Fundamentally there is a difference between scanning files on
         | your servers and scanning files stored on someone else's
         | device, without their consent.
        
           | avnigo wrote:
           | Not that I wholly agree with Apple's move due to other
           | implications, but it has to be said that only photos that you
           | have chosen to upload to iCloud will be scanned.
           | 
           | So, in a sense, you do consent to having those photos scanned
           | by using iCloud. Maybe it's time for Apple to drop the
           | marketing charade on privacy with iCloud, since iCloud
           | backups aren't even encrypted anyway.
        
         | ianhawes wrote:
         | Many people share photos of child pornography via the mail.
         | There has been criticism that the USPS does not open all mail
         | and scan them for child pornography.
         | 
         | I would really like to hear from people who do not sign this
         | letter how they think about that.
        
           | donkeyd wrote:
           | I think it would be horrible if they opened all mail...
           | However, if they had a system that could detect CSAM in
           | unopened mail with an extremely low false negative rate, then
           | I'd be fine with it.
           | 
           | With many packages this already happens in the search for
           | drugs and weapons and I have no problem with that either.
        
             | stevenicr wrote:
             | I believe millions more nudes are sent via
             | snap/watsap/etcetc every day then are sent in the mail in a
             | year.. with "extremely low false negative rate" - if that
             | means there are a bunch of geeks and agents staring at your
             | wife/gf.. your daughter/ etc.. is that okay? Some people
             | will say yes - but I don't think those people should get to
             | decide that all the other people in the world have to hand
             | over all the nudes and dick pics everyone else has taken.
             | 
             | when they open mail with drugs and weapons it's most likely
             | not a sculpture of your wife's genitals / daughter's tits,
             | pics of your hemorrhoids / erectile implant - whatever
             | other private pic stuff that no one else should be peering
             | at.
             | 
             | If they were opening and reading every written word sent
             | and received - I can guarantee you people would be up in
             | arms about it.
        
               | donkeyd wrote:
               | > I believe millions more nudes are sent via
               | snap/watsap/etcetc every day then are sent in the mail in
               | a year.
               | 
               | I know for sure this is true, but this BS scenario was
               | still used as a rebuttal against my comment, so I figured
               | I'd give them an actual answer.
               | 
               | > when they open mail with drugs and weapons it's most
               | likely not ... other private pic stuff that no one else
               | should be peering at.
               | 
               | People order a lot of stuff that is very private online.
               | From self help books, to dilos (which show up clearly on
               | x-ray), to buttless chaps. If anything, packages are
               | probably more private than mail these days, since people
               | only mail postcards and bills.
               | 
               | > If they were opening and reading every written word
               | sent and received - I can guarantee you people would be
               | up in arms about it.
               | 
               | They would. But this is completely incomparable to what's
               | happening on iOS devices now.
        
               | stevenicr wrote:
               | Maybe some crossed wires in the discussion - I thought
               | your initial reply was in regard to what gp wrote about
               | letters - and then you were saying about xrays and
               | weapons and drugs.. which generally are not an issue with
               | letters..
               | 
               | think I brought too many examples in basically trying to
               | say flat paper.. yeah I think most people assume
               | boxes/packages are xrayed/scanned by usps - and people do
               | still send stuff they would not want public that way.
               | 
               | we are in agreement in that "They would. But this is
               | completely incomparable to what's happening on iOS
               | devices now."
        
             | ls65536 wrote:
             | I suppose one important distinction here, if this makes any
             | difference, is that drugs and weapons (to use your
             | examples) are physical items, and these could arguably
             | cause harm to people downstream, including to the postal
             | system itself and to those working in it. In contrast,
             | photos, text, written letters, and the transmissions of
             | such are merely information and arguably not dangerous in
             | this context (unless one is pro-censorship).
        
               | judge2020 wrote:
               | The proliferation of CSAM is extremely harmful to the
               | victims in the photos and more people seeing them might
               | encourage more CSAM production in general.
        
               | ls65536 wrote:
               | I suppose I should clarify my point that I was referring
               | to those dangerous items in the mail in the sense of
               | their capacity to directly cause physical harm to those
               | handling or receiving them, rather than in the more
               | general sense of the societal effects of their
               | proliferation, which is something else altogether (to
               | your point).
               | 
               | To be clear, I'm not disagreeing with you in regards to
               | the harm caused in the specific case of CSAM, but I can't
               | help but see this as a slippery slope into having the
               | ability in the future to label and act on other kinds of
               | information detected as objectionable (and by whom one
               | must also ask), which is itself incredibly harmful to
               | privacy and to a functioning free society.
        
           | gordon_gee123 wrote:
           | They already scan for bombs and hazardous materials, so yes
           | the line is drawn somewhere between 'let anything get sent'
           | and 'track everything'
        
           | emodendroket wrote:
           | Not sure I'm going to go all in on unqualified support, but
           | it seems like comparing an image to a series of known hashes
           | is qualitatively somewhat different than the postal inspector
           | open all your mail. Though those convenient little scans
           | they'll send you if your mail if you sign up suggests they do
           | have some interest in who sends you mail already.
        
         | simion314 wrote:
         | My personal opinion is if we could spy on all citizens all the
         | time we could stop all or most of the crimes, do we want this?
         | If you say Yes then stop reading here. Else if you say 100%
         | surveillance is too much then what do you have against people
         | discussing where the line should be drawn?
         | 
         | Some person that would sign that letter might be fine with
         | video cameras in say a bank or some company building entrance
         | but he is probably not fine with his own phone/laptop recording
         | him and sending data to soem company or government without his
         | consent.
         | 
         | So let's discuss where should the line be drawn, also if
         | competent people in this domain are present let's discuss
         | better ideas on preventing or catching criminals, or even for
         | this method let's discuss if it can be done better to fix all
         | the concerns.
         | 
         | What I am sure is that clever criminals will not be affected by
         | this, so I would like to know if any future victim would be
         | saved (I admit I might be wrong, so I would like to see more
         | data from different countries that would imply the impact of
         | this surveilance)
        
           | croes wrote:
           | This only finds known pictures of child abuse not new ones
           | and especially it doesn't find the perpetrators or prevent
           | the abuse.
           | 
           | But it creates a infrastructure for all other kind of
           | "criminal" data. I bet sooner or later governments want to
           | include other hashes to find the owners of specific files.
           | Could be bomb creation manuals, could be flyers against a
           | corrupt regime. The sky is the limit and the road to hell is
           | paved with good intentions.
        
             | heavyset_go wrote:
             | > _This only finds known pictures of child abuse not new
             | ones_
             | 
             | No, it finds images that have _similar_ perceptual hashes,
             | typically using the hamming distance or another metric to
             | calculate differences between hashes.
             | 
             | I've built products that do similar things. False positives
             | are abound with perceptual hashes, especially when you
             | start fuzzy matching them.
        
               | croes wrote:
               | But these are just variations of known pictures to
               | recognize them if they cropped or scaled. The hash of a
               | really new picture isn't similar to the hash of a known
               | picture.
        
               | heavyset_go wrote:
               | Perceptual hashes have collisions, and it is entirely
               | possible for two pictures that have nothing to do with
               | each other to have similar perceptual hashes. It's
               | actually quite common.
        
         | stevenicr wrote:
         | "Apple is opening the door to broader abuses". Wouldn't not
         | doing anything open a different door to broader abuse?
         | 
         | Actually I believe that in regards to "not doing anything open
         | a different door to broader abuse" - no. Starting scans will
         | lead to broader, and no tin foil hat needed.
         | 
         | If we compare 1-apple starts scanning for known hashes, not
         | "looking at all your naked pics and seeing if you've got
         | something questionable"... this is just looking for known /
         | already created by others / historical things... by doing a
         | scan for one thing - they open the pandoras box to start
         | scanning for other things - and then they will be compelled by
         | agents to scan for other things - and I believe that is
         | broader, much.
         | 
         | Next month it will be scan for any images with a hash that
         | matches a meme with fauci - the current admin has already
         | stated that in their desire to stop 'disinformation' they want
         | to censor sms text messages and facebook posts (assuming also
         | fbk DMs and more).
         | 
         | There is a new consortium of tech co's sharing a list of bad
         | people who share certain pdfs and 'manifestos' or something
         | like that now right? Might as well scan for those docs too, add
         | all them to the list.
         | 
         | What power this could lead to - soon the different agencies
         | want scans for pics of drugs, hookers.. how about a scan for
         | those images people on whatsapp are sharing with the black guy
         | killing a cop with a hood on?
         | 
         | What happens when a new admin takes the house and white house
         | and demands scans for all the trumped a traitor pics and make a
         | list?
         | 
         | See this is where the encryption backdoors go.. and where is
         | that line drawn? Is it federal level agencies that get to scan?
         | can a local arizona county sheriff demand a scan of all phones
         | that have traveled through their land/air space?
         | 
         | Frankly, public chats, public forums.. if you post kids or
         | drugs or whatever is not legal there - then it's gonna get the
         | men with guns to take note. What you do in private chats / DMs,
         | etc - I think should stay there and not go anywhere else.
         | 
         | I don't like the idea that Msoft employees look at naked pics
         | of my gf that are added to a pics folder because someone setup
         | a win system without disabling one drive. So I don't use one
         | drive and tell others to not use it - and not put pics of me
         | there or on fbk or tictoc.
         | 
         | For all those people that have nothing to hide - I feel sorry
         | for you - but wonder if your kids/grandkids should have
         | thousands of agents looking into their private pics just to
         | make sure there is nothing not legal there.
         | 
         | so would these scans get nudes sent through whatsapp? That
         | would kill the encryption thing there kind of.
         | 
         | Would this scan get a cach if someone was using a chat room and
         | some asshat posted something they shouldn't - and every person
         | in the chat got a pic delivered to their screen. so many
         | questions.
         | 
         | I also question what the apple scans would scan as far as
         | folders and what not.. would it scan for things in a browser
         | cache? like not purposefully downloaded.. if someone hit a page
         | that had a setup image on it - would that person now be flagged
         | for inspection / seizing?
         | 
         | If they are working with nice guys in Cali that just want to
         | tap people on the shoulder and have a talk with people - will
         | they send flagged notices to agents in other places who may go
         | in with guns drawn and end up killing people?
         | 
         | I'm sure many people are fine with either outcome - I think
         | there is a difference between someone surfing the web and
         | someone who coerces real world harm, and not all those who surf
         | the web deserve bullets to the chest, and there is no way to
         | control that.. well maybe in the uk where cops aren't allowed
         | to have guns maybe.
        
         | digitalsushi wrote:
         | The big reason I don't want someone else scanning my phone (and
         | a year later, my laptop) is that last I checked, software is
         | not perfect, and I don't need the FBI getting called on me
         | cause my browser cache smelled funny to some Apple PhD's
         | machine-learning decision.
         | 
         | It's the same reason I don't want police having any
         | opportunistic information about me - both of these have an
         | agenda, and so innocent people get pulled into scenarios they
         | shouldn't be when that information becomes the unlucky best-fit
         | of the day.
         | 
         | That Apple has even suggested this is disturbing because I have
         | to fully expect this is their vision for every device I am
         | using.
         | 
         | And then I expect from there, it's a race to the bottom until
         | we treat our laptops like they are all live microphones.
        
           | robertoandred wrote:
           | The chance of you getting enough false positives to reach the
           | flagging threshold is one in a trillion. And then the manual
           | review would catch the error easily. The FBI won't be called
           | on you.
        
             | JohnFen wrote:
             | > and then the manual review would catch the error easily.
             | 
             | How do we know this? It's not obvious to me that this
             | system would work well for the edge cases that it seems
             | intended for.
        
             | silverpepsi wrote:
             | Manual review. So in other words they're going to go
             | through your private photos unbeknownst to you to decide
             | you're innocent. And you're never even going to know it
             | happened. Wonderful.
             | 
             | What if it really was a very private photo?
        
             | chopin wrote:
             | That's a claim by Apple. Due to the opaque process this is
             | completely unverifiable.
             | 
             | As well this statement relies on the fact that no malicious
             | actor out there is trying to thwart the system. The hashes
             | used are deliberately chosen to easily produce collisions
             | otherwise the system won't work. This almost certainly will
             | be abused.
        
               | snowwrestler wrote:
               | The concept of checking images against a hash list is not
               | unique to Apple or even new.
               | 
               | https://en.m.wikipedia.org/wiki/PhotoDNA
               | 
               | What Apple announced is a way to do it on the client
               | device instead of the server. That has some security
               | implications, but they're more specific than just "hashes
               | might collide."
        
               | heavyset_go wrote:
               | They're not just checking against a hash list, they're
               | using perceptual hashing, which is inexact and unlike
               | cryptographic hashing or checksumming. Then, they use a
               | fuzzy metric like hamming distance to determine if one
               | image is a derivative of an illegal image.
               | 
               | The problem is that the space for false positives is huge
               | when you use perceptual hashing, and it gets even larger
               | when you start looking for derivative images, which they
               | have to do otherwise criminals would just crop or shift
               | color channels in order to bypass filters.
        
               | robertoandred wrote:
               | "This almost certainly will be abused." is a claim by you
               | and completely unverifiable. Apple has published white
               | papers explaining how their hashing, matching, and
               | cryptography work. How do you see someone thwarting that
               | process specifically?
        
               | JohnFen wrote:
               | Easy -- by having the authorities expand what is being
               | searched for.
        
               | drampelt wrote:
               | Apple's claims on how it will work are also completely
               | unverifiable. What's stopping a government from providing
               | Apple with hashes of any other sort of content they
               | dislike?
        
               | robertoandred wrote:
               | And then Apple reviews the account and sees that what was
               | flagged was not CSAM. And again, the hashes aren't of
               | arbitrary subject matter, they're of specific images.
               | Using that to police subject matter would be ludicrous.
        
               | drampelt wrote:
               | How would Apple know what the content was that was
               | flagged if all they are provided with is a list of
               | hashes? I completely agree it's ludicrous, but there are
               | plenty of countries that want that exact functionality.
        
               | gordon_gee123 wrote:
               | If they have the hash/derivative they dont need to look
               | on device or even decrypt, theyll know that data with
               | this hash is on device, and presumably 100s of other
               | matching hashes from the same device
        
               | robertoandred wrote:
               | The matched image's decrypted security voucher includes a
               | "visual derivative." I'm sure in other words they can do
               | some level of human comparison to verify that it is or is
               | not a valid match.
        
         | shoulderchipper wrote:
         | > Should the internet be a free for all place without
         | moderation?
         | 
         | The better question would be: do you want an arbitrary person
         | (like me) to decide whether you have a right to send an
         | arbitrary pack of bytes?
         | 
         | Neither "society" nor "voters" nor "corporations" make these
         | decisions. It is always an arbitrary person who does. Should
         | one person surrender his agency into the hands of another?
        
           | krapp wrote:
           | >Neither "society" nor "voters" nor "corporations" make these
           | decisions. It is always an arbitrary person who does.
           | 
           | Except in this case, a corporation (Apple) is making the
           | decision relative to the sexual mores of modern Western
           | society and the child pornography laws of the United States.
           | It's unlikely this decision was made and implemented randomly
           | by a single "arbitrary" individual. Contrary to your claim,
           | it's _never_ an arbitrary person.
           | 
           | And yes, I believe Apple has the right to decide how you use
           | their product, including what bytes can and cannot be sent on
           | it.
           | 
           | >Should one person surrender his agency into the hands of
           | another?
           | 
           | We do that all the time, that's a fundamental aspect of
           | living in a society.
           | 
           | But in this specific case, no one is forcing you to use an
           | Apple phone, so you're not surrendering your agency, you're
           | _trading_ it in exchange for whatever convenience or features
           | lead you to prefer an Apple product over competitors. That 's
           | still your choice to make.
        
             | WWLink wrote:
             | "Apple has the right to decide how you use their product"
             | 
             | I hate this argument. If Apple wants to claim ownership of
             | _their_ products then they shouldn't sell them. They should
             | lease them.
        
         | browningstreet wrote:
         | There are reasonable measures to take against child abuse, and
         | there are unreasonable ones. If we can't agree on that, there's
         | no discussion to be had.
        
         | BjoernKW wrote:
         | If a company explicitly states in a service's terms and
         | conditions that content stored or shared through that service
         | will be scanned, I think that's acceptable because the user
         | then can decide to use such a service on a case-by-case basis.
         | 
         | However, making this a legal requirement or deliberately
         | manipulating a device to scan the entire content stored on that
         | device without the user's consent or knowledge even, is
         | extremely problematic, not just from a privacy point of view.
         | 
         | Such power can and will be abused and misused, sometimes
         | purposefully, sometimes accidentally or erroneously. The
         | devastating outcome to innocent people who have been wrongfully
         | accused remains the same in either case (see
         | https://areoform.wordpress.com/2021/08/06/on-apples-expanded...
         | for a realistic scenario, for example).
         | 
         | The very least I'd expect if such a blanket surveillance system
         | were implemented is that there were hefty, potentially
         | crippling fines and penalties attached to abusing that system
         | in order to avoid frivolous prosecution.
         | 
         | Otherwise, innocent people's lives could be ruined with little
         | to no repercussions for those responsible.
         | 
         | Do strict privacy requirements allow crimes to be committed?
         | Yes, they do. So do other civil liberties. However, we don't
         | just casually do away with those.
         | 
         | If the police suspect a crime to have been committed they have
         | to procure a warrant. That's the way it should work in these
         | cases, too.
        
         | gerash wrote:
         | By that logic, can we send in someone into your house every day
         | to look through every corner for child porn in digital or print
         | form?
         | 
         | In fact there are a lot of heinous crimes out there some much
         | worse than child porn IMHO. Singling out child porn as the
         | reason seems like it's meant only to elicit an emotional
         | response.
        
         | aidenn0 wrote:
         | I won't use a service that performs dragnet surveillance on my
         | communication. The US Postal service does not open every single
         | letter to check if it contains illegal material. If I rent a
         | storage unit, the storage company does not open up my boxes to
         | see if I have albums of child pornography. This move by Apple
         | is equivalent.
         | 
         | I'm going to turn this around and say that those in favor of
         | Apple's move: "Should the internet be a place where your every
         | move is surveilled?" given that we have some expectation of
         | privacy in real life.
        
           | donkeyd wrote:
           | > The US Postal service does not open every single letter to
           | check if it contains illegal material.
           | 
           | You'd be surprised about the amount of packages that gets
           | x-rayed in order to find drugs. But yes, you're 100% right
           | that it's not all of them.
        
             | silverpepsi wrote:
             | An x-ray is nothing like reading the contents of letters or
             | randomly checking ahipped harddrives and USB sticks for
             | content. I don't know how to clarify legally or
             | conceptually, but I feel confident there is a very clear
             | difference
        
               | emodendroket wrote:
               | Perhaps I'm being too cynical but I think the difference
               | is they haven't figured out a convenient, automated way
               | to do the latter
        
             | JohnFen wrote:
             | It is illegal for the USPS to look at the contents of first
             | class mail except with a court order.
             | 
             | Other classes of mail may have less stringent protections.
             | For instance, third class (and maybe second class, I
             | forget) mail can be opened to make sure that the things it
             | contains are allowed to be mailed in that class.
        
             | aidenn0 wrote:
             | I forgot about that. I think it's nearly all stamped mail
             | over a certain weight? I don't remember if labeled mail is
             | similarly scanned and my google skills are failing me
             | today.
        
         | 0xy wrote:
         | Evidently the episode you listened to was loaded with false
         | information, because Kik has used PhotoDNA to scan for CSAM for
         | 7 years. [1]
         | 
         | [1] https://www.kik.com/blog/using-microsofts-photodna-to-
         | protec...
        
           | donkeyd wrote:
           | > Doc: From what I can tell, it only starts looking in the
           | rooms and looking at individual people if they are reported
           | for something.
           | 
           | https://darknetdiaries.com/transcript/93/
           | 
           | If I were Kik, I would also write a blog post about using
           | something like this. Many, many things point at Kik only
           | doing the bare minimum though. (If you're the type who
           | supports moderation, apparently they're already doing too
           | much according to much of HN.)
        
         | 3rly wrote:
         | Should the internet be a free for all place without moderation?
         | 
         | Yes!!!
        
           | ionwake wrote:
           | I think different opinions are great. Just wanted to point
           | out saying the internet should be moderated on the hacker
           | news forums is surprising and funny.
        
             | [deleted]
        
         | toxik wrote:
         | I think there is a very deep and troublesome philosophical
         | issue here. Ceci n'est pas une pipe. A picture of child abuse
         | /is not/ child abuse.
         | 
         | Let me ask you a counter-question. If I am able to draw child
         | pornography so realistically that you couldn't easily tell, am
         | I committing a crime by drawing?
        
           | yunohn wrote:
           | What is this absurd counter?
           | 
           | Who's talking about surrealistic drawings? We're talking
           | about actual material in real life, being shared by users and
           | abusers.
           | 
           | To be clear, I'm not supporting surveillance, just stating
           | facts.
        
             | toxik wrote:
             | You should maybe read about the work [0], or just read the
             | rest of what I said. Surrealism has nothing to do with the
             | argument I made, so why do you bring it up?
             | 
             | CSAM databases can by necessity not contain novel abuses,
             | because they are novel. In fact, filtering by CSAM
             | databases even indirectly /encourage/ novel abuses because
             | these would not be caught by said filter.
             | 
             | Catching CP hoarders does little to help the children being
             | exploited in the first place, and does a lot to harm our
             | integrity and privacy.
             | 
             | [0] https://en.wikipedia.org/wiki/The_Treachery_of_Images
        
               | ElFitz wrote:
               | > Catching CP hoarders does little to help the children
               | being exploited in the first place
               | 
               | If there is no market for it, there might be less
               | incentive to produce any more of it.
               | 
               | Not that I believe we should all be continuously spied by
               | people who merely pinky sweared to do it right and not
               | abuse it.
        
               | toxik wrote:
               | I don't think market forces are what drive the production
               | of CSAM. Rather, it's some weird fetish of child abusers
               | to share their conquests. I'm sure you're aware, but when
               | CP collectors are busted, they often have terabytes upon
               | terabytes of CP.
               | 
               | But that's, I think, tangential - I don't understand
               | pedophilia well enough to say something meaningful here.
        
               | ElFitz wrote:
               | Quite funny: looking for some figures, it seems I wasn't
               | the first one to do so, and that finding anything even
               | remotely reliable isn't easy at all. See
               | 
               | - https://www.wsj.com/articles/SB114485422875624000
               | 
               | - https://thehill.com/blogs/congress-blog/economy-a-
               | budget/260...
               | 
               | - http://www.radosh.net/archive/001481.html
               | 
               | > I'm sure you're aware, but when CP collectors are
               | busted, they often have terabytes upon terabytes of CP.
               | 
               | Always appalled me that there is so much of it out there.
               | FFS.
               | 
               | For some reason, it reminds me of when the French
               | government decided to spy on BitTorrent networks for
               | copyright infringement (HADOPI).
               | 
               | Defense & Interior asked them not to, saying that it
               | would only make their work harder.
               | 
               | That some geeks would create and democratise new tools
               | not to be caught downloading some random blockbuster, or
               | even just out of principle, and both child pornographers
               | & terrorists would gladly adopt them in a heartbeat,
               | because while they weren't necessarily the type capable
               | of _creating_ such tools, they had historically been
               | shown to be proficient enough to _use_ them.
               | 
               | Quite hilarious, when you think of it. Some kind of
               | reverse "Think of the Children".
               | 
               | We still got HADOPI however, and now any moron has access
               | to and knows how to get a VPN.
        
               | donkeyd wrote:
               | > Catching CP hoarders does little to help the children
               | being exploited in the first place
               | 
               | This is completely untrue, since hoarders tend to often
               | also be part of groups where images of still unknown
               | abuse circulate. Those images help identify both victims
               | and suspects and in the end help stop abuse.
        
               | lstodd wrote:
               | I sometimes wonder where logic is, if it isn't on HN.
               | 
               | Surely if one takes measures against only existing
               | material, and not production of new, this only encourages
               | appearance of new material.
               | 
               | For evidence you can look up what happened with synthetic
               | cannabinoids, and how much harm they brought.
        
             | ElFitz wrote:
             | > We're talking about actual material in real life, being
             | shared by users and abusers.
             | 
             | And, more importantly, material that has been and will be
             | produced through the actual abuse of real children. The
             | spreading of which encourages the production of more of it.
             | 
             | GP's counter is absurd.
        
             | [deleted]
        
           | emodendroket wrote:
           | In many jurisdictions you are.
        
           | donkeyd wrote:
           | > am I committing a crime by drawing
           | 
           | That really depends on the law in the country where you're
           | doing that. According to the law in my country, yes, you are.
           | 
           | None of this actually answers my question though, it's just a
           | separate discussion. I would appreciate an actual answer.
        
             | toxik wrote:
             | Right, my question is of course a rhetorical one. If I
             | similarly draw other crimes being committed, the same does
             | not apply. Why is that?
             | 
             | And to address your explicit question, it is far too
             | complex and specific to a given community to answer in any
             | meaningful way here. I can tell you what isn't the answer
             | though: deploying spyware on phones worldwide.
        
               | donkeyd wrote:
               | > If I similarly draw other crimes being committed, the
               | same does not apply. Why is that?
               | 
               | The answer is much more simple than you seem to think it
               | is. Because the law doesn't say it is. I can only go by
               | the law in my country, which is that (loosely translated)
               | 'ownership of any image of someone looking like a minor
               | in a sexually suggestive position' is a crime. Since a
               | drawing is an image, it's a crime. Having an image of
               | someone breaking into a home is not a crime according to
               | law in my country. That's why you can have a drawing of
               | that.
               | 
               | Your question is like asking "why don't I get a fine for
               | driving 60 on the highway, but I do get a fine for
               | driving 60 through a school zone?" Well, because one is
               | allowed and the other isn't.
        
               | gerash wrote:
               | It is a pretty dumb law IMHO for the mere fact that two
               | teenagers sexting on snapchat when they are 17 years and
               | 355 days old are committing a fairly serious crime that
               | can completely mess up their lives if caught but doing
               | that the next day is totally ok all of a sudden and they
               | can talk and laugh about it.
        
               | a1369209993 wrote:
               | > but doing that the next day is totally ok all of a
               | sudden and they can talk and laugh about it.
               | 
               | Unless a leap year is positioned unfortunately
               | (probability ~1/4), in which case it's back to being a
               | serious crime.
        
               | zug_zug wrote:
               | Yeah but he's using reducto-ad-absurdem to illustrate the
               | law as written is absurd in some cases. So yes you can
               | pedanticly cite the law and have an answer to the
               | question, but you're missing the larger discussion.
               | 
               | At somepoint somebody is gonna actually have to tackle
               | that looking at images, especially fake ones, might not
               | only be harmless, it might be _safer_ to let people get
               | release from their fantasies (I 'd be curious to see what
               | the research says).
               | 
               | Some day, people will say "Obviously nobody cares if you
               | possess it or look at it, obviously, it's just about
               | MAKING it."
               | 
               | I think this is follow the process of weed (unspeakable
               | horror, "drugs will kill you" in dare, 1-strike expulsion
               | from high-school or college) when I was growing up, yet
               | now legal to even grow.
        
               | emodendroket wrote:
               | It seems just as plausible that viewing such depictions
               | could feed into someone's urges. Without some evidence
               | I'd be hesitant to go with someone's conjectures here.
               | There is also, frankly, the fact that a lot of people
               | find viewing such stuff so shockingly depraved that they
               | don't care if someone is "actually" harmed in the
               | process, a claim that is hard to evaluate in the first
               | place.
        
               | donkeyd wrote:
               | > So yes you can pedanticly cite the law and have an
               | answer to the question, but you're missing the larger
               | discussion.
               | 
               | While I do agree that drawings being illegal might not be
               | in anyone's best interest, that doesn't have anything to
               | do with the questions I asked.
               | 
               | > At somepoint somebody is gonna actually have to tackle
               | that looking at images, especially fake ones, might not
               | only be harmless, it might be safer
               | 
               | I thought I agreed with you for a second, buy "especially
               | fake ones" implies you think that child pornography is
               | actually a good thing. I hope I'm wrong about that.
               | 
               | > Some day, people will say "Obviously nobody cares if
               | you possess it or look at it, obviously, it's just about
               | MAKING it."
               | 
               | Guess I'm not and you really think possession of child
               | pornography is harmless. My god that's depressing.
               | 
               | If you are reasoning from the standpoint that the
               | ownership and trade of child pornography is harmless,
               | yes, then I understand that giving up privacy in order to
               | reduce this is a bad thing. Because in your eyes, you're
               | giving up privacy, but you gain nothing.
        
               | zug_zug wrote:
               | >> implies you think that child pornography is actually a
               | good thing.
               | 
               | Holy cow, that's the worst phrase anybody has ever put
               | into my mouth. That's too offensively disingenuous to
               | warrant any further discussion. Shame on you.
        
           | sandworm101 wrote:
           | Yes. People have been convicted for possession of hand-drawn
           | and computer-generated images. Editing a child's image to
           | make it pornographic is also illegal. So some "deepfake"
           | videos using faces from young celebs are in all likelihood
           | very illegal in many jurisdictions.
           | 
           | Images can be illegal if all people are of age, but are
           | portrayed as underage. Many historical dramas take legal
           | advice about this when they have adult actors portraying
           | people who historically married while underage by modern
           | standards. (Ie the rape scene in BBC's The White Princess
           | series.) This is why American porn companies shy away from
           | cheerleader outfits, or any other suggestion of highschools.
        
             | bambax wrote:
             | > _People have been convicted for possession of hand-drawn
             | and computer-generated images._
             | 
             | If that's indeed the law in some countries, it is a stupid
             | law that nobody should help enforce.
             | 
             | In Shakespeare's take on the story of Romeo and Juliet,
             | Juliet is _thirteen_. So this play should probably be
             | illegal in the countries that have such laws.
        
             | chokeartist wrote:
             | > This is why American porn companies shy away from
             | cheerleader outfits, or any other suggestion of
             | highschools.
             | 
             | Lol I'm not sure if this is the case anymore? I've seen
             | tons of "high-school-ish" professional produced porn
             | completely with cheerleading outfits.
        
             | dannyw wrote:
             | Not sure where you get your advice from. Supreme court has
             | ruled loli as legal. You can find it on 4chan in about 30
             | seconds.
        
               | stevenicr wrote:
               | "Supreme court has ruled loli as legal." - proof of this?
               | 
               | Had not heard that change.
               | 
               | the US arrested some diplomat like guy some years ago for
               | having sexualized bathing suit models underage on a
               | device. all non nude I believe.
               | 
               | I think most of what gp saying is mostly true - some
               | places cartoons can be / have been prosecuted.
               | 
               | Just cuz its on 4chan every day - doesn't mean it's legal
               | - I thought 4chan deletes it all with in 24 hours (?) -
               | so that if they got a warrant to do something about it,
               | it would already be gone before the warrant printed much
               | less signed and served.
               | 
               | However charges are going to vary by jurisdiction - I
               | don't think many prosecutors are trying to get cartoon
               | porn convictions as a top priority these days, but that
               | doesn't mean it couldn't happen.
               | 
               | I don't think gp is accurate in saying that american porn
               | avoids cheer and high for these reasons. the market
               | forces have changed many times over the past so many
               | years and depending on which porn portals one might
               | peruse they will see more or less of cheer and such. With
               | some portals the whole incest titling is used for
               | clickbait - it just depends on the portal, and much of
               | what is not seen on portals is not there because of dmca
               | / non-rights.. not because things aren't being produced
               | and made available elsewhere.
        
             | diebeforei485 wrote:
             | And companies are terrified of any sort of liability here,
             | which can lead to over-broad policies.
             | 
             | On reddit, you can't link to written content like stories
             | or fanfiction (words only, without any image files) if any
             | character in your fictional story is under 18 and does
             | anything sexual in the story.
        
             | toxik wrote:
             | This topic is really fascinating to me, how we deal with
             | images of bad things. Clearly, murder and assault is fine
             | to reproduce at 100% realism - but even plain consensual
             | sex is forbidden in the US for a wider audience.
             | 
             | This reminds me of a classic Swedish movie I watched, I
             | forget its name, it's made in the 70s and contains upskirt
             | panty shots of an actress who is playing a 14-year-old,
             | along with her exploring sexuality with a same-age male
             | peer. I think the actual actress was around 14 years old
             | too. It made me feel real uneasy, maybe because my parents
             | in law were around, but also because I thought "wait, this
             | must be illegal to watch". In the end, the movie was really
             | just a coming-of-age story from a time when we were more
             | relaxed about these things.
        
               | teddyh wrote:
               | The name can probably be found here: https://en.wikipedia
               | .org/wiki/List_of_Swedish_films_of_the_1...
        
             | birdyrooster wrote:
             | Maybe in Saudi Arabia? That's not the case in the US.
        
           | avnigo wrote:
           | I see the argument, but the counterargument is that by you
           | doing that, you could possibly be nurturing and perpetuating
           | abuse.
           | 
           | In effect, possessing such material may incentivize further
           | production (and abuse), "macroeconomically" speaking. And I
           | hate that evidently, yes, there is an economy of such
           | content.
        
           | randcraw wrote:
           | Or if you make a film about child abuse in which a child
           | actor pretends to be abused, can you arrest the actor who
           | abuses? If any depiction of the act is a crime, then you can.
           | 
           | This issue came before the US Supreme Court about a decade
           | ago and they ruled that the depiction of a crime is not a
           | crime so long as the depiction can in any way be called
           | "art". In effect, any synthetic depiction of a crime is
           | permitted.
           | 
           | However that ruling predated the rise of deep fakes. Would
           | the SC reverse that decision now that fakes are essentially
           | indistinguishable from the real thing? Frankly I think the
           | current SC _would_ flip since it 's 67% conservative and has
           | shown a willingness to reconsider limits on the first
           | Amendment (esp. speech and religion).
           | 
           | But how would we re-draw the line between art and crime? Will
           | all depictions of nudes have to be reassessed for whether the
           | subject even _might_ be perceived as underage? What about
           | films like  "Pan's Labyrinth" in which a child is tortured
           | and murdered off-screen?
           | 
           | Do we really want to go there? This enters the realm of
           | thought-crime, since the infraction was solely virtual and no
           | one in the real world was harmed. If we choose this path, the
           | freedom to share ideas will be changed forever.
        
           | ianhawes wrote:
           | Yikes, do not engage in a defense of child pornography in
           | these types of arguments. The opposition is gunning for that.
        
             | toxik wrote:
             | I'm not defending any specific point of view. I'm merely
             | pointing out a problem of map-territory conflation.
        
               | donkeyd wrote:
               | > I'm merely pointing out a problem of map-territory
               | conflation.
               | 
               | Using complicated language does not make you smart... It
               | just makes you hard to understand. Maybe if you expressed
               | yourself in common language, you'd understand that the
               | points you're trying to make are bogus.
        
           | quenix wrote:
           | A picture of child abuse _is_ child abuse in that the abuse
           | of a child was necessary to take the picture in the first
           | place.
           | 
           | If the picture had no grounds to spread, it would likely not
           | have been made--no incentive. As such, the fact that the
           | picture is able to spread indirectly incentivises further
           | physical abuse to children.
        
             | toxik wrote:
             | That does simply not follow from logic. Child abuse existed
             | before cameras.
             | 
             | Edit: People are unhappy with this refutation, so a second
             | one then. The claim, specifically, is that CP is CA because
             | CP requires CA to happen. So a photo of theft is theft. A
             | recording of genocide is genocide. Clearly absurd. Never
             | mind the context that the pornography in my question is
             | drawn, thus no actual child was hurt.
             | 
             | Edit 2: The point was made somewhere that CP is hurtful to
             | the child that it depicts, and this is obviously true - but
             | only if spread. Therefore, distributing CP should be
             | illegal, but that does not mean that it's justified to scan
             | every phone in the world for such images.
        
               | quenix wrote:
               | Sure, some child abuse isn't perpetrated for the purpose
               | of being filmed and sold/distributed. But a large
               | percentage is.
               | 
               | That large percentage is disincentivized when technology
               | that makes it more difficult to spread and/or makes it
               | easier for LE to apprehend owners is implemented.
               | 
               | I never said that there would no longer be any child
               | abuse with these steps, just less of it.
        
               | toxik wrote:
               | I think you'll be hard pressed to show this with any kind
               | of evidence. What happens when you effectively ban some
               | expression? People hide that expression. Likewise, if you
               | are successful in this CSAM pursuit, you'll mostly drive
               | pedophiles to reduce evidence sharing. I bet you dollars
               | to donuts, the people who fuck kids, will still fuck
               | kids.
        
               | donkeyd wrote:
               | > Child abuse existed before cameras.
               | 
               | Therefore we shouldn't do anything about it.
               | 
               | While your comment is true, it does nothing to refute
               | anything the previous commenter said. They are completely
               | right. Also the spread of child porn increases the impact
               | on the victim and should just for that reason be
               | minimized.
        
             | bishoprook2 wrote:
             | > A picture of child abuse is child abuse in that the abuse
             | of a child was necessary to take the picture in the first
             | place.
             | 
             | People have made all of these points before, but I still
             | wonder what the legal and/or practical basis of the law is
             | when the pictures or videos are entirely synthetic.
             | 
             | On another note, I wonder how much of this kind of
             | Apple(tm) auto-surveillance magic has made it into the
             | MacOS.
        
             | ksaj wrote:
             | PornHub recently changed their policies after an article
             | appeared that pointed out that many underaged boys were
             | uploading videos of themselves masturbating to the site. I
             | forget the numbers, but it was quite substantial. Of course
             | the boys in question claimed they were of age, although
             | they apparently were not.
             | 
             | Were the uploads self-abuse? Was PornHub subsequently
             | spreading child abuse? Did everyone who saw any of those
             | videos abuse the child who uploaded the videos?
             | 
             | The legal answer is, for the most part, yes on all counts.
             | But should an OS report users for clicking on PornHub
             | recommended videos?
             | 
             | It's reasonable to assume this happens on most porn sites
             | that allow user content uploads.
             | 
             | It seems such scanners should be employed by sites prone to
             | this kind of illegal use, and not be part of a global all-
             | devices-on-hand dragnet where there is so much room for
             | error.
        
           | toolz wrote:
           | I think it's very easy to get people to hate things that
           | gross them out. It's always interesting to me how casually
           | people joke about Epsteins death while simultaneously never
           | talking about cp without a gross look on their face.
           | 
           | I'm not sure I fully understand how society can be more
           | relaxed with actual pedophiles than they are with cp
           | material.
           | 
           | Personally it bothers me to see the focus around things that
           | gross people out rather than the actual child abuse.
        
           | bambax wrote:
           | > _A picture of child abuse /is not/ child abuse._
           | 
           | Yes, exactly.
           | 
           | It may even help _prevent_ child abuse, because it may help
           | pedophiles overcome their urges and not act upon it.
           | 
           | I'm not aware of any data regarding this issue exactly, but
           | there are studies that show that general pornography use and
           | sexual abuse are inversely correlated.
        
         | swiley wrote:
         | There's a difference a mile wide between moderating a service
         | and moderating people's personal computers.
         | 
         | Maybe the same thing that makes this hard to understand about
         | iOS will be what finally kills these absolutely wretched mobile
         | OSes.
        
           | donkeyd wrote:
           | > moderating people's personal computers
           | 
           | This is done client side because your data is encrypted in
           | their cloud. It won't be done if you disable cloud sync. If
           | you just keep your cp out of the cloud, you're fine.
        
             | swiley wrote:
             | TBH it doesn't matter at all why they're doing it, they've
             | crossed a line here.
        
             | new299 wrote:
             | Apple can decrypt data stored on iCloud, and they scan it
             | already it seems:
             | 
             | https://nakedsecurity.sophos.com/2020/01/09/apples-
             | scanning-...
             | 
             | Which it what makes adding the client side functionality
             | even more problematic. They can easily extend this to
             | scanning all offline content.
        
           | tda wrote:
           | except that the boundary between device and service is
           | steadily eroding. A screen is a device and netflix is a
           | service, but a smart tv that won't function without an
           | internrt connection? A nokia was a device you owned, but any
           | Apple device is more like a service than a device really, as
           | you don't control what OS can run on it, which unsigned
           | software you run etc. So if you are ok with moderation on
           | services and your device turns into a service... Perhaps we
           | need laws to fully separate hardware and software but
           | actually the integration between hardware and software is
           | becoming stronger and stronger every year
        
         | garmaine wrote:
         | I don't want to be arrested by the FBI and my life ruined
         | because my daughter, who is 6 and doesn't know any better,
         | takes a selfie with my phone while she happens to be
         | undressed/changing clothes and the photo automatically syncs to
         | the cloud.
        
           | robertoandred wrote:
           | You won't. Understand what this is instead of falling for the
           | hysteria.
        
             | garmaine wrote:
             | It starts with comparing file hashes. I'm worried about
             | where this goes next.
        
           | Clubber wrote:
           | Chances are your daughter would be arrested or at least
           | investigated for having child pornography.
           | 
           | https://www.theguardian.com/society/2019/dec/30/thousands-
           | of...
        
         | moonchrome wrote:
         | I think the best way to stop child porn is to install
         | government provided monitoring software + require an always
         | online chip to be present on any video capture system and any
         | media player must submit screen hashes + geolocation and owner
         | ID to a government service. If the device fails to submit
         | hashes for more than 24 it should block further operation.
         | Failing this we will never be safe from child porn in the
         | digital world.
         | 
         | Not doing anything opens the door to further abuse.
        
         | JohnFen wrote:
         | > If you want unlimited privacy, at least have the balls to
         | admit that this will allow free trade of CSAM, but also revenge
         | porn, snuff-films and other things like it.
         | 
         | Sure, in exactly the same way that the postal system, parcel
         | delivery services, etc. allow it. But that's not to say that
         | such things are unrestricted -- there are many ways that
         | investigation and enforcement can be done no matter what. It's
         | just a matter of how convenient that is.
         | 
         | It would also restrict CSAM a lot if authorities could engage
         | in unrestricted secret proactive searches of everybody's home,
         | too. I don't see this as being any different.
        
         | h0mie wrote:
         | Should all personal computers be scanned all the time by
         | windows/apple/... in case it contains cp?
        
           | donkeyd wrote:
           | Are you unable to answer any of the questions I asked? I'm
           | seriously interested in hearing, from people who think Apple
           | is in the wrong, where the border of acceptability is.
           | 
           | Would you prefer iOS submitted the photo's to their cloud
           | unencrypted and Apple scanned them there? Because that's what
           | the others are doing.
        
             | veeti wrote:
             | iCloud photos are not end-to-end encrypted, Apple has full
             | access to them. https://support.apple.com/en-us/HT202303
        
       | getaclue wrote:
       | Sad :(
        
       | bishoprook2 wrote:
       | My take is that by simply announcing this sort of thing the
       | Rubicon has been crossed. If you see one ant, there's 100 ants
       | somewhere.
       | 
       | The point in bitching to Apple isn't to make them change, but to
       | bring attention to privacy issues to the hoi polloi. Cleaning up
       | your own privacy act is the main lesson.
       | 
       | Maybe the only logical place to end up is a dedicated Chromebook
       | in guest mode for financial transactions, air-gapped workstation
       | to do artistic or other useful things, rarely-used dumbphone, get
       | out more among physical people.
        
       | throwaway6352 wrote:
       | In this day and age I am basically guilty of heresy for speaking
       | out against the insanity of our times.
       | 
       | Nobody should be prosecuted for possessing or viewing any text,
       | image or audio recording, unless they created it themselves by
       | abusing someone. That is a fundamental principle of a free
       | society which has been acknowledged for if not decades, hundreds
       | of years. By all means prosecute the distribution of such
       | material - but to prosecute possession is returning us to the
       | Middle Ages hunting witches again.
       | 
       | It is only because of radical, extremist feminists and other
       | moralists creating moral panics in the 1970s and 1980s are we
       | faced with the present situation in the 21st Century, which is
       | extremely dangerous in a highly interconnected society such as
       | ours, where it is very difficult if not impossible to stop people
       | from unintentionally coming across such material.
       | 
       | We might as well have a nuclear reactor in our own homes, yes
       | it's very useful and provides lots of free electricity but one
       | day it can melt down destroying the entire family? Do you see the
       | analogy I'm making? Because the Internet is just as dangerous.
       | The penalties for possession of such material and being put on
       | the sex offenders register are so horrific, it's totally unreal.
       | It's like a real life nightmare. I cannot believe I'm typing this
       | here in 2021. What has happened to our country?
       | 
       | We criticize dangerous products that are unsafe and burn your
       | house down or electrocute you, but why can't we criticize the
       | dangers of the Internet and all the ridiculous draconian laws
       | involving it? It is as if the law itself and the crazy ways it's
       | made is beyond discussion?
       | 
       | Nobody should have to fear their own computers (unless they are
       | doing major hacking/fraud/sending death threats, etc...).. This
       | simply cannot be happening in a free society. It looks like we
       | are no better than China - it's just over different stuff here in
       | the West.
        
       | oort-cloud9 wrote:
       | It's not Apple. It's the government. Thank the government.
        
       | tomc1985 wrote:
       | And this kind of thing is exactly why you should never trust the
       | cloud.
        
       | CubsFan1060 wrote:
       | Isn't this just wrong?
       | 
       | " Apple's proposed technology works by continuously monitoring
       | all photos stored or shared on a user's iPhone, iPad or Mac, and
       | notifying the authorities if a certain number of objectionable
       | photos is detected."
       | 
       | From: https://www.apple.com/child-
       | safety/pdf/Expanded_Protections_...
       | 
       | " Before an image is stored in iCloud Photos, an on-device
       | matching process is performed for that image against the
       | unreadable set of known CSAM hashes. "
       | 
       | I think it only does this before iCloud upload.
        
         | buzzy_hacker wrote:
         | I don't see where in the first quotation it says that the
         | scanning takes place after iCloud upload. It just says
         | monitoring.
        
           | Cenk wrote:
           | The first quote says it happens to "all photos stored [...]
           | on a user's iPhone". The second quote clarifies that it only
           | happens to photos that are going to be uploaded to iCloud
           | Photos, and that the scan is performed before the upload
           | happens.
        
             | CubsFan1060 wrote:
             | Man. Sometimes English is an imprecise language. I guess
             | the second quote I posted doesn't preclude scanning all
             | photos. It would be a weird use of wording, but if all
             | photos were scanned all the time, then technically that
             | sentence would still be correct.
             | 
             | I'm not 100% sure if this is about the on-device portion of
             | this. Gmail and OneDrive already use this (and have for the
             | better part of a decade) and I haven't seen any large
             | outcry. Also, Apple has confirmed that turning off icloud
             | photos means this doesn't work (this is separate from
             | iCloud backups).
        
               | voakbasda wrote:
               | The difference is choice. You choose to upload your
               | photos where you know they might be scanned. Not
               | uploading them is a choice to keep them private. This
               | destroys the very notion of sovereignty over your own
               | data.
        
               | CubsFan1060 wrote:
               | Isn't that exactly what this is? The choice to use iCloud
               | photos?
        
           | rantwasp wrote:
           | if you read the paper Apple put out, this check takes place
           | right before the icloud upload. also, they are allegedly not
           | doing this for all existing pics. only new ones that get
           | uploaded to icloud
        
       | djanogo wrote:
       | Can we also have a section where users can propose alternative
       | solutions?
        
       | SirensOfTitan wrote:
       | I know the privacy approach Apple has been pushing was just
       | marketing, but I didn't care too much because I enjoyed my iPhone
       | and M1 macbook.
       | 
       | Because of this decision (and the fact that my iPhone more-or-
       | less facilitates poor habits throughout my life), I'm considering
       | moving completely out of the Apple ecosystem.
       | 
       | Does anyone have any recommendations for replacements?
       | 
       | * On laptops: I have an X1 carbon extreme running linux, but it's
       | entirely impratical to take outside of the house: it has terrible
       | battery life and runs quite hot. I also cannot stand its
       | trackpad. Is there any linux-capable device with an okay
       | trackpad?
       | 
       | * On phones: are there any capable phones out there that aren't
       | privacy nightmares? All I want is a phone with a long lasting
       | battery that I can use with maps, a web browser, and texting.
       | 
       | * On cloud: I've considered executing on a NAS forever, but I'm
       | unsure where to start.
        
         | indymike wrote:
         | > On laptops
         | 
         | My XPS 15 does about 6 hours (it's a maxed out i7) on Linux and
         | 3 on Windows.
        
         | skybrian wrote:
         | Alternatively, if you don't use iMessage or iCloud then I don't
         | think these changes affect you.
        
           | rantwasp wrote:
           | lol. nope. trust is binary. you either trust apple or not.
           | after peddling their privacy marketing for so long and doing
           | this, they literally lost my trust.
        
             | skybrian wrote:
             | For most people, trust isn't binary though.
             | 
             | For example, just because you hire someone for one purpose
             | doesn't mean they should have the keys to your house.
        
               | rantwasp wrote:
               | If I hire someone to do X I trust them to do X. If X
               | involves them getting in my house to do something while
               | I'm not there I give them the keys. If they screw up I no
               | longer trust them for X and I no longer trust them in
               | general.
               | 
               | I'm not saying you trust someone for all the possible
               | things under the sun.
               | 
               | I'm saying if I trust you for X and you mess it up I no
               | longer trust you for X.
        
         | theta_d wrote:
         | I have a Lemur Pro from System76 and it has insane battery
         | life. I feel like I never charge it b/c it's always got juice.
         | I run Pop_OS! on it b/c at my age I just want to get work done
         | and not tinker with a distro.
        
         | calgoo wrote:
         | From what I understand, you can use Linux on the M1 now, so at
         | least as a stopgap.
         | 
         | For nas, synology is always a good brand.
        
           | marcan_42 wrote:
           | Linux on M1 isn't ready yet for daily use, but we're working
           | on it. Expect things to move pretty quickly in the next few
           | months.
        
             | rorykoehler wrote:
             | What is the plan for integrating with custom aspects of Mac
             | machines like the t2 chip?
        
             | javajosh wrote:
             | Link?
        
               | rvz wrote:
               | This person is the creator of Asahi Linux and the one who
               | did the early work of Linux on M1 Macs. [0]
               | 
               | When it is ready, it will be ready and they will say so.
               | 
               | Until then it is not ready for general use and is still
               | under active heavy development.
               | 
               | [0] https://asahilinux.org/
        
         | ursugardaddy wrote:
         | TBH I've been considering just dropping a smartphone all
         | together, I don't really get much value out of it since i'm on
         | my laptop most of the time when I want to internet anyway
        
         | grae_QED wrote:
         | For laptops I'd recommend the framework laptop
         | (https://frame.work/). It's thin, powerful, and modular.
        
         | caskstrength wrote:
         | > On laptops: I have an X1 carbon extreme running linux, but
         | it's entirely impratical to take outside of the house: it has
         | terrible battery life and runs quite hot.
         | 
         | I have X1C (not extreme) and it has excellent battery life with
         | Linux. Consider using TLP (https://linrunner.de/tlp/), if you
         | want to significantly improve your laptop power consumption.
        
           | p_j_w wrote:
           | It's possible he's got an old X1C and the battery is just on
           | it's last legs as well. If that's the case, he'd probably be
           | better served using his M1 laptop sparingly until Linux
           | support for it is better.
        
         | minton wrote:
         | I understand the impulse to turn away from Apple. However, it's
         | not a practical solution. How long until this tech or worse is
         | mandated in all products? I think the real answer is for the
         | people to stop bickering about primary colors and work together
         | toward passing legislation that limits the pervasive invasion
         | of privacy by governments and corporations.
        
           | fragileone wrote:
           | You can do both. Boycott Apple because they're the ones who
           | committed this violation, whilst also protesting government
           | efforts to spy.
        
         | guitarbill wrote:
         | As someone else mentioned, for a NAS using TrueNAS (used to be
         | FreeNAS) is easy enough and quite satisfying. You can find
         | plenty of guides.
         | 
         | In general, you need to balance budget, capacity requirements,
         | and form factor. Old servers are often great. Big disks seem
         | like a good idea, but for rebuild times going over 4TB is
         | horrible.
         | 
         | However, unfortunately HDD prices right now are horrible...
        
           | katbyte wrote:
           | define horrible? i have a 8x18tb raid-6 array and it rebuilds
           | in 2-3 days.
        
             | guitarbill wrote:
             | I guess it depends. Personally I don't like having rebuilds
             | longer than a day.
        
         | ipodopt wrote:
         | Laptop | Desktop: https://www.dell.com/en-
         | us/work/shop/overview/cp/linuxsystem...
         | 
         | Router: https://www.turris.com/en/omnia/overview/
         | 
         | Media Center: https://osmc.tv/vero/
         | 
         | Cloud (Use TrueNAS Scale): https://www.truenas.com/systems-
         | overview/
         | 
         | Phone: https://www.pine64.org/pinephone/
         | 
         | Watch: https://pine64.com/product/pinetime-smartwatch-sealed/
         | 
         | Smart Thermostat: https://hestiapi.com/product/hestiapi-touch-
         | one-free-shippin...
         | 
         | If you go this route all devices will be running Linux. The one
         | OS route is kindof nice, hence the pinephone over open android
         | alternatives (like Graphene OS).
         | 
         | I sorted from least to most technical. I also tried to pick the
         | least technical challenging in each category. The Dell stuff
         | should just work. The Phone will require some tinkering for the
         | moment.
        
           | jtsuken wrote:
           | I think it's about time to say: "Thank you, Apple!" Finally
           | these awesome projects will get the funding and the support
           | they deserve.
           | 
           | Also, has anyone tried the FXtec phones?
           | https://www.fxtec.com I am thinking about getting the FXtec
           | pro1 version, which promises to get a decent UbuntuTouch
           | support as well as lineageOS.
           | 
           | I feel that with the comeback of Vim, there might be a
           | sufficient user base for devices that use the keyboard for
           | most tasks. I miss the days, when I could send a text message
           | without taking the phone out of my pocket.
           | 
           | Edit: Just found a relevant HN discussion:
           | https://news.ycombinator.com/item?id=26659150
        
           | thefourthchime wrote:
           | I'm sympathetic to remove Apple and Google from my life, but
           | this list looks so sad. You know the experience, integration
           | and headache of all this is going to be horrible.
        
             | arsome wrote:
             | Depends what you're looking to do - if you really value
             | your privacy, yes, you're going to have to give up some
             | convenience, but I can assure you, it's really not that bad
             | if you put in a little work on an occasional basis, it's
             | not the constant maintenance nightmare that some people
             | seem to expect.
             | 
             | I fired up Syncthing on my phone and NAS and restic to
             | perform encrypted backups nightly to B2, sure, you're not
             | going to get some magical cloud image recognition stuff,
             | but do you really need or even want it?
             | 
             | Everything can integrate quite nicely, but plan to do the
             | connecting work yourself as a one-off with some occasional
             | maintenance a few times a year.
        
             | ipodopt wrote:
             | If you had the option of buying this hardware stack through
             | an integrated third-party brand/storefront that offered
             | support for the products and their integration would that
             | make you feel differently?
        
               | pomian wrote:
               | That's an interesting idea. The amount of work even for a
               | techie to maintain all this, is considerable. Could it be
               | set up for a "user". Tech support, would be; Interesting.
        
               | ipodopt wrote:
               | I made an example storefront a few months ago and jotted
               | down a plan in a bout of procrastination. I have a few
               | thoughts on a pragmatic lazy (in the cs sense) approach.
               | 
               | If you or anyone one else is interested email me:
               | ipodopt@pm.me
        
           | Rd6n6 wrote:
           | This has the potential to be a great laptop, if execution is
           | good:
           | 
           | https://frame.work/
        
           | fsflover wrote:
           | Another phone: https://puri.sm/products/librem-5
           | 
           | Laptop: https://puri.sm/products/librem-14
        
             | ipodopt wrote:
             | I have their laptop and do not recommend them. Honestly, I
             | sort of regret the purchase.
             | 
             | - They lie about specifications like battery life (and
             | other things). I get about 1 hour on my ~5 hour battery
             | life when web browsing.
             | 
             | - My laptop had a race condition at boot that would prevent
             | boot 50% of the time. There was a workaround.
             | 
             | - Wifi had a range of maybe ten feet (not joking).
             | 
             | I am sure there new laptop is better however I do not
             | really trust them after my interactions. Especially for
             | something more novel like a linux phone.
             | 
             | On the other hand, Pine64 is very focused on their hardware
             | stack. All their products are running a very similar
             | hardware unlike Purism. They are moving way more product
             | then Purism and are better liked. Hence they have a
             | stronger community. They are also much cheaper phone-wise
             | for a similar feature set. And you can actually buy and
             | receive the phone.
             | 
             | In terms of alternatives I think System76 is pretty good
             | desktop wise right now. Laptop are alright. Waiting for
             | their upcoming in-house laptop.
        
               | fsflover wrote:
               | This is quite interesting. I'm writing this on their
               | Librem 15 and can't recommend it enough. No problems with
               | booting or anything. Battery life got short with time
               | (but I never cared about it).
               | 
               | > Wifi had a range of maybe ten feet (not joking).
               | 
               | Purism is using the only existing WiFi card that works
               | with free firmware. It is less performant than typical
               | proprietary ones. If you don't care about your freedom
               | strongly, you can replace the card (they are very cheap
               | on ebay). Also, check Purism forums for such questions.
               | It works better than "ten feet" for me.
               | 
               | > On the other hand, Pine64 is very focused on their
               | hardware stack.
               | 
               | And the software is provided by Purism (Phosh, for most
               | Pinephone users). Pinephone is great, I'm using it, too.
               | But Librem 5 is much more performant. Many videos show
               | it's working fine, except of the non-finished software
               | (same as for Pinephone).
        
               | ipodopt wrote:
               | > Purism is using the only existing WiFi card that works
               | with free firmware. It is less performant than typical
               | proprietary ones. If you don't care about your freedom
               | strongly, you can replace the card (they are very cheap
               | on ebay). Also, check Purism forums for such questions.
               | It works better than "ten feet" for me.
               | 
               | Not the wifi card, the surround material like the chassis
               | are attenuating the signal (the librem 14 should have
               | fixed this issue). I swapped mine out for a well
               | supported and performant Intel card and only got marginal
               | improvements to the signal.
               | 
               | My "ten feet" was using it at coffee shops. I was
               | traveling with the laptop. It was hard. The card switch
               | did get it over the hump for this use case. Not an issue
               | most of the time but still will have get a spot closer
               | the router for video calls (like standup).
               | 
               | So the constraints for work were getting a seat close the
               | router AND a power plug. I ended up USB tethering with my
               | phone _alot_.
               | 
               | I do appreciate their contributions to the ecosystem but
               | was wronged as a consumer. They need to be truthful.
               | 
               | I take it you have the v3 with the standard boot drive?
        
               | fsflover wrote:
               | > I take it you have the v3 with the standard boot drive?
               | 
               | Yes, v3. What do you mean by "standard boot drive"? Using
               | SSD if that's what you mean.
        
               | ipodopt wrote:
               | The v4 updated the screen which burned more power. I
               | remember telling them my real life results and them
               | proceeding to not update their product marketing page
               | while admitting it was based on the v3. I feel like they
               | were already stretching it to begin with with v3 but you
               | would know.
               | 
               | I also got the fastest SSD they had in checkout. Which I
               | think contributed to the booting race condition. I never
               | got a link to an upstream ticket so I do not know if it
               | is fixed.
               | 
               | When I emailed them they said they do not have a laptop
               | in that configuration to test, haha.
        
               | vngzs wrote:
               | Sad the high DPI displays didn't go well. My eyes can't
               | take the 1080 vertical pixel displays that are still so
               | common on open laptops nowadays. But I really want to
               | like the Librems; there aren't many trustworthy laptops
               | with kill switches out there.
               | 
               | I have an X1 Carbon Gen 9 with a high DPI 16:10 display,
               | 32GB RAM, and anywhere from 4 to 12 hours of battery
               | depending on workload. It's worth a look for people who
               | can tolerate Lenovo's history (BIOS rootkits targeting
               | Windows in the non-ThinkPad lines).
        
               | fsflover wrote:
               | This looks like a similar problem to yours. They fixed
               | that, even though it affected SSDs that they did not
               | themselves sell: https://forums.puri.sm/t/support-with-
               | booting-from-nvme-ssd/....
        
           | sneak wrote:
           | The dell and lenovo laptops are nowhere the quality of the
           | apple machines, sadly.
           | 
           | I have a maxed out xps and it is a downgrade in all respects
           | but privacy. :/
        
             | SirensOfTitan wrote:
             | Yeah, I figured. When waiting for lsp autocompletes in
             | emacs, my entry level M1 MacBook with no gccjit is orders
             | of magnitudes faster than my almost maxed out Lenovo with
             | emacs and GCCjit.
             | 
             | The difference is so stark that I cannot bear to
             | autocomplete on type on the Lenovo machine, it lags too
             | much and frequently locks up.
        
               | zekica wrote:
               | My thinkbook g2 14 are is almost the same speed as m1
               | macbook and runs linux without any issue. It has "only" 9
               | hours of battery in my usecase, but that's completely
               | fine by me.
        
             | Dracophoenix wrote:
             | I saw your blogpost [1] mentioning how the iPhone 12 is
             | likely your last. Have you given any thought since then to
             | what your next smartphone would be? Or if you still use
             | smartphone at all?
             | 
             | [1]https://sneak.berlin/20210202/macos-11.2-network-
             | privacy/
        
             | notsureaboutpg wrote:
             | Not true at all. My Lenovo T440s from 7 years ago is still
             | running strong and handling everything I throw at it. It
             | doesn't do graphics well, but you don't own a Mac if you're
             | into gaming and things like that.
             | 
             | It has insane battery life, it lasts multiple days of
             | coding, compilation, system upgrades, etc. on one charge.
             | 
             | Plus you can always replace / upgrade the battery / RAM /
             | drive / etc. so you can always continue improving the
             | performance if you want.
             | 
             | Meanwhile the Apple laptop I got from work with an
             | astounding 32 GB of RAM has the fans whirring all the time
             | from just running a Google Meet in Chrome, and you can't
             | even use it in "clamshell" mode because it has so many
             | weird bugs, so I have to keep the screen slightly propped
             | open while it's connected to my monitors. T440s handles
             | don't-suspend-on-lid-off perfectly. Macbook won't even
             | allow you to do this if you're running on battery, you have
             | to be plugged in.
        
             | sdoering wrote:
             | I can't complain. But can't compare it to the M1. I had a
             | 2020 MBPro and am currently using a XPS 13 with maxed out
             | specs.
             | 
             | The camera on the XPS is some leagues below. The microphone
             | had driver issues from the start and it cost me two days
             | for a software workaround.
             | 
             | Other than that I am in every way more happy. Keyboard,
             | track pad and resolution. Performance even with crappy
             | corporate spy-and crapware is definitely way better.
             | 
             | I thought I would miss the Mac more. Not looking back once
             | the mic was fixed.
        
               | sneak wrote:
               | Speakers. Apple finally fixed laptop speakers. Nobody
               | else has figured out how to copy it yet. :/
        
               | sdoering wrote:
               | OK. Not my use case. I have never really listened that
               | intensely.
               | 
               | But totally understand when the milage varies.
        
             | ipodopt wrote:
             | Nothing quite like apple. The step down is short enough not
             | to stumble though :)
        
           | sdoering wrote:
           | Laptop: Dell XPS 13 and very happy. Maxed out specs and
           | clearly higher price range.
           | 
           | Or: Lenovo Yoga Convertible. My second device. I just don't
           | do games. Or bigger data stuff on this machine. Some design
           | work. Some photo and smaller video stuff. I love the
           | flexibility of the convertible when working with PDF and
           | doing annotations by hand.
        
             | Rd6n6 wrote:
             | The battery on my xps seems to be swelling and messing up
             | the trackpad. Apparently it's a pretty common issue
             | 
             | Edit: seems to be the precision line too
             | 
             | > The same problem is happening with the Precision 1510
             | line with the same batteries. I purchased 10 of these
             | laptops for my department around the same time you did.
             | We've had four of these failures so far in three laptops.
             | 
             | reddit.com/r/Dell/comments/6bzhtw/dell_xps_15_9550_battery_
             | swelling_causing/
        
               | sdoering wrote:
               | Good to know. Will watch for it. Currently not an issue.
        
           | mixmastamyk wrote:
           | I just ordered the pinephone! Looking forward to trying it,
           | hope it can keep up with most things I use my 6S for.
           | 
           | Guess, I'll dig out the old digital camera again, since that
           | is a weak point for the pinephone. :-D
        
           | GordonS wrote:
           | I hadn't come across that Omnia router before - it looks
           | great! Bit of a shame it doesn't support 802.11ax, and it is
           | more expensive than I'd like, but still...
        
             | ipodopt wrote:
             | Might be a good idea to wait. It's coming soonish I think:
             | 
             | - https://github.com/openwrt/luci/pull/4994
             | 
             | - https://forum.turris.cz/t/wifi-6-ax-adapter/10390/77
             | 
             | I think the router might be my favorite open hardware
             | piece:
             | 
             | - It was easier to setup then my old Asus router
             | 
             | - Schematics and source are easy to access.
             | 
             | - It has never not worked.
             | 
             | - It is made by a company that is a domain registrar with a
             | good track record on open source projects (click the more
             | button it the top right and you might recognize a few
             | projects): https://www.nic.cz/
             | 
             | - And if you need to do something advance with it, you can.
             | Mine is has Bird runing BGP.
        
               | BostonEnginerd wrote:
               | The Omnia was quite expensive, but I've gotten frequent
               | updates for the last four years. It's a nice mix of
               | hackable and "just works".
               | 
               | I've turned off the WiFi at this point, and just use it
               | as a router now. The UniFi access point that I installed
               | provides better coverage in my house since it's easier to
               | place in a central location.
               | 
               | Overall, I'd rate it as a good value.
        
         | robertoandred wrote:
         | You know Google scans the CP database too, right?
        
         | ByteWelder wrote:
         | On phones: I'm considering a Pixel device with
         | https://calyxos.org/ The main limitations will be the
         | featureset of MicroG (no ~maps~/wearOS/auto). Possible issues
         | with important apps like those for banking.
         | 
         | On cloud: Synology Diskstation is amazing. Only use it through
         | a local VPN though.
         | 
         | edit: maps work (see reply)
        
           | commoner wrote:
           | > no maps
           | 
           | I don't experience any issues using mapping apps on Android
           | with microG instead of Google Play Services. Closed source
           | navigation apps including Google Maps, HERE WeGo, and Magic
           | Earth work just fine. Open source navigation apps like
           | Organic Maps and OsmAnd also work with no problems.
        
             | yosito wrote:
             | I have found that my Pixel with CalyxOS and microG doesn't
             | have any text to speech engine. So the maps apps work, but
             | with no turn by turn audio.
        
               | commoner wrote:
               | RHVoice is a free and open source text-to-speech output
               | engine for Android. Try it out:
               | 
               | - F-Droid: https://f-droid.org/en/packages/com.github.olg
               | a_yakovleva.rh...
               | 
               | - GitHub: https://github.com/RHVoice/RHVoice
               | 
               | Alternatively, if you prefer Google's closed source text-
               | to-speech engine (that is preinstalled on most commercial
               | Android devices), you can download Speech Services by
               | Google through Aurora Store:
               | 
               | - https://play.google.com/store/apps/details?id=com.googl
               | e.and...
               | 
               | Instructions for configuring the default text-to-speech
               | engine on Android are here:
               | 
               | - https://support.google.com/accessibility/android/answer
               | /6006...
               | 
               | Most navigation apps come with their own voice banks for
               | turn-by-turn navigation audio, and installing a text-to-
               | speech engine isn't necessary for these apps. However,
               | Organic Maps (which is a recommended app in the CalyxOS
               | setup wizard) doesn't, and it relies on the default text-
               | to-speech engine.
        
               | yosito wrote:
               | Thanks! I didn't know about RHVoice. I tried installing a
               | couple others I found through searching F-Droid, but
               | nothing worked. I just installed RHVoice.
        
             | ByteWelder wrote:
             | Thank you for that insight! My comment was based on some
             | (older) Reddit comments. I'm glad to hear there are working
             | apps.
        
               | commoner wrote:
               | If you're not sure whether an app is compatible with
               | microG or a flavor of Android without Google Play
               | Services, Plexus is a helpful database of crowdsourced
               | compatibility ratings:
               | 
               | - Plexus: https://plexus.techlore.tech
               | 
               | - GitHub: https://github.com/techlore/plexus
        
           | elliekelly wrote:
           | Are the only phone options iOS or Android? I'm also
           | considering leaving my iPhone behind because of this privacy
           | violation but I'm definitely not moving to Google. That seems
           | like two steps back.
        
             | m4rtink wrote:
             | Sailfish OS (built on Maemo/MeGo legacy, for those who
             | remember those) is what I have been using for years (and
             | this is typed from).
             | 
             | Also there are a lot of Linux distros targeting the
             | PinePhone maturing by the day.
        
             | fragileone wrote:
             | Calyx is a degoogled Android ROM. It's probably the best
             | choice until mobile Linux improves.
        
               | Engineering-MD wrote:
               | What are the benefits over graphene OS?
        
               | fragileone wrote:
               | microG support mainly, it's needed for some Play Services
               | APIs like push notifications and maps, though the choice
               | all depends on what apps you use. GrapheneOS is great
               | also and even better for those with very high security
               | and privacy requirements.
        
             | jazzyjackson wrote:
             | I'm considering the Nokia 8110 (banana phone)
             | 
             | KaiOS, somewhere between a smart phone and a dumb phone.
             | Still has Google maps, not sure if you need a Google
             | account tho.
        
         | heavyset_go wrote:
         | If I were to buy a laptop, it would be the Framework laptop
         | that just started shipping. It doesn't have a Ryzen chip,
         | though, so that's a deal breaker for me. Otherwise, that laptop
         | ticks all of my boxes.
         | 
         | Phone-wise, there are many options to choose from. I like the
         | idea behind the Teracube 2E[1], as they take some of the
         | principles behind Framework and apply it to phones.
         | 
         | > _On cloud: I 've considered executing on a NAS forever, but
         | I'm unsure where to start._
         | 
         | Depends on how much you want to tinker. You can't go wrong with
         | a Raspberry Pi and some external harddrives, but there is also
         | dedicated NAS equipment that requires less setup and
         | maintenance, some of them are Linux based, as well.
         | 
         | [1] https://myteracube.com/pages/teracube-2e
        
         | fragileone wrote:
         | Laptop: Framework Laptop
         | 
         | Phone: Pixel with GrapheneOS or CalyxOS. In the future a Linux
         | phone when the software improves.
        
           | ssklash wrote:
           | CalyxOS is fantastic. You can get a like-new Pixel on
           | swappa.com for cheap, and have a virtually Google-free,
           | Apple-free phone that supports most/all the apps you would
           | want via MicroG. Can't recommend it enough. GrapheneOS is
           | similar except without support for MicroG if you don't need
           | it.
        
           | chopin wrote:
           | > Phone: Pixel with GrapheneOS or CalyxOS
           | 
           | I've seen this recommendation very often lately. As I am
           | shopping for a new phone: Why is it that hardware directly
           | from Google is recommended for putting another OS onto it
           | (I've seen recommendations for LineageOS as well). What makes
           | it better than any stock phone supported by LineageOS?
        
             | fragileone wrote:
             | See the GrapheneOS or CalyxOS websites for more details,
             | they are significantly hardened for security compared to
             | LineageOS.
             | 
             | Currently those two projects only support Pixels, mainly
             | because they're all bootloader unlockable. If these
             | projects had as many volunteers as LOS then more devices
             | could be officially supported.
             | 
             | LOS on a supported Android phone is still a better option
             | than a stock Android or iPhone at least.
        
               | SubzeroCarnage wrote:
               | My https://divestos.org project, while not as secure as
               | GrapheneOS, provides lots of security to many older
               | devices.
        
               | pzo wrote:
               | Interesting project. Thanks for sharing! Any reason why
               | at least some of those patches couldn't be upstreamed to
               | LOS?
        
               | SubzeroCarnage wrote:
               | Most things simply aren't in their scope.
               | 
               | I do send occasional patches to Lineage if they are in-
               | scope and am in contact with some of them reasonably
               | frequently.
               | 
               | The big blocker is that their Gerrit instance requires a
               | Google account to login.
               | 
               | Example of a recent fix I emailed them: https://review.li
               | neageos.org/c/LineageOS/android_device_htc_...
        
             | elagost wrote:
             | Consistency. All the Google hardware forever has allowed
             | easy factory unlocking without a fuss, easy ways to restore
             | to standard OS images without jumping through hoops, and
             | are widely available. Plus they allow re-locking the
             | booloader and the phone equivalent of enrolling your own
             | custom secure boot keys. They also provide firmware updates
             | for a long time so you can get platform/hardware patches
             | too. CalyxOS does provide these in their images.
             | 
             | The 3a/4a are cheap and have headphone jacks and good
             | cameras. What's not to love? Until they change their policy
             | on unlocking bootloaders and installing custom OSs they're
             | great devices. I still have a Nexus 5 that runs
             | PostmarketOS and Ubuntu Touch, and if it completely breaks
             | I can always use ADB/Fastboot to flash the Android 6 images
             | that are still on Google's website. Don't even have to log
             | in to get them.
        
               | m4rtink wrote:
               | Devices supported by the Sony Open Device Program shoukd
               | be also a good target:
               | 
               | https://developer.sony.com/develop/open-devices/
               | 
               | There are projects such as Sailfish OS that make use of
               | this to run on originally Android hardware.
        
         | avh02 wrote:
         | I use an hp spectre x360, it's not the most powerful thing in
         | the world and has occasional issues knowing when to charge but
         | otherwise I love it with ubuntu/i3wm
        
         | CA0DA wrote:
         | I'm suprised nobody has mentioned GrapheneOS:
         | https://grapheneos.org/
        
       | opnac wrote:
       | https://www.apple.com/customer-letter/
       | 
       | A stark change since Apple/FBI.
        
         | dmitryminkovsky wrote:
         | And what happened since then, I wonder?
        
           | echelon wrote:
           | They were probably strong-armed into this. Perhaps the CCP
           | and the FBI talked, and Apple was told they'd be cut off from
           | their suppliers if they didn't introduce this.
           | 
           | It doesn't matter. This is the wrong choice, and everyone
           | should rebuke and abandon Apple for this.
        
       | webmobdev wrote:
       | The intention to prevent child sexual abuse material is indeed
       | laudable. But the "solution" is not. For those wondering what's
       | wrong with this, two hard-earned rights in a democracy go for a
       | toss here:                   1. The law presumes we are all
       | innocent until proven guilty.         2. We have the right
       | against self-incrimination.
       | 
       | Pervasive surveillance like this starts with the presumption that
       | we are all guilty of something ( _" if you are innocent, why are
       | you scared of such surveillance?"_). The right against self-
       | incrimination is linked to the first doctrine because compelling
       | an accused to testify transfers the burden of proving innocence
       | to the accused, instead of requiring the government to prove his
       | guilt.
        
       | artifact_44 wrote:
       | Why are they telling everyone and not just doing it? Seems like
       | aiding and abetting, if you've already dismissed the privacy
       | aspect...
        
         | echelon wrote:
         | Because it was leaked, and the press release was damage
         | control.
        
       | RegnisGnaw wrote:
       | Note that:
       | 
       | 1) its only scanned on upload to iCloud, so if you don't upload
       | then its not scanned
       | 
       | 2) (per another article, https://techcrunch.com/2021/08/05/apple-
       | icloud-photos-scanni...): Most cloud services -- Dropbox, Google,
       | and Microsoft to name a few -- already scan user files for
       | content that might violate their terms of service or be
       | potentially illegal, like CSAM.
       | 
       | So you really can't opt out unless you avoid all cloud photos
        
         | rantwasp wrote:
         | yes it's only icloud photos for now. what's going to be next?
         | 
         | are you going to have a huge scandal every time they turn on a
         | new "feature"?
        
           | RegnisGnaw wrote:
           | Its interesting that its only brought up when Apple does it.
           | Google, MS, Dropbox, etc gets a pass..
        
             | drampelt wrote:
             | People have been talking about other companies doing that
             | for ages, and many specifically choose Apple products
             | because of their perceived commitment to privacy.
        
             | rantwasp wrote:
             | Google, MS, Dropbox don't wave their Privacy flag and
             | pretend they are better than everyone. Guess I'll add Apple
             | to my "do not buy stuff from them" list
        
       | m3kw9 wrote:
       | Yes how do they not allow govt to ask them to scan for their set
       | of restricted hashes?
        
         | voakbasda wrote:
         | They won't. This is the big problem.
        
       | thoughtstheseus wrote:
       | I support content filtering but it should be controlled and
       | implemented by users.
        
       | freediver wrote:
       | While I applaud the goal, this is likely not to achieve it on its
       | own. Those that should be affected are probably not using
       | Messanger to begin with and if they are they can easilly switch
       | to other communication apps.
       | 
       | It is also not quite clear if Apple is taking a moral or legal
       | stand here. If it is legal then this could in the future open
       | doors to:
       | 
       | - Scanning for other types of illegal content
       | 
       | - Scanning for copyrighted content (music, images, books, ...)
       | 
       | - Scanning your iCloud files and documents
       | 
       | - Scanning emails to make sure you are not doing anyting illegal
       | 
       | If it is morally driven and Apple wants to really take a stand
       | against any CSAM material on its devices, they would really have
       | to do it at a system level, monitoring all data being transferred
       | (including all communications, all browsing etc) so this could
       | just be the first step.
       | 
       | A moral-based agenda would be much easier for broader public to
       | accept, while a legal-based agenda could lead to other kinds of
       | privacy-intruding consequences. And even a moral-based agenda
       | would still be a precedent as ultimately we do not know what are
       | Apple's "moral values" and what would it be ready to intrude
       | user's privacy over in the future?
       | 
       | Seems like a slippery slope for a company to take, any way you
       | turn it, specially if privacy is one of your main selling points.
       | 
       | Another thought: if we as a society agree that CSAM is
       | unacceptable, why not globally prevent it at an internet router
       | level? edit: jetlagged... we can't because data is encrypted. It
       | has to be at client level, pre-encyption.
        
         | voakbasda wrote:
         | Apple is a legal entity not a moral one. It may have moral
         | employees, but at best Apple itself is amoral. It will do what
         | the law allows (or compels) it to do.
         | 
         | This feature absolutely will be used for human rights abuse by
         | countries like China, just like they have asked Apple to abuse
         | their platform in the past. Why? Because those abuses are legal
         | there, and capitulation will be the only way those governments
         | will allow them continue to sell in their lucrative
         | marketplace.
        
         | bambax wrote:
         | > _While I applaud the goal_
         | 
         | Really? What is the goal?
         | 
         | It's not to prevent child abuse, since passively looking at
         | images is not, per se, abuse.
         | 
         | It's also not to limit the making of child pornography, since
         | this will only search for _already existing_ and _already
         | known_ images that _already exist_ in government databases.
         | 
         | If you make new images that are not yet in said databases,
         | you're fine.
         | 
         | I'm not sure what the actual goal is (project a virtuous
         | company image, maybe?), but the result could very well be the
         | opposite of what people think.
        
           | juniperplant wrote:
           | > It's not to prevent child abuse, since passively looking at
           | images is not, per se, abuse.
           | 
           | Consumption feeds production.
        
             | bambax wrote:
             | Or the opposite: consumption prevents people from acting
             | out.
        
       | adunn wrote:
       | Alternate take and apparently unpopular opinion: the US
       | government can simply force Apple to give them a back door to
       | access iCloud photos directly.
       | 
       | What Apple has done here is both way more complicated and way
       | more visible than necessary, while being less useful to a
       | government. The slippery slope and capability to exploit is cloud
       | storage.
        
       | websites2023 wrote:
       | It's astounding how many very smart people are getting this
       | wrong.
       | 
       | Perhaps I missed it, but does anywhere on this letter mention
       | that that both of these features are optional?
       | 
       | CSAM depends on using iCloud Photos. Don't rent someone else's
       | computer if you don't want them to decide what you can put on it.
       | 
       | Content filter for iMessages is for kids accounts only, and can
       | be turned off. Or, even better: skip iMessages for Signal.
        
         | willseth wrote:
         | It's baffling. It seems like nearly everyone losing their shit
         | over this doesn't understand how it works. Most of the
         | commentary I see here and elsewhere is based on a
         | misunderstanding of the implementation that blends the CSAM
         | scanner with the child messaging scanner.
        
         | ursugardaddy wrote:
         | this is how it always starts, Apple went from 'no it's not
         | possible to unlock the shooters phone' to 'yeah you can give us
         | the fingerprint of any image (maybe doucuments too) and we'll
         | check which of our users has it'
        
         | epanchin wrote:
         | The new system is overkill for iCloud, which Apple already
         | scans. The obvious conclusion is Apple will start to scan
         | photos kept on device, even where iCloud is not used.
         | 
         | Very smart people are not getting this wrong.
        
           | websites2023 wrote:
           | > The obvious conclusion is Apple will start to scan photos
           | kept on device, even where iCloud is not used.
           | 
           | Why is this the obvious conclusion?
        
             | whartung wrote:
             | Partly it falls under the supposition of "if they can, they
             | will". It's also suggested when they tout that they are
             | going to start blurring "naked" pictures, of any kind, sent
             | to children under 14. Which means they need some kind of
             | tech to detect "naked" pictures, locally, across encrypted
             | channels, in order to block them.
             | 
             | In theory, this is different tech than CSAM, which is
             | supposed to checking hashes against a global database, vs
             | determining the "nakedness" score of an arbitrary picture.
             | 
             | But, scan them once, scan them all, on your phone. The
             | details start to matter less and less.
             | 
             | Also, since they already scanning all of the photos up on
             | iCloud already, why would they need to have it put locally
             | on the phone?
             | 
             | Finally, I know that Apple scans photos on my device,
             | because that's how I get those "memories" of Furry Friends
             | with vignettes of my cats. I don't even use iCloud. (To be
             | clear, I love this feature. Send me movies of my cats once
             | a week.)
        
               | websites2023 wrote:
               | > Partly it falls under the supposition of "if they can,
               | they will".
               | 
               | Hm. But there are many millions of things Apple could do,
               | but haven't, because it would hurt their business model.
               | So how would what you are proposing they will do help
               | their business model?
               | 
               | > Which means they need some kind of tech to detect
               | "naked" pictures, locally, across encrypted channels, in
               | order to block them.
               | 
               | I know you know this is the case, but to make it clear
               | for anyone reading: Apple is not blocking nude pictures
               | in the kids filter. It's blurring and giving a message.
               | Again I ask: why would using this technology on non-nude
               | stuff benefit Apple?
               | 
               | Are we worried about Apple or are we worried about the
               | government forcing Apple to do things that this
               | technology enables?
        
               | websites2023 wrote:
               | >They already scanning all of the photos up on iCloud
               | already
               | 
               | I can't find a source for this. Do you happen to have
               | one?
               | 
               | It seems to me that Apple doesn't want to host CSAM on
               | their servers, so they're scanning your device so that if
               | it does get uploaded, they can remove it and then ban
               | you.
               | 
               | They're not scanning all photos on iCloud, as far as I
               | can tell.
        
           | shuckles wrote:
           | Apple does not already scan iCloud.
        
             | willseth wrote:
             | Not sure why this got downvoted. This is correct and very
             | thoroughly documented.
        
               | shuckles wrote:
               | The meme tides have turned against logic on this topic.
        
           | willseth wrote:
           | Apple doesn't "scan" iCloud. Not sure what you're talking
           | about. Generally everything in iCloud is E2E encrypted, with
           | the exception of iCloud Backups, where Apple holds onto a
           | decryption key and will use it to comply with subpoenas. But
           | nothing is "scanned," and if you don't use iCloud backup,
           | Apple can't see your data.
        
             | shuckles wrote:
             | iCloud Photos aren't E2E encrypted, but it's unlikely
             | they're scanned for CSAM today because Apple generates
             | effectively 0 references to NCMEC annually.
        
               | websites2023 wrote:
               | I also believe Apple doesn't really want to scan your
               | photos on their servers. I believe their competitors do,
               | and they consider this compromise (scan on device with
               | hashes) is their way of complying with CSAM demands while
               | still maintaining their privacy story.
        
               | shuckles wrote:
               | Yes. Scope creep is much easier when implemented as
               | scanning on plaintext data.
        
               | [deleted]
        
           | gerwim wrote:
           | >> The obvious conclusion is Apple will start to scan photos
           | kept on device, even where iCloud is not used.
           | 
           | Wrong [1]. It's even in the first line of the document which
           | you apparently didn't even read:                 CSAM
           | Detection enables Apple to accurately identify and report
           | iCloud users who store       known Child Sexual Abuse
           | Material (CSAM) in their iCloud Photos accounts
           | 
           | This doesn't mean I'm supporting their new "feature".
           | 
           | 1. https://www.apple.com/child-
           | safety/pdf/CSAM_Detection_Techni...
        
             | rantwasp wrote:
             | nah. it means that they don't scan it yet.
             | 
             | also by reading that doc and pointing to it means you trust
             | apple.
             | 
             | i used to trust apple when they were peddling their privacy
             | marketing stuff. not anymore.
        
               | willseth wrote:
               | OK so let's all lose our shit over things that haven't
               | happened.
        
         | echelon wrote:
         | Then why do the "CSAM" perceptual hashes live on the device and
         | the checks themselves run on the device? Those hashes could be
         | anything. Your phone is turning into a snitch against you, and
         | the targeted content might be CCP Winnie the Pooh memes or
         | content the people in charge do not like.
         | 
         | We are not getting this wrong. Apple is taking an egregious
         | step to satisfy the CCP and FBI.
         | 
         | Future US politicians could easily be blackmailed by the non-
         | illegal content on their phones. This is a jeopardy to our
         | democracy.
         | 
         | The only reason this was announced yesterday is because it was
         | leaked on Twitter and to the press. Apple is in damage control
         | mode.
         | 
         | This isn't about protecting children. It's about control.
         | 
         | Stop defending Apple.
        
           | willseth wrote:
           | This boils down to two separate arguments against Apple: 1)
           | what Apple has already implemented, and 2) what Apple _might_
           | implement in the future. It 's fine to be worried about the
           | second one, but it's wrong to conflate the two.
        
             | websites2023 wrote:
             | >It's fine to be worried about the second one, but it's
             | wrong to conflate the two.
             | 
             | Agreed, and just to be clear, I'm worried about that too.
             | It just appears that we (myself and the objectors) have
             | different lines. If Apple were to scan devices in the US
             | and prevent them from sharing memes over iMessage, that
             | would cross a line for me and I'd jump ship. But preventing
             | CSAM stuff from getting on their servers seems fine to me.
        
               | echelon wrote:
               | > "preventing CSAM stuff from getting on their servers
               | seems fine to me"
               | 
               | You're either naive or holding your fingers in your ears
               | if you think this is the objective.
               | 
               | Let me repeat this again: this is a tool for the CCP,
               | FBI, intelligence, and regimes.
        
               | magicloop wrote:
               | I think the situation is clear when we think of this
               | development from a threat modelling perspective.
               | 
               | Consider a back-door (subdivided into code-backdoors and
               | data-backdoors) placed either on-device or on-cloud. (4
               | possibilities)
               | 
               | Scanning for CP is available to Apple on-cloud (in most
               | countries). Scanning for CP is available to the other
               | countries on-cloud (e.g. China users have iCloud run by a
               | Chinese on shore provider). Scanning for CP is not
               | available to Apple on-device (until now)
               | 
               | This is where the threat model comes in. Intelligence
               | agencies would like a back door (ideally both Code and
               | Data).
               | 
               | This development creates an on-device data-backdoor
               | because scanning for CP is done via a neural network
               | algorithm plus the use of a database of hashes supplied
               | by a third party.
               | 
               | If the intelligence service poisons the hashes database
               | then it won't work because the neural network scans for
               | human flesh and things like that, not other kinds of
               | content. So the attack works for other sexual content but
               | not political memes. It is scope-limited back door.
               | 
               | For it to be a general back door, the intelligence agency
               | would need the neural network (part of apple's on-device
               | code) and well as the hashes database to be modified. So
               | that is both requiring a new code back door (Apple has
               | resisted this), and a data back door both on-device.
               | 
               | Currently Apple has resisted:
               | 
               | Code back doors (on device) Data back doors on device
               | (until now)
               | 
               | and Apple has allowed Data back doors in cloud (in
               | certain countries) Code back doors in cloud (in certain
               | countries)
               | 
               | In reality the option to not place your photos in iCloud
               | is a euphemism for "don't allow any data backdoor". That
               | is because iCloud is a data-backdoor due to it being able
               | to be scanned (either by Apple or an on-shore data
               | provider).
               | 
               | My analysis is that the on-device scanning does not
               | improve Apple's ability to identify CP since it does so
               | on iCloud anyway. But if my analysis is incorrect, I'd be
               | genuinely interested if anyone can correct me on this
               | point.
        
           | websites2023 wrote:
           | > Future US politicians could easily be blackmailed by the
           | non-illegal content on their phones. This is a jeopardy to
           | our democracy.
           | 
           | US politicians should not be using normie clouds full stop.
           | This is a risk and always has been.
        
             | echelon wrote:
             | Do you know which of your kids is going to be a politician?
             | Better keep them all off the internet to keep their future
             | career safe.
             | 
             | This is why it's important to stop now.
        
           | avianlyric wrote:
           | iCloud photos aren't currently encrypted, but this system
           | provides a clear path to doing that, while staving
           | accusations that E2E of iCloud will allow people to host CP
           | there with impunity.
           | 
           | When the device uploads an image it's also required to upload
           | a cryptographic blob derived from the CSAM database which can
           | then be used by iCloud to identify photos that might match.
           | 
           | As built at the moment, your phone only "snitches" on you
           | when it uploads a photo to iCloud. No uploads, no snitching.
           | 
           | We know that every other cloud provider scans uploads for
           | CSAM, they just do it server side because their systems
           | aren't E2E.
           | 
           | This doesn't change the fact that having such a scanning
           | capability built into iOS is scary, or can be misused. But in
           | its original conception, it's not unreasonable for Apple to
           | say that your device must provide a cryptographic attestation
           | that data uploaded isn't CP.
           | 
           | I think Apple is in a very hard place here. They're almost
           | certainly under significant pressure to prove their systems
           | can't be abused for storing or distributing CP, and coming
           | out and saying they'll do nothing to prevent CP is suicide.
           | But equally the alternative is a horrific violation of
           | privacy.
           | 
           | Unfortunately all this just points to a larger societal
           | issue. Where CP has been weaponised, and authorities are more
           | interested in preventing the distribution of CP, rather than
           | it's creation. Presumably because one of those is much easier
           | to solve, and creates better headlines, than the other.
        
             | websites2023 wrote:
             | >iCloud photos are encrypted, so scanning has to happen on
             | device.
             | 
             | Is this true? I feel like Apple benefits from the confusion
             | about "Encrypted at rest" + "Encrypted in transit" and "E2E
             | Encrypted". It's my understanding that Apple _could_ scan
             | the photos in iCloud, since they have the decryption keys,
             | but they choose not to, as a compromise.
             | 
             | I'm keying into this because this document:
             | https://support.apple.com/en-us/HT202303 doesn't show
             | Photos as part of the category of data that "Apple doesn't
             | have access to." That's mentioned only in the context of
             | the E2E stuff.
        
               | avianlyric wrote:
               | You're right, currently iCloud photos isn't E2E. I've
               | updated my comment
        
       | markus_zhang wrote:
       | I guess more the reason to use vintage computing for personal
       | then? You can even disconnect them from Internet whenever not
       | needed.
       | 
       | Then you use the modern computer your company provides strictly
       | for business purposes.
        
       | snemvalts wrote:
       | With regulators forcing encryption backdoors with "the children"
       | as the argument, Apple's solution seems like the least of the
       | possible evils.
        
         | sizt wrote:
         | As with malware. Identify it. Isolate it. Remove it.
        
       | rvz wrote:
       | It has been admitted in the past and now in the open. Apple Inc.
       | is not your friend. [0]
       | 
       | The author of this Tweet has a point and made it clear all along
       | for a long time.
       | 
       | [0] https://twitter.com/aral/status/1182186665625960448
        
       | grae_QED wrote:
       | Honest question: why does apple think this will help anything?
       | From what I've read, it looks like the sheer volume of the
       | problem out weights virtually any solution (within reason) [1].
       | 
       | [1] https://www.nytimes.com/interactive/2019/09/28/us/child-
       | sex-...
        
       | [deleted]
        
       | Symmetry wrote:
       | Apparently "It will also scan messages sent using Apple's
       | iMessage service for text and photos that are inappropriate for
       | minors."
       | 
       | https://www.washingtonpost.com/technology/2021/08/05/apple-c...
       | 
       | That seems even worse. In the US we have this terrible situation
       | where it might be perfectly legal for two 17 year olds or a 17
       | and an 18 year old to have sex with each other but if they sext
       | then they're engaging in child pornography which is a huge
       | federal crime. But it hasn't been a problem until now because
       | it's very hard to enforce. But it looks like Apple is now going
       | to take part in enforcing that law. It'll be tattling to the
       | parents rather than law enforcement but I still think that's
       | terrible.
        
       | btdmaster wrote:
       | Why is CP punished more harshly than CA [1]? Is it because it is
       | easier to do so, or is it because it gives an illusion of
       | protecting children?
       | 
       | This, of course, ignores how a lot of child abusers are underage
       | themselves and know the victim,[2] and that the prosecutors are
       | committing the same crime as the prosecuted in the case of CP,
       | and that, in too many cases, the content in question is
       | impossible to call malicious if it is seen in context.[3]
       | 
       | [1] https://columbiachronicle.com/discrepancy-in-sex-offender-
       | se...
       | 
       | [2]
       | https://web.archive.org/web/20130327054759/http://columbiach...
       | 
       | [3] https://news.ycombinator.com/item?id=5825087
        
         | SSLy wrote:
         | your [1] 404's
        
           | btdmaster wrote:
           | My bad, got [1] and [2] backwards. [1] is supposed to be the
           | archive link, and [2] is https://www.d2l.org/wp-
           | content/uploads/2017/01/all_statistic....
        
       ___________________________________________________________________
       (page generated 2021-08-06 23:02 UTC)