[HN Gopher] Apple Delays Rollout of Child Safety Features
       ___________________________________________________________________
        
       Apple Delays Rollout of Child Safety Features
        
       Author : nycdatasci
       Score  : 1347 points
       Date   : 2021-09-03 13:13 UTC (1 days ago)
        
 (HTM) web link (www.macrumors.com)
 (TXT) w3m dump (www.macrumors.com)
        
       | s1artibartfast wrote:
       | I wonder how such a feature could even distinguish legitimate
       | use. What if I want to take pic pictures of a strange vaginal
       | discharge to send to my child's pediatrician? Am I now on a list?
       | What if I want family photos of my happy naked baby playing?
        
         | CubsFan1060 wrote:
         | What feature are you talking about? The iCloud photo scans for
         | known photos, it doesn't detect random images. The iMessage
         | feature only allows parents to see images that their children
         | are sharing.
         | 
         | As you described, nothing would flag that at all.
        
           | s1artibartfast wrote:
           | Really? Can you point me to a better description than in the
           | article?
           | 
           | >The planned features include scanning users' iCloud Photos
           | libraries for Child Sexual Abuse Material (CSAM),
           | Communication Safety to warn children and their parents when
           | receiving or sending sexually explicit photos
           | 
           | This description seems to cover scanning original content.
           | Database comparison seems to be one tool of many, and
           | insufficient to meet the core functionality of stopping OC.
           | 
           | https://www.google.com/amp/s/www.nytimes.com/2021/09/03/busi.
           | ..
           | 
           | Edit: after looking around, The CSAM hash compare is one of
           | two tools. The other is using on-device machine learning to
           | analyze message image attachments and determine if a photo is
           | sexually explicit
           | 
           | https://www.apple.com/child-safety/
        
             | CubsFan1060 wrote:
             | Yup, I think you found it. The on-device one only applies
             | to people under 13, and only reports to parents. And, IIRC,
             | lets the child know that it'll be reported to parents
             | before the child views it.
             | 
             | https://www.macrumors.com/2021/08/05/apple-communication-
             | saf...
        
         | Klonoar wrote:
         | This comment is proof that the discourse surrounding this is so
         | badly misunderstood that Apple couldn't possibly hope to fight
         | it. ;P
        
       | tzs wrote:
       | What I find most interesting is that they have apparently delayed
       | _both_ the iCloud photo upload scanning _and_ the Messages
       | parental control scanning which uses a completely different
       | mechanism.
       | 
       | The latter when it flags something in a message to the child
       | warns the child and asks if they want to view it. If the child
       | says "yes" and is 13+ it shows it to them and that is the end of
       | it. If the child is 12 or under, they are warned again and told
       | that if they view it their parents will be notified. If they say
       | "yes" again they are shown the image and their parents are
       | notified.
       | 
       | Note that no one is notified unless the recipient is an under 13
       | child and the child explicitly decides to allow their parents to
       | be notified.
       | 
       | I've seen much fewer objections to this. I don't see why they
       | seem to be tying this and the iCloud scanning together in one
       | release.
        
         | Klonoar wrote:
         | Likely because they made the mistake of doing them together to
         | begin with - there's no world where the public untangles it.
        
         | codezero wrote:
         | It helps them conflate the feature as "for the children" and
         | not as a privacy violation, a general confusion about the
         | coupling of these features is desirable for their marketing.
        
       | alliao wrote:
       | honestly, brand breaking.
        
         | Gareth321 wrote:
         | I've been using Apple hardware for so long now and have been a
         | huge fan. They killed almost all the goodwill they had with me.
         | I no longer trust them, and since their operating systems are
         | closed source, trust is critical. This has been a wake up call.
         | If it's not open source, I don't know what's happening on my
         | devices.
        
           | musha68k wrote:
           | Same here. I installed arch on my x86 MBP and I'm honestly
           | shocked how smooth it went and how well it runs. Definitely a
           | good time to get back to open source desktop experiences
           | (after ~17 years for me). Now I'm waiting to do the same
           | thing with my M1 MBP which is still on Big Sur for now (Asahi
           | linux looking very promising though!).
        
       | hughrr wrote:
       | Delay is not the word I want to see.
       | 
       | It's too late anyway. Their reputation is shot now. I don't trust
       | them.
        
       | stjohnswarts wrote:
       | I personally think one of the best things you can do is turn off
       | icloud and automatic updates. No doubt they collect that
       | telemetry. That sends a bigger "vote" than complain on HN,
       | although both are welcome. I turned it all off and set idrive as
       | my backup for now. Not optimal but the best I could do on short
       | notice.
        
       | subdane wrote:
       | Seems like the unspoken story here isn't really about Apple, it's
       | that the govts of the world want access to your phone data and
       | its backups. They're already getting some of it from exploits,
       | telcos and the cloud. They'll be back and will keep coming and
       | it's hard to imagine any company will have the ability to stand
       | up to them without seriously impacting their business.
        
       | skinkestek wrote:
       | ok folks, like I told you it was possible to get Apple to turn
       | around.
       | 
       | Maybe now is the time for the American part of HN to start
       | applying the same pressure towards your politicians.
       | 
       | We have an election coming up and I've been quite vocal to the
       | people on stand about how my people (the conservatives) sold us
       | out by rubber stamping the surveillance bill last year.
       | 
       | I've also sent email and have sent one and will be sending one or
       | more tweets the next few days.
        
       | robertwt7 wrote:
       | Delay is a strong word here. The question is until when? It's
       | like a time bomb. Apple is giving us mixed feelings here starting
       | by the privacy features, giving option for app not to track us,
       | then this.
       | 
       | Damn I'm still confused why person like Tim who has earned
       | everyone's respect for privacy is doing this kind of thing. It's
       | hard to move from apple honestly at this point and it's annoying
        
       | tehjoker wrote:
       | Why would macrumors characterize this unsafe and highly prone to
       | nation state abuse security system on Apple's terms? Could they
       | be protecting their institutional arrangement with Apple's PR arm
       | and leakers?
       | 
       | Anyone reading that headline alone will think, man that's crazy.
       | Why would Apple delay something that protects children?
        
       | intricatedetail wrote:
       | They want people to buy new Macs that will be coming soon and
       | then they'll switch the feature. I hope I am wrong.
        
       | CubsFan1060 wrote:
       | While looking up something, I came across this article (I believe
       | from before Apple's original announcement):
       | https://www.wired.co.uk/article/whatsapp-encryption-child-ab...
       | 
       | It's interesting in that it talks about both server and client
       | side scanning, and some of the things that WhatsApp does.
       | 
       | An interesting fact from the article: in 2020, Whatsapp made
       | 400,000 reports to the NCMEC. Apple: 265
        
         | jliptzin wrote:
         | And Facebook: 20 million reports last year. That is an
         | astounding number.
         | 
         | Does anyone really think we're only another 20 million reports
         | a year away from solving the CSAM problem?
        
           | CubsFan1060 wrote:
           | I have no idea what the right answer is, however "solving"
           | implies that the states are "huge problem" and "no problem".
           | 
           | I might say that it's possible there are actions that can
           | reduce, but not eliminate, the problem. What if another 20
           | million reports saved one child from exploitation? 10
           | children? 1000 children?
           | 
           | At some level, this boils down to a society question. We all
           | agree CSAM is evil. How does it stack up against other things
           | we think are evil? I fear that, in the end, there is no
           | middle ground that includes eliminating CSAM and keeping 100%
           | privacy.
        
         | cwizou wrote:
         | Yep, Apple currently only scans iCloud Mail attachements (I
         | believe they confirmed this, don't have the source at hand
         | though), which explains the discrepancy.
         | 
         | There's a big difference though between scanning a library
         | (which is closer to what DropBox/OneDrive are doing) and when
         | it's messaging platform whose ultimate goal is communication
         | between people. It does make sense that those catch way more.
         | 
         | If Apple had roll those scans on iMessage attachements (which
         | they didn't planned to, though the feature looked to be
         | designed for it way more than it was for a library scan), you
         | would probably see comparable numbers (modulo those platforms
         | relative marketshares).
         | 
         | One big problem though is are those large numbers actually
         | actionable ? I think I remember seeing a quote from Swiss (?)
         | law enforcement that complained that the reports they got from
         | NCMEC were nearly all of the time unactionable.
         | 
         | There are possibly many reasons from this, it could be
         | differences in local laws, algorithms that wrongly report
         | things, or other factors (maybe not enough to identify the user
         | of the acount, etc).
         | 
         | This is where one feature of Apple's design was a bit better:
         | they didn't plan to report every single occurence, but would
         | only if 30 such images were detected. Part of the reason they
         | did was because of the imperfections of the NeuralHash
         | mechanism, sure, but in the end it doesn't matter.
         | 
         | One can argue doing this would at least generate more
         | actionnable reports. It shouldn't be a numbers game.
        
           | CubsFan1060 wrote:
           | The article touched on the actionable parts. "They say it is
           | "highly unlikely" that a search warrant could be made from a
           | tip that does not include content "and instead only includes
           | IP data and an indication from an algorithm that there is a
           | high probability of suspicious or anomalous activity"."
           | 
           | It sounds like simply being able to say "This user was
           | messaging with 4 known offenders,and sending something the
           | size of images" is unlikely to be helpful to law enforcement.
        
             | ComputerGuru wrote:
             | Boy do I have a bridge to sell you. Search warrants get
             | signed for far, far less all the time.
        
       | leptoniscool wrote:
       | Apple is required by law to do this correct?
        
         | __warlord__ wrote:
         | According to them, no.
         | 
         | Source: https://youtu.be/OQUO1DSwYN0?t=398
         | 
         | Side note, I like the quote:
         | 
         | Interviewer: Who owns this phone?
         | 
         | Craig Federighi: I think our customers own their phones... aaah
         | for sure
         | 
         | :P
        
       | bigger_inside wrote:
       | calling it "child safety features" in a report ON it is its own
       | kind of propaganda collaboration
        
       | vagrantJin wrote:
       | I can't help but laugh at the good faith Apple Inc has. The less
       | obvious questions about the company are never asked because
       | anyone who does is attacked by fans, who in a manic state of self
       | righteousness never asked why were _child safety features_ not a
       | concern for Apple Inc for the last 20+ years.
       | 
       | That's the blind good faith of which I speak. It makes Apple Inc
       | is a mouthwatering target. So good, there's no need to rush. No
       | need to be splashy. The intrinsic trust its users have on its
       | integrity is mindblowing. People trust it more than they trust
       | their own children and their ability to raise them.
       | 
       | And no good can come from such blind faith.
        
         | stetrain wrote:
         | The reason they care now is because they have been trying to
         | increase privacy and encryption of their services, and the
         | powerful political flag that gets waved in opposition is CSAM.
         | (second place is Terrorism but that engenders more political
         | debate)
         | 
         | Even if the real government motives for wanting decryption
         | backdoors or no encryption at all are less pure in motivation,
         | CSAM investigation and prosecution is something that nobody
         | wants to argue against politically.
         | 
         | That's the only explanation, sinister or benevolent, that makes
         | sense to me for why Apple wanted to do this with all of the
         | rube goldberg steps they created to make it inconvenient as a
         | direct surveillance tool. . Given the priors that they want to
         | avoid decrypted dragnet scanning of user data on their servers,
         | limit who at Apple has access to user data on their servers
         | (even when compelled by govt warrant), and that they can't
         | continue down their road of device and cloud privacy without
         | addressing CSAM, this is a solution that attempts to fit those
         | measures.
         | 
         | I don't think that makes it a good thing. There is a clear
         | ethical boundary for most people between "I uploaded this to
         | your server and you scan it" versus "I'm about to upload this
         | to your server so you run a scanner on my device, and that scan
         | can result in a human at Apple seeing some kind of
         | censored/thumbnail version of my personal photos."
         | 
         | Apple tried to find a clever work around to the above
         | assumptions but their solution is something that feels like a
         | worse privacy violation than just scanning everything server
         | side.
        
         | least wrote:
         | > ...never asked why were child safety features not a concern
         | for Apple Inc for the last 20+ years.
         | 
         | This is remarkably disingenuous and can be applied to any
         | company who hasn't done any number of things to help prevent
         | any number of criminal or immoral activities on their
         | platforms. Did we have the need or the technology to implement
         | such features in 2001? I wasn't using cloud services to store
         | photos in 2001. Were servers, let alone devices powerful enough
         | to generate the hashes? Did NCMEC have the capability to
         | generate these hashes or even have a database from which they
         | could help create these technologies? PhotoDNA's development
         | didn't start until 2009.
         | 
         | Capability has expanded because the devices we carry around are
         | super computers with built-in ML chips to enable these sorts of
         | technologies. This would not have been possible with a feature
         | phone in 2001. Advancements in machine learning have also
         | enabled these sorts of technologies.
         | 
         | Concerns have expanded because our use of third party platforms
         | for cloud storage has significantly expanded which creates a
         | liability for all these platforms. E2EE would of course remove
         | their liability since they would have no idea what was being
         | hosted on their platforms but the problem of ensuring that
         | people don't lose access to their photos in the case of losing
         | their device or password remains.
         | 
         | If Apple's users had intrinsic trust in the company then Apple
         | wouldn't be concerned about the push back against this tech.
         | All you're doing here is outing yourself as the opposite of the
         | "apple sheep."
        
           | vagrantJin wrote:
           | > All you're doing here is outing yourself as the opposite of
           | the "apple sheep."
           | 
           | Fair point. I feel its a deflection from the point though.
           | Nothing about this whole saga has much to do with tech.
           | Understandbly since this is an engineering medium - fair
           | play.
        
           | smoldesu wrote:
           | > This would not have been possible with a feature phone in
           | 2001.
           | 
           | Perceptual hashes (photoDNA) can be executed on pretty much
           | any device, the requirements for it are quite low as far as I
           | understand it. Probably not a great idea to run it on a 40mHz
           | Nokia, but it would actually be rather viable with devices
           | swinging past the 1gHz threshold.
        
         | noirbot wrote:
         | I am very confused what you're trying to imply here? That Apple
         | was intentionally not implementing child safety features before
         | now and that's suspect but no one was questioning them?
         | 
         | They've had various levels of parental control features around
         | app and website blocking for a while. CSAM scanning aside, I
         | think the other features around checking/blurring for
         | unsolicited lewd images is ahead of most other companies in the
         | space?
         | 
         | What blind faith are you referring to? "Apple Inc is a
         | mouthwatering target" to whom? The CSAM features that are being
         | pushed back aren't about if people "trust their own children".
        
           | vagrantJin wrote:
           | > aren't about if people "trust their own children".
           | 
           | The trust their own children but isn't about CSAM itself. Its
           | about a previous comment I made where a parent made a claim
           | about airtagging their kids so they knew where they are. I
           | pointed out the hypocrisy of buying an iphone for better
           | privacy while blatantly denying their own kids that same
           | privilege. It should be in my comment histroy.
        
             | noirbot wrote:
             | How is that possibly Apple's fault? Some people are
             | hypocrites. Some people trust themselves more than their
             | children. You're just stringing together random arguments
             | you've had with people who happen to own Apple products and
             | then trying to blame Apple for it somehow?
        
       | ubermonkey wrote:
       | Came here to post this.
       | 
       | It is VERY uncharacteristic of Apple to pause or walk back policy
       | changes, so I wasn't very hopeful about this one. I'm glad
       | they're at least re-evaluating.
        
         | smoldesu wrote:
         | They aren't re-evaluating, they're just postponing the rollout
         | until after the holiday season.
        
       | thehappypm wrote:
       | I used to work on a system with a heavy amount of UGC (gaming
       | network). We had a big team of people doing moderation of
       | reported content, and the amount of stuff they had to deal with
       | was monstrous. Lots of death threats, racism, evil images, SWAT-
       | ing. They'd invite people to do ride-alongs to see the types of
       | things that were reported, and it was brutal. A very tough job. A
       | lot of those folks were ex-military, we needed really tough
       | people.
       | 
       | They were kind of a weird org, the police of the network, and
       | they kind of kept to themselves but carried a ton of weight. When
       | they were looped into a meeting for a new feature -- for example,
       | "Should we let users send videos to each other?" -- they'd list
       | out the evil ways people would use them, and then list out the
       | various CSAM scans we'd be obligated to do if we hosted the
       | content on our servers. They didn't have the means to block
       | features, but they did put a healthy fear-of-God into us, and
       | made it so we didn't have a lot of big public debacles of kids
       | seeing porn on the home screen of the device, or things like
       | that.
       | 
       | I can imagine that Apple has a similar group. I can imagine they
       | went to some executive, told them that this would be their best
       | way to comply with the law, of course you don't need to access
       | the images themselves, that would be a privacy violation and we
       | don't do that! We just need the hashes, may we remind you this is
       | CSAM we're talking about? Then it went up the chain and people
       | were excited to "protect the kids" and protect the brand, hashes
       | != user data, everyone wins!
       | 
       | I'm guessing this move was mostly misaligned priorities, and
       | possibly some genuine incompetence or perhaps people not speaking
       | up when they saw the potential here.
        
         | mmcnl wrote:
         | I have no idea wat UGC is. Can you please not use abbreviations
         | that serve no purpose other than confuse the reader?
        
           | [deleted]
        
           | jackson1442 wrote:
           | User generated content. It's a pretty standard term in the
           | game dev community.
        
           | toqy wrote:
           | It's a standard term for the audience of this site, can you
           | please not be so self important?
        
             | kzrdude wrote:
             | Not for me, and my account is 10 years old. Just a single
             | data point, though.
        
             | pensatoio wrote:
             | Never heard of it. I doubt the gaming industry makes up a
             | sizeable portion of HN.
        
           | thehappypm wrote:
           | My apologies, I hear it so much I just assume everyone on HN
           | does too.
        
         | joe_the_user wrote:
         | _I used to work on a system with a heavy amount of UGC (gaming
         | network)_
         | 
         | And here we have it, a situation where the content a user
         | generates on what's ostensibly "their machine" is being treated
         | more and more, by companies and the state as being equivalent
         | to "UGC", user generate content - stuff explicitly being
         | shared. Obviously, the insidious thing is that no one will be
         | able to create anything without even the creation process
         | falling under someone's "terms of use".
        
           | thehappypm wrote:
           | It's a different ball game when the content is being shared
           | on a network, I think.
        
         | [deleted]
        
         | belter wrote:
         | So this group of good hardened souls...That from the
         | description could exist at Apple...and went to speak to their
         | management about all the good thinks they need to do. Do they
         | also speak up to their management about all the activities
         | Apple is complicit with, so as to keep their presence in the
         | Chinese market?
        
           | spicybright wrote:
           | I doubt it. If they did apple would just find "stronger"
           | people.
        
           | thehappypm wrote:
           | No, because that's not in their wheelhouse.
        
       | shantara wrote:
       | I was wondering how Apple would handle their imminent iPhone 13
       | announcement event in light of the public outcry over the photo
       | scanning. Now we know the answer, they will pretend to listen
       | until everyone buys a new shiny, and then quietly slip in the
       | changes without making much noise after everyone forgets about
       | them.
        
         | bitigchi wrote:
         | "Good Morning!" (Grinning like a Cheshire Cat).
        
         | neop1x wrote:
         | It may be easy to hide that as part of an update, after
         | mainstream focus settles down. It could be part of "Various
         | privacy improvements" changelog line and it may not be easily
         | noticable. And if some security researcher discovers it many
         | years later, they will say "we told everyone in 2021 we cared
         | about children and that we would implement this".
        
       | uptownfunk wrote:
       | I completely cancelled apple Icloud as a result..
        
       | balozi wrote:
       | What's the point of delaying anything, the damage is already
       | done.
       | 
       | I can only imaging all the yelling and recrimination happening in
       | executive meetings at Apple Spaceship. Oooff!
        
       | tonyedgecombe wrote:
       | In the news tomorrow: "Apple allowing child porn on their
       | phones".
        
       | jliptzin wrote:
       | Here's my take on this
       | 
       | I think this was Apple calling the US government's bluff. I don't
       | think they ever wanted to do anything like this because they know
       | it destroys their claims of superior privacy. I think they have
       | internal plans to roll out E2E iCloud encryption so that in
       | addition to not being able to provide law enforcement the code to
       | unlock any phone, they also won't be able to help law enforcement
       | decrypt anything stored in iCloud, including those phone backups.
       | So Apple sees the incoming fight where government cries foul to
       | the public, making the same tried and true "think of the
       | children" arguments, saying now all pedophiles will be free to
       | save their CSAM on Apple servers. It's the government's way to
       | try to get the public against Apple on this, and this is how
       | Apple neuters that argument.
       | 
       | But I don't think anyone in the government really cares about the
       | CSAM. What they really want is backdoor access to the devices for
       | the high-value targets that they actually care about, or just to
       | make their jobs a lot easier in other investigations, but that's
       | a lot harder to sell to the public. Look at Facebook alone. They
       | made 20 MILLION CSAM reports last year ALONE. That number was
       | astounding to me. I haven't heard anyone discuss the sheer
       | numbers yet. Think about that for a second. That's one company,
       | making 55,000 CSAM reports PER DAY. The equivalent of a medium
       | sized city being reported for CSAM materials every day! I don't
       | know how many people work in government handling those reports,
       | but I'd imagine the process of taking in reports, evaluating
       | them, forwarding them to appropriate authorities, and then making
       | arrests is not one that can possibly be automated away to deal
       | with that kind of volume. And Apple would generate at least as
       | many reports as Facebook I'd imagine, not to mention all the
       | other sources of CSAM reports that are out there. Do we really
       | think there's anyone in law enforcement who is saying "oh gee, if
       | only we had an ADDITIONAL 55,000 CSAM reports coming in PER DAY
       | then we'd really be able to get a handle on the problem." If
       | anything, that just provides even more noise, making it harder to
       | find real signals.
       | 
       | So now that they've shown they're willing to call law
       | enforcement's bluff, I think law enforcement predictably has now
       | said to them ok, it's not really about the CSAM, we still want
       | backdoor access for other reasons, and now Apple is reevaluating
       | their response.
       | 
       | The issue for Apple though is that even if they're legally
       | allowed to say no to law enforcement requests to open up
       | backdoors, they're probably being extorted by law enforcement
       | threatening to come down on them for antitrust issues if they
       | don't go along with it. Smaller companies that don't have
       | antitrust issues to worry about would be able to put up a much
       | more resilient fight.
        
         | nullc wrote:
         | I'd like to believe that, but the investment into this faulty
         | system was clearly millions of dollars. That is an awful lot of
         | money for a feint.
         | 
         | Re: facebook the tiny number of court cases in the face of an
         | astronomical number of reports is an indication that we really
         | don't understand what is going on there or or the motivations.
        
           | jliptzin wrote:
           | Millions of dollars to Apple is like pennies to us, though.
        
       | numair wrote:
       | For everyone who is saying that the technical community "won"...
       | Nobody's been fired. Rather, Apple's senior execs _publicly
       | doubled down_ and basically labeled those of us who thought this
       | scheme to be insane, as a bunch of uninformed idiots...
       | 
       | You can change a product decision in a day, but it takes a LONG
       | time to change a culture that thinks these sort of insane product
       | decisions make any sense whatsoever. Making a long-term bet on
       | Apple has become precarious at best.
        
         | chaz6 wrote:
         | Perhaps they need to be reminded of what happened when RSA/NSA
         | backdoored Dual EC. Eventually a foreign power discovered it
         | and used it for their own intentions.
        
         | chrononaut wrote:
         | > but it takes a LONG time to change a culture that thinks
         | these sort of insane product decisions make any sense
         | whatsoever
         | 
         | Assume Good Faith. If there's one thing I learned about
         | sensationalist stories or situations, it's that the position of
         | "the other side" / reality is far more nuanced than the story
         | tellers would like to make you believe. The result still may be
         | poor, but the steps that brought people there are not nearly as
         | insidious as it's normally presented.
        
         | hughrr wrote:
         | Yeah the thing that really got me was the double down. They are
         | not on my side.
        
         | livueta wrote:
         | Yeah. I think some celebration is warranted, but the "disaster
         | averted, my iPhone is back to being a trustworthy user agent"
         | take seems to be a bit myopic when the core problem is that
         | Apple possesses enough control over the devices it sells to be
         | able to implement something like this by fiat. Sure, Apple
         | backed down today, but until users are able to exercise the
         | four essential freedoms they're living under the sword of
         | Damocles.
        
           | amelius wrote:
           | "Trust is gained drop by drop, but lost by the bucketful."
        
         | zekrioca wrote:
         | The technical community was labeled the "screeching voices of
         | the minority".
        
           | jodrellblank wrote:
           | Not by Apple, much as the media would love to have you think
           | otherwise. The 'screeching' description comes from Marita
           | Rodriguez, executive director of strategic partnerships at
           | the National Center for Missing and Exploited Children, in a
           | memo to Apple employees.
           | 
           | [Edit, since I can't reply anymore because I'm "posting too
           | fast"] I didn't say you thought that personally; in a sub-
           | thread about Apple senior execs applying negative labels to
           | people, and this being one of the most insulting labels
           | applied, I think it's important to note that this label came
           | from somewhere else. (and maybe important to note "the
           | technical community" were not labelled "screeching voices";
           | the memo said "the air will be filled with the screeching
           | voices of the minority". It didn't say "everyone who
           | disagrees is a screeching voice").
           | 
           | [1] https://www.howtogeek.com/746588/apple-discusses-
           | screeching-...
        
             | simondotau wrote:
             | > _[Edit, since I can 't reply anymore because I'm "posting
             | too fast"]_
             | 
             | This means you've had your account sanctioned by Dang. You
             | might want to send him an email and offer him your most
             | profuse and sincere apology for whatever it is you did to
             | anger him.
        
             | zekrioca wrote:
             | It is interesting how you thought that I thought the label
             | came from Apple, even though I did not mention it.
             | 
             | The _key_ point is that the label did come from the
             | collaboration that Apple engaged in.
        
             | SheinhardtWigCo wrote:
             | An Apple software VP forwarded the memo to engineers,
             | saying:
             | 
             | > I wanted to share this note that we received today from
             | NCMEC. I found it incredibly motivating, and hope that you
             | will as well.
             | 
             | https://9to5mac.com/2021/08/06/apple-internal-memo-icloud-
             | ph...
        
       | justinzollars wrote:
       | Still on the fence about this brand now. Privacy is the last
       | thing I think about when I think apple.
        
       | stevenalowe wrote:
       | "Child safety features" because only a monster would be against
       | such a thing.
       | 
       | I seem to recall gun trigger locks being pushed with the exact
       | same phrase (because "self-defense obstacles" would be too
       | accurate)
       | 
       | "For the children" is the last refuge of politicians and conmen.
       | 
       | So what's the bottom line?
       | 
       | i'm guessing that scanning (for many things) is mandated by
       | multiple governments, and on-device scanning is orders of
       | magnitude cheaper than server-side scanning.
       | 
       | I hope this PR nightmare for Apple continues until the full
       | extent of mandated surveillance is exposed
       | 
       | But it probably won't - Apple's mistake was in thinking that
       | people would blindly accept this as a desirable feature, instead
       | of just quietly implementing it.
       | 
       | Or, maybe Apple did it this way on purpose to expose the issue to
       | public scrutiny.
        
       | mulmen wrote:
       | Honestly the trust damage is already done. They announced it, the
       | genie is out of the bottle. I can never trust Apple as much as I
       | did before. They are no longer the benevolent dictator of the
       | walled garden. They have proven their judgement is fallible. They
       | have proven that their customers interests do not matter to them
       | as much as their own agenda. I'm not sure how they can ever fully
       | recover from this, or even promote themselves as a privacy
       | focused brand. That will always ring hollow.
       | 
       | This reminds me a bit of the DNS-over-HTTPS hand wringing as
       | well. DoH points out that applications can run their own DNS and
       | you can't easily control that behavior at a network level like
       | with conventional DNS. That is pretty troubling from a
       | technically literate user perspective but it's not actually
       | _new_. Applications could always hard-code DNS. DNS-over-HTTPS
       | just made us think about it.
       | 
       | Similarly Apple has complete control of your device. They always
       | have, it's actually part of the value proposition. There has been
       | a lot of debate about what Apple should or shouldn't do here but
       | the fact is they could have pushed this out silently and it could
       | be scanning your entire device right now, we just don't know. We
       | have to trust Apple at their word and after their announcement
       | that they would push this kind of capability I'm not sure how we
       | can ever trust them again.
        
         | eric__cartman wrote:
         | As many have been saying for decades, the only way to have
         | privacy when computing is by using publicly auditable open
         | source software. No amount of pinky promises made by the
         | marketing department can change that.
         | 
         | It doesn't matter if the end user doesn't audit everything
         | themselves, that's impossible to do in a reasonable manner, but
         | the constant auditing and testing by many independent entities
         | gives, in my opinion, a better assurance of privacy than some
         | marketing material saying they deeply care about me.
         | 
         | Somehow a large subset of the general public doesn't seem to
         | understand this. And act all surprised when Apple, Google,
         | Microsoft or whatever screws them over.
        
         | athrowaway3z wrote:
         | You don't deserve this personally, and I'm genuinely interested
         | if you have more insight, but I feel the need to vent:
         | 
         | Really? What the fuck did Apple do, that you gave them total
         | control to dictate _your_ walled garden in the first place?
         | 
         | I just don't get it. My best guess is Sunk-cost-color glasses,.
         | From the moment they launched itunes (which is 15y old for me)
         | I understood their value was a walled garden as a platform.
         | Users as a commodity. Valuation as overlords.
         | 
         | What ever gave you a different idea?
        
         | [deleted]
        
       | TedShiller wrote:
       | Apple doing this to users' phones is like the police entering
       | every house in the city (or in this case, the COUNTRY), just in
       | case someone has something illegal in their basement. Nothing to
       | worry about if you have nothing to hide, right?
       | 
       | The presumption of guilt is the problem. Freedom means that I
       | shouldn't even be suspected, and certainly not searched or
       | monitored, unless there is a clear reason.
       | 
       | If there is a reason to suspect me, all of these tactics are fair
       | game. But not before that.
        
         | mikehearn wrote:
         | This analogy doesn't accurately represent the technology, at
         | least as I understand it.
         | 
         | In Apple's implementation, the device never knows if a
         | particular picture is a CSAM match. That determination is made
         | in iCloud when the server attempts to decrypt the safety
         | voucher. Until that point, it's just an encrypted payload that
         | the device can't interpret one way or the other.
         | 
         | In your analogy, where "your home" is the equivalent of "your
         | device", the police never enter the home to determine whether
         | you have anything illegal. Instead, there's some process that
         | boxes up all your stuff into nondescript, anonymous boxes that
         | can only be opened if someone has the key.
         | 
         | To determine illegality, you'd have to voluntarily send them
         | off to the police (police = iCloud), where they only have a
         | handful of keys - they have a "gun" key, a "knife" key, and a
         | few other keys for boxes containing illegal items. But the
         | boxes are nondescript, so the police don't know whether you
         | have anything illegal until they insert the key and turn it. If
         | the "gun" key successfully opens the box, the box contains a
         | gun, and you are reported. If all the police's keys fail on a
         | particular box, then whatever is inside must not be illegal and
         | the police never learn its contents.
         | 
         | Needless to say, this analogy is tortured because it's hard to
         | apply Apple's tech to a physical process, but the point is that
         | whether something is "illegal" isn't able to be determined
         | until you voluntarily ship it off to an entity that has the
         | keys to unlock it.
        
           | noptd wrote:
           | >In Apple's implementation, the device never knows if a
           | particular picture is a CSAM match.
           | 
           | That's a distinction without a difference w.r.t the end
           | result but I'll offer a more apt analogy regardless.
           | 
           | A better analogy would be the police installing a device in
           | your house that's capable of seeing or hearing anything that
           | happens and then claiming there's nothing to worry about. The
           | device is only watching a specific door in your house and
           | forwarding a hash of that information to their servers.
           | Nevermind that it would only take a policy change and an OTA
           | update that you have no visibility into, or chance of
           | blocking, before it's watching your entire house in real-
           | time.
           | 
           | But hey, you have several other doors to enter or exit your
           | house from, and it's not like the camera actually knows
           | anything, only the people on the other end do, so what's the
           | big deal right?
           | 
           | Would you trust the police in that scenario?
        
       | pshirshov wrote:
       | Well, I personally feel that the trust was broken.
       | 
       | I've migrated to a self-built Android (that was quick and mostly
       | painless, I would recommend CalyxOS and/or Graphene to anyone)
       | and have a long-term plan to completely pull myself out of Apple
       | ecosystem.
       | 
       | Also it was a good reminder to degoogle myself, so I did.
        
       | midrus wrote:
       | I was going to buy an iPhone before they announced this, because
       | I'm tired of all the Google tracking stuff.
       | 
       | If they finally roll this out, I think my next phone will be a
       | Nokia 1100. I don't have anything to hide, but I'm tired of
       | everyone trying to track and sneak into your personal stuff, for
       | whatever reason.
        
       | mrtksn wrote:
       | In my opinion, Apple will break its brand if they actually go
       | ahead and release "self-reporting to the authorities" device
       | features.
       | 
       | Even if they somehow manage to make it do exactly the thing it is
       | supposed to do and not slide into a feature for mass governmental
       | control, I think a lot of people will loose trust into the
       | devices they own.
       | 
       | Automatically scanning your stuff and reporting the bad things to
       | the police is like the concept of God always watching. Even if
       | you don't do anything wrong and you agree that the stuff they are
       | watching for is bad and should be prevented, there is something
       | eerie about always being watched and controlled.
       | 
       | The iPhones should mind their own business and leave the police
       | work to the police.
       | 
       | That being said, I would be O.K. if the scan is performed on the
       | device upon physical access by the authorities. You caught
       | someone, You have enough evidence to investigate the person, you
       | are allowed to search his/her home then maybe it can be O.K. to
       | run a fingerprint scan on the phone to look for evidence.
        
         | Lamad123 wrote:
         | this whole thing just made me wake up to how much control that
         | company has over something that's supposed to belong to those
         | who buy it!!
        
           | lm28469 wrote:
           | > company has over something that's supposed to belong to
           | those who buy it!!
           | 
           | If it's not written in the law nothing is "supposed to"
        
           | slg wrote:
           | I don't know why more people didn't come to this conclusion.
           | The choice between scanning the files before upload to the
           | cloud or after upload to the cloud is largely a policy
           | decision and not a technical one since Apple already controls
           | both ends of that process.
           | 
           | If you don't trust them to scan things on your device why are
           | you trusting them to scan things in the cloud and why are you
           | trusting them with both the hardware and software you run?
        
             | wizee wrote:
             | 1. They own their cloud, you own your device. Having your
             | own device work against you is more problematic in
             | principle than having another party you give your data to
             | working against you. I don't like the concept of devices
             | that I own betraying me.
             | 
             | 2. When scanning is done locally, it's easier for them to
             | secretly or openly modify the software to scan everything
             | on your device, rather than just what's just uploaded to
             | iCloud. If they tried scanning everything (including non-
             | iCloud images) in the cloud, the network traffic of
             | uploading every image on your device to the cloud would be
             | observable.
        
               | slg wrote:
               | You seemed to have missed my point. They control the
               | entire ecosystem. If you are worried about them doing
               | something secretly to violate your trust, you should
               | assume they are already compromised because the only
               | thing stopping them is a policy decision and not a
               | technical limitation.
        
               | fsflover wrote:
               | The difference is, they _promised_ not to look at
               | personal files. If you are using their devices you trust
               | their promises. Now, they broke their promise and a lot
               | of people are loosing the trust.
        
               | slg wrote:
               | >they promised not to look at personal files
               | 
               | Where was this promise stated in these explicit terms,
               | especially the definition of "personal files"? Because
               | Apple is also promising that this only applies to file
               | that are set to be uploaded to iCloud and the expectation
               | of privacy is different for files uploaded to the cloud.
               | 
               | Either Apple's promises can be trusted or they can't. And
               | if they can't, you can't trust anything about their
               | devices.
        
               | dschuessler wrote:
               | I think this comes pretty close to what you're asking
               | for: https://www.maclife.de/media/maclife/styles/tec_fron
               | tend_lar...
               | 
               | Of course it is not an explicit promise and doesn't refer
               | to "personal files" but I think it shows where OP is
               | coming from. You can reasonably understand this message
               | in the way they do.
        
               | ohazi wrote:
               | > They own their cloud, you own your device.
               | 
               | They push arbitrary software updates to your device at
               | their whim, therefore _they_ own your device.
        
           | Terretta wrote:
           | Arguably, we are still collectively asleep that the real
           | problem needing solving is client-side SDKs.
        
             | spicybright wrote:
             | Can you expand on that?
        
         | godelski wrote:
         | > Automatically scanning your stuff and reporting the bad
         | things to the police is like the concept of God always
         | watching.
         | 
         | What I find interesting is how our perspectives shifted over
         | time. 1984 was considered a dystopian nightmare and within that
         | world Big Brother wasn't always watching but rather _could_
         | watch at any time. The simple ability to be able to tune in was
         | enough to create the fear and have people report others. We're
         | well past that to be honest and interestingly our
         | interpretations have shifted as well to Big Brother _is_ always
         | watching and that being the dystopian nightmare. In this sense
         | we're there in some areas of our lives and it has shifted to "I
         | have nothing to hide" or "what can I do?"
         | 
         | So it is interesting to see that essentially the window shifts
         | further and further in that direction and as long as the shift
         | is sufficiently slow the general public seems not to care. I
         | wonder if there is a turning point.
        
           | skoskie wrote:
           | I think it's critical to understand that a single law can
           | shift that window drastically. If Apple can allow for a small
           | shift in that direction - however distasteful - it may
           | prevent the larger shift caused by anti-privacy legislation.
           | 
           | https://www.eff.org/deeplinks/2019/12/senate-judiciary-
           | commi...
        
         | robertoandred wrote:
         | It of course does not self-report to any authorities.
        
           | yjftsjthsd-h wrote:
           | So how would you describe the actions of this system when it
           | think it's detected illicit pictures?
        
             | oreilles wrote:
             | If your phone were to detect illicit pictures, nothing
             | would happen unless you send them those files to Apple, in
             | which case they would have know that you had them anyway.
             | There is nothing even remotely close to the truth with
             | saying that your phone `self reports` to the authorities.
             | Not only Apple has no idea about what photos you have on
             | your phone nor what they are scanned against unless _you_
             | send them your photos with iCloud, but it also isn 't your
             | phone reporting to the authorities but Apple, once they
             | know for sure that you have illegal content on _their_
             | servers.
        
             | robertoandred wrote:
             | If the phone and the server agree that enough CSAM matches
             | have been detected, the matches get reviewed by Apple. If
             | Apple agrees with the phone and the server, the matches get
             | reviewed by the NCMEC. If the NCMEC agrees with Apple and
             | the phone and the server, the NCMEC may get law enforcement
             | involved.
             | 
             | Bad-faith actors want people to think that taking a picture
             | of your baby in the tub will bring a SWAT team to your door
             | the next day.
        
               | Gareth321 wrote:
               | You just described reporting to authorities.
        
               | turdnagel wrote:
               | I guess we're splitting hairs over "reporting" and "self-
               | reporting." I think the idea is that there is a layer -
               | yes, a fallible, human one - between detection and
               | reporting.
        
               | stjohnswarts wrote:
               | There is nothing "bad faith" about pointing out the
               | mentality behind this of total device surveillance. Most
               | people who are angered about this aren't worried about
               | false positives of baby's first bath. They are worried
               | about Apple scanning stuff on your phone as a proxy for
               | police. What's next? Copyright infringement? combing
               | through texts for drug activity? thought crimes against
               | the CCP? It's not slipery slope as all through history
               | governments have Always wanted more power over their
               | citizens and to not even have your phone be safe from
               | arbitrary scanning is just too damn far.
        
               | oreilles wrote:
               | It's not Apple scanning your device. It's your device
               | scanning itself, and _you_ sharing the results of those
               | scans to Apple with iCloud if you decide to use it. Don
               | 't want Apple to see your data ? Just don't send it to
               | them, it's that simple.
        
               | stale2002 wrote:
               | What's even more simple than your solution, is to
               | pressure Apple politically to change this.
               | 
               | And it looks like that worked, and apple got rid of this
               | self reporting feature.
        
               | mikeiz404 wrote:
               | > It's your device scanning itself, and you sharing the
               | results of those scans to Apple with iCloud if you decide
               | to use it.
               | 
               | While technically correct, there is much power in the
               | defaults that are set. iCloud photo sharing is on by
               | default and provides some useful features, especially if
               | you have multiple apple devices. Also apple doesn't
               | provide a good automatic way to do backups outside of
               | iCloud.
               | 
               | And realistically speaking how many "normal" people will
               | even be aware of how to turn this off and be willing to
               | go to the effort? Will they have a choice? Sure,
               | technically. Practically though the majority of people
               | stick to the defaults.
        
               | skoskie wrote:
               | I fully admit to being a bit pedantic here, but you are
               | not logged into iCloud on a device by default. Once
               | logged in, then yes, iCloud photos is enabled, but that
               | seems like a fair assumption if you are using iCloud
               | services.
               | 
               | > And realistically speaking how many "normal" people
               | will even be aware of how to turn this off and be willing
               | to go to the effort? Will they have a choice? Sure,
               | technically. Practically though the majority of people
               | stick to the defaults.
               | 
               | I fully agree with your point here, but also, how many
               | "normal" people deal in CSAM? The vast majority of people
               | just won't be affected by this in any noticeable way.
        
               | malwrar wrote:
               | Look, I'll just say it directly because (slippery-slope
               | fallacy aside) we all know this is how things will play
               | out in a few years after this gets rolled out:
               | 
               | gov: "hey Apple, pedophiles are rampant rn, do you really
               | need to wait before the devices upload photos to icloud
               | before you use that on-device scanning? here's some cool
               | proposed legislation to motivate the conversation. oh
               | also here's some more images that are illegal."
               | 
               | This conversation will happen behind closed doors with
               | much better wording and probably with real stakes for
               | Apple. They've built a dream technology for control
               | freaks in government across the world, something that we
               | in the technology sphere have traditionally fought
               | against for decades. All it just needs is a honest day's
               | work to scan all the time vs on upload.
               | 
               | This is the fight to fight if you'd rather just not deal
               | with a future when, surprise, the scope creeped but now
               | its too normal to oppose. People will get vanned by less
               | benign government who interpret things far more mundane
               | than pedophilia as undesirable.
        
               | oreilles wrote:
               | If Apple wanted to make the iPhone a sandwich, we could
               | eat it. But until then you can't.
               | 
               | Apple could read your whole phone content and send it to
               | governments without your consent if they wanted to, on
               | device-scanning or not. Util they do, there is no breach
               | of privacy.
               | 
               | If you believe Apple can implement one-party consent
               | snitching on iPhone without telling anyone, then you
               | shouldn't use closed source software in the first place,
               | because there is zero reason to believe that they haven't
               | already implemented this.
        
               | malwrar wrote:
               | > Apple could read your whole phone content and send it
               | to governments without your consent if they wanted to, on
               | device-scanning or not. Util they do, there is no breach
               | of privacy.
               | 
               | This misses the point. Apple has repeatedly assured its
               | customers (I'm waiting for my Pinephone to get here so I
               | no longer need to be one) that the devices they buy are
               | theirs. This is how things always have been, and although
               | you're right they could _in theory_ slip some law
               | enforcement update on my phone, they're publicly refused
               | to do so before and I trusted them as a result.
               | 
               | They're not secretly slipping in a government backdoor
               | though, they're backpedaling on their promises and openly
               | installing a backdoor on my phone for no reason. Why let
               | this slide like it's normal, or pretend like this isn't a
               | betrayal? They refused to unlock an iPhone owned by a
               | mass shooter!
               | 
               | Plus, for many people, Apple and Android are basically
               | the only options for modern computing. Their decisions in
               | this market normalize stuff like this. I don't want
               | future devices to be forced to implement easily-abused
               | crap like this because Apple convinced people this is a
               | solvable problem with machine accuracy. We're already
               | seeing this for open questions like moderation and
               | copyright enforcement, with horrific results that are now
               | just normal.
        
               | oreilles wrote:
               | > Apple has repeatedly assured its customers that the
               | devices they buy are theirs
               | 
               | That is definetly not true. Apple has always been anti-
               | ownership. By making the OS closed source, preventing
               | installation of other OS, preventing sideloading
               | applications, preventing self repair, preventing parts
               | distribution etc etc.
               | 
               | What they used to say is that what's on your device stays
               | on your device, and that you decide what you share. And
               | they stay true to that promise. If you don't want to
               | share you pictures with Apple, you don't use iCloud
               | photos, end of the story.
               | 
               | > openly installing a backdoor on my phone
               | 
               | A backdoor is defined by two characteristics: one, the
               | user do not know the existence of the backdoor. Two, the
               | backdoor allows the attacker to bypass device security in
               | order to gain access. You know the existence of the
               | scanning feature, and this feature does not allow Apple
               | nor anyone to gain access or information about what's on
               | your phone without your consent. It is not a backdoor in
               | any way you can think of it.
        
               | skoskie wrote:
               | I think you make the perfect point, with one exception.
               | Rather than future tense, it should be past tense.
               | 
               | https://www.eff.org/deeplinks/2019/12/senate-judiciary-
               | commi...
               | 
               | > This conversation will happen behind closed doors
               | 
               | It did.
               | 
               | > with much better wording and probably with real stakes
               | for Apple.
               | 
               | Pretty much. Quote from the infamous Lindsay Graham
               | (2019): "My advice to [big tech] is to get on with it,
               | because this time next year, if we haven't found a way
               | [to get at illegal, encrypted material] that you can live
               | with, we will impose our will on you."
        
               | pope_meat wrote:
               | Someone should throw an iphone at Lindsey like that guy
               | did with his shoes at Bush. It'll probably do little, but
               | at least it will be satisfying.
        
               | Piskvorrr wrote:
               | And learned-from-history actors think of all the previous
               | "think of the kids" campaigns, and their subsequent scope
               | creep: Oh, we sorta kinda extended the hash database to
               | also include, say, Bitcoin. Because we can. No, there's
               | nothing really wrong with it, and anyway, the innocent
               | have nothing to fear, right? And also Winnie the Pooh and
               | the tankman; the Chinese government asked very nicely.
               | Oh, and the Afghan government also had some requests on
               | what hashes they need reported. Yeah, that's the Taliban,
               | but if you're innocent, you have nothing to fear, right?
               | Brb, there's an email from Lukashenka asking for
               | detections of anything white-red-white...
        
               | robertoandred wrote:
               | Expand the hash database to include Bitcoin? What? I
               | really don't think you understand how this works.
        
               | jodrellblank wrote:
               | > " _And learned-from-history actors think of all the
               | previous "think of the kids" campaigns, and their
               | subsequent scope creep:_"
               | 
               | You say, and then go on to give as examples /things which
               | haven't happened, but you fantasised might happen/. That
               | isn't learning from history.
        
               | MeinBlutIstBlau wrote:
               | So if a creep amasses a database of those even though
               | individually they aren't "illegal", what do you do? You
               | can't call it CP for one but not for the other.
               | Especially if they're made publicly available on
               | facebook. It's a lame slippery slope argument but it's
               | gonna get hashed out in criminal defense cases where
               | Apple is calling someone a pedophile.
        
               | jmull wrote:
               | Excuse me, but that is reporting to the authorities.
               | 
               | You're simply describing some of the details of how it
               | happens.
               | 
               | It's nice that there are some manual review steps, but
               | that hardly changes what's happening. Especially since
               | the human reviewers are almost surely going to pass the
               | case through to the next step if there is any ambiguity.
        
               | robertoandred wrote:
               | It absolutely changes what's happening. The phone is not
               | self-reporting to the authorities. Simply false.
        
               | noptd wrote:
               | Let's not be pedantic. In the parent's example, this is a
               | distinction without a difference.
        
               | genewitch wrote:
               | That's just self-reporting to the authorities with a
               | bunch of extra fallible human steps
        
               | robertoandred wrote:
               | No, it's self-reporting to Apple.
        
               | stjohnswarts wrote:
               | lol who are at this point just a proxy for the police
               | intelligence.
        
               | hannasanarion wrote:
               | In what ways do you think those human steps can fail?
               | 
               | The method only returns positive on a hash-match with a
               | known piece of CSAM from NCMEC.
               | 
               | The human process is looking at a picture somebody has
               | already been convicted of child abuse for having, and
               | comparing it with yours to see if they're the same.
        
         | YLYvYkHeB2NRNT wrote:
         | > loose trust into the devices they own.
         | 
         | I already did and have moved on. And I was with them for a long
         | time. Thousands of dollars, also. Thousands.
        
         | oreilles wrote:
         | > "self-reporting to the authorities"
         | 
         | That's 100% not what the feature is about. People need to stop
         | spreading this nonsense. Apple only get the result of the scans
         | when the files are send to them via iCloud, _then_ they, Apple,
         | not your device, would report to the authorities. Also, they
         | already were looking for CP in your iCloud content and report
         | to the authorities anyway, as was shown recently with the
         | doctor that was caught with 2k images. So whether you were or
         | weren 't using iCloud, privacy-wise on-device scanning makes
         | literally 0 difference.
        
           | vorpalhex wrote:
           | It makes all the difference in the world on whether Apple
           | scans their own servers for content they don't want versus
           | your phone snitching on you.
           | 
           | I can choose to not upload things to Apple.
           | 
           | If my iPhone decides to scan my photos just because, I can do
           | nothing.
           | 
           | I won't accept my device being a snitch under someone elses
           | control. Companies change values and Apple already brazenly
           | allows the CCP to snoop on Chinese iCloud data.
        
             | oreilles wrote:
             | Except your phone is not a snitch. Snitching is sending
             | informations to a third party, which your phone doesn't do,
             | unless you explicitly accept it by using iCloud syncing.
             | 
             | You're basically upset that your operating system can read
             | your files. Well good luck finding an operating system that
             | cannot read your files.
        
               | vorpalhex wrote:
               | Apple is a third party.
               | 
               | Right now Apple says they only scan iCloud uploads. We
               | can't verify that, but let's assume they aren't lying.
               | 
               | Why not scan your whole library? They can, they could
               | make the argument it's to fight CSAM. I mean if you
               | aren't uploading to iCloud because you're a pedophile,
               | obviously we should scan your whole phone just in case.
               | Think of the children!
        
               | skoskie wrote:
               | You seem to be contradicting yourself. You have the
               | option to turn off iCloud photos in which your phone will
               | not scan the photos because there is no point.
               | 
               | > Why not scan your whole library?
               | 
               | That may very well be the case in the future, but nobody
               | can say for sure. But even if they do scan other files on
               | your phone, it's unlikely they will do that for files
               | that don't sync to iCloud. If this bothers you, you can
               | always just sign out of iCloud completely. You can still
               | retain access to the app store and music subscriptions
               | separately.
        
               | noptd wrote:
               | In that example, you phone is still a snitch but it's
               | snitching to Apple in order for them to then snitch to
               | the "appropriate" authorities.
        
               | oreilles wrote:
               | No it is not, as zero information is sent to Apple unless
               | you use their cloud service. Now if you consider that a
               | cloud syncing service is spyware because it sends
               | information to the provider, I don't know what to say.
        
         | dylan604 wrote:
         | >That being said, I would be O.K. if the scan is performed on
         | the device upon physical access by the authorities.
         | 
         | Why? You have physical access to the device. Just look at the
         | data. The CSAM scan will only ID known imagery. What if this
         | person has new content that is unknown to the database? This is
         | a non-sensical justification.
        
           | mrtksn wrote:
           | I want my data to stay encrypted no matter what however I
           | accept that there are people who can do bad things and access
           | to their data could be valuable.
           | 
           | So it is a compromise, the data stays encrypted but the
           | authorities can get an idea about the nature of the data.
           | 
           | Then it is up to the person to reveal their data as a part of
           | a cooperation agreement. If there's a CSAM material match,
           | the person can prove innocence by revealing the data(false
           | positives) or the judges can increase the severity of the
           | penalty for not cooperating.
           | 
           | Milage may vary depending on your jurisdiction however I
           | think it is a good way to look at someones stuff for
           | offensive material without actually looking at unrelated
           | stuff. So no match, no reason to try to get into the device.
        
             | hfsh wrote:
             | > the person can prove innocence
             | 
             | Yeah...
             | 
             | That's not a thing many people are really comfortable with
             | in societies that are nominally non-authoritarian.
        
               | mrtksn wrote:
               | Innocence over credible suspicion.
        
             | dylan604 wrote:
             | What are you on about? The comment stated that the
             | authorities have physical access to the device. That means
             | that they can see whatever data you have because it's your
             | device showing it to them.
             | 
             | Also, this idea is even more dystopian than the original
             | concept. The original idea was only going to scan content
             | the user chose to push to Apple servers. This current
             | suggests scanning the entire device just because johnny law
             | has the device. That is soooo much worse of an idea.
        
               | mrtksn wrote:
               | Physical access doesn't mean access to a meaningful data
               | if it is properly encrypted.
               | 
               | I didn't say anything about Apple servers. They can
               | implement a protocol that works over a wired connection
               | only, gives the device the instruction to match all the
               | files on the device against the given list of
               | fingerprints and returns the results. No match, no
               | results. If there's a match, return just enough to tell
               | what fingerprints matched and nothing more.
        
               | skoskie wrote:
               | This is almost how the new scanning system works.
        
               | mrtksn wrote:
               | The difference is huge. Apple's proposed system scans
               | everyone's(who uses iCloud) photos and reports them to
               | the authorities(with Apple being a proxy). It's a
               | snitch/God system.
               | 
               | On my proposal, all photos on the device are scanned,
               | scan results(hashes) are stored on device but no attempts
               | to pin you down are made. Once there's a reason to
               | suspect that you might be a criminal, only then the scan
               | results are evaluated by the phone upon the request of
               | the authorities to check on you as part of the
               | investigation. This is blackbox system.
        
               | dylan604 wrote:
               | >Physical access doesn't mean access to a meaningful data
               | if it is properly encrypted
               | 
               | And what is properly encrypted if you can take advantage
               | of 0-day vulns to unlock the phone so you have access to
               | the data in an unencrypted state. You're not playing this
               | through to the end of what physical access to the device
               | means.
        
               | mrtksn wrote:
               | These risks are valid for any device, it has nothing to
               | do with Apple or the feature in question. The police can
               | have a physical access to your devices with or without
               | implementation of this feature.
               | 
               | I don't understand what are you arguing for, physical
               | access is not a software thing and it is guarded by
               | exactly the same mechanism that guard the access to you
               | apple(the fruit) in your fridge.
        
         | Exuma wrote:
         | LOSE. The word is LOSE. LOOSE and LOSE do not even look
         | remotely similar phonetically. LOOSE. GOOSE. MOOSE. NOOSE.
         | CABOOSE. ... lose... which of these do not belong?
         | 
         | I see this mistake once a day and it literally makes no sense
         | to me, for the love of god stop plleeeasseeeeee
         | 
         | Can you imagine watching a movie called FootLose?
        
           | count wrote:
           | Don't loose your mind over a grammatical error.
        
             | Exuma wrote:
             | Yes I get it, grammar nazi blah blah blah. I don't normally
             | ever correct people's grammar (because who cares), but this
             | one error is just so mindblowing to me, and the SHEER
             | amount of times I see it makes me go absolutely nuts,
             | because if you've spoken english for any length of time,
             | you know about 1000 words with long-o soft-s that end in
             | "oose", yet I see people saying "lose" as "loose" about....
             | 2 times a day, 1 if I'm lucky, and it makes no sense
             | whatsoever.
        
               | [deleted]
        
               | oehpr wrote:
               | It's just a typo that doesn't get corrected. People just
               | arn't super duper proof reading, they're just watching
               | for the red squiggles that indicates a typo.
               | 
               | You wouldn't be really mad if you spotted an "an" instead
               | of an "and" in the wrong place, right? Because it's the
               | same accident.
               | 
               | If you think about it in these terms, you wouldn't of had
               | as much of a reaction ;)
        
               | Exuma wrote:
               | No it's definitely not a typo lol, it's people who are
               | reading it wrong in their head. The sheer amount of time
               | that specific mistake is made very strongly leads me to
               | believe it's most definitely not a typo.
               | 
               | Watch, now that I pointed it out you will see it
               | everywhere.
               | 
               | https://hn.algolia.com/?dateRange=all&page=0&prefix=true&
               | que...
               | 
               | https://www.google.com/search?q=site%3Areddit.com+%22loos
               | e+m...
               | 
               | This is only one combination of words for example, but
               | you get the idea
               | 
               | While super rough/anecdotal... "loose my" has 38,000
               | results and "lose my" has 900,000 results. Literally 1
               | out of 25 times the word is used it's used wrong (if
               | google is including roughly the same amount of items in
               | its index)
        
               | krona wrote:
               | _The sheer amount of time that specific mistake is made_
               | 
               | Don't you mean the shear _number_ of _times_ that mistake
               | is made? I 'm sure you have no insight in to the
               | aggregate duration of time all these people took to
               | incorrectly press 'o' on their keyboard.
        
               | Exuma wrote:
               | You're making a point that I'm not ("someone is
               | correcting someone else's spelling therefore ill dig
               | through every single word to find a mistake they make and
               | call them a hypocrite").
               | 
               | I just said above I don't correct people's grammar
               | because a) it doesn't matter and b) I make mistakes all
               | the time.
               | 
               | This on the other hand is if you see someone making the
               | mistake (2 * 2 = 6). Even the most basic/out-of-the-math-
               | loop person should not make this mistake. Or at the very
               | least, it shouldn't be a common mistake made 1 out of ~25
               | times
               | 
               | These are 2 very common words, that sounds completely
               | different, and there's nothing tricky about them. That's
               | why it makes no sense to me, and each time I see it it's
               | maddening. It's one thing to mess up something
               | like....... "if I was____" (improper subjunctive mood), I
               | see that mistake all the time, but who cares because how
               | is anyone going to remember that? But lose and loose is
               | just so fundamental to know the difference between when
               | to use hard/soft "S" when preceeded by "oo". Furthermore,
               | how many words are there that precede a soft "s" that
               | only has one "o"... ? I can't think of any other than
               | "dose".... but there are tons that fit the bill
               | otherwise: hose, lose, nose, pose, rose..............
        
               | Toutouxc wrote:
               | > because if you've spoken english for any length of
               | time, you know about 1000 words with long-o soft-s that
               | end in "oose"
               | 
               | > it's people who are reading it wrong in their head
               | 
               | After speaking english (which is my second language and
               | has nothing common with the slavic language I usually
               | speak) for over 15 years, I learned TODAY that those
               | words are supposed to sound different.
               | 
               | Thank you and next time please consider that people like
               | me exist.
        
               | Exuma wrote:
               | Interesting, so when you said the following 2 sentences:
               | 
               | The team will lose the game.
               | 
               | The lid on the jar is loose.
               | 
               | How did you pronounce lose/loose up until now (either out
               | loud or in your head)?
        
               | gusmd wrote:
               | Another ESL speaker here. To me they are pronounced
               | exactly the same. The best way I can describe how I
               | pronounce it to you is "loose" :)
        
               | Exuma wrote:
               | Lol hmmmm... very interesting.
               | 
               | So how do you pronounce "nose" (like on your face)?
               | 
               | Second question: how do you pronounce "bulldozer" (like
               | the heavy machinery)
        
           | spicybright wrote:
           | Please stick to the discussion and keep the hissy fit to
           | youself.
        
         | encryptluks2 wrote:
         | They already collect as much information about you as possible.
         | Whenever you turn on a Mac, you're essentially agreeing to a
         | ton of data collection. It is naive to think that they will
         | never share it with anyone.
        
           | sneak wrote:
           | They share data on 30,000 users per year with the US govt
           | _without a search warrant_.
           | 
           | There are even more when it comes to actual search warrants.
        
             | dragonwriter wrote:
             | > There are even more when it comes to actual search
             | warrants.
             | 
             | Probably not much, since there is very little that would be
             | within third-party custody that they would need a search
             | warrant rather than one of various forms of subpoena
             | (including administrative ones like NSLs) for, and the
             | latter would already be counted in the "without a search
             | warrant" count.
        
               | sneak wrote:
               | NSLs and FISA orders are a small fraction of the data
               | that US cloud providers turn over. The majority is in
               | response to actual search warrants.
        
         | mulmen wrote:
         | > That being said, I would be O.K. if the scan is performed on
         | the device upon physical access by the authorities. You caught
         | someone, You have enough evidence to investigate the person,
         | you are allowed to search his/her home then maybe it can be
         | O.K. to run a fingerprint scan on the phone to look for
         | evidence.
         | 
         | This was an unreasonable and unnecessary capability, no amount
         | of comprise makes it a good idea.
         | 
         | I don't want my devices to tattle on me. They exist to serve
         | me, not the state.
         | 
         | Don't negotiate with terrorists.
        
         | alatkins wrote:
         | Not to mention the potential severe consequences of the
         | inevitable false positives, given the target subject matter.
        
         | FabHK wrote:
         | > Automatically scanning your stuff and reporting the bad
         | things to the police is like the concept of God always
         | watching.
         | 
         | ... like all cloud providers scan your stuff and report the bad
         | things (and Apple has not been doing it, but now proposed doing
         | it in a more privacy preserving manner).
        
           | CydeWeys wrote:
           | There's a big difference though between cloud providers
           | scanning files stored on their hardware that they own and
           | Apple scanning files on _your_ hardware that you own.
        
             | FabHK wrote:
             | Is there? It's just running a blinded hashing and private
             | set intersection locally, doesn't even know on device
             | whether something has matched or what.
        
               | intricatedetail wrote:
               | It's a difference between police officer on the street
               | and in your home.
        
               | mrtksn wrote:
               | It is your own device actively checking on you for
               | misbehaving. Completely different from cloud providers
               | checking on renters for misbehaving in their
               | infrastructure.
               | 
               | Your phone will listen to instructions from 3.rd parties
               | to check on you and report you to the police(with Apple
               | as a dispatcher).
        
               | FabHK wrote:
               | Well, no... the cloud provider (Apple in the case of
               | iCloud) will check on you and potentially report you
               | (just like all the other cloud providers do), but without
               | scanning all your files in the cloud (unlike all the
               | other cloud providers).
               | 
               | As I said, your phone doesn't even know if anything
               | matched.
        
               | mrtksn wrote:
               | The point is, your phone follows instructions from 3rd
               | parties on scanning your data and reporting back the
               | results. It's processing your files accordingly, with the
               | intent to to make sure you are not doing anything wrong
               | according to those 3rd parties. Your own device is
               | policing you according to the requirements of those 3rd
               | parties.
        
         | joe_the_user wrote:
         | _That being said, I would be O.K. if the scan is performed on
         | the device upon physical access by the authorities._
         | 
         | Let me call Yet Another Insideous Concession (YAIC). It seems
         | like this has the sell that "this has safe search built in so
         | in the event of seizure, authorities won't have to do a full
         | search". But actual situation is authorities would never be
         | satisfied with things limiting their search on seizure but
         | would be happy for this to be added where they'd wind-up using
         | it primarily through some automatic mechanism.
        
           | qualudeheart wrote:
           | Yet Another Insidious Combinator sounds like a cool name for
           | a startup incubator.
        
         | skoskie wrote:
         | I really wish the people - especially those on HN - would take
         | a broader look at what Apple is proposing and better understand
         | the forces at play before being so critical of the tech. I
         | understand the initial knee-jerk anti-privacy response, but
         | since there has been some time for people to learn all the
         | facts, I remain amazed that they never come up in these
         | threads.
         | 
         | First, you hear a lot of people, including Snowden (while
         | contradicting himself), say this isn't really about CSAM. That
         | point is absolutely correct. This is ALL two things, each
         | addressed here:
         | 
         | 1. Legal liability, and the cost of processing as many
         | subpoenas as they do.
         | 
         | Ultimately, Apple has the keys to the data they store on their
         | servers. They could easily encrypt all the data using on-device
         | keys, before uploading to ensure they can't actually see
         | anything. But this would cause a huge backlash from law
         | enforcement that would cause congress to pass legislation
         | mandating backdoors. In fact, Apple (big tech) has been trying
         | to hold off that legislation since at least 2019, when they met
         | with the Senate Judiciary committee [1].
         | 
         | Quote from EFF article:
         | 
         | > Many of the committee members seemed to arrive at the hearing
         | convinced that they could legislate secure backdoors. Among
         | others, Senators Graham and Feinstein told representatives from
         | Apple and Facebook that they had a responsibility to find a
         | solution to enable government access to encrypted data. Senator
         | Graham commented, "My advice to you is to get on with it,
         | because this time next year, if we haven't found a way that you
         | can live with, we will impose our will on you."
         | 
         | Apple is doing exactly what Graham told them to do. They have
         | come up with a system that manages to increase security for
         | most users by ensuring that nobody - not even Apple - has the
         | decryption keys for your data, while also satisfying law
         | enforcement to the degree necessary to prevent really harmful
         | anti-privacy legislation. They managed to do it in a really
         | creative way.
         | 
         | It's not perfect of course. There are plenty of people with
         | valid concerns, such as the potential for hash collisions and
         | how a country like China might try to abuse the system and
         | whether Apple would give into that pressure (as they did in the
         | past). All of that is valid, and I'm glad to see Apple stop and
         | examine all the complaints before pushing the release. But
         | strictly on the topic of privacy, the new system will be a
         | massive improvement.
         | 
         | 2. User privacy. Yes, everyone thinks this is an invasion of
         | privacy, but I just don't see how. The proposed on-device
         | scanning solution provides MORE privacy than either the current
         | iCloud system (in which Apple can be compelled to decrypt
         | nearly all of your data) or the proposed [2] legislation - MORE
         | privacy even for people found to meet the CSAM threshold!
         | 
         | It seems to me there must be a lot of misunderstanding
         | surrounding the encryption mechanisms Apple has proposed. But
         | having read the technical documents, my view (again, strictly
         | from a privacy standpoint) is that it appears to be extremely
         | sound.
         | 
         | Essentially, there are currently two parties that can decrypt
         | your iCloud data with master keys - you and Apple.
         | 
         | In VERY greatly simplified terms the new system will set one
         | master decryption key on your device. But Apple will now
         | instead use shared key encryption, which requires ALL of the
         | ~31 keys to be present to decrypt the photos. Apple will have
         | one of those keys. The other 30 (the "threshold") keys will be
         | generated by a hash (of a hash of a hash) of the match found in
         | the CSAM database. If no match is found, then the shared key
         | needed to decrypt that image is never generated. It doesn't
         | exist.
         | 
         | One way to look at this is that it's the CSAM images that are
         | the keys to unlocking the CSAM images. Without them, Apple
         | _cannot_ comply with a subpoena (for photos ... for now). Even
         | people who meet the CSAM threshold, can only have the CSAM
         | images decrypted. All other photos that have no match in the
         | CSAM database cannot be decrypted without access to the suspect
         | 's phone.
         | 
         | On the flip side, Apple is bending to congress's demands by
         | voluntarily sharing information with law enforcement. I can
         | absolutely understand how this behavior could make even
         | perfectly innocent people feel uncomfortable. But in the
         | context of the understanding that you have more privacy for
         | yourself, while exposing those who deal in CSAM (and are dumb
         | enough to store it in their iCloud account), I have to push my
         | logical understanding to overcome my natural but unwarranted
         | discomfort. Anything that prevents the government from getting
         | universal backdoor into everyone's phone is a win, in my
         | opinion.
         | 
         | [1] https://www.eff.org/deeplinks/2019/12/senate-judiciary-
         | commi...
         | 
         | [2] https://www.eff.org/deeplinks/2020/06/senates-new-anti-
         | encry...
        
           | Copernicron wrote:
           | A great many people have attempted to explain why they are
           | against this particular mechanism for detecting CSAM. I agree
           | with you that Apple's implementation is technically
           | impressive and probably the most private way to performing
           | this action on device. However, I disagree that it's more
           | private than the current cloud scanning. If the scanning
           | happens client-side, then I have absolutely no control over
           | what gets scanned and when. If the scanning is server-side,
           | then I can simply not upload anything to the cloud and no
           | scanning happens. I can't avoid client-side scanning like I
           | can avoid server-side scanning.
           | 
           | I realize this is a simplification of the actual method Apple
           | has implemented and as it currently stands it would only scan
           | photos that are destined to be uploaded to the cloud. If it
           | were guaranteed that would never change then I think a lot
           | more people wouldn't have a problem with it. But it will be
           | abused. Every[1] single[2] time[3] this sort of system is
           | implemented "for the children" it gets abused. The slippery
           | slope here is real and well-demonstrated in various countries
           | around the world.
           | 
           | For my part I have come across an imperfect analogy that I
           | feel accurately captures how I feel about Apple's solution.
           | My phone is like my diary. There's nothing illegal in there.
           | But there is stuff that is deeply personal, private, and even
           | some that would be terribly embarrassing if the wrong person
           | saw it. As long as I keep my diary to myself and don't let
           | anyone see it I have nothing to worry about. If I were to
           | send my diary off to someone else known to read diaries then
           | it's my own fault as much as anything else if it gets read
           | and intimate details of my life known.
           | 
           | [1] https://en.wikipedia.org/wiki/Internet_censorship_in_the_
           | Uni...
           | 
           | [2] https://en.wikipedia.org/wiki/Internet_censorship_in_Aust
           | ral...
           | 
           | [3]
           | https://en.wikipedia.org/wiki/Internet_censorship_in_Russia
        
           | Lamad123 wrote:
           | I am seeing a lot of these contrived tedious comments
           | shilling for this company.
        
         | polote wrote:
         | > In my opinion, Apple will break its brand if they actually go
         | ahead and release "self-reporting to the authorities" device
         | features.
         | 
         | I'm always astonished that some (random) people feel more
         | confident on their brand analysis than the best people at this
         | specific topic paid thousands of dollars at Apple to do just
         | that. Or maybe I'm the one who is wrong
        
           | eitland wrote:
           | > I'm always astonished that some (random) people feel more
           | confident on their brand analysis than the best people at
           | this specific topic paid thousands of dollars at Apple to do
           | just that. Or maybe I'm the one who is wrong
           | 
           | I'm afraid you are wrong.
           | 
           | I'm also astonished that Google has allowed search to
           | detoriate to a point where competition actually has a chance.
           | 
           | I'm astonished that Microsoft cheapest their brand by serving
           | me ads on my Windows with Professional license.
           | 
           | and probably a few other things.
        
           | mrtksn wrote:
           | What makes you think that I am random, I did brand analysis,
           | that I am confident , that confidence is over the confidence
           | of the people Apple paid, and those highly paid people
           | concluded differently?
           | 
           | That comment looks like partly uneducated guess, partly a
           | fallacy to refer to the "authorities of branding". I'm always
           | amazed to see people who assume that employees of large
           | companies cannot make mistakes or that they have unlimited
           | power over the decision making.
        
             | polote wrote:
             | You are probably not random, I just added the word to make
             | the message clear.
             | 
             | There is no authorities of branding, it is just that there
             | are people who are paid for that, and it is very unlikely
             | that shareholders will let Apple do things that decrease
             | the market value over a long period of time.
        
               | jorl17 wrote:
               | How do you explain, among many other things, for example,
               | the Digg fiasco? Could they not have seen it coming?
               | 
               | People make mistakes everywhere. Teams make mistakes
               | everywhere.
               | 
               | Of course it's much more likely that they know what they
               | are doing and that they have weighed the pros and cons.
               | But, then again, even if they still go forward with it,
               | they have changed their course slightly. Do you believe
               | it was part of their plan? (genuine question)
               | 
               | Do you trust that the code written by experts has no
               | flaws? How about teams of experts? Do rockets not
               | sometimes have bugs which cause them to fail? If a team
               | of highly skilled programmers in N-version controlled
               | critical systems can fail, why can't someone or some team
               | in such unpredictable topics such as market analysis and
               | brand perception fail as well?
               | 
               | Again: I share your sentiment that surely they have
               | thought about this. It's just that I'm not so sure that
               | they were as competent as you might think -- maybe
               | because fundamentally it's a very reactive and
               | unpredictable market!
        
           | RicoElectrico wrote:
           | The brand is ultimately in the minds of consumers, and if you
           | followed discussions, many of them (and press) had the same
           | impression that it is inconsistent with Apple positioning
           | wrt. privacy.
           | 
           | While household analysts get things wrong all the time,
           | companies are not infallible, including Apple.
        
             | polote wrote:
             | You can make a mistake once or two, but if a company like
             | Apple keep pushing for something ( that can hypothetically
             | kill its brand) it is unlikely that it will actually kill
             | the brand.
        
               | philosopher1234 wrote:
               | Unlikely, sure, but its happened countless times. Its not
               | useful or interesting to say "you're probably wrong
               | because you're not head of apple".
        
               | pope_meat wrote:
               | How could a big bank like Lehman Brothers with their
               | highly paid risk analysis people fail?
               | 
               | How could Rome fall?
               | 
               | The sun never set on the British Empire.
               | 
               | Both me and my partner were very close to joining the
               | apple eco system because we felt completely betrayed by
               | G, now I'm researching alternative OS to flash my phone.
               | We both abandoned Google search and browsers. The only
               | FAANG I'm not actively avoiding is Netflix.
        
           | NoSorryCannot wrote:
           | If you have to be an insider to pontificate, then we can
           | discontinue HN comments since meetings inside Apple already
           | exist.
        
         | torstenvl wrote:
         | > _That being said, I would be O.K. if the scan is performed on
         | the device upon physical access by the authorities._
         | 
         | In the U.S., due to how the rules of evidence work, this would
         | be of limited utility. You want the results of the scan to come
         | from a trustworthy expert who can testify, who uses equipment
         | and software that is tested and calibrated frequently. Because
         | the suspect's own device won't have those calibration records,
         | trying to use the results from such a scan would raise issues
         | of foundation and authenticity.
         | 
         | If I were a prosecutor and I had a case like that come in, I
         | would still send the device in to the lab for an expert
         | forensic extraction.
         | 
         | (Edit to reply to a response below: Yes, theoretically, such an
         | on-device scan could be used to establish probable cause, but
         | that seems circular, since the authorities wouldn't be able to
         | seize the phone in the first place unless they already had
         | probable cause or consent from the owner, either of which would
         | obviate the need to establish probable cause via the on-device
         | scan.)
        
           | stjohnswarts wrote:
           | It would be used to get a warrant to trace the persons online
           | activities and get his/her phone as evidence. This isn't
           | going to be used to send out a SWAT team upon notice from
           | apple.
        
             | Piskvorrr wrote:
             | [citation-needed], and Apple's pinky swear doesn't count.
        
               | skoskie wrote:
               | The argument made in several comments that this will
               | result in SWATing isn't cited because it's an opinion on
               | potential future events. A counter opinion, therefore,
               | also would not need citation.
        
           | mrtksn wrote:
           | Thanks, interesting details.
           | 
           | I guess the legislators and the tech companies can work
           | something out on the practical issues instead of turning
           | people's devices into always watching policemen.
        
       | saltedonion wrote:
       | I bet this had something to do with #AppleToo and the company
       | didn't want to fight 2 PR battles at once. My bet is this is just
       | a delay tactic.
        
       | raylad wrote:
       | EFF petition here. Numbers do count.
       | 
       | https://act.eff.org/action/tell-apple-don-t-scan-our-phones
        
       | kf6nux wrote:
       | I bet people would have had no problem with the CSAM on-device
       | scanning if it involved neither Apple employees snooping on your
       | photos nor government reporting.
       | 
       | e.g. if they made it so their software simply refused to upload a
       | photo and inform the user why it refused to upload it, then
       | there's no privacy issue (at least not until they do add a
       | backdoor later on).
        
         | short_sells_poo wrote:
         | I respectfully disagree. In my mind, the fact that Apple
         | employees do first level filtering is a minor problem.
         | 
         | The main problem is the precedent this sets: on device scanning
         | is now possible.
         | 
         | Before this, if a government asked Apple to scan all the phones
         | for something, Apple could say "we're sorry but we don't have
         | the capability " and they could not be compelled by legal
         | means.
         | 
         | Now, a large part of that argument has been eroded. Apple may
         | have added in a few hurdles, but the first crucial step of on-
         | device surveillance capability has been installed and is on
         | path to being normalized.
         | 
         | It doesn't matter that they don't do this yet. We are
         | undeniably closer to direct phone surveillance than we have
         | been before.
        
           | kf6nux wrote:
           | The supposition was that, if the scanning had no reporting
           | capability, then Apple could still claim a lack of
           | capability. They could respond to government demands with,
           | "Sorry, our software only blocks uploads. We have no ability
           | to get telemetry on what uploads are blocked or how often."
           | 
           | That proposal probably wouldn't work for a lot of reasons
           | though. The largest blocker being that (IIUC) the NCMEC won't
           | share a DB of offending signatures without NDA, so Apple
           | probably can't load it onto consumer devices.
        
         | CubsFan1060 wrote:
         | Wouldn't this make it simply a way for the people who trade in
         | these sorts of things to get more information about how to
         | avoid them?
         | 
         | i.e. my iphone told me this photo was in the database, so I'll
         | start adjusting it until it 'passes', and them I'm free to
         | share it without detection.
        
           | kf6nux wrote:
           | Since Apple shared the algo, people who trade in these things
           | already know how to get around it (just adjust the image so
           | the fingerprint changes). People who trade in these things
           | will probably turn off icloud too.
           | 
           | Similarly, governments could take a known image they want
           | banned (say of a national embarrassment), grab some CSAM,
           | tweak the signature on the CSAM to match the photo they want
           | banned, and add the tweaked photo to the CSAM DB.
        
       | Zenst wrote:
       | So a feature for child safety, delayed due to public peer-
       | pressure. How much of the public pee-pressure was from children?
       | An odd question but equally, does highlight how many things are
       | done in the "think of the children" vein that never even asks a
       | single child for their thoughts. Whilst we all admit there are
       | many things to protect children from and even some questions, but
       | imagine if Apple countered with an advert showing lots of
       | children saying how it's good how Apple is protecting their
       | safety. How would that sit with public opinion then?
        
         | klipklop wrote:
         | A child's choice or thoughts (within reason) we rarely take
         | into consideration because they do things like shove raisins up
         | their noses. Kid's should be allowed to be kids and not be
         | dragged into this discussion.
         | 
         | This is a complex issue and there is a good reason why we don't
         | let children consent to various things or sign legally binding
         | contracts. Kid's are not just "little adults."
         | 
         | I will admit if apple did that it would be a pretty effective,
         | but really sleazy.
        
       | aborsy wrote:
       | The idea of a vendor scanning your device is terrifying, to say
       | the least.
        
         | BitwiseFool wrote:
         | And, I might add, they've created the infrastructure that
         | allows governments to scan for other 'questionable content'.
         | How long until they begin to scan for hate-speech? What if you
         | download files from Wikileaks? Or a journalist gets their hands
         | on classified materials? The government could know about this
         | before they have a chance to release it.
        
           | gremloni wrote:
           | Why bring up specifically hate-speech as some sort of paragon
           | that needs to be protected? I agree with you but put hate-
           | speech in its place, it's vile.
        
           | danlugo92 wrote:
           | This is obviously the end plan.
           | 
           | I support all measures against child abuse, but child abuse
           | is always the trojan horse.
        
       | SergeAx wrote:
       | See what you did? Chirdren will not be safe now.
       | 
       | (it is not a sarcastic rant, just a heads up on Apple's PR
       | wording)
        
       | opnac wrote:
       | We are likely a week and a bit away from new iPhone releases--a
       | good bit of damage control at play here.
        
       | concinds wrote:
       | I think it's just sad that Apple will now likely be forced to
       | switch to server-side scanning, which they've explained is
       | strictly inferior to their client-side proposal (and which I
       | wrote about here: https://news.ycombinator.com/item?id=28176418).
       | Irrational fears and (some) misinformation (which I won't accude
       | HN of, for the purposes of this discussion, just reddit, Twitter,
       | and the "tech NGOs") will result in worse outcomes for consumers.
       | 
       | This backlash (including these NGOs) also protested the iMessage
       | changes by equating "putting a nudity warning" with "breaking E2E
       | encryption". I'm not a parent yet, but why prevent tech companies
       | from adding this obvious (and desirable to many) feature?
       | 
       | It'll be interesting if they walk back their fears of "non-CSAM
       | hashes being added" when that problem is still present with
       | cloud-scanning. They didn't protest Google, Microsoft, and others
       | using PhotoDNA, which presents identical risks. Will they now
       | still complain about Apple if they propose cloud-scanning? They'd
       | be hypocritical not to, since their current arguments would still
       | apply. But then why haven't they complained when everyone else
       | did it?
        
         | stjohnswarts wrote:
         | How is scanning on My Phone better than on server side? Either
         | you trust apple with your data or you don't, supposedly only
         | they have the keys to open the data on their servers in case it
         | gets into 3rd party hands. I trust that far more than I trust
         | Apple acting as a government spy on my own phone. Sure there
         | are trade offs, but this one is well worth it. Your data isn't
         | sitting unencrypted on apple servers. They have a key to open
         | it but hackers who might steal that don't. I assure you it's as
         | good as any encryption you would have applied to it.
        
           | erikpukinskis wrote:
           | Because, with the proposed system in place, if law
           | enforcement wants to decrypt your data they have to subpoena
           | you, not Apple.
           | 
           | As long as Apple can't do this client-side scanning scheme,
           | they have to keep the keys to your data on their servers,
           | which means law enforcement can subpoena them without you
           | even being aware of it.
        
       | MeinBlutIstBlau wrote:
       | ...So they're still doing it. Yeah no thanks. I've seen what the
       | Stasi and KGB did. We don't need a tech company like Apple to
       | tattle on our behalf for what their AI interpreted as illegal but
       | was not. The last thing the average American wants to do is to
       | talk to police and lawyers.
        
       | lugged wrote:
       | Between this and the docker desktop bs we're talking about moving
       | around a thousand apple laptop purchases a year over to Linux.
        
         | neom wrote:
         | Just as a reference point. Apple sells approx 2,200 macs an
         | hour.
        
           | lugged wrote:
           | I get it, but we're not going to do it to make them sad.
           | We'll be doing it because their product sucks.
        
             | neom wrote:
             | For sure, I wasn't taking a swipe at you, it's fair! I was
             | just giving the reference point.
        
           | pshirshov wrote:
           | Yes, sure. Probably they won't notice anyone leaving.
           | 
           | That's not a reason not to leave though.
        
           | SV_BubbleTime wrote:
           | That seems crazy from the production side. Think of what it
           | would look like to manufacture 2,200+ macs per hour
           | (19,000,000 year).
        
             | propogandist wrote:
             | don't worry, they're offsetting environmental harm by
             | excluding chargers and other accessories from your
             | purchase. You can pay more to get these items, because
             | that's Apple's commitment to the 'environment'.
        
             | dharmab wrote:
             | Pre-pandemic, Ford sold a pickup truck on average once
             | every 30 seconds.
        
               | frockington1 wrote:
               | off topic but post-pandemic my Bronco still doesn't have
               | a build date, this chip shortage is out of hand
        
               | SV_BubbleTime wrote:
               | You have no idea. I just got a 100 week factory lead time
               | for a cortex m4 ARM chip... it's not a super rare chip.
        
               | godelski wrote:
               | A year later and Ampere GPUs are still near impossible to
               | get. So much for my home ML machine... (literally only
               | waiting on a GPU and been that way for a year)
        
             | katbyte wrote:
             | apple sold 18 million macs in 2018
        
             | dividedbyzero wrote:
             | Apparently they sold 6 million Macs in Q2 [1]
             | 
             | 6M/91d/24h is about 2700 units per hour, if those numbers
             | from statista are true.
             | 
             | 1: https://www.statista.com/statistics/263444/sales-of-
             | apple-ma...
        
         | mnd999 wrote:
         | Docker desktop for Linux is coming. They've already said so.
         | Your boss is going to end up paying for Docker. Giving your
         | security department the ability to restrict what images you can
         | pull is going to become a checkbox in your infosec audit.
        
           | nickjj wrote:
           | > Docker desktop for Linux is coming. They've already said
           | so.
           | 
           | I can't officially say how Docker will implement DD on Linux
           | but I have a hunch it will be optional.
           | 
           | If it's not optional then it means you will need to install
           | DD on your servers to run Docker which I can't realistically
           | see happening. I'm guessing DD on Linux will just provide a
           | few quality of life enhancements that exist on Windows and
           | Mac, such as the host.docker.internal address and a UI for
           | certain things for the folks who prefer that.
        
           | heavyset_go wrote:
           | You can just run Docker natively on Linux, no need for Docker
           | Desktop.
        
           | dharmab wrote:
           | On Linux you don't need Docker at all. You can use containerd
           | directly (or via alternative tooling such as CRI-O or
           | Podman). Kubernetes has already deprecated the Docker APIs
           | and uses OCI APIs to spawn containers.
           | 
           | https://opencontainers.org/
        
             | mnd999 wrote:
             | Of course you can, Docker's play here is not about what's
             | technically possible. It's about what they can convince
             | your IT / infosec team to block.
        
           | ulzeraj wrote:
           | Just proxy or mirror a curated list of docker images to your
           | internal registry?
        
         | xrobledo84 wrote:
         | What is going on with Docker?
        
           | headmelted wrote:
           | Just going to assume this is due to how the hypervisor on
           | macOS won't allow dynamic memory allocation to the runtime.
           | 
           | On a Mac, you need to allocate a fixed amount of RAM to
           | Docker, which isn't the case on other platforms where it can
           | dynamically allocate and release as much as your containers
           | demand.
           | 
           | It's very painful in workloads that need Docker. Especially
           | on MacBooks where you can't upgrade the RAM to compensate.
        
             | dharmab wrote:
             | Your assumption is incorrect. This is about Docker Inc
             | starting to charge money for Docker Desktop:
             | https://www.docker.com/blog/updating-product-subscriptions/
        
               | headmelted wrote:
               | It certainly was. Yikes.
        
           | codys wrote:
           | https://news.ycombinator.com/item?id=28369570
           | 
           | (while on linux one can use the docker command line tool
           | without any docker desktop thing, on mac docker desktop wraps
           | together running a linux vm to run the docker daemon &
           | containers inside, making it the most straight forward way of
           | running docker on mac)
        
         | istingray wrote:
         | Hell yes!!!! Consider System76, one of the few Linux-focused
         | vendors. Their tech support is great. Not sure the scale is a
         | match for them, 1,000 is a lot of laptops.
         | 
         | But Linux is a side project for Dell and Lenovo. They put up
         | with it but don't really support it. Support a vendor who is
         | all in on Linux.
        
           | ubermonkey wrote:
           | How's fit and finish on those?
           | 
           | Build quality is a big deal. I feel like the midline for PC
           | makers is pretty low -- Dell has mostly been kinda trashy,
           | though the XPS machines have a good feel to them (it's a
           | shame they've been kind of lemony for us).
           | 
           | Absent the keyboard kerfluffle, Apple's build quality has
           | traditionally been very high -- IME, on par with the IBM-era
           | ThinkPads in the sense that the hardware lasts longer than
           | technical viability. How is S76 here?
        
             | mrtranscendence wrote:
             | I've not used a System76 laptop, but I had a coworker who
             | used them for a long while. He never had particularly kind
             | things to say about the polish. I'm overall not too
             | impressed by what I see on their website; I really want a
             | higher-res display at least.
             | 
             | The coworker did eventually move to a Mac, but has recently
             | expressed dissatisfaction and may move back to Linux. Not
             | sure if that'll be with a System76 machine, though.
        
               | ubermonkey wrote:
               | Did s/he say what was annoying about the Mac to them?
               | 
               | (I'm not trying to argue. I'm always curious.)
        
             | istingray wrote:
             | Lemur Pro: Fit and finish is a A- for PCs (B compared to a
             | Mac) Solid build, no super weird stuff on the Lemur Pro.
             | The silkscreen for one of the USB ports says 3.1 but it's
             | 3.2. The webcam has a little lip. Trackpad is pretty
             | clickey (ive been used to apple synthetic click)
        
           | infogulch wrote:
           | Maybe consider the Framework laptop which has made the rounds
           | here a couple times recently. They made good hardware choices
           | for a productivity laptop (3:2 screen, good keyboard), is
           | easily reconfigurable with USB4 modules for a custom array of
           | ports, is designed to be easily upgraded and repaired, and is
           | sold for a fair price. There's an argument to be made that a
           | device designed for upgradability and repairability can
           | reduce TCO. Mine is shipping this month.
           | 
           | https://frame.work/
        
       | AHappyCamper wrote:
       | Facebook also "delayed" their surveillance plan for WhatsApp....
       | and eventually implemented it. Apple is now copying Facebook.
       | Apple must abandon this feckless, evil plan, or face losing their
       | user-base. This contradicts Apple's core values.
        
       | mlindner wrote:
       | Good! Now expose fully how it works and open it up to bug
       | bounties for people to show how easy it is to insert a backdoor
       | image into the system.
        
       | sebiw wrote:
       | What do you think are the chances we'll see a server-side
       | implementation instead of the so far proposed client-side one?
        
         | JohnFen wrote:
         | Personally, I wouldn't object to a server-side implementation
         | at all. My objection is that it's client-side.
        
         | nowherebeen wrote:
         | If server-side implementation, you mean they will scan images
         | that I manually upload to iCloud, then I don't have a problem
         | with it. As long as its off my personal device, and it is
         | through my own action and not theirs. My device is my property,
         | not theirs.
        
           | madeofpalk wrote:
           | I guess the trade off here is that your data is still stored
           | in (essentially) plain text on Apple's servers, and they (or
           | whatever authorities) can still rifle through it at any time
           | without telling you.
        
             | nowherebeen wrote:
             | iCloud is encrypted but Apple has the private keys so it's
             | only slightly better than plain text. You should not expect
             | you data to be anything more whenever you post anything
             | online. The only exception is E2EE but Apple will never
             | offer it.
        
               | erikpukinskis wrote:
               | The whole point of this hashing thing everyone complained
               | about was to enable E2EE.
        
               | madeofpalk wrote:
               | Just to be clear - that's pure speculation. There's no
               | evidence (apart from this local CSAM detection) or other
               | reports to back that up.
        
               | nowherebeen wrote:
               | Agreed. If they really planned to enable E2EE, they would
               | have announced it with the CSAM detection. It makes no
               | sense to have all this backlash only to enable E2EE at a
               | later date.
        
             | matwood wrote:
             | Which ironically, is less transparent, less open, and less
             | private than Apple's original plan.
        
               | ByteWelder wrote:
               | The main issue with privacy is not necessarily the
               | initial implementation, but the sliding scale that it
               | permits. Apple's statement that they will ignore
               | "government requests" for broadening the scope is almost
               | funny, because the government wouldn't request: it would
               | demand. Just like how China has control over the crypto
               | keys for the iCloud data in Chinese data centers.
               | 
               | So you're right that the initial plan is indeed more
               | privacy-friendly, but it has some major future privacy
               | implications that are much worse.
        
               | alibert wrote:
               | Is there any other way? The way I see it, the scanning is
               | coming either fully server side or with a bit more
               | privacy with the proposed client side system so is there
               | something better?
               | 
               | Others providers do it and I don't expect Apple to sit
               | and do nothing about it (governments, ngo pressure,
               | etc.). Something worse is that the Apple brand could be
               | labeled as a "CP enabler" which would very much hurt
               | Apple way more than the current backslash.
        
               | ByteWelder wrote:
               | Yes: E2EE and not having the ability to scan is a legit
               | alternative. Apple has lost its credibility as a privacy-
               | enabler. That's a real consequence, as opposed to a
               | potential consequence of being targeted by propaganda.
        
         | yyyk wrote:
         | There's a good chance we'll have a server-side implementation
         | _and_ a client-side implementation. The currently proposed
         | client-side implementation has obvious holes and there 's no
         | way Apple doesn't know that.
         | 
         | They can expand client-side scanning massively or implement
         | server-side scanning too. The latter will cause less
         | objections, because 'Apple will not provide E2E' isn't a news
         | item if they keep silent, while the client-side expansion will
         | be.
        
         | CubsFan1060 wrote:
         | My prediction, based on zero evidence whatsoever. They'll
         | implement it server side, and eventually introduce encrypted
         | iCloud photos. But your choice is to allow this scheme (scan on
         | device before upload) and have E2EE, or to not have E2E
         | encryption and have it scanned on the server side.
        
           | cwizou wrote:
           | Interesting thought but Apple never gives choice on much
           | (well except Safari UI now!), and I don't see them giving an
           | explicit choice on this. If they really had for a goal to
           | implement E2EE, I could maybe see this now that you are
           | mentioning it.
           | 
           | However if I had to make a guess, they'll go server-side
           | scanning only, and E2EE backups/Photos will never happen.
           | 
           | To me it's a pipe dream to believe they would offer E2EE,
           | there's too much international pressure from governments
           | around the world to keep iCloud backups open and going E2EE
           | would trigger unpalatable retaliations.
        
             | rjvir wrote:
             | I think it's plausible that they'd offer E2EE for all of
             | iCloud.
             | 
             | They already do it for iMessage, and it makes it easier to
             | turn down subpoenas if they can credibly claim that they
             | can't even access the data themselves.
             | 
             | Likewise, offering an explicit choice also seems plausible.
             | The full E2EE flow might have UX downsides (for example,
             | the user might need to write down a recovery seed phrase on
             | paper), so they might not force all users into that flow.
        
               | [deleted]
        
               | cwizou wrote:
               | But that's the thing, they have reached a compromise
               | regarding iMessage being "E2EE" by keeping all those
               | avenues (iCloud backups and iMessage on iCloud) that
               | essentially trade E2EE for the convenience.
               | 
               | Most users opt-in to the convenience for iMessage making
               | it, for all purposes, no longer E2EE.
               | 
               | I don't see them removing that precisely because of
               | iMessage.
        
           | [deleted]
        
           | sebiw wrote:
           | Huh, I never really thought about the possibility that Apple
           | would implement both schemes. I don't think they will go down
           | that path, though. Why: Cost and complexity for Apple would
           | be somewhat higher and the surrounding discussion about
           | capabilities (abuse of the client-side system for other
           | things than CSAM scanning) would persist just because of the
           | existence of client-side scanning. Let's see how this story
           | continues.
        
           | jpxw wrote:
           | I highly doubt they'd offer a choice like this. 99% of their
           | customers won't know what these terms mean.
        
             | derefr wrote:
             | I'm guessing that rather than a choice, it'd be one or the
             | other depending on the market, with most countries getting
             | the server-side version, while totalitarian countries where
             | people are already used to being spied on get the client-
             | side one.
        
           | dylan604 wrote:
           | Is there an alternate where they upload, scan, encrypt only
           | non-offending data? Is there a way to share a public key that
           | only allows for encrypting but not decrypting that would make
           | this a viable option?
        
           | [deleted]
        
         | blacktriangle wrote:
         | Assuming they become very clear about advertising what they are
         | doing and whether your data is stored locally or in their
         | cloud, I might be okay with this. Having a vendor scanning my
         | own personal device is downright terrifying to the point of
         | walking away from iOS full stop. However the same logic can be
         | applied to Apple's servers, and I think has serious
         | implications for how we view cloud services going forward. If a
         | company signs up to host my content, doesn't it make sense they
         | would like to examine that content to make sure they are not
         | breaking any laws simply by being in possession of said
         | content?
         | 
         | The real answer is the same it's always been, we need to
         | abandon the cloud in favor of going back to self-hosting and
         | bring back the fully distributed internet, the problem is just
         | getting technology to the point where doing so is easy enough
         | for non-technical users.
        
           | CubsFan1060 wrote:
           | This is already done by quite a few places. Take a look at
           | PhotoDNA. I believe at least OneDrive and Gmail have been
           | confirmed to do this same scanning.
           | 
           | For reference, from 2014:
           | https://techcrunch.com/2014/08/06/why-the-gmail-scan-that-
           | le...
        
             | jliptzin wrote:
             | They scan material already on servers they own, I don't
             | think anyone is surprised about this. If you don't want
             | anyone seeing or scanning your data, don't push it to the
             | cloud. Apple's new policy of analyzing your data before it
             | even gets to their cloud would break that paradigm.
        
         | simion314 wrote:
         | >What do you think are the chances we'll see a server-side
         | implementation instead of the so far proposed client-side one?
         | 
         | If Google, Microsoft and Facebook are doing this for years then
         | the question is why is Apple not doing it? Did they not think
         | about the children until last month?
        
       | superkuh wrote:
       | "Apple Delays Rollout of Security Compromising Backdoor"
        
         | q-rews wrote:
         | "...in Order to Implement It Silently Once the Uproar Quieted"
        
       | Macha wrote:
       | I'm pleasantly surprised Apple backed down on well... anything.
       | Especially this.
       | 
       | However, macrumors has probably the most user-hostile gdpr modal
       | I've encountered.
       | 
       | 1. I used it on my android device. The modal popped up, I gave it
       | several seconds to load in, everything seemed stable, I go to tap
       | the settings dialog and... up jumps the modal and registers a
       | click on the accept button.
       | 
       | 2. Want to go fix it after the fact? Go to the footer, find the
       | privacy link, find the link in the middle of the policy, now you
       | get the desktop modal on the site.
       | 
       | 3. Everything shows consent default off, so all good right? Nope.
       | Expand out the toggles and you'll see under the "Personalised
       | ads" a set of more granular toggles.
       | 
       | 4. So toggle them off and you're good, right? Nope, each of them
       | you need to tap into and get to a new screen. This new screen has
       | two modals, one for consent, and one defaulted in for legitimate
       | interest. One screen per sub modal, and the only way to go back
       | to get to the others is to click "Back" in the top left, which
       | leaves you at the root of the flow.
       | 
       | I have seen this particular modal on or two other sites, and
       | finding yet another layer to bury the controls in seems to be a
       | new "innovation" in this regard. Hopefully they're one of the
       | sites targeted by noyb's enforcement complaints.
        
       | encryptluks2 wrote:
       | A new job recently sent me a Mac after I've been using Linux for
       | the last 5+ years. I couldn't believe all the things that I had
       | to agree too. I don't think Apple has ever been about privacy. It
       | is one thing to say you support privacy, but another thing to
       | actually do it. Apple appears to only say they do, but their
       | actions show otherwise.
        
       | tlynchpin wrote:
       | Apple is well regarded for their strategy. I don't find it very
       | credible that Apple is reacting here, it seems apparent that they
       | would anticipate substantial negative reception.
       | 
       | So this delay is preordained in the strat, right?
       | 
       | But Apple is only human and has had its share of monumental wtf,
       | so this is just a huge gaffe? The spyware team didn't know about
       | iphone 13 launch? This seems incredible.
       | 
       | There is some n-dimensional chess at play here. I feel like a
       | popcorn munching audience member who is actually a pawn in
       | another dimension.
        
         | BugWatch wrote:
         | It's a corporation, not a human.
         | 
         | (Corporations should never have been given "bodies" at all,
         | IMO.)
        
       | cletus wrote:
       | [Insert Picard meme of "Mild Shock"]
       | 
       | This is good but not good enough. Why? Because delayed means it
       | might still happen. What we're aiming for is "cancelled".
       | 
       | Scanning your phone and your content is a terrible capability to
       | exist and precedent to set. It's a small step to extend that
       | system to, say, scanning for DMCA violations, pirated content,
       | etc. And those will be justified by those activities supporting
       | terrorism.
        
       | akomtu wrote:
       | Translating this to plain English using the corp/gov lingo:
       | 
       | Child = authoritarian Safety = control/censorship (depends on
       | context) Features = restrictions
        
       | walshemj wrote:
       | Mac rumors has really drunk the cool aid with that headline and
       | not actually call out what apple was trying to do.
        
       | cblconfederate wrote:
       | So, Apple choosing to endanger children to avoid bad PR
        
         | BitwiseFool wrote:
         | Do you sincerely believe the situation is _that_ simple? It 's
         | not like they are disabling a feature that has already been
         | around, so how is keeping the status quo 'endangering
         | children'?
        
           | concinds wrote:
           | It's quite commonly accepted that stopping the spread of CSAM
           | reduces demand, which has an effect on supply, and directly
           | saves some children's lives from trafficking.
        
             | snovv_crash wrote:
             | Ah, but what about the children oppressed by authoritarian
             | governments? If we give those governments better tools for
             | oppression, does that harm more children than CSAM
             | reduction helps?
        
         | burnished wrote:
         | This is a spicy hot take. I could see an element of valid
         | criticism here, maybe something along the lines of questioning
         | how their equivocation now lines up with their previous
         | 'spirited' defense of this project (this would genuinely make
         | an interesting case study). Placing Apple in the active role of
         | endangering children though! Seems like the blame is in the
         | wrong place.
        
         | [deleted]
        
         | musha68k wrote:
         | Bad actors will just skip any of these technologies anyways:
         | recording with old camera gear, sending hard disks by mail,
         | meeting IRL to exchange USB sticks, etc
         | 
         | Security theatre.
         | 
         | Back in 2013 Snowden thankfully gave us good insight into why
         | these efforts are made.
        
           | erikpukinskis wrote:
           | Some bad actors are stupid and lazy.
           | 
           | The fact that people can still get away with crimes given a
           | protection scheme is not, in itself, reason to skip the
           | scheme. The relevant question is: will it catch some people?
        
             | musha68k wrote:
             | No, the relevant question is: who gets to decide that we
             | end up having this discussion?
             | 
             | Mass surveillance is not an option.
             | 
             | Even if it were, it would eventually expose a wrong
             | approach. Mitigating (systemic) child abuse does not
             | necessitate flashy "crypto surveillance". It asks for well
             | educated and vigilant societies and correspondingly smart
             | and industrious investigation work.
             | 
             | At best this is aimless activism. Either way trust has been
             | broken. Big time.
             | 
             | With great power comes great responsibility.
        
               | [deleted]
        
       | beezischillin wrote:
       | I fully expect them to eventually go ahead with it, I just wonder
       | if they'll turn this on via a flip of a switch when the device
       | phones home or it will be a software update. So basically: will
       | upgrading to iOS 15 be safe for now?
        
       | post_break wrote:
       | Wow, we made Apple blink. I did not see that coming. People like
       | me who sold their iPhone and cancelled all services probably are
       | a drop in the bucket, but it looks like the general public and
       | the news picked up on this enough to get the ball rolling to make
       | them step back. So now the question is in what 15.X update do
       | they just sneak it in.
        
         | azalemeth wrote:
         | Part of me wonders if the calculus includes either (a) the open
         | letter signed by (effectively) ~10k verified developers [1],
         | (b) the complaint letters from groups like the EFF [2, 3], or
         | (c) expert and qualified input from their (presumably huge)
         | marketing department...
         | 
         | [1] https://appleprivacyletter.com/
         | 
         | [2] https://act.eff.org/action/tell-apple-don-t-scan-our-phones
         | 
         | [3] https://www.eff.org/deeplinks/2021/08/apples-plan-think-
         | diff...
        
           | koheripbal wrote:
           | More likely D) The regulatory threats from the EU.
        
             | gambiting wrote:
             | Wasn't the feature meant to be US only - at least for now
             | anyway?
        
               | Mindwipe wrote:
               | Apple said themselves, there's only one iOS ROM, so it
               | was there on everyone else's devices regardless.
        
               | FabHK wrote:
               | But enabled only for the US region initially. That is bad
               | in itself, though, I think, insofar as it signals that
               | the system can be switched on and off, and thus maybe
               | configured, on a per-region basis, which does seem to
               | open the door for government pressure and abuse.
        
             | threatofrain wrote:
             | The EU's mood is cloudy here, as there's also legislative
             | pressure against child porn coming up this year.
        
           | judge2020 wrote:
           | Or D) employees who weren't part of the discussions heard
           | about it the day before the official announcement, and made a
           | stink about it internally.
        
             | haswell wrote:
             | E) All of the above
        
               | lumost wrote:
               | F) They received requests to scan for other forms of
               | content from various governments.
        
               | valkmit wrote:
               | Underrated comment right here
        
               | sslayer wrote:
               | G) Secret Pedo BoardMembers/Investors/Mutual Fund Manager
               | threaten to dump all Apple Stock.
        
               | marcan_42 wrote:
               | G) Turns out they can't know if they're scanning for
               | other forms of content, since they delegate computing the
               | hash lists of banned content to government organizations
               | like NCMEC anyway.
        
               | darwingr wrote:
               | So a hash for a public key used to encrypt something
               | could be in there? Banned books? A video of unsavoury
               | evidence to an institution?
        
               | dave7 wrote:
               | Yes indeed.
        
             | jtbayly wrote:
             | E) Their promise that they would only include scanning of
             | images that appeared in multiple countries' databases.
             | 
             | As of right now, _is_ there even a database other than the
             | one NCMEC has? I suspect they are waiting for the NCMEC to
             | spin up its European branch.
        
               | Mindwipe wrote:
               | Yes, the IWF database.
               | 
               | Indeed, the NCMEC database does not have an especially
               | good reputation, much like MCMEC themselves.
        
               | WaitWaitWha wrote:
               | There is International Centre for Missing and Exploited
               | Children) (ICMEC), Child Exploitation and On-line
               | Protection (CEOP), Parents and Abducted Children Together
               | (PACT), Missing Children Europe (MCE), and more just in
               | Europe.
               | 
               | ICMEC is a fork of NCMEC.
        
               | josefx wrote:
               | They are probably just waiting for several countries to
               | spin up their databases of gay porn and blasphemous /
               | regime critical memes. Shouldn't be too hard to find two
               | of those countries working on a shared "child abuse"
               | database now that the feature has been widely publicized.
        
               | camgunz wrote:
               | Honestly, maybe that happened and they went "oh. shit."
               | and now they're "delaying".
        
               | Animats wrote:
               | There's so much censorship now worldwide, and it's
               | increasing. Russia doesn't allow promotion of the gay
               | agenda. Parts of the Islamic world view criticism of
               | Islam as a killing offense. China won't tolerate the flag
               | of Taiwan, Falun Gong, or criticism of the Party.
               | Afghanistan no longer allows music.
               | 
               | A decade ago, all those things were on the Internet
               | worldwide.
        
               | ilamont wrote:
               | China is also cracking down on LGBTQ groups:
               | 
               | https://www.bbc.com/news/world-asia-china-57759480
               | 
               |  _Many of the closed WeChat accounts display messages
               | saying that they had "violated" Internet regulations,
               | without giving further details.
               | 
               | The account names have also been deleted and just read
               | "unnamed".
               | 
               | "After receiving relevant complaints, all content has
               | been blocked and the account has been suspended," the
               | notice said.
               | 
               | The crackdown is the latest example of what some call
               | growing intolerance toward the LGBT community. Last year,
               | Shanghai Pride week, modelled on Pride events in the
               | West, was cancelled without explanation after 11 years of
               | it going ahead.
               | 
               | In 2019, the Oscar-winning Freddie Mercury biopic
               | Bohemian Rhapsody was released in Chinese cinemas, but
               | references to the Queen singer's sexuality and AIDS
               | diagnosis were censored.
               | 
               | In 2018, Weibo said all posts related to homosexuality
               | would be taken down, although it backtracked after
               | massive outrage._
        
               | totetsu wrote:
               | WeChat seems like a pretty big part of life in China and
               | with the know your customer requirements I imagine those
               | accounts are links to individuals. It must be a nightmare
               | to have one canceled.
        
               | noduerme wrote:
               | And most recently, China wants to censor less masculine
               | men: https://www.cbsnews.com/news/china-bans-sissy-men-
               | tv/
               | 
               | I have a theory about this that isn't politically
               | correct, but here goes. This has nothing to "conservative
               | values" or homophobia or whatever you call sexual
               | repression in places like Arkansas and Afghanistan. It's
               | not based on religion or culture. It's just the CCP
               | running an actuarial table.
               | 
               | The CCP is a blunt force instrument. It realized some
               | time ago that the one-child policy had left it holding
               | the bag on taking care of a rapidly aging population
               | without enough young people to power the economy in 10-20
               | years. Not only that, you couldn't easily just repeal the
               | policy and expect a baby boom. They took a look at Japan
               | and realized they were about to hit a demographically
               | driven, deflationary wall. So the party planners moved
               | from repealing the 1CPolicy to actually offering cash
               | bonuses for second children. This volte-face happened
               | within a few short years. But it didn't work as well as
               | expected. Their sudden magnanimous gesture didn't bump
               | their 5-year plan's crop of new Han for a few reasons: A
               | shortage of women (unexpected consequence of the 1CP),
               | more women in the workforce who don't want to have
               | children, _video game culture_ which makes young men stay
               | home instead of going out and impregnating girls -- which
               | is why they 're now limiting screen time, _gender
               | fluidity / queerness_ which suppresses baby-generation,
               | and of course _western individualism_ , the great bugbear
               | of "harmony," which encourages people to wait for love
               | and financial stability before having children. At the
               | end of 5 years of encouraging people to have babies, they
               | don't have enough babies. So now they have to get harder
               | on the edge cases.
               | 
               | It's probably safe to assume that gays and lesbians
               | represent at least 10% of the Chinese population as they
               | do in most countries. So that's what, 120 million people?
               | Let's say half of whom are young enough to have at least
               | one child? So we're talking about an extra 30-60 million
               | children if you can somehow get a replacement rate.
               | 
               |  _That 's_ what I believe all these recent moves by the
               | CCP have been about. And banning test study programs?
               | Same thing. No point having more babies if you're also
               | getting overproduction of elites. They need construction
               | workers and factory workers. And someone was told, we
               | ain't gonna become Germany by bringing them in from
               | Tajikistan, so take this data and figure out how to
               | squeeze as much Han production as possible out of it in
               | the next 5 years.
        
               | erosetthanatos wrote:
               | I 100% agree with your assessment. I've thought the same
               | thing for a while. The downfall of the West is already a
               | foregone conclusion purely from demographics. CCP's
               | hardline on reversing the inverted pyramid trend will
               | catapult it as indisputable leader within the next 20-30
               | years.
        
               | [deleted]
        
               | noduerme wrote:
               | I don't think their attempts at upping the birthrate are
               | going to succeed. The CCP is fighting against the larger
               | trend of decreasing fertility rates in all post-
               | industrialized nations, while trying to promote
               | stability... they started later and they got to the wrong
               | side of the inverted pyramid _faster_ than the west. Now,
               | if they do have success with a baby boom it may come with
               | unintended consequences. The party leadership is aging,
               | worse, ossifying. It 's going to get less and less easy
               | to put down youth protests the more Xi starts to look
               | like Fidel Castro. Maybe their technological cage can
               | keep citizens too afraid to talk to each other, but
               | they're too well connected to the rest of the world to
               | not know what's going on. And you can't cut the cord.
               | That's why the laws against video games, and the
               | censorship of LGBTQ media, have to be pushed as a
               | nationalistic marketing campaign rather than forced as
               | Maoist dictats. (Although the idea of Mao banning video
               | games is kinda funny). China will never be a superpower
               | _unless_ they have a liberal French-style democratic
               | revolution. Then all bets are off. A democratic China
               | might take over the world. Other than that, the worst the
               | west has to fear from that quarter is losing our own
               | souls in the process of doing business with a genocidal
               | government. It hasn 't stopped us from buying toasters on
               | Amazon yet.
        
               | tg180 wrote:
               | Highly agree with you.
               | 
               | In the West we have already tried to force LGBTQ people
               | to accept the behavior of their assigned sex. Result?
               | Increased suicides, depression and drug use.
               | 
               | Obviously, after decades/centuries of failure we finally
               | decided to recognize the reality of the facts: you cannot
               | force people into a heteronormative life, nor to make
               | children.
               | 
               | They are in for a hard failure.
        
               | suifbwish wrote:
               | The French Revolution actually marked the end of France
               | ever being considered a super power again. The entire
               | western world mocks their pathetic legal system
        
               | ldargin wrote:
               | You can add U.S. states banning "Critical Race Theory" to
               | that list.
        
               | klipklop wrote:
               | You left out the "in public schools" part. You can still
               | look at all the CRT stuff you want on the internet in the
               | US. Stuff like White Fragility are top sellers in the US.
               | 
               | This is not even remotely the same as the Taliban banning
               | music or China cracking down on LGBTQ stuff.
        
               | jandrese wrote:
               | Which was extra silly because CRT wasn't being taught in
               | public schools unless you are also including public
               | universities.
        
               | miles wrote:
               | _Yes, critical race theory is being taught in public
               | schools_ [1]
               | 
               | > Christopher Rufo reported[2] that 30 public school
               | districts in 15 states are teaching a book, _Not My Idea_
               | , that tells readers that "whiteness" leads white people
               | to make deals with the devil for "stolen land, stolen
               | riches, and special favors." White people get to "mess
               | endlessly with the lives of your friends, neighbors,
               | loved ones, and all fellow humans of color for the
               | purpose of profit," the book adds.
               | 
               | > There are plenty of other examples that prove racial
               | essentialism and collective guilt are being taught to
               | young students. In Cupertino, California, an elementary
               | school required[3] third graders to rank themselves
               | according to the "power and privilege" associated with
               | their ethnicities. Schools in Buffalo, New York, taught
               | students[4] that "all white people" perpetuate "systemic
               | racism" and had kindergarteners watch a video of dead
               | black children, warning them about "racist police and
               | state-sanctioned violence." And in Arizona, the state's
               | education department sent out[5] an "equity toolkit" to
               | schools that claimed infants as young as 3 months old can
               | start to show signs of racism and "remain strongly biased
               | in favor of whiteness" by age 5.
               | 
               | [1] https://www.washingtonexaminer.com/opinion/yes-
               | critical-race...
               | 
               | [2] https://twitter.com/realchrisrufo/status/141329288126
               | 4005126
               | 
               | [3] https://www.city-journal.org/identity-politics-in-
               | cupertino-...
               | 
               | [4] https://www.city-journal.org/buffalo-public-schools-
               | critical...
               | 
               | [5] https://www.washingtonexaminer.com/news/arizona-
               | education-ba...
        
               | heavyset_go wrote:
               | Christopher Rufo defines critical race theory as
               | everything he doesn't like, so of course he'd see phantom
               | CRTs lurking behind every shadow and around every corner.
               | 
               | From here[1]:
               | 
               | > _Christopher Rufo, a prominent opponent of critical
               | race theory, in March acknowledged intentionally using
               | the term to describe a range of race-related topics and
               | conjure a negative association._
               | 
               | > _"We have successfully frozen their brand -- 'critical
               | race theory' -- into the public conversation and are
               | steadily driving up negative perceptions," wrote Rufo, a
               | senior fellow at the Manhattan Institute, a conservative
               | think tank. "We will eventually turn it toxic, as we put
               | all of the various cultural insanities under that brand
               | category. The goal is to have the public read something
               | crazy in the newspaper and immediately think 'critical
               | race theory.'"_
               | 
               | [1] https://www.washingtonpost.com/education/2021/05/29/c
               | ritical...
        
               | miles wrote:
               | > of course he'd see phantom CRTs lurking behind every
               | shadow
               | 
               | When it comes to teaching children _Not My Idea_ ,
               | definitely seems to be more to the concern than just
               | shadows:
               | 
               |  _The fallacy of 'whiteness'_
               | https://www.bostonglobe.com/2021/08/08/opinion/fallacy-
               | white...
               | 
               | > According to recent reports, public and private
               | elementary schools across the United States have used, as
               | part of racial equity education, an illustrated
               | children's book called "Not My Idea," in which a devil
               | with a pointy tail offers the young reader a "contract
               | binding you to whiteness." The contract promises "stolen
               | land," "stolen riches," and "special favors"; in
               | exchange, whiteness gets "your soul" and power over "the
               | lives of your friends, neighbors, loved ones, and all
               | fellow humans of COLOR."
        
               | kgarten wrote:
               | CRT is something taught in law school, not in elementary
               | schools. You have an indoctrination problem in the US but
               | it definitely does not come from CRT.
               | 
               | I looked at some of the illustrations of "Not My Idea"...
               | it's a bit wired and cringe worthy sometimes. Still
               | compare that to the indoctrination that is in some of the
               | schoolbooks:
               | 
               | Slavery was just "black immigration"
               | https://www.theguardian.com/education/2021/aug/12/right-
               | wing...
               | 
               | The kkk was not morally wrong ... ??
               | https://www.nbcnews.com/politics/politics-news/texas-
               | senate-...
               | 
               | Banning of Black and Latino Authors
               | https://www.mcall.com/news/pennsylvania/mc-nws-pa-banned-
               | boo...
               | 
               | Teaching creationism:
               | https://www.arkansasonline.com/news/2021/apr/08/house-
               | advanc...
               | 
               | Are you as outraged about that as you are about a fringe
               | law theory?
        
               | kgarten wrote:
               | Downvotes, but no counter?
               | 
               | Can you name numbers comparing how many books with
               | problematic woke content are used versus the books
               | mentioned from the guardian?
        
               | CamperBob2 wrote:
               | Is he actually wrong? Seems like it should be possible to
               | judge the citations he's posted independently of his own
               | (admittedly obvious) biases.
        
               | josephg wrote:
               | Critical race theory is a mythic story - the sort Yural
               | Harari describes as a religion. It sits alongside
               | replacement theory, humanism, environmentalism, crypto
               | anarchism, liberalism, rand's objectivism, and so on.
               | 
               | These stories are lenses through which you can describe
               | the world. They're like software for the mind. And like
               | software, they aren't objectively right or objectively
               | wrong. Each of these stories (dubiously) explain and
               | obsesses over some aspects of the world, and ignores
               | other aspects completely.
        
               | dragonwriter wrote:
               | > Critical race theory is a mythic story -
               | 
               | No, its not.
               | 
               | Though there is a mythic anti-"Critical Race Theory"
               | story created by the American Right, and the thing within
               | it called "Critical Race Theory" is a (particularly
               | incoherent, because a number of unrelated and opposing
               | things from the real world that share only that they
               | concern race, and they are disliked by the American
               | Right, and they are _not actually Critical Race Theory_ ,
               | are jammed into it) mythic story.
               | 
               | > These stories are lenses through which you can describe
               | the world. They're like software for the mind. And like
               | software, they aren't objectively right or objectively
               | wrong.
               | 
               | Actual critical race theory (like critical legal studies,
               | from which it stems) holds the existence of objective
               | features of social structures, with tangible, material,
               | effects.
               | 
               | Like many hypothesized social phenomenon, the complexity
               | of the systems involved may make falsification difficult
               | on a practical level, but the claims it makes are fact
               | claims which are objectively true or false.
        
               | kgarten wrote:
               | It seems he is. I see a much bigger push to prevent any
               | type of critical discussion about issues of slavery etc.
               | 
               | I'm also always surprised how little US citizens know
               | about their own history. Ask somebody about "Birth of a
               | Nation" and see what they can tell you about it.
               | 
               | Ask them if they know about the "Pro American Rally" in
               | 1939 and see what they say.
               | 
               | These things are not taught in school and it's a shame. I
               | know what I'm talking about I'm German and I hated our
               | history education in school as we were discussing the 3rd
               | Reich nearly every year. Yet, reflecting on it and seeing
               | that the same stupid ideas get a hold today again, it was
               | not nearly enough. We should have taught more. The same
               | holds for the German colonial history (that was not
               | covered and is changing slowly).
               | 
               | If you don't know your past, you are condemned to repeat
               | it.
               | 
               | https://www.washingtonpost.com/local/education/150-years-
               | lat...
               | 
               | https://www.theguardian.com/education/2021/aug/12/right-
               | wing...
        
               | romwell wrote:
               | So what's the data?
               | 
               | 30 school districts (out of how many?) use a book Rufo
               | doesn't like.
               | 
               | Any reason why anyone, aside from Rufo, should care?
               | 
               | Half of the nation doesn't teach sex ed because skyperson
               | doesn't like when people bang without signing an
               | exclusive banging agreement in public first. Half the
               | nation teaches kids that the Confederacy was formed to
               | protect "states' rights", carefully omitting that the
               | right in question was to own black people as
               | livestock[1]. But hey, 30 districts use a white-people-
               | bad, and that's the real problem.
               | 
               | And what does the last part of your comment (about AZ)
               | have to do with anything? Telling educators that 5-year-
               | olds can absorb shitty beliefs is now a cOnTrOvErSiAl
               | tHeOrY?
               | 
               | I don't even know where to start here, let's set this one
               | aside.
               | 
               | So, let's focus on this question first: assuming the book
               | you mentioned is bad, which percentage of pubic schools
               | use it, and in which way?
               | 
               | [1] TX is my go-to example. They were teaching that
               | slavery was a "side issue" in the Civil War when I was
               | living there in 2010-2017.
               | 
               | I'm not even going to bother dissecting the bullshit they
               | peddle these days. Feel free to dig in.
               | 
               | Texas oversees 1,247 school districts. Tell me more about
               | the problem of CRT in schools though.
               | 
               | https://www.smithsonianmag.com/smart-news/texas-will-
               | finally...
        
               | nyolfen wrote:
               | > this isn't happening
               | 
               | > this is happening and it's good
        
               | jandrese wrote:
               | 30 public school districts out of 13,800[1]. About 0.2%.
               | 
               | Rufo is trying to create the world's biggest molehill.
               | 
               | [1] https://ballotpedia.org/Public_school_district_(Unite
               | d_State...
        
               | hanselot wrote:
               | Ironically I agree. You should be exposed to every
               | religion after learning how to think critically. That way
               | you can train your propaganda filter instead of falling
               | prey to sophistry.
        
               | kevin_thibedeau wrote:
               | The canary will be when the dissident Winnie the Pooh
               | starts being disappeared.
        
           | jug wrote:
           | The cite even researchers as leading to the decision. Maybe
           | flaws with perceptual hashes? It took a mere day to discover
           | a hash collision after reverse engineering the system.
           | Granted, the colliding image looked like a grey mess, but I
           | wonder how far a malicious party might get after a year.
        
             | chillfox wrote:
             | The colliding images stopped looking like a grey mess
             | within the week and just looked like a normal low quality
             | image.
        
             | mbesto wrote:
             | They also could have had a legitimate claim to non-
             | technical people (at the FBI/NSA/blah) that "you sure you
             | want this? this system will be abused...someone could write
             | a hash collision". The Gov't agency scoffed because they
             | didn't believe it.
             | 
             | So Apple called them out on it by letting the market
             | decide. They now have that evidence.
        
               | notriddle wrote:
               | Never attribute to 4D chess what can be adequately
               | explained by knee-jerk reaction.
        
               | mortenjorck wrote:
               | This is an excellent and perceptive Hanlon corollary.
        
               | ksm1717 wrote:
               | Never attribute credence to an argument because it has a
               | name
        
               | gfodor wrote:
               | Razors are inherently self-identifying as heuristics -
               | not rules. So no need to add more disclaimers.
        
               | ZainRiz wrote:
               | Never attribute credence to an argument because it was
               | stated confidently
        
             | mlac wrote:
             | I don't really understand the argument that was thrown out
             | by many people: "Attackers will frame people by planting
             | hash collisions on their phones!"
             | 
             | So...
             | 
             | a) a bad actor is going to target someone, and have the
             | resources to generate enough collisions that(b) look like
             | CP, but aren't CP? but are close enough (c) to pass human
             | review and cause an investigation so (d) they need the hash
             | collisions to look like CP, but not be real CP? or ????
             | 
             | If a bad actor wants to frame someone, it's easy to do this
             | today - hack their system or home network, open it to the
             | internet, place the photos, call the FBI and report an
             | anonymous tip, with the URL where it is hosted and open to
             | the internet. Don't need hash collisions.
             | 
             | Hacking someone's iPhone (How do you get the photos on
             | there without forensic logs that they were added?) or
             | iCloud so that you can place hash collisions that look like
             | crap and fail to pass the review doesn't make sense and
             | leaves too much of a trail. Oh? And someone won't notice
             | thousands of photos just added to their phone?
             | 
             | A bigger threat would be deepfake CP. When that becomes a
             | reality, it will be a mess, because an attacker could
             | theoretically generate an unlimited amount of it, and it
             | will be extremely difficult to tell if it is authentic.
             | Those hashes wouldn't be in the CSAM database, but if the
             | attacker put them out on a server to be taken down, they
             | would get added eventually and then show up in a scan
             | elsewhere.
             | 
             | But I'd be pretty horrified (and definitely call my
             | attorney, Apple, and local FBI field office) if thousands
             | of CSAM images just showed up in my iPhone photo stream.
             | 
             | Edit: Downvotes are fine, and I completely understand other
             | arguments against this, but this one has just not made
             | sense to me...
        
               | gunapologist99 wrote:
               | This is a very simple attack to pull off; email someone
               | with the colliding image embedded in the MIME document.
               | Set height and width equal to zero or one pixel. Even if
               | the user never actually sees it or marks it as spam, it's
               | still on their device and in their email. For bonus
               | points, spoof the "From" as from Apple. At the very
               | least, they're getting SWATted.
        
               | mlac wrote:
               | I would expect the human review ("Swatting") to be
               | transparent, unless it was actually CP or looked like CP,
               | at which point the person would be taken for a ride...
               | But this is a real threat with actual CP now.
               | 
               | And again, there is the whole fact that you received the
               | email and there is a log that you received it.
        
               | vlovich123 wrote:
               | "Human review" here means basically minimum wage person
               | who's looking through a lot of abuse images on a daily
               | basis & has significant fatigue. I view it more as a CYA
               | thing than any meaningful protection mechanism.
               | 
               | Additionally, the images sent to review are significantly
               | downscaled versions of the original & could easily be
               | made to be ambiguous.
               | 
               | The most difficult challenge in this SWAT story is that
               | Apple has a secret secondary hash that's checked on a
               | collision. That's the part of the SWATting story that
               | feels difficult on a technical level. However, there are
               | also really smart people out there so it wouldn't
               | surprise me if a successful attack strategy is developed
               | at some point given time.
        
               | mlac wrote:
               | Sure. I mean, my main concern would be people planting
               | actual illegal images. Not just colliding hashes. If
               | anything, a handful of images that have hash collisions,
               | once they made it to the authorities, would then be
               | reviewed in detail and shown as a false positive, and
               | then ignored in the future / whitelisted. Or investigated
               | because someone is generating images to directly tamper
               | with the program / frame people.
               | 
               | No one is going to be prosecuted on "significantly
               | downscaled ... ambiguous" versions of original fake
               | images with a hash collision that flagged a review and
               | was handed to the FBI accidentally because a "minimum
               | wage" fatigued person passed it on.
               | 
               | I get the counter-arguments, but the hash collision thing
               | is just, sort of... weird? I even get the argument that
               | an innocent hash collision may have your personal and
               | private images reviewed by some other human - and that's
               | weird. But I can't really see it going further (you'll be
               | arrested and sentenced to life in prison from the HASH
               | COLLISIONS!).
               | 
               | It's just using technical terms to scare people who don't
               | understand hashes and collisions and probability, and not
               | really founded on reason.
        
               | vlovich123 wrote:
               | > once they made it to the authorities, would then be
               | reviewed in detail and shown as a false positive, and
               | then ignored in the future / whitelisted
               | 
               | Which typically will be a court case, or at least
               | questioning by police. This can be quite a destructive
               | event on someone's life. Also, there's no mechanism for
               | whitelisting outlined in the paper, nor can I imagine a
               | mechanism that would work (i.e. now you've got a way to
               | distribute CP by abusing the whitelisting fingerprint
               | mechanism or you only match exact cryptographic hashes
               | which is an expensive CPU operation & doesn't scale as
               | every whitelisted image would have to be in there).
               | 
               | Also, your entire premise is predicated on careful and
               | fair review by authorities. At scale, I've not seen this
               | actually play out. Instead either the police will be
               | underworked & not investigate legitimate cases (too many
               | false positives) or they'll aggressively police all cases
               | to avoid missing any.
        
               | tyingq wrote:
               | Going down a rabbit hole, but a colliding image could be
               | an adult that doesn't look like an adult.
        
               | misnome wrote:
               | How does this theoretical attack get them to save the
               | attached image to iCloud?
        
               | newbamboo wrote:
               | iCloud is trivial to hack. Recall the massive leak of
               | celebrity nudes a few years back. As long as iCloud is
               | forced on users (you cannot apply security updates to an
               | apple device without an Apple ID and an attached iCloud
               | account) these attacks will be simple to do in mass, with
               | very little risk to the perpetrators. Terrible security
               | model, with a long history of spectacular and avoidable
               | breaches which go totally unsolved.
        
               | misnome wrote:
               | Let's ignore the fact that you just shifted the goalposts
               | from "emailing a 1 pixel image SWATs them" to an
               | extremely wide scope for a minute -
               | 
               | Even assuming that is true that iCloud is trivially
               | "hackable" - and as I understand it, that was never clear
               | how those leaks happened - how does uploading to iCloud
               | help when it specifically needs to be uploaded from the
               | users phone along with the scanning metadata.
               | 
               | In fact, isn't apples proposed implementation here the
               | _only_ cloud service that protects against your proposed
               | attack - while other clouds scan stored data and can be
               | triggered by your attack, Apple's requires you to upload
               | specifically from a registered phone on their account;
               | data stored on-cloud is never scanned.
        
               | newbamboo wrote:
               | You can choose to use an iPhone. Most people will never
               | be targeted. Hell you could use a phone with no security
               | at all and odds are you'll be safe. But if you have
               | enemies or are a high profile target, apple has made you
               | easy to destroy. iPhones themselves can be hacked with no
               | click attacks and we know this because macron's phone was
               | one of many that was listed in the recent fiasco. Icloud
               | can be hacked because legions of celebrities had their
               | nudes published. If the device is not secure, even for
               | the president of France, and iCloud is not secure for
               | legions if celebrities, then people can and will be hit
               | with this attack and have their lives permanently
               | destroyed. Worse than having you sex photos leaked, worse
               | than having all you calls and communications intercepted,
               | you'll become known as a pedo. You can't undo that. To be
               | sure, that probably won't happen to most people, but
               | political figures, those with enemies working in infosec,
               | etc.... I want a secure phone, and iPhone no longer meets
               | that need for me and many others. You can do you.
        
               | AmericanChopper wrote:
               | The described attack is completely unnecessary. Just send
               | colliding images to people on WhatsApp. They go right
               | into Photos, and then to iCloud if that's enabled.
               | There's no reason most people would assume that this is
               | what was happening if somebody sent them some slightly
               | weird looking images.
               | 
               | The response to this is "yeah but then a human will
               | review it and nothing will happen to the victim of the
               | attack, because it's just some slightly blurry ordinary
               | images". But it ignores to entirely likely harms that
               | could result from that.
               | 
               | 1) Law enforcement use it as the basis of getting a
               | search warrant, but conveniently leave the bit about the
               | alerts being false alarms off the warrant application.
               | 
               | 2) The list of people who have had CSAM alerts is
               | inevitably leaked to the public, and the victim has to
               | spend the rest of their lives explaining to people like
               | employers why Apple flagged them as possessing child
               | sexual abuse material.
               | 
               | At the end of the day, all the gaslighting about "no
               | potential for inadvertent harm" is bs, because it's my
               | device, so get lost. Go run your anti-privacy software
               | somewhere else, imo.
        
               | misnome wrote:
               | As I understand it;
               | 
               | - Law enforcement doesn't get _anything_ unless it
               | triggers a large number of images that match the
               | perceptual hash
               | 
               | - It also needs to match a -private- perceptual hash,
               | that isn't distributed to devices, and so we don't have a
               | reliable way of generating collisions for or even knowing
               | that collisions are generated
               | 
               | I mean this whole thing is bad enough on its own without
               | having to artificially manufacture extremely specific
               | scenarios and extrapolating from there to invent hysteric
               | conclusions.
        
               | AmericanChopper wrote:
               | Well if you think a tech giant that's partnered in a
               | surveillance program with the NSA isn't going to share
               | anything they feel like with the government, then go
               | ahead as use that as the basis of coming up with your own
               | opinions.
               | 
               | But you're right though, the possibility of the list
               | being leaked and ruining countless innocent lives is the
               | much more likely of the two scenarios I described.
        
               | misnome wrote:
               | > if you think a tech giant that's partnered in a
               | surveillance program with the NSA isn't going to share
               | anything they feel like with the government
               | 
               | In that case, nothing is stopping them from scanning
               | everything already uploaded anyway, and nothing is
               | stopping them pushing code to your device to scan it
               | without telling you about it. Nothing is stopping them or
               | the government from making these "lists" anyway.
               | 
               | I'm not saying you (or anyone) should trust Apple, but if
               | you already don't - then this changes literally nothing.
        
               | AmericanChopper wrote:
               | By my own personal assessment it changes the way I view
               | them, because this is the most overtly anti-user, anti-
               | privacy thing they've ever done. Perhaps I should have
               | never had any level of trust for them, but you live and
               | learn I guess.
               | 
               | Feel free to respond to my point about the alert catalog
               | inevitably being leaked and ruining lives. Or you could
               | just have a go at gaslighting me a little more if you
               | prefer.
        
               | misnome wrote:
               | Gaslighting? Your proposed scenario is:
               | 
               | - Send colliding images
               | 
               | - image gets uploaded to icloud automatically
               | 
               | - image _also_ collides with private hash <- completely
               | unclear how this happens
               | 
               | - Only the colliding images are looked at by apple and
               | are determined to be innocent
               | 
               | - user goes on a list (This is an imagined scenario)
               | 
               | - User is reported to law enforcement even though the
               | images are innocent (This is an imagined scenario)
               | 
               | - Law enforcement uses this hypothetical report to file a
               | warrant (This is an imagined scenario)
               | 
               | - Law enforcement uses the hypothetical warrant to
               | extract images that are completely innocent, and somehow
               | build a case around this
               | 
               | - The "List", which is entirely a hypothetical of yours,
               | "leaks" (This is an imagined scenario)
               | 
               | Which also requires:
               | 
               | - Apple does not counter the meaning of "the list"
               | 
               | - Apple is not sued for vast quantities of money
               | 
               | I expect the first argument is that none of this matters
               | as long as "the idea" is out there, the reputational
               | damage is already done. Except if that's true, then none
               | of this is necessary at all, just make the accusation.
               | 
               | So, sure, continue to thread the needle between "They are
               | automatically sending all information to the government,
               | so promises are meaningless" and "This new process, on
               | top of them potentially sending all information to the
               | government, somehow makes it worse".
               | 
               | I mean, this is all a million times more difficult and
               | less likely than just, like, sending them CP in the first
               | place. Or uploading it to their Gmail or any other cloud
               | they use. Or just send a report that they have it to the
               | police without actually doing anything.
        
               | AmericanChopper wrote:
               | That's absolutely not an accurate representation of what
               | I've said, or how this system works.
               | 
               | All it requires is somebody to send somebody else a
               | colliding image.
               | 
               | This will send an event to Apple. There is nothing
               | imaginary about that, it is exactly how the system works.
               | 
               | Now that Apple has this information, the only thing left
               | is for it to be leaked or compromised in some way.
               | 
               | This is much simpler than the scenario you've described,
               | because they require the attacker to first commit the
               | crime of possessing CP. Its also possible to do without
               | tipping off the victim in any way.
               | 
               | Apple, in case you didn't know, is a company that had
               | already been the source of a couple of the most notorious
               | data breaches ever (and has somehow managed to so far
               | avoid getting "sued for vast quantities of money" for
               | them).
               | 
               | What you're trying to do here is quintessential
               | gaslighting.
        
               | skoskie wrote:
               | I think there's an unspoken assumption that photos are
               | only the first part of this scanning exercise. IMO, Apple
               | will likely end up scanning all content you choose to put
               | on their servers.
               | 
               | Tangentially, I think keeping their servers clear of
               | illegal material is actually Apple's main motivation.
               | This, in turn, supports claims made by nay-sayers that
               | Apple could scan for other types of images/content in
               | more repressive countries (but not necessarily report the
               | people who did it). However, this assumption also
               | contradicts arguments that Apple will start scanning for
               | pictures of (e.g.) drugs, or weapons. Such images are not
               | inherently illegal and therefore of no interest to Apple.
        
               | dwaite wrote:
               | > IMO, Apple will likely end up scanning all content you
               | choose to put on their servers.
               | 
               | Even if that was a goal (and I would argue they have a
               | hard stance against it), this system as built is not
               | usable for that.
               | 
               | While they can scan locally, every step of recording,
               | thresholds, and subsequent automated/manual auditing is
               | built to require content to be uploaded to iCloud Photos.
        
               | skoskie wrote:
               | I agree that the current photo scanning system won't work
               | for other types of files. Further, I don't think they
               | really even need to scan text files except to see if it's
               | actually a filetype that should be scanned. And they can
               | already easily scan music and such, but have shown no
               | interest in it from a legal standpoint. Video seems like
               | a prime candidate for the next generation of this tech.
               | And I very much think they will be able to do that before
               | the decade is over.
        
               | misnome wrote:
               | > Tangentially, I think keeping their servers clear of
               | illegal material is actually Apple's main motivation
               | 
               | My guess about this whole debacle is that - with pressure
               | from the government to scan their cloud storage - that
               | this is the alternate scenario to avoid giving up (or
               | being forced to) the "encryption" guarantees of their
               | cloud. I'm not sure what technical process they have in
               | place to "only decrypt with valid law enforcement
               | requests" or allow account rescue, but it seems likely
               | that not just any employee can view whatever they want,
               | before or after this system.
               | 
               | Saying that I can maybe see a way the pressures are on
               | this doesn't mean that I'm saying this is a good solution
               | though. Clearly technically implementing this is opening
               | a can of worms that can't really be closed again and
               | makes a lot of other scenarios "closer".
               | 
               | Also, evidently, people are a lot more comfortable with
               | the idea of them actively scanning stuff people store in
               | the cloud than transmitting the information in a side
               | channel so they don't even need to handle decrypted data
               | without a hit.
        
               | skoskie wrote:
               | > My guess about this whole debacle is that - with
               | pressure from the government to scan their cloud storage
               | - that this is the alternate scenario to avoid giving up
               | (or being forced to) the "encryption" guarantees of their
               | cloud.
               | 
               | This is exactly the case.
               | https://www.eff.org/deeplinks/2019/12/senate-judiciary-
               | commi...
               | 
               | > I'm not sure what technical process they have in place
               | to "only decrypt with valid law enforcement requests" or
               | allow account rescue, but it seems likely that not just
               | any employee can view whatever they want, before or after
               | this system.
               | 
               | They have master keys that can be used to decrypt
               | _almost_ everything you upload. They can be compelled to
               | decrypt and turn over information on anyone. Another
               | (unsourced) comment in this thread indicated they do so
               | 30,000 times per year. The new encryption scheme will
               | effectively stop this for photos, and no doubt other
               | files in the future.
               | 
               | Apple's side will begin using shared key encryption,
               | which will require ALL ~31 keys to decrypt the offending
               | images.
               | 
               | The decryption keys are only generated on-device, and
               | only from a hash that results from a CSAM match. The
               | other photos, simply won't have the decryption keys
               | generated, so they don't even exist.
               | 
               | As an interesting side note, this means that a person who
               | surpasses the CSAM threshold will still only reveal the
               | images that actually match the CSAM database. Every other
               | image, including those that have CSAM unknown to
               | authorities remain encrypted. This is hardly a big win
               | for the big scary government. They now have far less
               | ability to search for evidence of any other crimes. You
               | could upload video of your bank robbery to iCloud, and as
               | long as your personal device remains secure, nobody will
               | know.
        
               | a1369209993 wrote:
               | > email someone with the colliding image embedded in the
               | MIME document. Set height and width equal to zero or one
               | pixel.
               | 
               | (I think) the complaint is: how is that different from
               | just using CSAM images, no collision required?
        
               | FabHK wrote:
               | Apple's proposed system does not scan all the pictures on
               | the device, only images about to be uploaded to iCloud. A
               | picture in an email would not be uploaded (whether
               | visible or not).
               | 
               | Even if uploaded to iCloud (such as pictures sent via
               | WhatsApp by default), and above the threshold, they would
               | still be scanned by a second algorithm and subject to
               | human review. So, your "very simple" attack fails on at
               | least three counts.
        
               | sweetheart wrote:
               | If this were a reasonable vector, it'd already be an
               | issue, at least if you use large email clients like
               | Gmail, which already scan for CSAM.
               | 
               | Edit: Downvotes because?...
        
               | speeder wrote:
               | Because it IS already an issue.
               | 
               | For example when CIA/NSA tools leaked, one of them had
               | precisely this purpose.
        
               | cmsj wrote:
               | Downvotes because nerds love to construct a super super
               | unlikely corner case to argue against something ;)
        
               | sweetheart wrote:
               | I'd call that out as totally unfair... if I haven't found
               | myself doing that many times before hahaha.
        
               | CamperBob2 wrote:
               | We, the nerds, only have to be wrong once in order to
               | allow immense harm to occur. Apple voluntarily put itself
               | in that position.
               | 
               | Conversely, the governments of the world get to keep
               | trying, over and over.
        
               | brigade wrote:
               | Those use a different perceptual hash that's intended to
               | be kept secret.
               | 
               | It's significantly harder to develop collisions for an
               | algorithm you cannot inspect or obtain the output of.
               | 
               | (well also, emailing actual CSAM is way easier and mostly
               | just gets the sender reported)
        
               | foldr wrote:
               | Surely the sender is also likely to get reported if they
               | email the image with the perceptual hash collision. Lots
               | of companies run these scans. E.g. if you send the image
               | via gmail it will most likely be scanned by Google.
        
               | cherioo wrote:
               | Apple also uses a second, different perceptual hash
               | that's intended to be kept secret, on server side.
        
               | therealjumbo wrote:
               | Then why have a client side scanner in the first place?
               | 
               | You're exposing the hashes to the world, you're not able
               | to E2E encrypt anything if you need to server side scan,
               | which you probably do since trusting the client no matter
               | what is generally bad in potentially adverserial
               | situations, and you get all this negative press and loss
               | of reputation and potentially pressure from governments
               | around the world to use this for other ends. Cui Bono?
        
               | nicce wrote:
               | I think you have misunderstood the system.
               | 
               | The second scan applies only for those images which are
               | flagged as positive, which are then accessible by Apple.
               | This is applied to detect adversarial hashes. The rest of
               | the images stays encrypted. So, indeed on-device scanning
               | is the only way to enable at least partial E2EE with CSAM
               | detection.
               | 
               | Yes, this was PR failure Apple. They rushed the
               | announcement because of the leaks, and secondly they
               | thought that people will understant the system when they
               | did not. There is too much misundersting. That scanning
               | for example is built-in so deep into the iCloud pipeline,
               | that one does not simply change the policy for scanning
               | the whole phone.
        
               | therealjumbo wrote:
               | >I think you have misunderstood the system.
               | 
               | On a technical level I think you're correct. As a
               | holistic approach to the problem, I still disagree. This
               | is too cute for its own good. The PR misunderstanding is
               | a symptom of that.
               | 
               | >The second scan applies only for those images which are
               | flagged as positive, which are then accessible by Apple.
               | 
               | In the end, Apples software is scanning all of the
               | images, why is it any more privacy respecting to do it
               | this way? I guess reasonable people can disagree on that,
               | personally I wasn't fully aware of the cloud side
               | scanning either, and I don't think the public was either.
               | This is similar to Snowden's revelations, if you were
               | paying attention you probably already knew a lot of that,
               | but the incident made everyone aware of it in a very
               | blunt way.
               | 
               | >The rest of the images stays encrypted
               | 
               | I think this is unclear, Apple can still decrypt those
               | other images, how else could you view them in a browser?
               | 
               | This goes back to what Stratechery said about capability
               | vs policy.
        
               | nicce wrote:
               | > I think this is unclear, Apple can still decrypt those
               | other images, how else could you view them in a browser?
               | 
               | Obviously there is change coming to iCloud. Otherwise
               | whole PSI protocol is pointless.
        
               | gunapologist99 wrote:
               | At least we know that secrets never leak out.
        
               | chipotle_coyote wrote:
               | Assuming the system works as Apple's described it, then
               | this attack doesn't work. The image needs to be added to
               | the Photos library _and_ the Photos library needs to be
               | syncing with iCloud. Images in email aren 't scanned, and
               | an image that's zero or one pixel can't even be clicked
               | on to be added to the library.
               | 
               | And, to at least respond to two obvious counter-arguments
               | I've looked into:
               | 
               | "But it's just one line of code to change it to scan
               | images (that aren't uploaded to iCloud) (that are
               | anywhere on the device)!" No, it isn't; if you read the
               | technical documentation and the more technically-oriented
               | interviews Apple's given on this, there isn't just _one_
               | hash that needs to be matches, there are _two_ hashes,
               | one on the device and one on the server. (I think Apple
               | did a very poor job of communicating this to a general
               | audience; it certainly wouldn 't have alleviated all the
               | concerns, but if it was understand as "client-server
               | scanning" rather than "client-only scanning" it might
               | have at least changed the tenor of the conversation.)
               | That doesn't mean they can't do a combination client-
               | server scan on every single image or even file on the
               | device, but it makes it both more difficult to do and
               | more difficult to _hide._
               | 
               | "But what if the system doesn't work as Apple's described
               | it?" Well, if you don't trust Apple to some degree, all
               | bets are off. They already do ML-based image analysis of
               | all photos in your photo library regardless of iCloud
               | status and they've literally demoed this on stage during
               | iPhone keynotes, so if Apple was going to secretly give
               | government access to on-device scanning, a different
               | technology -- one that works (questionably well) on _all_
               | images, not just already-learned ones -- is literally
               | already there. The only way you  "know" what Apple is
               | doing with your data on or off device comes from a
               | combination of what they tell you and what third-party
               | security researchers discover.
        
               | jackTheMan wrote:
               | That is a huge-huge difference if they do it secretly, or
               | if they do it announced.
               | 
               | If they get caught doing secretly this in China that
               | would be a big blow. But if they are doing openly and it
               | is known that the government provides the hash, they can
               | wash their hands.
        
               | sweetheart wrote:
               | I also don't understand the vector everyone seems worried
               | about. Also considering that perceptual hashing isn't
               | new, as far as I'm aware, and it hasn't seemingly yet led
               | to any sort of wave of innocent folks being screwed over
               | sham illicit images.
               | 
               | I think there _is_ an argument to be made about a system
               | like this being used to track the spread of political
               | material, and it's easy to see how such a system would be
               | terrible for anyone trying to hide from an authoritarian
               | power of some type, but that'd already require Apple is
               | absolutely and completely compromised by an authoritarian
               | regime, which isn't high on my list of concerns.
        
               | azalemeth wrote:
               | The main issue is that the system is basically a bit of
               | snitching spyware dolled up with cryptographic magic to
               | make it _only work with really bad stuff, honest_.
               | 
               | As recently posted on HN [1] one should be _very_ wary of
               | backdoors, no matter how much one believes only the
               | _right guys_ could _ever_ use them. Once they 're there,
               | they're there: it's arrogant beyond believe to think that
               | opponents of yours _won 't_ exploit them.
               | 
               | I won't use an apple product with this feature. I've been
               | using apple's ecosystem since about 1994.
               | 
               | [1] https://twitter.com/matthew_d_green/status/1433470109
               | 7425182... and
               | https://news.ycombinator.com/item?id=28404219
        
               | mlac wrote:
               | I understand your points on backdoors, but I don't
               | understand people trying to argue that the "Cryptographic
               | Magic" is the bad part that will send people to prison
               | due immediately upon "HASH COLLISIONS", in my view, takes
               | away from the actual concerning points.
        
               | Zak wrote:
               | What most people are focused on is that this is spyware
               | built in to the operating system. The fact that
               | perceptual hashes have some weak points could also lead
               | to problems, but that's not the main focus of most
               | opposition to this system.
        
               | FabHK wrote:
               | > there _is_ an argument to be made about a system like
               | this being used to track the spread of political material
               | 
               | How so? Unlike other cloud providers, which do scan all
               | uploaded images server side, this system is specifically
               | designed to prevent this.
               | 
               | (As you say, if Apple is entirely compromised, then all
               | bets are off anyway.)
        
               | Griffinsauce wrote:
               | > Apple is absolutely and completely compromised by an
               | authoritarian regime
               | 
               | I don't understand this, they can just be legally
               | compelled.
        
               | sweetheart wrote:
               | Yeah sorry, I was a little vague. I consider that
               | willingness to comply as already having been compromised.
               | And I think, to be fair, if companies like Apple are ever
               | in the position of using technologies like this to track
               | individual dissidents of an authoritarian regime, we are
               | already royally screwed.
               | 
               | But, to be clear, you make a legitimate point.
        
               | cmsj wrote:
               | And they already are, Apple (and indeed all companies)
               | have to follow all sorts of rules in all sorts of
               | countries. Some of those rules are fairly harmless, some
               | of them are about radio power levels, some of them are
               | "don't show the Taiwanese flag in China".
        
               | mlac wrote:
               | And I also understand the concerns with this (and the
               | potential downsides of the technology).
               | 
               | But a motivated bad actor has a lot easier time just
               | putting an image of the Taiwanese flag or propaganda on
               | someone's phone than trying to make it a hash collision
               | that triggers... If an attacker is really after someone,
               | I would expect them to put the _actual_ material on that
               | person 's phone...
               | 
               | ... It's like trying to frame someone for drugs in their
               | car and going through the hassle of synthesizing a
               | chemical that triggers drug dogs to react, but it isn't
               | actually the drug. Wouldn't they just... buy drugs and
               | stick them in the car?
        
               | zzo38computer wrote:
               | > Wouldn't they just... buy drugs and stick them in the
               | car?
               | 
               | Maybe they want to create too many false positives
               | deliberately, or maybe they do it just because they want
               | to see if and how synthesizing such a chemical can be
               | done.
               | 
               | When putting pictures on someone's phone or computer,
               | there are ways to add false positives as well as maybe
               | doing false negatives sometimes (e.g. by encrypting the
               | picture or using an uncommon file format so that it must
               | be converted, or using complicated HTML/JS/CSS to render
               | it instead of just a picture file).
               | 
               | Also, if someone has the picture or drug or whatever to
               | find if it is what they are, can you accuse them (maybe
               | they are the police) of liking pictures and drugs, too?
        
           | newbamboo wrote:
           | They ruined their reputation for being relatively secure
           | devices. I'm done with them, for good. Anybody who cares
           | about security has likely jumped ship now too. Normies will
           | eventually ditch them when they realize most tech people have
           | bailed. Unfortunately the best alternative, google, is even
           | worse. Hopefully the loss of demand causes a decrease in
           | device sales across the board. Landlines still work.
           | Smartphones and their consequences have been a disaster for
           | the human race.
        
         | headmelted wrote:
         | Apple didn't blink. They just don't want the bad press hurting
         | iPhone 13 sales.
         | 
         | It'll go live around December when the Christmas sales push for
         | iPhones is winding down and people are distracted with
         | everything else going on.
        
           | haswell wrote:
           | A more charitable take is that both are true. Apple blinked,
           | and one of several large contributing factors (probably a
           | very big one) is the upcoming iPhone launch.
           | 
           | When the tech news cycle is dominated by bad press about a
           | move like this, and every tech nerd community is discussing
           | this topic almost daily, it would be insane for Apple not to
           | take _some_ notice.
           | 
           | I know many are pessimistic about this, but Apple has learned
           | from bad mistakes before. They rarely directly admit they're
           | wrong, but conceding to pressure from the community is not
           | unprecedented: see the return to the "Magic Keyboard" in
           | their laptops.
           | 
           |  _Edit_ : removed the word "never" and replaced it with
           | "rarely directly".
        
             | Bud wrote:
             | Could we please not say silly, easily-disproven things like
             | Apple "never admit[s] they're wrong", when 10 seconds of
             | Googling instantly gives the lie to that assertion?
             | 
             | It might not happen every day, but it happens frequently
             | enough that it's easy to find a lot of examples.
        
               | Bud wrote:
               | I expected the knee-jerk lazy downvotes from the usual
               | suspects, so:
               | 
               | Apple admits mistake, says it's back in EPEAT
               | https://channeldailynews.com/news/apple-admits-mistake-
               | says-...
               | 
               | Apple Admits the Mac Pro was a mess
               | https://www.theverge.com/2017/4/4/15175994/apple-mac-pro-
               | fai...
               | 
               | Apple Admits iPhone 7 Manufacturing Fault
               | https://www.theguardian.com/money/2019/feb/11/apple-
               | iphone7-...
               | 
               | I could go on for pages, but I trust the point is made.
        
               | dhosek wrote:
               | Or the changes to the new version of Safari in response
               | to user feedback.
        
               | FabHK wrote:
               | But that was GP's point: They listen to feedback and
               | improve things, but typically quietly, rather than
               | admitting a mistake. Exceptions do prove the rule.
        
               | dwaite wrote:
               | What was the mistake for this example though that they
               | would admit to? Iterating aggressive UX changes in a
               | beta?
        
               | haswell wrote:
               | Ok, Apple _rarely_ admit they are wrong, and when they
               | do, they tend to spin it heavily. I don't think this
               | meaningfully changes the primary message of my comment
               | though.
        
               | ksec wrote:
               | Every single one of them was to the point Apple could no
               | longer hold the bad press or damaging to sales figure
               | they admit they were wrong. Or more like they didn't,
               | they just accept defeat.
               | 
               | Mac Pro was a mess when one of their largest customer
               | told them they will switch their whole studio away from
               | Apple right in front of Eddy's Cue face before some thing
               | was being done.
        
               | karmakaze wrote:
               | I wondered if Apple would ever say that you want more
               | than one mouse button. They started having an option to
               | use each side of the mouse as a separate button (even if
               | it looked like a single one and defaulted that way), but
               | I've never heard that two buttons are better than one
               | from Apple. With Steve gone, who even cares now?
        
               | ksec wrote:
               | When Apple was designing Mac as personal computer for
               | everyone, a single button mouse is truly godzillion times
               | better than two button. There used to be an Apple
               | guideline on how to move a mouse and not to _lift_ the
               | mouse up into the air, but mapping the horizontal plane
               | of the mouse to the vertical plane on the screen. Even
               | that was difficult for some people.
               | 
               | People just dont realise how normal people have problem
               | with mouse. Of course as we progress I think Two Button
               | mouse could make sense as default. The role of PC also
               | changed. The PC for everyone is now an Smartphone.
        
               | the_other wrote:
               | One button is better than two, for people who don't know
               | how to use two. I imagine the number of such users is
               | higher than most of us here assume.
        
               | unityByFreedom wrote:
               | > One button is better than two, for people who don't
               | know how to use two.
               | 
               | It's like the three seashells. They don't know how to use
               | them.
        
               | karmakaze wrote:
               | In the early days of Macintosh they tried hard to make
               | this true. The excessive and inconsistent use of click vs
               | double-click in apps even then was confusing.
               | 
               | These days it's hard to find an app that can get by
               | without Cmd+click which is harder than click on the right
               | button, which would be even easier if the right button
               | was physically distinguishable. Long-press is super
               | annoying as well as force click--I never want the action
               | that comes up when I accidentally force-click. With the
               | prevalence of touch phones, the two-finger-tap might be
               | the easiest of them to remember (if not as precise).
        
               | setpatchaddress wrote:
               | The primary input device to Apple hardware is the
               | trackpad, so it's kind of irrelevant. But two button mice
               | have been well-supported since classic Mac OS circa
               | 1997-1998.
        
             | FabHK wrote:
             | > A more charitable take
             | 
             | No, no, GP clearly has direct insight into the values,
             | ulterior motives, and decision process of Tim Cook and
             | other top brass at Apple.
        
           | noptd wrote:
           | Exactly, it's (sadly) a temporary victory at best.
           | 
           | Ten bucks says they take a page out of our politicians'
           | handbooks and sneak it in as a small, vague footnote in a
           | much larger, unrelated announcement once the initial bad
           | press blows over.
        
             | darwingr wrote:
             | Honestly, that's what I assumed they did in MacOS 11.5.2
             | where they refused to give details of the change. It's
             | likely the last update for many intel devices.
        
           | godelski wrote:
           | Your description sounds like a blink to me.
        
           | caeril wrote:
           | This is the most likely outcome. Moreover, they won't even
           | announce it.
           | 
           | Apple learned a valuable lesson here. Roll the panopticon out
           | in secret, and don't announce it. They've done a very good
           | job locking down their modern devices, enough that security
           | researchers would have an exceptionally difficult time
           | proving that they're doing it anyway.
           | 
           | GrapheneOS is still a viable option until Google ends
           | upstream security updates in 2023. That's a solid two years
           | for Purism, PinePhone, and anyone else working on linux
           | phones to bring their performance and feature set up to
           | modern standards.
           | 
           | The correct course of action is to buy a Pixel 5, run
           | GrapheneOS for the next couple years, while donating to the
           | main privacy-focused linux phone projects, and make the
           | switch again in 2023.
        
             | farmerstan wrote:
             | > Moreover, they won't even announce it.
             | 
             | That's a terrible idea. Apple knows that researchers are
             | going through every bit of assembly code on the phone.
             | 
             | If all of a sudden they say "Hey we found some code that
             | scans all your photos and sends information up to the
             | cloud" how bad would that look? It's better to explain
             | upfront rather than get found, because they will get found.
        
               | caeril wrote:
               | Is this still possible? My lay understanding is that the
               | SEP decrypts kernelspace on-demand, loads the opcodes on
               | a very temporary (and randomly allocated) basis, and
               | makes dumping any meaningful amount of the contents out
               | of RAM very difficult.
               | 
               | Not a jailbreaker at all, so happy to be completely wrong
               | on this.
        
               | dwaite wrote:
               | The default for images is signed, not encrypted. A new
               | section of self-modifying or encrypted kernel code would
               | probably trigger some red flags.
        
             | Hackbraten wrote:
             | > The correct course of action is to buy a Pixel 5, run
             | GrapheneOS for the next couple years, while donating to the
             | main privacy-focused linux phone projects, and make the
             | switch again in 2023.
             | 
             | Alternatively, get a Linux phone right now and donate time
             | by contributing bugfixes.
        
           | thehappypm wrote:
           | The bad press led to them blinking.
        
         | gojomo wrote:
         | A "blink", or the iterative strategy of a skilled
         | manipulator/abuser/groomer?
         | 
         | Push to the level of their target's discomfort, then back off -
         | without respecting the "no!", pretending to hear only "not
         | now".
         | 
         | Come back later from a superficially-different angle, when the
         | victim's defenses are down.
         | 
         | Couch the next attempt, or the next, or the next, in some
         | combination of feigned sweetness ("but I waited & changed
         | things for you"), or esteem-attacks ("you're so nasty you
         | deserve this"), or inevitability ("this is going to happen
         | whether you like it or not, so stop fighting").
        
           | float4 wrote:
           | Yeah, this has been done many times before by other tech
           | companies. Facebook being the prime example.
        
           | orangepurple wrote:
           | It's also known as shifting the Overton window.
           | 
           | It's a way to inch toward the unthinkable by obtaining
           | concessions. You propose the unthinkable and after extreme
           | push-back, the concession seems completely reasonable.
           | However, if the concession would have been proposed alone
           | instead of the unthinkable, it would have been rejected. It's
           | an absolute deceptive manipulation technique.
           | 
           | Another example of manipulative Overton window shift at play:
           | You may also recall deep state puppet Kathy Griffin showing
           | Trump's severed head as the deep state testing the waters of
           | a possible coup or assassination. _The more you expose the
           | public to the unthinkable, the more it becomes acceptable._
           | 
           | https://en.wikipedia.org/wiki/Overton_window
           | 
           | See also the Door-in-the-face technique and Foot-in-the-door
           | technique
           | 
           | https://en.wikipedia.org/wiki/Door-in-the-face_technique
           | https://en.wikipedia.org/wiki/Foot-in-the-door_technique
           | 
           | The door-in-the-face (DITF) technique is a compliance method
           | commonly studied in social psychology.[1][2] The persuader
           | attempts to convince the respondent to comply by making a
           | large request that the respondent will most likely turn down,
           | much like a metaphorical slamming of a door in the
           | persuader's face. The respondent is then more likely to agree
           | to a second, more reasonable request, than if that same
           | request is made in isolation.[1][2] The DITF technique can be
           | contrasted with the foot-in-the-door (FITD) technique, in
           | which a persuader begins with a small request and gradually
           | increases the demands of each request.[2][3] Both the FITD
           | and DITF techniques increase the likelihood a respondent will
           | agree to the second request.[2][3]
           | 
           | Foot-in-the-door (FITD) technique is a compliance tactic that
           | aims at getting a person to agree to a large request by
           | having them agree to a modest request first.[1][2][3]
           | 
           | This technique works by creating a connection between the
           | person asking for a request and the person that is being
           | asked. If a smaller request is granted, then the person who
           | is agreeing feels like they are obligated to keep agreeing to
           | larger requests to stay consistent with the original decision
           | of agreeing. This technique is used in many ways and is a
           | well-researched tactic for getting people to comply with
           | requests. The saying is a reference to a door to door
           | salesman who keeps the door from shutting with his foot,
           | giving the customer no choice but to listen to the sales
           | pitch.
        
           | new_realist wrote:
           | Wow, this is unhinged.
        
             | scubbo wrote:
             | Yes, it is terrifying that companies behave this way.
        
               | orangepurple wrote:
               | I suppose it is no surprise we hear about so many sexual
               | complaints lodged against blue chip tech execs. They are
               | experts in grooming.
        
           | delaaxe wrote:
           | Don't forget public safety matters ("but COVID!")
        
             | gojomo wrote:
             | "We're sorry to inform you that Apple iCompliance AI Eyes,
             | as of last night's updated version 16.4.2, has identified
             | the following violations of laws & emergency edicts in your
             | recent local photo roll:
             | 
             | * Unmasked proximity with non-household member
             | 
             | * Unlicensed gathering of more than 6 people
             | 
             | * Association with unidentifiable individuals (no
             | faceprints on record)
             | 
             | These are your 2nd, 3rd, and 4th strikes so your phone has
             | been disabled and you are now under arrest. Do not leave
             | your immediate vicinity. As always, keep your phone on,
             | charged, and in your personal possession at _all_ times.
             | 
             | Due to large violation volumes, we are currently
             | experiencing long physical arrest team hold times. Your
             | arresting agents should arrive at your location in
             | approximately... 5... hours.
             | 
             | Thank you for using Apple products, the leaders in Social
             | Safety. 'Better Safe - Or You'll Be Sorry!'(tm)"
        
           | beckman466 wrote:
           | > A "blink", or the iterative strategy of a skilled
           | manipulator/abuser/groomer?
           | 
           | Exactly. Note this description of the cycle of abuse:
           | 
           |  _" [The abuse] cycle involves four stages:
           | 
           | 1) building tension
           | 
           | 2) an incident of abuse
           | 
           | 3) reconciliation
           | 
           | 4) calm
           | 
           | Rinse and repeat."_
           | 
           | Source:
           | https://www.healthline.com/health/relationships/cycle-of-
           | abu...
        
         | sneak wrote:
         | The fact that they are going to sneak this backdoor in the back
         | door later against the wishes of their customers is sufficient
         | for me to decide that Apple has abandoned any values they may
         | have once had.
         | 
         | I won't be purchasing any more iPhones, and I've had every one.
        
         | 5faulker wrote:
         | Hmm... this is getting more serious with potential
         | ramifications many years down the line.
        
         | Bud wrote:
         | Apple doesn't "sneak in" stuff like this. That's complete
         | bullshit. Could we at least discuss in good faith?
        
           | stjohnswarts wrote:
           | They were trying to make a small press release about it and
           | be done hoping no one would notice. Instead everyone got mad
           | and called them out on it. That is where they attempted to
           | sneak it in. Not everything is black and white and the
           | terminology is sufficient to describe the situation.
        
             | azinman2 wrote:
             | A press release is the exact opposite of sneaking something
             | in.
        
             | djrogers wrote:
             | Wait a sec - this wasn't a 'small press release', it was a
             | huge one, made several months in advance of the feature
             | launching, accompanied by a bunch of interviews with execs.
             | 
             | If you look at the tone of all of it, they also honestly
             | felt this was the most privacy preserving way to do CSAM
             | scanning - likely so they can enable E2EE on everything
             | iCloud.
             | 
             | What they got very very wrong was the public reaction to
             | it.
        
               | Zak wrote:
               | What they got very very wrong is the idea that spyware
               | running on the user's device is the most privacy
               | preserving way to scan for illegal content hosted on
               | their servers.
               | 
               | Apple has not announced E2EE for files stored on iCloud.
               | If that _is_ their intent, they likely would have had a
               | somewhat improved public response by announcing it at the
               | same time as the on-device spyware.
        
               | sneak wrote:
               | > _likely so they can enable E2EE on everything iCloud._
               | 
               | Refers to facts not in evidence.
               | 
               | Additionally, they have previously voluntarily declined
               | to enable e2e on systems to aid government surveillance.
               | 
               | This is baseless speculation, and, given what we know
               | about Apple's history with e2e, not only baseless but
               | actually unlikely.
        
               | TimTheTinker wrote:
               | > What they got very very wrong was the public reaction
               | to it.
               | 
               | What they got very wrong is that this fundamentally
               | changes the nature of iPhones -- they are no longer _user
               | agents_ ; in fact, they're actively scanning user data on
               | behalf of another party. That change doesn't just make
               | people _feel_ uncomfortable, it opens the door for untold
               | myriads of abuses.
               | 
               | It's one thing if iCloud servers scan user data - those
               | servers never were assumed to be _user agents_. It 's
               | entirely different when user-owned devices do this.
        
           | npteljes wrote:
           | I think OP referred to the finding that Apple already rolled
           | out parts of the detection thing
           | 
           | https://www.reddit.com/r/MachineLearning/comments/p6hsoh/p_a.
           | ..
        
             | mlindner wrote:
             | Yeah but no one's found any evidence that any code calls
             | it. (At least not that I've seen.)
        
               | arminiusreturns wrote:
               | Black box closed source systems sorta make verification
               | of what the device is doing rather difficult.
        
               | djrogers wrote:
               | Not really - there are a ton of ways to profile and
               | monitor a Mac to determine what processes are calling
               | into what, and spotting something activating this code
               | oth would be trivial if it were happening.
        
               | cwkoss wrote:
               | It may be trivial to look under a given rock, but when
               | there are hundreds of thousands of rocks, the likelihood
               | of someone noticing something left under a rock is low.
        
               | arminiusreturns wrote:
               | You say that, but there are also tons of ways to hide
               | such things (there is an entire field dedicated to side-
               | channel leakage). The point is you don't know unless you
               | have the source code (and can compile it yourself, I'm
               | ignoring the trust of compilers at the moment). You can
               | try to reverse engineer and analyse all you want, but it
               | doesn't mean you know what the system is really doing...
        
               | Piskvorrr wrote:
               | "Yeah, we have this really suspicious code here, sure,
               | but we have not started calling it. Pinky swear."
               | 
               | The issue here is not the trustworthiness of the pinky
               | swear...
        
               | mlindner wrote:
               | I'm not saying it's a good situation, but it's not
               | terrible yet.
        
               | Piskvorrr wrote:
               | I am reminded of the joke "an optimist falls from a
               | skyscraper. At second floor, he thinks: 'so far, so
               | good.'"
               | 
               | In other words, a situation that seems to be a feature
               | toggle away from becoming terrible is terrible in itself.
        
             | spywaregorilla wrote:
             | Isn't this the bad part? Like... it being there is the
             | problem, not apple specifically using it for child porn
             | detection. Right?
        
               | macksd wrote:
               | Correct. If we could magically end the rape of children
               | and stop the propagation of images of such, I think we'd
               | all be on board. But when a system would realistically
               | just push most of it elsewhere and violate a bunch of
               | innocent people's privacy, people get nervous.
               | 
               | It's like being asked to have an invasive, detailed scan
               | of your body shown to TSA personnel (and recorded) when
               | you fly. The motive is apparently to prevent violence.
               | But there were still incidents on flights when that was
               | being done. It wouldn't have caught the box cutters on
               | 9/11. The stated motive ceases to be part of the equation
               | for me.
        
               | azinman2 wrote:
               | Pushing it elsewhere is still a win. You don't want this
               | to have easy cover. Keep in mind FB reported close to 17M
               | pieces of CSAM last year. So if having the ability to
               | detect it was such a deterrent, why are their numbers so
               | high?
        
               | sbuk wrote:
               | Because the don't tell anyone what they are scanning?
        
               | azinman2 wrote:
               | They literally publicize these numbers. It's not like I
               | somehow know this FB secret. Every cloud provider scans,
               | except Apple.
        
               | sbuk wrote:
               | Apple currently do scan for CSAM in their cloud.
        
               | azinman2 wrote:
               | Their reported numbers were like 350 last year, compared
               | to almost 17M for Facebook. You could theorize Apple has
               | little CP, except in the Epic lawsuit we saw exec emails
               | released that said the exact opposite. So clearly not
               | much scanning has been going on.
        
               | dwaite wrote:
               | For email only
        
               | mlindner wrote:
               | I think it's actually impossible to use an AI to
               | determine this type of thing. Even courts of humans
               | sometimes rarely get it wrong sometimes with images that
               | are open for debate (for example missing context) on
               | whether it was abusive or not (parents bathing their
               | kids, as one example). Then you have the whole can of
               | worms that pedophiles will find sexual interest in images
               | taken in non-abusive contexts but will use them none the
               | less. How is an AI supposed to determine that. Even our
               | laws can't handle that.
               | 
               | There's also images that can be abusive if made public
               | but non-abusive if kept private. For example what about a
               | kid taking pictures of themselves nude (many kids don't
               | have parental moderated accounts) and putting them on
               | icloud (iphone does this automatically with default
               | settings).
               | 
               | It's a complete nuthouse and there's no way to do it.
               | 
               | (Also don't get me started on drawn artwork (or 3D CG)
               | that people can create that can show all sorts of things
               | and is fully or partially legal in many countries.)
        
               | Grustaf wrote:
               | Since the threshold is 30 images, and those images are
               | matched to a curated database of abusive material, what
               | you write here simply doesn't apply.
        
               | scoopertrooper wrote:
               | They're not using an AI for that purpose. They're just
               | sending a hash of the image and sending it along with
               | iCloud uploads. The hashes are of images that have been
               | identified by humans as containing child pornography.
        
               | godelski wrote:
               | They're not hashes in the way md5sum or sha256 is a hash.
               | They are _neural_ hashes. Which is not trying to
               | fingerprint a specific image, but tag any image that is
               | deemed sufficiently similar to the reference image.
               | Hashes are unique, neural hashes aren't.
               | 
               | To add, there are natural neural hash collisions in
               | ImageNet (a famous ML dataset). Images that look nothing
               | alike to us humans.
        
               | eurasiantiger wrote:
               | No, the hashes are visual hashes. It is completely
               | possible that a clean shaven adult taking pictures of
               | themselves could get flagged.
        
               | mrtranscendence wrote:
               | It's also possible that a picture of an avocado will get
               | flagged. A nude photo of someone who's shaved themselves
               | is unlikely to look so much like an actual CSAM photo
               | that it'll get flagged. It's certainly _possible_ , but
               | again, so's the avocado.
        
               | vmladenov wrote:
               | Apple already does virus scanning on device and has for a
               | decade. I fail to see how you trust one proprietary
               | binary file scanner to stay in its lane and not spy on
               | you, but fear a different one with a different stated
               | purpose.
        
           | behnamoh wrote:
           | > Could we at least discuss in good faith?
           | 
           | I'm afraid good faith is not gonna protect your privacy.
        
           | post_break wrote:
           | Well I don't mean by doing it without telling everyone. I
           | mean wait until it dies down, then just be like oh btw 15.X
           | includes CSAM scanning as a change log bulletin.
        
             | haswell wrote:
             | Given the uproar on this change, and the fact that the
             | uproar was big enough to cause this delay, can you imagine
             | the bad press if they come back later and roll this out
             | quietly?
             | 
             |  _Editing to add_ : I think this will play out more like
             | the butterfly keyboard debacle. They won't ever really
             | acknowledge how bad the original thing was, but they'll
             | return to a solution the community finds palatable (server
             | side scanning), and wrap it up in a bow like they did with
             | the "magic keyboard, now on our laptops".
        
               | vxNsr wrote:
               | They just need to time it right.
        
           | [deleted]
        
           | api wrote:
           | Google or (worse) Facebook would have just put this in with
           | no press release.
           | 
           | My read is that Apple is trying to placate governments by
           | throwing them a bone without totally abandoning their privacy
           | stance, hence the drive to put the scanning on the actual
           | device so all your photos don't have to be shared as-is with
           | Apple. That way they can encrypt iCloud but still keep the
           | feds from accusing them (in bad faith) of being a child porn
           | trading platform.
           | 
           | The alternative is to keep iCloud unencrypted and scan in the
           | cloud, which is what they and everyone else already does.
           | 
           | One of the pieces of feedback they got though is that people
           | are more okay with that than with scanning on the device.
           | People expect that things that go to the cloud are no longer
           | private unless you run some kind of local encryption that you
           | control.
           | 
           | The reason they don't care about photos not sent to iCloud is
           | the _trading platform_ angle. It 's one thing to let people
           | store shit on their own devices. It's another thing to be a
           | large scale and _easy to use_ secure channel that people can
           | use to share CP. The nightmare scenario for Apple would be to
           | become a de-facto standard in the CP underworld for sharing
           | content.
           | 
           | Sure people can use Tor, private VPNs and mesh networks, PGP,
           | or many other things, but that requires a level of technical
           | knowledge that greatly reduces the size of the audience.
           | 
           | Maybe a better alternative for Apple would be to add iCloud
           | encryption but to stipulate that any _sharing_ of content
           | disables encryption for the shared content. That way they
           | could scan shared content only. For unshared content iCloud
           | is basically just an encrypted backup.
        
             | nick__m wrote:
             | If it was the trading platform angle, could they just scan
             | on share or am I missing something ?
        
             | simion314 wrote:
             | >My read is that Apple is trying to placate governments by
             | throwing them a bone
             | 
             | You would believe this if you only read/watch Apple PR and
             | ignore reality. The actual reality is that Apple always
             | collaborates with all governments. The reality is that
             | Apple did not announce any new end to end encryption that
             | would use this feature and did not promise such a feature.
        
               | api wrote:
               | Apple isn't altruistic, but I do think they want to have
               | strong privacy. They cooperate with governments to what
               | seems to be the minimum extent required to do business in
               | a given country. They can't change governments.
               | 
               | In China and other more authoritarian nations that means
               | cooperating a lot. If you don't you do no business in
               | China. In the USA they seem to be looking for a new
               | minimum level of cooperation, or trying to enhance
               | privacy a little without moving the needle.
               | 
               | In any case this particular strategy looks like a failure
               | in the market. They might re-engineer it or backpedal or
               | just toss the whole idea of encrypting iCloud.
        
               | simion314 wrote:
               | >Apple isn't altruistic, but I do think they want to have
               | strong privacy.
               | 
               | Maybe if you define privacy that Google and FB do not
               | track you and only Apple do. You open an application on
               | your Mac and the event was sent insecure to Apple , find
               | a definition of privacy that is not contradictory with
               | what Apple was doing.
        
             | eldaisfish wrote:
             | what apple are trying to do is minimise damage to their
             | bottom line from bad PR. At the end of the day, more people
             | must realise that corporations are beholden to one thing -
             | money.
             | 
             | Morality barely matters and legality is something easily
             | skirted around with the right legal argument. Apple, google
             | etc with continue working with governments so long as their
             | profit can flow from those countries.
        
           | yummypaint wrote:
           | Their practice of suripticiously pushing updates to older
           | phones to kill the battery life would seem to fit that bill.
           | Their updates aren't open either, people just take them at
           | their word. If the law or EULA doesn't ouright prohibit them
           | from doing something it's prudent to assume they already are.
        
         | hutzlibu wrote:
         | "So now the question is in what 15.X update do they just sneak
         | it in."
         | 
         | I think they cannot really sneak it in. Some people _do_ read
         | updated TOS. And I think a  "featurechange" like this, requires
         | one.
         | 
         | And yes, the question will be, if the broad attention to the
         | topic can be brought back on to it, if that happens. And they
         | clearly say it will come, only with some "improvements" (but I
         | cannot think of any improvements, that can be made, without
         | abondoning the basic concept):
         | 
         | "we have decided to take additional time over the coming months
         | to collect input and make improvements before releasing these
         | critically important child safety features"
        
           | Zak wrote:
           | > _I cannot think of any improvements, that can be made,
           | without abondoning the basic concept_
           | 
           | The concept of scanning for CSAM during the upload process,
           | or the concept of making the user's device do it? Apple could
           | do the former on their own servers and essentially nobody
           | would be upset.
        
             | hutzlibu wrote:
             | The basic concept of Apple running private pictures through
             | an KI algorithm to detect something worthy of
             | investigation.
             | 
             | The algorithms never work 100% right. That means there will
             | be always humans in the loop, sorting through false
             | positives etc. which means probably very private pictures
             | here in this context.
             | 
             | And to the end user it does not matter, whether this
             | happens on the server or on their device. The message is:
             | your pictures are not private with apple.
        
               | Zak wrote:
               | It matters very much with regard to the distinction
               | between _yours_ and _theirs_. I strongly believe the
               | operating system of your device should never actively
               | work against your interests, and checking whether you 're
               | using the device to commit a crime, then reporting it to
               | the device manufacturer who will tell the government is
               | potentially strongly against your interests.
               | 
               | If that's too abstract, it matters in that it's a
               | foothold for further scanning of data on your device.
               | Without this system, if a state wanted to pressure Apple
               | to create spyware and add it to their operating system to
               | look for something else, that would have been easy to
               | refuse. With this system, the spyware is already there,
               | and despite Apple's protests to the contrary, it's just a
               | matter of a few tweaks to repurpose it.
        
         | yyyk wrote:
         | >Wow, we made Apple blink.
         | 
         | Did we? I suspect the real issue the original client-side
         | proposal had a lot of holes. What if the bad guys upload CSAM
         | which hasn't been tagged by NCMEC yet? What if they then delete
         | it from the device (but keep it on iCloud)? Or what if they zip
         | the CSAM images and backup that?
         | 
         | In order to be even semi-effective, the client-side scanning
         | has to be _more_ invasive, or they have to implement server-
         | side scanning too. Apple may well be looking whether they can
         | implement this scanning without even more backlash.
        
           | stjohnswarts wrote:
           | I don't think you're right. Apple didn't care about CSAM
           | policing, it's just they weren't opposed to it and throwing
           | governments a little bit of what they wanted to make them
           | shut up. They thought this through very thoroughly. I read
           | the whitepapers they put out on it. Clearly this wasn't done
           | overnight by a confederacy of dunces. I think they thought
           | they were picking a middle ground on surveillance and privacy
           | (clearly they weren't as they broke the 4th wall of "this is
           | my damn iphone you're scanning on!") . Anyway, they didn't
           | want to be the police and they aren't going to be doing more
           | to police your device as you are pointing out. They never
           | cared if they caught anyone, just trying to make the US
           | government a bit happier.
        
             | yyyk wrote:
             | Apple may have thought this thorough technically, but
             | politically I find it very unlikely the proposed method is
             | a sufficiently effective scanning.
             | 
             | Sooner or later government and NCMEC will push them to
             | complete the feature (understandably so from their POV),
             | and when they do, Apple will have to expand scanning. Apple
             | may have already been pressured to do so.
             | 
             | Sometimes choosing the middle ground is like choosing to
             | drive on the white line in the middle of the road as a
             | 'compromise' between the lanes.
        
         | albertop wrote:
         | Are you sure they did? Or it's just a doge to avoiding bad
         | press at the time they getting ready to introduce new devices.
        
         | 1vuio0pswjnm7 wrote:
         | "So now the question is in what 15.x update do they just sneak
         | it in."
         | 
         | It would be nice if the general public and the news therefore
         | now picked up on the issue of opaque "automatic updates". To
         | date, they have been trained to always every update without
         | question.
         | 
         | Conventional wisdom: Never, ever question any particular
         | automatic update. Don't think, just enable. All updates are
         | created equal.
         | 
         | Assumptions: Every update is for users benefit. Software
         | companies never operate out of self-interest. There are no
         | trade-offs. Whats good for them is good for users and vice
         | versa. Theres no reason for end users to trial updated versions
         | of software ("A/B test") before deciding to use them "in
         | production".
         | 
         | Thought experiment: User installs Software from Company to
         | perform "Function #1". Company makes changes to Software,
         | described as "Fixes and other changes." Software now performs
         | Function #1 plus Function #2. User allows new software to be
         | installed automatically. (Automatic updates enabled per
         | "expert" advice.") When user downloaded and installed Software
         | she based her desicion to use Software on its ability to
         | perform Function #1; however, did she have any interest in
         | Function #2. Did she agree to Function #2. Company says yes.
         | Developers say yes. What does User say. The license agreeement
         | places no limits on what might comprise Function #2. It could
         | be anything. It could have no benefit whatsoever for User.
         | Moreover, Company is under no oblgation to disclose Function #2
         | to User. With automatic updates, Company is free to keep
         | changing Software. User originally chose Software for Function
         | #1 but Software may look very different after many "automatic
         | updates". Would she choose it again in its current state. Segue
         | to discussion of "lock-in".
        
           | 1vuio0pswjnm7 wrote:
           | s/always/& accept/
        
         | ksec wrote:
         | I posted on HN [1] that I really wish they stand ground, it
         | makes decision on Apple a lot easier. Now they wont. Half of
         | the people will forgive them, the rest will most likely forget
         | about it in a few years time.
         | 
         | But to me, it did at least look like this scared the shit out
         | of Apple PR.
         | 
         | [1] https://news.ycombinator.com/item?id=28221869
        
         | [deleted]
        
         | teitoklien wrote:
         | > People like me who sold their IPhone are a drop in the bucket
         | 
         | I doubt it , they will bring this up again very soon , if they
         | cancel it, they'll get very bad press too.
         | 
         | And second, a company that once shows its intention to breach
         | your personal privacy while disregarding its user's best
         | interests altogether. Wont hesitate to do it again.
        
           | judge2020 wrote:
           | Apple could lose a million customers over this, but they
           | still sold >200m a year in 2018 and beyond.
           | 
           | https://www.statista.com/statistics/276306/global-apple-
           | ipho...
        
             | darksaints wrote:
             | The developer mindshare really matters to apple. As long as
             | they are on apple devices, some significant fraction of
             | them will be creating software for apple devices. Without
             | 3rd party software, apple is toast.
        
               | poetaster wrote:
               | Hmm, having experienced the influx of 1.5 million people
               | into an area smaller than Alberta, and the social
               | consequeces in the last 5 years, i'd say you are being
               | very glib. Thr shit show is just starting.
        
               | adventured wrote:
               | > The developer mindshare really matters to apple.
               | 
               | The developers are going nowhere so long as the money is
               | there to be made from iOS.
               | 
               | Did they all quit when it was revealed that Apple was
               | part of PRISM? Of course not. They barely blinked.
               | 
               | It doesn't matter if a billion developers sign a
               | petition. It's empty. The vast majority of them will
               | promptly capitulate if Apple goes forward. By vast
               | majority I mean 99%.
               | 
               | Did you see all the people desperately fleeing from
               | Australia due to the rise of the extremist police state
               | there? Nope. Did you see the tens of millions of people
               | flee from the US when the Patriot Act was passed or PRISM
               | was revealed? Nope. Did you see everyone rapidly flood
               | over to mainland Europe as Britain became a police state?
               | Nope. How about the hundreds of millions of people
               | fleeing out of China over the past decade with the rise
               | of Xi and the entire demolition of all human rights in
               | China? Nope. Perhaps you're spotting a predictable human
               | nature trend in all of this.
               | 
               | All the tech employees of Google, Facebook, Amazon,
               | Microsoft, Apple, etc. They've all quit in droves over
               | the past decade over human rights abuses - including vast
               | privacy abuses - and concerns for what these companies
               | are doing? Whoops, no, the exact opposite. There was a
               | price for their integrity as it turns out, and they
               | grabbed the money, which is what most people do.
               | 
               | The iOS developers are going nowhere, either. Besides the
               | fact that their livelihoods depend on it, there's no vast
               | green field to run to from what's happening. Sure, some
               | might change jobs, or switch industry segments, it will
               | be a small group though. It's going on in just about
               | every liberal democracy simultaneously. What huge
               | platform are developers going to flee to that's better
               | and is never going to feature forced on-device scanning
               | spyware? It's coming for Android too, bet on it.
        
               | cmsj wrote:
               | Not sure why you're getting downvotes.
               | 
               | iOS is where all the profitable users are. While that
               | holds true, developers will be there to sell things to
               | the profitable users.
               | 
               | The principled indie developers might leave in protest,
               | and that would be a terrible loss for the platform's
               | soul, but realistically the vast bulk of money flowing
               | through the App Store is not going to the indie
               | developers.
        
               | defaultname wrote:
               | Do you believe that most developers are up in arms about
               | this, ready to abandon the platform? I doubt more than a
               | tiny fraction really care about this at all, and the
               | hysterics on HN (threads brigaded by a small percentage
               | of HN users) are not representative. Predictions of some
               | massive sales drop -- or even a sales drop _at all_ --
               | are deluded nonsense.
               | 
               | Apple is delaying this, I suspect, because it confuses
               | the privacy message: If every time they talk about
               | privacy someone can make them look like hypocrites by
               | pointing out the on-device scanning, well that's just
               | ugly for Apple. I suspect the on-device scanning element
               | is going to be removed and it'll be on ingress to iCloud,
               | which is what I said on my very first post to this. It
               | was ill-considered to do it on device given Apple's
               | privacy push.
        
               | FabHK wrote:
               | > Predictions of some massive sales drop -- or even a
               | sales drop at all -- are deluded nonsense.
               | 
               | I remember, when that recalcitrant dentist was dragged of
               | a United plane, endless predictions of United's imminent
               | demise, people would never fly United again, yada yada
               | yada. Needless to say, nothing much happened.
               | 
               | > I suspect the on-device scanning element is going to be
               | removed and it'll be on ingress to iCloud, which is what
               | I said on my very first post to this. It was ill-
               | considered to do it on device given Apple's privacy push.
               | 
               | Not sure about that. Apple's approach (private set
               | intersection run on half client, half server) preserves
               | more privacy than scanning it in the cloud, and leaves
               | the door open for E2EE.
        
               | darksaints wrote:
               | Most? Probably not. But developers are more sensitive to
               | privacy issues in general, because they work in the most
               | privacy invasive industry that ever existed. They know
               | how much corporations dig into your stuff, and they might
               | have even implemented some of it. At least speaking
               | personally, I know my purchasing behavior changed on
               | Amazon when I worked for them and realized how much they
               | tracked me.
               | 
               | More importantly, there is an oligopoly in tech and up
               | until now, only one of the manufacturers of tech devices
               | cared about privacy. And to that extent, I wouldn't be
               | surprised if vast majority of developers who care about
               | privacy have biased themselves towards apple. And now
               | they have a reason to treat apple like every other
               | privacy invading company.
        
               | [deleted]
        
               | wallacoloo wrote:
               | > And now they have a reason to treat apple like every
               | other privacy invading company.
               | 
               | That's right. I like the idea that on android, I can use
               | stock tools like rsync to manage the photos/music on my
               | phone. With Apple it's been a constant battle against
               | iTunes to do those basic things. The only reason I chose
               | Apple last time I bought a phone is because I'm willing
               | to sacrifice a bit of convenience for the perception of
               | better security/privacy. If they lose on that front, then
               | I'll hop to whichever phone is easiest for me to use the
               | way I want to.
        
               | digisign wrote:
               | What does Amzn do besides tracking your purchases?
        
             | spacebear wrote:
             | If the million customers they lose also happen to be people
             | who build apps and influence the tech buying decisions of
             | everyone they know, that could be a very big problem over
             | time.
        
           | Lamad123 wrote:
           | They've already lost my trust!!
        
         | jimbob45 wrote:
         | >People like me who sold their iPhone and cancelled all
         | services probably are a drop in the bucket
         | 
         | Yeah, very few have the financial independence to be able to do
         | something like that. However, over the next few years when
         | everyone gets their free upgrades, I imagine we'll see some
         | slower, yet more robust, switching.
        
           | rconti wrote:
           | Financial independence to... spend less money?
        
             | post_break wrote:
             | Most people finance their phones. To sell them means to pay
             | them off which is extremely hard for most people. I had the
             | luxury of paying off my phone, and buying another while
             | waiting to sell my current phone. In that timeline it was
             | about $2,000 in cashflow. So yeah, I'm blessed to be able
             | to do so at the drop of a hat.
        
               | rconti wrote:
               | Gotcha, yeah, I was focusing on the "sell phone / cancel
               | services" side and didn't even think of the prerequisite
               | to be able to sell the phone. (and, i guess, buy a new
               | one before you ship your old one off!)
        
           | Grustaf wrote:
           | Just like people stopped using google and facebook because
           | they also scan for CSAM?
        
         | sharken wrote:
         | Generate enough bad press for long enough works wonders for
         | creating change.
         | 
         | The battle is not over though, as you say Apple might include
         | the feature unchanged in a later release.
         | 
         | If they do, should come as no surprise that they will lose a
         | lot of business.
         | 
         | The correct business decision will be to cancel the feature and
         | double down on the privacy angle.
        
         | A4ET8a8uTh0 wrote:
         | Just before everyone starts high-fiving each other, even the
         | release itself brands it as a delay. I accept that it is a
         | positive development, but do not read as anything but that: a
         | temporary delay to let the furor die.
        
         | brundolf wrote:
         | It was a torpedo to the heart of the privacy marketing campaign
         | they've spent who-knows-how-much-money building up over the
         | past several years. That translates directly to wasted dollars,
         | which they do care about.
        
           | echelon wrote:
           | Well done, screeching voices!
        
           | wintermutestwin wrote:
           | But isn't their privacy marketing: we will protect you from
           | abusive surveillance capitalists, not we will protect you
           | from state level actors?
        
             | xvector wrote:
             | privacy is privacy, you can't pretend to care about it
             | while at the same time pushing something like this through
        
               | wintermutestwin wrote:
               | >privacy is privacy
               | 
               | Are you asking Apple to come out and say that they are
               | supporting privacy against state-level actors? After
               | snowyD, why would anyone believe that anyway?
               | 
               | I'm not advocating against or for - I'm only wanting to
               | add more clarity to the discussion because I think it is
               | an important distinction that is often left out.
        
             | lrem wrote:
             | Wasn't there a very public "no, FBI, you cannot get the
             | shooter's phone contents"?
        
         | [deleted]
        
         | istingray wrote:
         | Congrats on selling your iPhone so fast. I managed to purchase
         | a System76 laptop in the meantime, typing from it now.
         | 
         | The cat is out of the bag. You and I have realized Apple can
         | make these type of changes whenever they want.
         | 
         | I'm disappointed to report that Linux needs a lot of
         | improvement to be viable for most people. I'm forging ahead,
         | donating to open source software and hardware projects, filing
         | bug reports, and generally bringing the Mac spirit to Linux.
         | "It just works."
         | 
         | What phone did you switch to?
        
           | mrtranscendence wrote:
           | > I managed to purchase a System76 laptop in the meantime,
           | typing from it now.
           | 
           | If I could just get a decent higher-than-1080 panel from
           | System76, I might consider it for my next laptop. I've been
           | spoiled by Macs. As it is I might go with a Framework laptop
           | instead.
        
             | istingray wrote:
             | Email them and say that! We need to be more vocal as
             | customers so vendors don't waste time.
        
           | TaylorAlexander wrote:
           | > I'm disappointed to report that Linux needs a lot of
           | improvement to be viable for most people. I'm forging ahead,
           | donating to open source software and hardware projects,
           | filing bug reports, and generally bringing the Mac spirit to
           | Linux. "It just works."
           | 
           | IMO it would go a long way if we taught computer literacy and
           | specifically Linux literacy to everyone in school. Microsoft
           | and Google have all the school contracts and they bring
           | people up on systems that "just work" (actually they have
           | paid system administrators). If we taught people from a young
           | age to use Linux and handle problems, then the small problems
           | that often come up with open source need not be barriers to
           | adoption. Even if every regular person can't always solve
           | every linux problem, if they were familiar enough to use it
           | and had at least one expert friend, they'd be fine.
           | 
           | That's not to say we shouldn't strive to make Linux "just
           | work" for most people, but we can attack the problem from
           | other angles as well.
        
             | istingray wrote:
             | I agree more tech literacy is needed. The learning curve
             | for Linux seems so steep. For what it's worth, I took UNIX
             | classes in high school, and know how to get around.
             | 
             | But Linux is just far more than I can handle, even just to
             | install apps and configure settings. For example just
             | trying to get my mouse not to stutter, I dig and find a
             | solution. I copy and paste it into the terminal. It doesn't
             | work and I try 3 different solutions. I'm curious about
             | solutions but it wears my curiosity out pretty fast. I need
             | to get work done.
             | 
             | In the sense that computers are like cars, I'm okay not
             | knowing how exactly it works under the hood.
        
               | TaylorAlexander wrote:
               | Yes this kind of thing is frustrating. And I may be very
               | wrong, but I have been wondering, what if this is what
               | open source is always going to be like. I am 100% pro
               | open source and I think everything (cars, trucks, washing
               | machines, factory machines) should be open source. But I
               | guess if we got to that point, where the people producing
               | the computers and the people writing the code were all on
               | the same side, maybe things wouldn't break so often.
               | 
               | Also FWIW I have learned, even though I run debian, to
               | check the Arch Wiki for tips on problems like that. One
               | more place to check in addition to stack overflow.
        
           | rconti wrote:
           | > I'm disappointed to report that Linux needs a lot of
           | improvement to be viable for most people.
           | 
           | I'm continually surprised that people are continually
           | surprised by this. I'm sure this post will get a lot of
           | anecdotal replies; of course it can work fine for many
           | people, but that's not the point. I used it exclusively on
           | the desktop from 1995 to 2002. I could make it work for me,
           | _today_ , if I wanted to.
           | 
           | > You and I have realized Apple can make these type of
           | changes whenever they want.
           | 
           | Was this not obvious? I'm pretty shocked that, for example,
           | Docker can decide to change their license, and now in a few
           | months a bunch of companies owe them money for every single
           | user they have, _even if the users don 't do anything
           | different_. If they opened the license agreement and chose
           | NOT to upgrade. If they never use the software again. But
           | bam, terms changed, now you owe us money.
        
             | istingray wrote:
             | Yeah what a bad deal. What can we so about this?
        
             | mrtranscendence wrote:
             | > I'm sure this post will get a lot of anecdotal replies
             | 
             | I tried Ubuntu last year, briefly. Almost everything worked
             | fine, except ... OK, I'm a trackpad user. I have a couple
             | Apple Magic Trackpads and I prefer to use those over a
             | traditional mouse.
             | 
             | It worked fine for, like, a day? And then out of nowhere it
             | just stopped responding. Reset or reinstalled everything I
             | could think of and it still failed to work. OK, whatever, I
             | bought a traditional bluetooth mouse. Which went to sleep
             | every time it remained still for more than a split second,
             | rendering it basically unusable (not a problem on Windows
             | on the same machine).
             | 
             | Maybe a whiz could have fixed one or both issues, but
             | googling completely failed me. This was pretty basic stuff,
             | surely? And yet I couldn't even get bluetooth mice to work
             | on the most mainstream distro.
             | 
             | (Now, admittedly, the trackpad doesn't work properly out of
             | the box in Windows either, since Apple doesn't provide
             | drivers that work outside of Boot Camp. But at least there
             | are paid drivers that work really well.)
        
               | kaladin-jasnah wrote:
               | Anecdotally, I have installed Linux on older MacBooks and
               | have had the Magic Mice working fine without any issues.
               | 
               | Anyway, I don't know. People have all sorts of annoying
               | issues with Linux that end up being things that I have
               | never encountered before. I usually use wireless mice
               | with USB transceivers, which have never had problems and
               | I feel like not everyone has such a bad experience
               | (although I know many people who do). Perhaps it's
               | because it's Apple hardware or something, but,
               | admittedly, I have had MacBooks work perfectly on Linux
               | before.
               | 
               | I think that, however, you use Linux on a desktop rather
               | than a laptop, you will find that the hardware experience
               | is quite pleasurable and does not have nearly the same
               | number of issues. Between the two, Linux does manage to
               | shine decently on desktops.
        
           | abnercoimbre wrote:
           | Is the System76 experience worthwhile so far? I'm about to
           | invest in a Pangolin [0] this Labor Day weekend and could use
           | an honest review from a fellow HN ex-Mac user.
           | 
           | [0] https://system76.com/laptops/pangolin
        
             | istingray wrote:
             | see my comments above in this thread. specific comments are
             | welcome. i love talking about this thing :) As a Mac user,
             | the build is about what you would expect from a PC. It's
             | not one of those nightmare shit PCs. It does its job.
             | 
             | Get that Pangolin before they sell out! Is there a Labor
             | Day sale or something?
        
               | abnercoimbre wrote:
               | I'm _hoping_ there 's a Labor Day sale :) Thanks for the
               | comments, I'm motivated to take the plunge now!
        
             | Raineer wrote:
             | I ordered a Pangolin yesterday as I've been waiting for an
             | AMD laptop forever. This is my third System76 laptop, and
             | the first two are still great.
             | 
             | My first was a Galago Pro, and my only complaint was that I
             | went HiDPI. That was just a bad choice on my part. The
             | software supports it fine, I just don't prefer it in a
             | laptop unless it's literally a Mac-level Retina-quality
             | panel. It wasn't, but it's half the price point.
             | 
             | My other is a Gazelle 15", which dual-boots Arch and
             | Windows. I use it primarily as a gaming box when I travel
             | (remember those days?). I spent a lot of raid nights on
             | various MMOs from hotel rooms. It works great.
             | 
             | Really looking forward to the Pangolin. Mine is still in
             | "building" status.
             | 
             | I see a lot of sh*t written about System76 because they are
             | just rebranded Clevos. Well...yeah? They are, and they are
             | fine. I would rather give money to a company like System76
             | a thousand times over than someone dripping with anti-
             | consumer behavior like Dell.
        
               | istingray wrote:
               | System76 is incredible. I have nothing but good things to
               | say. They are the future of Linux.
               | 
               | Re: Clevo - obligatory copy pasta from System76 Chief
               | Engineer right here on HN [1]
               | 
               | "System76 UX architect here! This vastly trivializes the
               | work System76 does for months and sometimes years leading
               | up to a product release. We don't simply take an off-the-
               | shelf product that already exists, throw an OS on it, and
               | sell it.
               | 
               | System76 works with upstream manufacturers (like, yes,
               | Clevo for laptops) to determine what types of products to
               | develop, including their specifications, design, etc. for
               | months up to a release. These products do not exist
               | before we enter into these conversations.
               | 
               | Once that has been determined, designed, and goes into
               | production, we start on firmware. We ensure all
               | components are working together and with the Linux kernel
               | (often requiring changes to the components' low level
               | interactions with the OS, since the upstream components
               | themselves are often manufactured with the assumption
               | they will be used by Windows).
               | 
               | Once that is complete, we test with Ubuntu and Pop!_OS
               | specifically, ensuring the OS is working perfectly with
               | the hardware. If there are any OS-specific changes to be
               | done, we write that behavior into Pop!_OS and/or our
               | "driver" which is preloaded on all machines (and
               | available in any Ubuntu-based distro, Arch, Fedora,
               | etc.), with the intent to upstream that into Ubuntu,
               | GNOME, and/or Linux itself as quickly as possible. When
               | this is more generic like ensuring HiDPI works great out
               | of the box, this actually ends up benefiting competitors
               | like Dell's XPS 13 probably as much as it benefits us,
               | but we put in the effort to file the bugs, track them,
               | write the code, and get it upstreamed.
               | 
               | Once all of that is complete, we finally offer it for
               | purchase and market it with all of our pretty
               | photographs, sales pages, etc.
               | 
               | What ends up happening, then, is Clevo offers a machine
               | with a similar-looking chassis for sale as a barebones
               | laptop. This is the result partially of the decision
               | making System76 has made for what to produce in the first
               | place. These products, however, do not contain any of the
               | firmware or driver work that System76 has invested in.
               | They do benefit from the nice photography and advertising
               | System76 has done, and since they look similar, people
               | assume they're going to get the same machine for cheaper
               | "directly from the manufacturer."
               | 
               | Edit: regardless, this is a bit beside the point of the
               | linked blog post, and is also becoming less and less true
               | as we work on designing and manufacturing our products
               | completely in-house."
               | 
               | [1] https://news.ycombinator.com/item?id=17040293
        
               | Raineer wrote:
               | Thank you for the quote, I hadn't seen that. I certainly
               | didn't mean to trivialize their work (obviously? since
               | I've voted with my wallet 3 times)
               | 
               | Their work on both the firmware front and with Pop!_OS
               | should not be overlooked. And, it should be mentioned, if
               | one is familiar with their whole product line - they now
               | go far beyond just laptops for the open source community.
               | And their powerhouses are absolutely not from some other
               | OEM.
        
             | istingray wrote:
             | Also, what pushes you to get a Pangolin (AMD) instead of
             | Lemur (Intel)? I'm curious because I don't really
             | understand the meaningful difference and bought the Lemur
             | when it was in stock.
        
               | abnercoimbre wrote:
               | For APUs there's no beating AMD Ryzen, especially if
               | you're into gaming. Also my Windows laptop has a Ryzen
               | and I'm just used to it now.
        
             | Jayschwa wrote:
             | I've been using an Oryx Pro (v5) for a few years. It's
             | mostly fine.
             | 
             | My main disappointment is that it doesn't "just work" with
             | the kernels shipped by Ubuntu. System76 provides their own
             | kernel packages, but they sometimes cause weird conflicts
             | with other low level packages, and it can't hold its charge
             | when suspended. I haven't attempted to debug the issue
             | because I usually stay plugged in when I'm working, but I
             | suppose I ought to engage tech support. They have been
             | helpful in the past.
             | 
             | At the end of the day, I'll probably still give them my
             | first look if I'm shopping for another laptop.
        
               | mark_l_watson wrote:
               | I also have had an Oryx Pro for several years. I stuck
               | with System76's branded Linux, just do updates as they
               | are available, and so far all has been well. I like that
               | they keep CUDA working with very few hassles.
               | 
               | Compare this to my migration to a M1 MacBook Pro: I love
               | it, but it is rough doing deep learning on it.
        
           | inpdx wrote:
           | "I'm disappointed to report that Linux needs a lot of
           | improvement to be viable for most people."
           | 
           | Evergreen.
        
             | nbzso wrote:
             | As a designer, the only thing that I miss on Linux is
             | graphical software at least on a level of Affinity
             | Designer. Everything else is more than viable. Blender.
             | Resolve. BitWig Studio. RawTherapee, Krita. For web
             | development is perfect, my engineers run ARCH/Manjaro over
             | soon to be abandoned Apple hardware. Even my old MacBook
             | Air is supported, Pop!_OS is working perfectly with
             | properly mapped keyboard keys (keyboard backlit controls,
             | audio, playback, etc). The only thing that I have done is
             | to plug thunderbolt/ethernet adapter, clicked on popshop
             | update, restarted and wifi started working flawlessly.
        
           | digisign wrote:
           | I ordered the pine64 but it still has not arrived.
        
           | pshirshov wrote:
           | Buy a Pixel and install Calyx or Graphene with sandboxed Play
           | Services.
        
             | RussianCow wrote:
             | How does sandboxing Play Services compare with using
             | microG?
        
               | pshirshov wrote:
               | Well, push notifications are lot more reliable, safetynet
               | passes basic check and app do not complain about play
               | services issues.
               | 
               | But network location provider doesn't work, that's an
               | issue.
        
             | istingray wrote:
             | Is Pixel open hardware? I'm assuming not but props to them
             | if so.
        
               | pshirshov wrote:
               | It's not. Unfortunately there is little open hardware
               | around.
               | 
               | Though proprietary hardware plus mostly open OS is better
               | than proprietary everything.
        
           | Hackbraten wrote:
           | Not GP but I took a leap of faith and ordered a Librem 5.
           | It's supposed to ship early 2022 but I'm not holding my
           | breath.
        
           | post_break wrote:
           | S21 ultra. Out of one place into another I know. But it's
           | also the principle of the thing. I don't share my location
           | data with the bare minimum, and sideload stuff. I don't see
           | hardware good enough to go completely off-grid from Google
           | yet so this was my compromise.
        
             | istingray wrote:
             | Understandable. I suggest donating to open source options
             | as part of your compromise. Purism, PinePhone, Ubuntu
             | Touch....or buying something from them if it's not an
             | option.
        
               | post_break wrote:
               | Unpopular opinion but I mostly donate to charities that
               | are confirmed to spend the money on more than
               | administrative people. Open source donations I make for
               | software I use. If an open source thing comes around that
               | can replace my Samsung phone I'd buy in a heartbeat.
        
               | istingray wrote:
               | Yes! And because of this, this means you need to donate
               | _ahead_ of time so things can be built to replace your
               | Samsung phone. After it 's built well...what would they
               | spend your money on? Donate early, donate often.
        
           | luxuryballs wrote:
           | I mean I already have accepted the fact that they allow
           | Signal type apps because if the FBI or whoever requests it
           | they can just access your display driver layer and see what
           | you're typing locally, no need to ban encrypted messaging. I
           | just keep remembering how Lavabit shut down because he
           | refused to comply and otherwise had no choice in the matter.
           | Computers and software should not be easily trusted.
        
         | babypuncher wrote:
         | I think the internal backlash helped a lot. The initial move
         | was very contrary to the culture Apple has cultivated over the
         | years and it did not go over well with them.
        
           | bobbytit wrote:
           | Yeah their orig mac ad was 1984 will not be like 1984.
           | 
           | They were just out by 37 years.
        
         | ak391 wrote:
         | also check out the neural hash demo here to test it out on your
         | own images https://huggingface.co/spaces/aliabd/generate-
         | neural-hash-co...
        
         | gjs278 wrote:
         | enjoy being a green bubble in people's texts now
        
         | rajacombinator wrote:
         | Yep they'll just quietly backdoor it in next time.
        
         | spacebear wrote:
         | My bet is they wait a few months and quietly announce that
         | they'll do the scanning on the iCloud servers instead. That
         | would be relatively uncontroversial in the tech press and
         | probably never make it to the mainstream press.
        
         | tyingq wrote:
         | I have to wonder if the sequence was somewhat deliberate on
         | Apple's part. Like:
         | 
         | - The government is pressuring us to do things that don't fit
         | with our "privacy is good" sales pitch
         | 
         | - Let's propose something, but in a ham-handed enough way that
         | it gets a ton of public push back
         | 
         | - Now we can argue our point with the government in a way that
         | has already has more public support and existing
         | discourse/debate
         | 
         | Somewhat risky though, since it hurt their public image.
        
           | labcomputer wrote:
           | Then why wouldn't they say, "the government asked us to do
           | this"? Apple is taking all the negative publicity this way.
           | 
           | My favorite theory is this:
           | 
           | Apple wants to create a chilling effect on reverse
           | engineering iOS. They're starting to catch regulatory
           | attention and lost their recent suit against Corellium. By
           | putting neural hashes in the encrypted OS image, they can
           | accuse anyone who reverse engineers iOS of:
           | 
           | 1. Being in cahoots with child abusers
           | 
           | 2. Knowingly possessing CSAM
           | 
           | I would hope that most courts would see through that paper
           | thin logic, but the idea would be simply to introduce enough
           | doubt that most reverse engineers don't want to risk it.
        
             | xvector wrote:
             | > Then why wouldn't they say, "the government asked us to
             | do this"?
             | 
             | NSLs are routinely issued to silence companies.
        
               | labcomputer wrote:
               | Oh, you're so close... what was the rest of the context?
               | 
               | It was that Apple somehow gets negotiating leverage with
               | a three letter agency.
               | 
               | So how does Apple get negotiating leverage in a context
               | with a TLA when they can't say "they made us do it"?
        
             | mackrevinack wrote:
             | they could be under a gag order which would mean if they
             | did mention anything about the government they could get
             | into deep shit
        
               | upbeat_general wrote:
               | I can't imagine how the gag order would allow Apple to
               | announce a CSAM scanning system but not announce that the
               | government forced them to.
               | 
               | What basis could there be for that? And I'd be even more
               | skeptical if this originated from a single case of abuse.
        
               | labcomputer wrote:
               | Sure, but look at the context. The hypothesis was that
               | Apple was doing this to create negotiating leverage with
               | some three letter agency.
               | 
               | How does Apple get leverage if Apple gets all the
               | negative publicity, and can't even say "TLA made us do
               | it"?
        
       | istingray wrote:
       | The big takeaway for me: Linux isn't a viable alternative yet for
       | most people.
       | 
       | We have no alternative to being stuck with Apple. I had always
       | thought I chose Apple. But it wasn't a choice really because
       | there are no other viable alternatives.
       | 
       | I'm typing this from a new Linux machine I purchased in response
       | to Apple's surveillance announcement. The hardware and support is
       | terrific. But Linux falls short.
       | 
       | - For a casual user, Linux doesn't support bluetooth headphones,
       | and the mic is basically impossible to get it to work. That means
       | I can't have video calls.
       | 
       | - To get my mouse to work I have to clone a github from some
       | random person.
       | 
       | - To install Brave I have to use the CLI and copy and paste a
       | blob of commands I don't understand (https://brave.com/linux/).
       | 
       | I'm not a developer. I need to work. I just need a computer to
       | work on and have video calls with a bluetooth mic. Not doable on
       | Linux after spending $1500 on a new computer.
       | 
       | That said, open source, end to end encryption is the only path
       | forward. Linux _must_ be viable.
       | 
       | If you work on Linux, thank you. Please continue to be empathetic
       | to users who have no idea what a CLI is and have no interest in
       | learning and just want to work using open source systems.
       | 
       | If you are a regular person who wants to use Linux, be vocal
       | about it. Donate money, file bug reports, complain and let people
       | know what you need. Buy from companies leading the way like
       | System76 and Purism. (I bought from System76, highly recommended)
       | 
       | Linux has been built and used by brilliant mechanics, but it is
       | time for Linux to be used by all.
        
         | geophile wrote:
         | What OS?
         | 
         | I have a 2+-year old Darter Pro, running Pop OS 20.10, which
         | has been wonderful. Bluetooth just works (as long as there is
         | only one user attempting to use it), the mic just works, the
         | mouse just works. Also, System76 support is the best I've
         | encountered since I bought my first computer in the 80s.
        
           | istingray wrote:
           | It's a Linux/bluetooth issue not a computer vendor issue.
           | 
           | Yes support is terrific. What mic/headphones do you use?
           | 
           | I eventually got the Sony XM3s to play sound after a bunch of
           | CLI commands and random Linux threads. But bluetooth mic
           | never worked.
           | 
           | Support recommended getting a wired mic. I figured they would
           | know, having talked to all sorts of users.
        
         | [deleted]
        
         | pkulak wrote:
         | Well, to add to the anecdata, I set my Dad up with a System76
         | machine years ago and he's been chugging alone with zero
         | complaints.
        
           | istingray wrote:
           | Zero complaints...to you. I can see giving up on asking other
           | Linux users for help if they're so dismissive. Users aren't
           | wrong. Apple gets that. If you have a problem getting
           | something basic to work, it's Apple's fault, not the user's
           | fault.
           | 
           | This will be a hard cultural shift to make for Linux culture
           | given their starting point today.
        
             | pkulak wrote:
             | System76 takes full responsibility for their machines. My
             | Dad has a number to call if something doesn't work... and
             | it's not mine.
        
         | geethree wrote:
         | The Linux Bluetooth experience may not be as slick as apple's,
         | but it certainly works as expected.
         | 
         | Bluemon + pulseaudio + pavucontrol has served me well for
         | years.
        
           | istingray wrote:
           | I'm not a developer, I don't understand CLI, and I can copy
           | paste stuff to the best of my ability.
           | 
           | I spent 4 hours getting my Sony XM3 headphones to connect.
           | They finally played sound. I never got the mic to work at
           | all.
           | 
           | Linux is not ready for mainstream adoption. The fact that you
           | need to mention 3 packages needed is proof.
        
           | ubermonkey wrote:
           | I guess it's a matter of what you think of as "as expected."
           | 
           | Pairing events, and switching between devices, is a dead easy
           | with Apple gear. Non-Apple peripherals work very very
           | smoothly, too, though when there's an Apple option I tend to
           | pick that because of the additional ease of use.
           | 
           | Windows is materially worse. Linux lags Windows.
        
         | wallacoloo wrote:
         | > Linux doesn't support bluetooth headphones for example.
         | 
         | That's not true. I've been using Bluetooth on Linux since 2015,
         | on at least 3 different machines. I hate to be a Linux trope,
         | but you probably haven't set it up right. Finding the right
         | distro will be key to your Linux experience, because the good
         | distros make these things easy. The right distro won't require
         | anything beyond a GUI to pair with your BT device, if that's
         | what you want.
        
           | istingray wrote:
           | Does the microphone work in your headphones? Let me know what
           | model I'll check it out.
        
         | skarz wrote:
         | wow. what distro are you using? mint cinnamon is pretty modern
         | (especially after you theme it) and bluetooth headphones have
         | worked quite well for a long time. never had a mouse not
         | work...there's even a small program to pair logitech devices. i
         | guess this dell 7470 works better with linux than sys76. only
         | driver i've had to install was for my gpu and there's even a
         | program for that!
        
         | NexRebular wrote:
         | And here I am, in process of purging linux from all the servers
         | and workstations replacing them with BSD and illumos. Linux is
         | not the only option and in fact, should never be the only
         | option.
        
         | driverdan wrote:
         | Are you sure you didn't get a lemon? I haven't heard of others
         | complaining about System76 BT issues. On the other hand I've
         | heard of many people having BT issues with Apple laptops. BT
         | never worked on one of my work Macbooks.
         | 
         | I'm running PopOS on a Surface Pro 3 and it's nearly flawless,
         | including BT.
        
           | istingray wrote:
           | It's a Linux problem, not computer hardware problem.
           | 
           | Does the mic in your headphones work? Let me know the model,
           | I'll check it out for myself.
        
         | MangoCoffee wrote:
         | >Linux has been built and used by brilliant mechanics, but it
         | is time for Linux to be used by all.
         | 
         | it will be very hard for Linux to be used at home/corporate PC.
         | 
         | Apple/Microsoft started out in the home/corporate PC. Apple
         | with Apple 1 and Microsoft's DOS. They are too entrench in the
         | PC industry. Linux shine in server, mobile, IoT and embedded
         | devices.
        
         | fouc wrote:
         | > I had always thought I chose Apple. But it wasn't a choice
         | really because there are no other viable alternatives.
         | 
         | Looking back, that seems accurate. Though it was perhaps not
         | entirely about viability but more about sheer convenience,
         | sheer joy of shit just working, etc.
         | 
         | Unfortunately it seems we ended up sacrificing freedom at the
         | altar of convenience..
        
           | istingray wrote:
           | Doh! It's true.
           | 
           | For me it boils down to trading freedom for bluetooth
           | headphones/mic that just work.
        
         | ubermonkey wrote:
         | Ironically, I suspect Apple is the reason end-user Linux isn't
         | dramatically farther along on the adoption curve.
         | 
         | I feel like story of my move the Mac is not at all uncommon: in
         | the very late 90s, when I was doing mostly consulting and not
         | really any coding, my work life revolved around emails and
         | office docs. Windows on laptops of the era was AWFUL -- long
         | boot times, unstable/unusable device sleep, frequent crashes,
         | etc.
         | 
         | I had a coworker using a G3 PowerBook, and his actual user
         | experience was drastically better than mine by every metric
         | (except, I guess, in that his Powerbook was significantly
         | heavier than my super-sleek ThinkPad). Crashes were rare. Boot
         | time was fast, but it mattered less because sleep actually
         | worked.
         | 
         | I switched. Then the dot-com crash happened, and our company
         | failed, and I hung my own shingle out -- and right about that
         | moment, OS X happened.
         | 
         | My Mac went from being difficult to work on (for a LAMP-stack
         | person) to being the IDEAL platform. I had a real bash prompt,
         | on a machine that shipped with all the tools I wanted, and on
         | which I could easily install most anything else (even if, early
         | on, I had to build from source) -- and this same machine ran
         | true Office, and had a host of really well-designed other
         | applications and utilities that made my more productive and
         | happier. It was a total win. And now, 20+ years later, I'm
         | still here, typing on a Mac.
         | 
         | If OS X hadn't happened, then the work I had after the dot-com
         | crash would've pushed me and a lot of other people like me to
         | full-time Linux.
        
           | istingray wrote:
           | I like the idea of Apple as a bridge to the global awakening
           | of Linux :)
        
           | sswastioyono18 wrote:
           | same with me. Tried mac once 7 years ago, didnt work out so I
           | sold my macbook pro. 4 years later, my new workplace gave me
           | mac and topd me windows laptop is moatly only for QA. Then I
           | realized its actually a good thing and I learnt a lot by just
           | using mac and get myself familiar with linux as well.
        
         | nonameiguess wrote:
         | What model? I was shipped an Oryx Pro for work and it's given
         | me exactly zero problems in 11 months. I've got a wireless USB
         | mouse, wireless USB keyboard, USB smart card reader plugged in
         | through a hub, and an external monitor connected via HDMI in
         | extend the screen mode. Even with an NVIDIA discrete GPU, every
         | single thing has worked plug and play on the first try. I use a
         | bluetooth headset for all meetings and it also works perfectly
         | fine, paired in exactly the same way as it does on any other
         | platform. The Linux bluez stack definitely supports headphones.
         | 
         | Ironically, the only gotcha moment I have ever had was when
         | plugging my iPhone in to charge it, I kept noticing my network
         | connection would get wonky, and I eventually realized Apple
         | puts code injection into its USB cables that include a setting
         | to automatically tether a device an iPhone get plugged into to
         | use Ethernet over USB, so I was being disconnected from my WiFi
         | access point and then being tethered through the iPhone back to
         | the same WiFi access point, except mediated through the iPhone
         | as a relay. Yet another user-hostile, annoying on by default
         | Apple setting I had to discover by accident and then turn off.
        
           | istingray wrote:
           | It's a Linux + bluetooth problem, not a computer vendor
           | problem. Bluetooth microphones are basically impossible to
           | get working on Linux. That's what I was told by vendor
           | support as well, just get a wired mic as bluetooth
           | headphones+mic are unreliable.
           | 
           | I was attempting to use Sony XM3s.
           | 
           | Glad to hear you have some headphones that work, what model?
        
         | Macha wrote:
         | - Bluetooth headphones
         | 
         | I've never had that much to do to get them to just work.
         | 
         | I have done some Bluetooth tinkering to:
         | 
         | (A) Get them to use high quality codecs, something which also
         | requires tinkering for a Mac: https://medium.com/macoclock/how-
         | to-enable-aptx-on-mac-osx-b... - that's with pulse. With newer,
         | Pipewire using, systems, even that is not needed which for my
         | use case puts Linux ahead of Mac OS for Bluetooth headphone
         | support.
         | 
         | (B) my previous headphones would get super confused if I paired
         | them on Linux and windows on my dual boot device as they'd see
         | different keys on the same host and that broke them so copied
         | my keys from my Linux config files to my windows registry so
         | both would use the same keys. My newer headphones seem to "just
         | handle" this situation though. This is a dual boot issue and
         | not specifically a Linux issue. My Linux only devices are fine.
         | 
         | - Mouse. I have since 2005, never had any issues with any mouse
         | on Linux ever. Not even for all the extra buttons on my current
         | Logitech mouse where they just happily show up as mouse button
         | 5-9 and I can bind them in games.
         | 
         | - Brave: Honestly I'm surprised Pop is not setting up a GUI
         | package manager with proprietary software able to be installed
         | from it with their target market, but I'll take your word for
         | this one.
        
           | istingray wrote:
           | Does the microphone in your bluetooth headphones work?
           | 
           | Brave: Just to clarify, it's Brave.com itself that only has
           | CLI instructions
           | 
           | From https://brave.com/linux/ ---
           | 
           | sudo apt install apt-transport-https curl
           | 
           | sudo curl -fsSLo /usr/share/keyrings/brave-browser-archive-
           | keyring.gpg https://brave-browser-apt-
           | release.s3.brave.com/brave-browser...
           | 
           | echo "deb [signed-by=/usr/share/keyrings/brave-browser-
           | archive-keyring.gpg arch=amd64] https://brave-browser-apt-
           | release.s3.brave.com/ stable main"|sudo tee
           | /etc/apt/sources.list.d/brave-browser-release.list
           | 
           | sudo apt update
           | 
           | sudo apt install brave-browser
        
             | Macha wrote:
             | > Does the microphone in your bluetooth headphones work?
             | 
             | Yes. I had to go into the audio settings to change between
             | Headphones (high quality audio) and Headset (mic enabled)
             | modes which is a bluetooth thing. But it works, and is a
             | thing I need to do on Windows also.
             | 
             | (Actually on Windows I go out of my way to disable the
             | headset device, because it just makes everything bad by
             | being enabled - communication programs mute my everything,
             | the audio playback becomes bad, and the microphone quality
             | on my headphones is just worse than on my webcam, never
             | mind my desk mic, so I never want to use it)
             | 
             | ---
             | 
             | Yeah, that's a Brave thing. They probably (rightly to be
             | fair) assume most current Linux users understand the
             | command line enough to follow those steps and Brave would
             | prefer users get the browser from their repos so they don't
             | have to deal with users having potentially out of date
             | browsers in support requests.
             | 
             | But you can get Brave in a snap, which to my understanding
             | means you can get it in software center on Ubuntu, Manjaro,
             | etc. And you can probably install the snap store on Pop
             | also.
             | 
             | Chrome on the other hand just serves you a package you can
             | double click to install, if you'd rather get it from them
             | rather than your package manager/app store:
             | https://www.google.com/intl/en_uk/chrome/
        
         | bamboozled wrote:
         | You're individual experience isn't universal.
         | 
         | Running Fedora 34 on X1 Carbon. Bliss.
         | 
         | Linux is community driven, if somethings not working, try fix
         | it?
        
           | Espressosaurus wrote:
           | > I'm not a developer. I need to work. I just need a computer
           | to work on and have video calls with a bluetooth mic. Not
           | doable on Linux after spending $1500 on a new computer.
           | 
           | > Linux has been built and used by brilliant mechanics, but
           | it is time for Linux to be used by all.
        
           | istingray wrote:
           | Replying here to say I agree with the other posters.
           | 
           | I'm not a developer. I need a system to do my work. I need to
           | do video calls using a bluetooth headset with a microphone.
           | 
           | I'm not in a position to spend time "fixing". But I am in a
           | position to pay. But Linux seems to have few options
           | available to pay for polish. I would pay $500 this instant to
           | have the ease of connectivity that Apple has on this Linux
           | machine.
        
             | bamboozled wrote:
             | Some days I spend 5 hours on the phone with my Bluetooth
             | headset, I have no issues with it at all.
        
           | selykg wrote:
           | His point is that it's not ready for the average person. I
           | would not expect the average person to switch from something
           | that works out of the box to having to fiddle with trying to
           | find an answer just to get where they were before.
           | 
           | OP is 100% correct. I also love how you basically say "well
           | it works for me" which is a common answer I seem to hear from
           | so many Linux users. It's like customer support responding
           | "well, I can't recreate the issue."
        
             | bamboozled wrote:
             | I can honestly say that in recent times (last 2 years) I
             | never had to fiddle with Fedora to get it working.
             | 
             | It just works.
        
           | mschuster91 wrote:
           | > Linux is community driven, if somethings not working, try
           | fix it?
           | 
           | That works if you're 20 and studying at university. After
           | wasting over half my awake day at work, the last thing I have
           | energy left for is dealing with random bullshit and tracing
           | down bugs.
        
             | bamboozled wrote:
             | Also, most people don't have great time management skills
             | (myself included) so never really have any slack time.
        
         | devwastaken wrote:
         | What version of pop os? My PC wouldn't even work with Ubuntu 20
         | because the kernel drivers were not there yet until 21.04.
        
         | heavyset_go wrote:
         | Use a distro that adopted Pipewire for good BT headset support.
         | 
         | In my experience, if a Chromebook can fit a user's use case,
         | than so can desktop Linux.
        
           | heavyset_go wrote:
           | Then* not than.
        
         | sho_hn wrote:
         | > Linux doesn't support bluetooth headphones for example.
         | 
         | This is not generally correct. I've been using bluetooth
         | headphones with my Linux desktop computer at work all day, and
         | probably over a decade in total. Closer to two.
         | 
         | I don't doubt your negative experience or that it's an
         | interesting data point, but let's not stick to factually
         | incorrect things, either.
        
           | plandis wrote:
           | I've had similar experiences across multiple different
           | Bluetooth chipsets in several different motherboards.
           | 
           | I've never been able to get AirPods to work and it took
           | several hours and some custom code to get my Bose headphones
           | to work. It's not what I would consider to be even close to
           | similar to macOS/Windows
        
             | [deleted]
        
             | istingray wrote:
             | ^ this. Custom code to get headphones to work is exactly
             | the type of thing that is only viable for developers. Linux
             | needs to move beyond this "for mechanics only" mindset for
             | it to succeed.
             | 
             | Same, could not get Airpods to connect AT ALL. They would
             | show up, "connect" for a second then disconnect.
        
               | gunapologist99 wrote:
               | I have Airpods as well as Jabra Elite 65t, Samsung Galaxy
               | Buds Pro, and various Plantronics. The Apple are the only
               | ones that are wonky. There seems to be some other
               | protocol for them that needs to be enabled, but I didn't
               | bother and just switched to the Galaxy Buds Pro (the
               | Jabras are excellent as well.)
               | 
               | The very first DDG result for "airpods linux bluetooth"
               | is this one:
               | https://askubuntu.com/questions/922860/pairing-apple-
               | airpods...
        
               | istingray wrote:
               | Does the mic work in any of these? I found it to be
               | impossible
               | 
               | Re: airpods linux bluetooth -- CLI commands to get
               | bluetooth headphones just to play sound is not viable for
               | Linux to take off.
        
               | gunapologist99 wrote:
               | You have to change the profile from "headphones" (high
               | quality audio) to "headset" to have a microphone. It's a
               | matter of a few clicks.
               | 
               | I'm just pointing out that Apple was doing something
               | different. As far as that being not viable for Linux to
               | take off... well, I respect your opinion.
        
               | istingray wrote:
               | The instructions you linked state that to connect Airpods
               | one needs to:
               | 
               | Set ControllerMode = bredr or ControllerMode = dual by
               | editing /etc/bluetooth/main.conf file using sudo nano
               | /etc/bluetooth/main.conf command (or another text editor
               | of your choice)
               | 
               | sudo /etc/init.d/bluetooth restart
               | 
               | Try to pair again.
        
               | sho_hn wrote:
               | I've done nothing special to use Bluetooth headphones
               | with my Linux computers (a Dell laptop, a Lenovo
               | workstation and a Lenovo laptop, running either Fedora or
               | Ubuntu) and I use a variety of them regularly:
               | 
               | - Samsung Galaxy Buds Live that came with my last
               | smartphone
               | 
               | - Some fancy Jabra-brand headset we get from work
               | 
               | - My wife's Sennheiser Momentum Wireless when I don't
               | bring my own
               | 
               | - Not headphones per-se, but in the office meeting rooms
               | we have Logitech desk mics I connect to all the time
               | 
               | No config or installing additional software needed beyond
               | pairing.
               | 
               | Based on my own samples the above makes me suspect
               | "AirPods don't comply well with standards" or something
               | along those lines.
        
               | istingray wrote:
               | Does your mic work in the headphones?
               | 
               | These were Sony XM3 in particular. After installing a
               | bunch of software through the CLI I got them to play
               | sound.
               | 
               | But the mic never worked. I saw similar stories online of
               | everyone just saying "stick with wired mics".
        
               | sho_hn wrote:
               | Yep, mic works for me. My main mic is actually wired, too
               | (a RODE NT-USB), but just for audio quality reasons. I
               | use the Galaxy Buds Live for meetings when I'm in the
               | living room because my wife needs our office room for
               | something, and I use the Logitech desk mics in the
               | office, and those work. I can't remember about the Jabra
               | off-hand.
        
               | istingray wrote:
               | Thanks I'll check those Galaxy Buds out.
        
               | stackbutterflow wrote:
               | Blame manufacturers.
        
             | resfirestar wrote:
             | Of course it's not as good as MacOS + Apple headphones but
             | I think more recent Linux distros that use PipeWire as the
             | audio backend are ahead of Windows. On Windows, A2DP
             | support is so limited that headsets like the Sony XM4s
             | suffer from worse latency and sound quality, particularly
             | terrible mic quality, while on Kubuntu everything works out
             | of the box with the same quality you get when paired with a
             | phone. There's still the issue of unstable drivers that
             | break out of nowhere and require rebooting for Bluetooth to
             | work again, but Windows has the same problem on my laptop.
        
           | istingray wrote:
           | Does the mic in your headphones work?
           | 
           | Edited to clarify that while 3 pairs of headphones required
           | fiddling to get 1 connected, the microphone was basically
           | impossible to get working. The majority of threads I've found
           | online have said basically not to bother with getting
           | bluetooth headphones+mic, use a wired one or something else.
        
             | adamtulinius wrote:
             | I have two different Jabra bluetooth headsets, both work on
             | Linux with a microphone, but you have to change to the
             | headphone profile (using something like pavu) to enable the
             | microphone.
        
           | thebruce87m wrote:
           | I'll add to this - I can generally hack away at Linux, build
           | my own yocto for custom SBCs etc but I gave up trying to get
           | my QC35s paired with my Linux box
        
             | istingray wrote:
             | Thanks for saying this. I'm a Mac user. I don't know jack
             | about this stuff. It's easy for Linux users to say "it
             | works for me dummy!". But it doesn't work for me. I had to
             | run CLI commands just to install Brave, are you kidding me?
             | (https://brave.com/linux/)
             | 
             | I can't imagine less computer savvy people putting up with
             | this.
        
               | c0nfused wrote:
               | If your distribution offers synaptic it helps immensely
               | as a gateway drug to apt/rpm.
               | 
               | https://www.nongnu.org/synaptic/
               | 
               | But to install it you still need the cli. After that you
               | get a GUI interface to search and install packages
        
               | Macha wrote:
               | A lot of the "beginner friendly" distros will install a
               | GUI frontend in the default install and label it
               | "software center" or "app store" or similar.
        
           | umanwizard wrote:
           | Only certain ones -- not AirPods, for example.
           | 
           | They work as headphones, but not as a headset with microphone
           | (for reasons that are not really Apple's fault)
        
           | booleandilemma wrote:
           | That's the problem though isn't it? Imagine if some people
           | just weren't able to get their mouse or airpods working with
           | their laptop. Of course, airpods work for everyone's iphone,
           | and I've never had a bluetooth mouse that didn't work with
           | Windows.
           | 
           | Until Linux works out of the box _for everyone_ it's not
           | going to see mainstream adoption.
        
           | BiteCode_dev wrote:
           | While this is not correct, and I do use my BT headphone with
           | Ubuntu 20.04, it's is almost true for any casual user.
           | 
           | Indeed, the number of glitches and irritating workaround I
           | have to use to make something else than a mouse work with BT
           | is something most people don't want to put up with their
           | $1000 laptop.
           | 
           | Yes, BT is a shitty tech. But their phone works out of the
           | box. Their Mac OS and Windows as well.
           | 
           | The fact we still get BT not pairing randomly (or wifi stop
           | working, or sleep mode never awaking) is a huge problem in
           | 2021.
           | 
           | We should not ignore that as a community.
           | 
           | That's why I report bugs, give money and help people online.
           | 
           | But it will not be enough if we close our eyes on the work
           | yet to be done.
        
             | istingray wrote:
             | Thank you for saying this. As someone new to Linux, it's
             | easy to be intimidated by Linux users saying "but it works
             | for me!".
             | 
             | Where do you give money? I've looked into a few projects
             | but it's pretty fragmented. Donating money to Ubuntu was a
             | trip...required digging through forums and finally
             | "Downloading" Ubuntu to trigger the "donate money" pop up.
             | 
             | Linux needs more options for people like me to pay for
             | polish.
        
               | BiteCode_dev wrote:
               | I have a list of FOSS projects I rotate and donate to
               | regularly, including Ubuntu and the FSF. So many projects
               | need help, VLC, libreoffice, firefox, python, ublock
               | origin...
               | 
               | It's not just about the money, it's also a way for devs
               | to feel that their work matters.
        
             | sho_hn wrote:
             | > But it will not be enough if we close our eyes on the
             | work yet to be done.
             | 
             | I very much agree with your sentiment, but I simply don't
             | have any issues with BT headphones on Linux desktops
             | specifically I could work on today.
             | 
             | Bluetooth is a very a complex set of standards - I work as
             | a system architect for a well-known celestial car brand,
             | and implementing Bluetooth and BLE is the bane of our
             | existence; truly disastrous technology - and I'm sure
             | things to improve are aplenty, though. In terms of desktop
             | biz logic, the roll-out and maturation of the PipeWire
             | stack is one ongoing effort to watch for.
        
               | BiteCode_dev wrote:
               | Yes, and then you have Wifi, touchpads, graphic cards,
               | web cams... It's amazing that we even have a working OS
               | honestly.
        
               | istingray wrote:
               | Thanks for saying this, it gets to the root of the
               | problem being Bluetooth.
               | 
               | What should we support instead? Is there another wireless
               | standard?
               | 
               | I've noticed there are models that use some kind of non-
               | Bluetooth USB dongle for example.
        
           | throwdecro wrote:
           | For an end user who isn't a Linux enthusiast, generally
           | working headphones aren't sufficient. They have to _almost
           | never not work_.
           | 
           | I record music on Linux, and one of the reasons that I never
           | recommend it is that I can imagine people dropping real money
           | on a sufficiently powerful laptop and finding things don't
           | work without a lot of configuration, or perhaps not at all.
           | That doesn't cut it for the average user.
        
             | istingray wrote:
             | Thank you for saying this. Yes, it doesn't cut it for the
             | average user.
             | 
             | Have you thought about donating to open source projects to
             | support them along?
             | 
             | Linux _is_ the only viable option. It _must_ improve.
        
               | throwdecro wrote:
               | Yes I've donated to some :)
        
           | okl wrote:
           | I use them regularly as well, never had a problem, not even
           | when setting them up the first time.
        
             | istingray wrote:
             | microphone too? What model, I'll get it.
        
         | jwommack wrote:
         | The thread following this is a great example of Linux's fatal
         | issue. 5 different people have 5 different user stories.
         | 
         | Public uptake requires ease of use and consistency.
        
           | CubsFan1060 wrote:
           | And, at least with some people, almost a dismissal of someone
           | else's problem.
           | 
           | "It does too work, you're wrong. All I had to do was write
           | some code, install a few things, and recompile the kernel and
           | my headphones work most of the time. Have you tried writing a
           | new driver for your headphones?"
        
             | istingray wrote:
             | It would be funny if this weren't true right in this
             | thread. The best example I can point to is Brave's only
             | option to install on Linux involves a blob of CLI commands:
             | https://brave.com/linux/
             | 
             | Linux has to grow beyond being built by and for mechanics.
        
               | CubsFan1060 wrote:
               | Looking at it, those commands are not horrible, or
               | unreasonable.. for a developer or someone comfortable
               | with the command line.
               | 
               | However, the 'average user' is going to be annoyed and
               | lost with those commands.
        
               | istingray wrote:
               | The average user is going to use Firefox instead.
               | Advanced users who know to open "Terminal" will attempt
               | to just copy and paste the blob of code and hope it
               | works. Training new users to do this is a bad habit that
               | will bite the Linux community in the ass later.
        
         | mschuster91 wrote:
         | > - Linux doesn't support bluetooth headphones well
         | 
         | FTFY: "Everyone except macOS doesn't support bluetooth well".
         | 
         | The Windows situation is a horrid mess - it was with w7 and at
         | least three different _commercial_ stacks, and my s /o can't
         | use her brand new Sony headphones on her three year old w10
         | Lenovo laptop as it ends up in stuttering audio. Linux isn't
         | better. Audio stacks are a mess (even independent of
         | Bluetooth), BTLE is an _even worse_ mess, and I have no idea
         | why Android is relentlessly buggy. And the Nintendo Switch
         | doesn 't support _anything_ except compatible controllers, you
         | need an external, third party adapter that blocks the docking
         | port for decent wireless capability.
         | 
         | The only ones who have managed something decent with Bluetooth
         | are Apple, where everything you ordinarily need (headphones,
         | mice, keyboards) just works (tm).
         | 
         | The core problem is that Bluetooth _itself_ is an old, ugly,
         | grown standard with lots of different stuff bolted on, not much
         | in terms of interoperability testing and closed binary blobs
         | everywhere.
        
           | istingray wrote:
           | Good context. Given that Bluetooth is the issue, what should
           | we do instead? Some new standard? I've seen some headphones
           | that offer a non-Bluetooth USB wireless dongle (Corsair)
        
             | mschuster91 wrote:
             | > Some new standard?
             | 
             | Obligatory xkcd: https://xkcd.com/927/
             | 
             | I'm not sure. From an implementation POV, re-starting wUSB
             | (https://en.wikipedia.org/wiki/Wireless_USB) again would be
             | an easy way out - for a lot of things that can be done over
             | Bluetooth there are USB device classes and drivers already,
             | so all that would be required on the OS side is support for
             | a discovery/secure pairing mechanism.
        
       | browningstreet wrote:
       | Well, nothing about this announcement yet changes the fact that I
       | still won't be buying the next iPhone or Macbook this fall, like
       | I'd intended.
       | 
       | Without knowing what their refinements actually do, it's too
       | uncertain.
        
       | iyn wrote:
       | > "[...] we have decided to take additional time over the coming
       | months to collect input and make improvements before releasing
       | these critically important child safety features."
       | 
       | Seeing how this situation was handled by Apple so far, my cynical
       | take on this is that they want to wait for the storm to pass/get
       | smaller and then will continue deployment of this tech but with
       | better PR strategy. I hope to be wrong but the latest moves by
       | Apple completely evaporated the remaining trust I had for them.
        
         | tomjen3 wrote:
         | The same cynic could also say that they are postponing it
         | indefinitely, but can't say so.
        
           | kristofferR wrote:
           | That's not really a cynic, that's an optimist.
        
             | wl wrote:
             | It's being cynical about the idea that PR departments tell
             | the straightforward truth.
        
         | e-clinton wrote:
         | Why not trust Apple? I get that you dislike their plan but
         | consider three things:
         | 
         | * they do want to help keep your children safe * they told you
         | in detail how the system would work * they are holding off on
         | releasing after the world gave them feedback
         | 
         | Who else is doing better? I get that you don't like the system,
         | but how do you solve it? Is it not worth solving? Too easy to
         | nix ideas without contributing to the conversation.
        
           | iyn wrote:
           | > "Why not trust Apple?"
           | 
           | Due to their actions, it's that simple.
           | 
           | On this particular issue, I don't trust them, because they've
           | built a capability to surveil private data of millions of
           | people, on their private devices. Basically, to default to
           | "you're the criminal and we want to verify that you're not".
           | They presented this capability as a "safety measure". With
           | that, you can consider:
           | 
           | 1. If they weren't aware of the potential consequences of the
           | system they've built, it's just foolish and shows that their
           | decisions/predictions can't be trusted. (I don't believe they
           | weren't aware of the potential of the system to be abused)
           | 
           | 2. They were aware of the potential negative consequences
           | (massive at scale) and yet decided to use PR tactics to hide
           | that. If that's the case, they can't be trusted due to
           | misdirection and lying.
           | 
           | If you want other example, the first that comes to mind is
           | their actions related to the butterfly keyboard fiasco. No
           | (real) acknowledgement of the mistake, no apology but using
           | PR to spin that into a new feature of the next model. Another
           | is their monopolistic, rent-seeking behavior with App Store
           | policies.
           | 
           | The problem with CSAM is definitely worth solving but spying
           | on everybody's data is not the right level to look for
           | solutions. I imagine it would make executives feel better
           | about themselves but - as far as I understand - it doesn't
           | meaningfully address the root of the issue.
        
           | dylan604 wrote:
           | >they are holding off on releasing after the world gave them
           | feedback
           | 
           | They did not state that they are changing things to address
           | the concerns. Deploy it now, or deploy it later, it's the
           | same thing being deployed.
        
             | judge2020 wrote:
             | > They did not state that they are changing things to
             | address the concerns.
             | 
             | The full title (was cut from HN...)
             | 
             | > Apple Delays Rollout of Controversial Child Safety
             | Features to Make Improvements
             | 
             | And the actual page with Apple's words:
             | 
             | > Update as of September 3, 2021: Previously we announced
             | plans for features intended to help protect children from
             | predators who use communication tools to recruit and
             | exploit them and to help limit the spread of Child Sexual
             | Abuse Material. Based on feedback from customers, advocacy
             | groups, researchers, and others, we have decided to take
             | additional time over the coming months to collect input and
             | make improvements before releasing these critically
             | important child safety features.
             | 
             | https://www.apple.com/child-safety/
        
             | haswell wrote:
             | > _we have decided to take additional time over the coming
             | months to collect input and make improvements_
             | 
             | They explicitly stated that they plan to make improvements.
             | Only time will tell if those improvements are meaningful.
        
           | rightbyte wrote:
           | > * they do want to help keep your children safe
           | 
           | CSAM is more or less a documention of the crime. Isn't too
           | late for keeping the children safe by that point? It should
           | be a police matter and I don't think they are very much
           | helped by some flawed automatic report system spamming them.
        
           | aspaceman wrote:
           | > Why not trust Apple?
           | 
           | Where do people get this idea I should trust a company? I
           | should never trust a company. That's nonsense. It's like
           | trusting a lawnmower to watch your hamster.
        
           | ByteWelder wrote:
           | > * they do want to help keep your children safe
           | 
           | Well-meaning intentions are irrelevant to what's effectively
           | happening on the bottom line.
           | 
           | > * they told you in detail how the system would work
           | 
           | This is great though. In any normal situation, this would
           | build trust. Not so much in this case, because of the impact
           | on privacy and potential for abuse.
           | 
           | > * they are holding off on releasing after the world gave
           | them feedback
           | 
           | The crucial part is that they still intend to continue with
           | their plan. They'll possibly modify it, but we'll have to see
           | what that means. Well-intentioned backdoors are still
           | backdoors.
           | 
           | > Who else is doing better?
           | 
           | That's completely irrelevant to the discussion. Just to
           | humour the question: Android (AOSP), Pine (from PinePhone),
           | Librem, CalyxOS and GrapheneOS are some products and
           | operating systems that I could think off that respect the
           | users' privacy better.
           | 
           | > I get that you don't like the system, but how do you solve
           | it? Is it not worth solving? Too easy to nix ideas without
           | contributing to the conversation.
           | 
           | Customers don't have any responsibility to solve this. It's
           | totally in the customers' right to complain and say "we don't
           | want this feature". For good reasons, as shown by all the
           | experts and privacy advocacy groups.
        
             | akie wrote:
             | I would like some substantiation for your claim that
             | Android is doing better in terms of privacy. Everything
             | I've seen more or less points to the opposite.
        
               | Gareth321 wrote:
               | Android doesn't scan the files on our phones against a
               | government ban list with the intent to upload said files
               | to and alert law enforcement. We can confirm this because
               | Android is open source. This makes Android objectively
               | more secure and more private.
        
               | ByteWelder wrote:
               | Please note the "AOSP" mention: I was referring to
               | Android Open Source Project, not the variants with
               | Google's apps and Play Services installed.
        
             | [deleted]
        
           | wyager wrote:
           | > they do want to help keep your children safe
           | 
           | Wow, I love trillion dollar corporations now!
        
           | danaris wrote:
           | > Who else is doing better?
           | 
           | I think this is the part that far, far too many people are
           | ignoring.
           | 
           | Do I, personally, think this is a good thing for Apple to add
           | and do? Ehh....not at all sure. I do have concerns about both
           | the precedent set by it doing the scanning on everyone's
           | iPhones, and the potential technical issues I've seen raised
           | about the actual functioning of the system.
           | 
           | But seriously, * _who is doing better*_??
           | 
           | Firstly: Is there _anyone_ out there who has a better, more
           | user-respectful method of scanning for CSAM? A method that
           | doesn 't essentially say "as soon as you give us any access
           | to your photos, we will use them for _any_ purpose we deem
           | worthwhile to us "? Because despite all the brouhaha, that is
           | absolutely _not_ the attitude Apple has taken in this. They
           | have made abundantly clear that they have no intention of
           | ever using systems of this nature for anything other than
           | scanning for CSAM.
           | 
           | And secondly: _Even if_ Apple implemented this today, exactly
           | the way it 's been described, is there _anyone_ who has a
           | better track record on user privacy? Is there _any_ device or
           | OS manufacturer that has openly expressed and followed
           | through on a commitment to protecting users ' data?
           | 
           | I'm not aware of any.
        
             | yjftsjthsd-h wrote:
             | > They have made abundantly clear that they have no
             | intention of ever using systems of this nature for anything
             | other than scanning for CSAM.
             | 
             | Then they're either lying or delusional; creating the
             | capability guarantees that someone will pass a law forcing
             | them to use it in other ways.
        
               | stjohnswarts wrote:
               | I agree completely. In military parlance it's called
               | "softening the blow". People who say "this is a slippery
               | slope fallacy" simply aren't paying attention. Look at
               | Europe and Australia passing all kinds of laws to have
               | social media spying on everyone. It can happen. In the
               | USA Texas just passed a law to spy on women who are
               | getting abortion and trying to limit free speech about it
               | in an attempt to push most of the policing off to private
               | citizens like in Handmaid's Tale. It's insane that people
               | can't see threats on democracy and liberty from all these
               | angles.
        
               | [deleted]
        
             | poetaster wrote:
             | Maybe librem? Haven't audited yet but I believe they are
             | more privacy centric by far.
        
             | stjohnswarts wrote:
             | Google is doing better as they are not actively scanning my
             | phone for CSAM data (and eventually other data) as a USA
             | policing agent without a warrant. That's what google is
             | doing better. As is windows. They don't scan for material
             | on your phone either. If you have something to prove that
             | they do scan for illegal stuff for reporting to police
             | forces without a warrant then please show us.
        
               | owlbite wrote:
               | No, they're just uploading it to their servers, scanning
               | it to form a comprehensive profile of you and selling the
               | results to the highest bidder.
        
               | kook_throwaway wrote:
               | I've never signed into google, are they really uploading
               | my pictures to their servers?
        
           | stjohnswarts wrote:
           | They never wanted to "keep your children safe" as this "plan"
           | has a hole in it big enough to drive a semi through. As in
           | the criminals know it's there and can avoid it. No this was
           | to please the government a little bit and so they can say
           | "look we're thinking of the children". Governments want this
           | so they can eventually add more scanning On Your Phone in the
           | future for anything and everything. Just look at what's going
           | on in Australia right now with authoritarian type
           | surveillance getting passed in 24 hour session. Western
           | nations aren't safe from the Stasi mentality if we don't
           | fight back. Donate to EFF and ACLU today please.
        
         | pfortuny wrote:
         | Like, you know, whatsapp........
        
       | cannabis_sam wrote:
       | I genuinely hope that Apple will eventually realize that they
       | have burned decades of goodwill on this evil fascist idiocy..
       | 
       | I've been buying Macs exclusively for 15 years, but sadly it's
       | apparently time to change..
       | 
       | Every single executive at Apple should be figuratively "raked
       | over the coals" by shareholders for pushing this kind of insane
       | totalitarianism under a vague veil of "protecting children".
        
       | tremoloqui wrote:
       | Sadly, the cats out of the bag and Apple, Google and the rest of
       | us will no longer have the plausible deniability that kept this
       | sort of thing in check.
       | 
       | Soon enough every tinpot dictator in the world will be demanding
       | their particular flavour of device scanning inside and outside of
       | their jurisdiction.
        
       | buildbot wrote:
       | I don't think the timing of this is an accident - or at least I'd
       | like to imagine it's not.
       | 
       | iCloud subs renew at the end of the months; I bet they were maybe
       | surprised at the number of people not paying them money anymore
       | for iCloud...
        
         | kmlx wrote:
         | > I bet they were maybe surprised at the number of people not
         | paying them money anymore for iCloud
         | 
         | I bet you are overestimating how many people care about this
         | kind of stuff.
        
         | ziml77 wrote:
         | That doesn't make much sense. People would have cancelled their
         | subscriptions during the month of August. There's no need to
         | wait until the end of the month to know how many people are
         | cancelling.
        
           | cwkoss wrote:
           | I wouldn't be surprised if the suits only look at the numbers
           | once a month.
        
             | buildbot wrote:
             | Exactly - someone looked at the September vs. August iCloud
             | numbers, and went, "oh shit!"...
        
         | LeSaucy wrote:
         | I cancelled my recurring iCloud subscription because of this,
         | switched to self-hosting photo prism for storage.
        
           | buildbot wrote:
           | Yep, I did too, and set up a local+remote self hosted
           | solution instead. Probably not going back now, it was very
           | convenient for the iPhone upgrade program with a new phone
           | each year, but I no longer really want use Apple services...
        
       | m0zg wrote:
       | Really makes you think why a trillion dollar corporation would
       | want to severely undermine its most important asset, its brand,
       | over something like this. It ain't "children" I can tell you
       | that.
        
       | least wrote:
       | I hope this is a case of Apple delaying it long enough to
       | silently cancel the feature or completely change it to not be on-
       | device scanning rather than Apple delaying it long enough to be
       | well out of the news cycle and silently enable it at a later
       | date. Both features mentioned have potential for abuse and create
       | an adversarial relationship with ones own device so I'm not sure
       | what they do to implement them without these remaining concerns.
       | 
       | A delay in either case is welcome.
        
         | pwned1 wrote:
         | The modus operandi of the surveillance state is to back off
         | something newly intrusive just long enough for the outcry to
         | die down and then slip it in when no one is looking. I'd expect
         | this not to be canceled, but to find its way in to an update on
         | a holiday weekend when no one is checking the news.
        
           | least wrote:
           | Well the day they announced it was also just a largely
           | unremarkable day towards the start of a weekend but it still
           | turned into a fiasco, so hopefully not.
        
         | dusing wrote:
         | I'd say very likely canceled quietly. Otherwise they are just
         | taking the punishment twice, there isn't a change they could
         | make that would allow them to go live with this kind of thing
         | at a later date and not get raked over the coals.
        
           | TillE wrote:
           | Yeah, if they implement it in the future, the whole cycle of
           | news coverage repeats.
           | 
           | I think/hope they'll just move to the simple compromise of
           | scanning stuff on iCloud like everybody else, which is far
           | less of an intrusion.
        
       | [deleted]
        
       | c7DJTLrn wrote:
       | Apple's brand is tarnished in my eyes. I already got rid of my
       | iPhone and won't be returning. The fact that they even announced
       | this is proof that they only ever cared about providing a facade
       | of privacy.
        
         | 88840-8855 wrote:
         | I can imagine that the following is going to happen.
         | 
         | 1. Apple announces the scanning feature (done)
         | 
         | 2. People dislike it; backslash (happened)
         | 
         | 3. Apple is worried about the brand, publishes statement that
         | they are going to delay (happened)
         | 
         | 4. People say "great", "wonderful" (ongoing)
         | 
         | 5. Apple waits 5 months, release this crap feature anyway
         | 
         | 6. People moved on, media does not want to repeat the same
         | messages again, not many people care anymore
         | 
         | 7. DONE.
         | 
         | This is the same thing that Facebook did with its TnC. People
         | complained, Facebook took it back, re-introduced it 3 months
         | later. Nobody cares anymore.
        
           | c7DJTLrn wrote:
           | Most people don't care even at step 2. It's only technical
           | people with concerns about privacy and corporate overreach
           | behind this backlash. Most of that group will continue using
           | iPhones no matter what out of convenience and unwillingness
           | to stand up for their principles.
           | 
           | I stood my ground and actually took some action when
           | something made me unhappy. This is the only way things
           | change.
        
         | SamuelAdams wrote:
         | Just curious, what did you switch to? All the truly open source
         | phones still have a lot of work that needs doing, which means
         | the only real competitors are android devices with a custom
         | rom, and that doesn't really fix privacy either.
        
           | busymom0 wrote:
           | Not op but I started moving away from Apple a few months ago,
           | much before this Apple CSAM debacle. This is a pretty big
           | move for me because I am a developer who makes apps for both
           | iOS and MacOS, so I pretty much need Apple software for work.
           | 
           | No longer buying iPhones or Macs. I was planning on upgrading
           | to the Mac Mini with M1 chip later this fall but now I plan
           | on building a hackintosh instead. I also no longer recommend
           | Apple devices to friends/family.
           | 
           | I got myself a cheap android phone which I have de-googled
           | myself. I got this Android phone ($190 USD for a very good
           | phone - 8GB ram, 12gb space):
           | 
           | https://www.amazon.com/UMIDIGI-Unlocked-4150mAh-Capacity-
           | Sma...
           | 
           | I use Firefox for YouTube on it with the following add-ons:
           | 
           | 1. uBlock Origin
           | 
           | 2. Video Background Play Fix add-on
           | 
           | This allows me to use YouTube as a background playback music
           | player. And if needed, I use YouTube-dl to get the audio
           | files and put them on the phone.
           | 
           | You can check out several tutorials to de-google an android
           | phone.
        
             | xena wrote:
             | The hackintosh is still going to have the ability to do
             | CSAM scanning fwiw.
        
               | busymom0 wrote:
               | I am aware. What I meant was that I don't want to support
               | Apple with my dollars directly. Since I need MacOS for
               | work (app development), I will use a hackintosh. That way
               | I can run Ubuntu on the side if needed. Might look into
               | how MacOS operates in virtualbox.
        
               | smoldesu wrote:
               | If you're hosting on a Linux machine then you can use
               | QEMU for a surprisingly sturdy Mac virtualizationb setup.
               | I've heard legends on IRC of people with meticulously-
               | picked hardware who get perfect GPU-acceleration working,
               | but ymmv.
        
             | smoldesu wrote:
             | KDE Connect[0] has a MacOS implimentation that will restore
             | some of your 'magic' to your ecosystem. If you find
             | yourself missing AirDrop, Continuity Clipboard, or synced
             | media controls, give it a try and see if it feels right.
             | Even the earlier builds of this were a lifesaver when I was
             | still hackintoshing, and it made it a snap when I was ready
             | to switch to Linux. Best of luck to you in the future!
             | 
             | [0] https://kdeconnect.kde.org/download.html
        
               | busymom0 wrote:
               | Oh wow thanks! I currently use AirDrop quite often so
               | this is very helpful!
        
             | davidinosauro wrote:
             | Wasn't aware of Video Background Play Fix add-on, thanks
             | for mantioning it!
        
           | stjohnswarts wrote:
           | Android phones don't scan the data on your phone looking to
           | notify the authorities that you aren't complying.
        
             | SamuelAdams wrote:
             | Yes but there are other tradeoffs that are introduced.
             | Carrier-specific ROMs, difficulty upgrading to new android
             | versions, generally more malware found in the Play store
             | [1] and non-privacy-centric defaults [2] aren't really
             | great to see either.
             | 
             | I'm not looking to start another android vs iOS debate
             | here. There's plenty of that online that I and others can
             | reference. I'm more interested in seeing how people are
             | finding alternatives or what privacy tradeoffs they are
             | willing to accept to avoid Apple's recent photo scanning
             | move.
             | 
             | [1]: https://arxiv.org/pdf/2010.10088.pdf (PDF WARNING)
             | 
             | [2]: https://www.cnbc.com/2020/01/16/how-to-stop-google-
             | maps-from....
        
           | c7DJTLrn wrote:
           | I switched to a Pixel 5 running GrapheneOS. It does a good
           | job of reducing the chat to Google, but it would be an
           | outright lie to say that it's completely detached from
           | Google. I'm comfortable with the measures GrapheneOS has
           | taken and the way it sandboxes Play Services seems quite
           | clever.
           | 
           | The battery life is really good, not sure if that's just the
           | hardware or whether the debloated OS means less power is
           | consumed.
        
             | colordrops wrote:
             | You can also run without Google Play, but there are
             | tradeoffs you will make to do so.
        
         | meowtimemania wrote:
         | Was Apple pressured by some government to add the photo
         | scanning software? It seems like a weird feature to add given
         | apple's messaging on privacy
        
           | pcurve wrote:
           | Since they weren't scanning icloud, there had to have been a
           | ton of pressure from the government, despite their claims
           | otherwise.
        
       | kerng wrote:
       | Strange that they do this delay so late... the damage is already
       | done, so why bother now?
       | 
       | Apple's leadership must be quite confused.
        
         | JohnFen wrote:
         | I wonder if the issue isn't that they've been living in their
         | own insular world for so long that they really thought they
         | were doing something that would pass muster in the real world.
         | 
         | There does seems to be an increasing tendency of FAANG
         | companies to do that.
        
       | spfzero wrote:
       | "Critically important features" Well,I guess if they can wait a
       | few months, were they really "critical"? I mean, maybe use a word
       | like "important"? Can Apple please stop using max adjective in
       | every situation? It just waters down the meaning of these words
       | to the point that we no longer have a word that means "critical"
       | anymore.
        
       | ekianjo wrote:
       | "Delays", not cancel.
        
       | nickjj wrote:
       | Does anyone have a success story of using a non-Android / non-iOS
       | smartphone as their main phone?
       | 
       | I've been on a flip phone and never owned a smartphone but a
       | recent role I'm taking requires having a smartphone for email /
       | Slack access.
       | 
       | I know it's a matter of picking between 2 lesser evils but has
       | anyone avoided both and went with an open Linux based phone that
       | has a really nice user experience and decent battery life? All of
       | the research I've done leads to Linux / open phones just not
       | being there yet, even phones that cost $900 like the Purism
       | Librem 5. The Librem 5 looks like it came a long way in the last
       | year and I am really rooting for them to succeed, but is anyone
       | using a current one as of today or know of a viable alternative?
        
         | unixfg wrote:
         | I say don't change your setup for them; get a dedicated
         | device... consider a tablet. Having two devices is a pain, but
         | it'll improve your work-life balance when you have to actively
         | decide to carry it. You'll also never accidentally send
         | something personal to a coworker.
         | 
         | I use an older iPhone in airplane mode. I get over a week of
         | battery life. I forward calls to my personal if I'm away from
         | wifi and on-call, and they do not get my personal phone number.
        
           | nickjj wrote:
           | I was considering "upgrading" my existing phone instead of
           | carrying 2 devices. They said I can use a personal phone and
           | I wouldn't have to install anything company specific or have
           | random audits done. In return they would pay half the monthly
           | phone cost. Since I would be on-call a tablet might be too
           | bulky to carry around, the idea with the smartphone from
           | their POV would involve being able to respond by email or
           | Slack in case a downtime event happens. Probably wouldn't
           | technically need a phone number tho.
           | 
           | But yes, the above has flaws in that there's always a 0.01%
           | chance I send something to the wrong person if my contacts
           | co-exist in 1 app. I'm dumb when it comes to smart phone
           | capabilities, maybe there's a way to 100% silo off 2 sets of
           | contacts or profiles? It's something I'll need to research.
        
         | unethical_ban wrote:
         | I've considered trying, since I'm buying a second line as a
         | "distraction-free tool" for the daytime hours.
         | 
         | I think the easiest thing to do (aka the #1 method of de-
         | Googling) is to run certain versions of Android that have the
         | play store removed, that has a version of android built from
         | the AOSP with an alternate store like F-Droid enabled.
         | 
         | If you search "AOSP phone" or "de-google android" you could get
         | places.
         | 
         | The other thing I have thought about is getting a Pine phone.
        
           | livueta wrote:
           | "AOSP Treble GSI" is another good top-of-the-rabbit-hole
           | search term.
           | 
           | tldr if you have an even somewhat recent unlock(ed)/(able)
           | android device, you can probably drop a very pure AOSP build
           | on it pretty easily.
        
         | CubsFan1060 wrote:
         | The best way to root for them to succeed would be to support
         | them.
        
         | ModernMech wrote:
         | I had a Microsoft Lumia and used it into the ground until the
         | screen died. It was a good phone, lacked some critical app
         | support, but otherwise was a very solid smartphone alternative
         | to iOS and Android. And you could even replace the battery! I
         | can't say that is MS were still in the game, things would be
         | markedly better, but at least there would be a decent, if
         | distant, 3rd place smartphone platform.
        
         | viktorcode wrote:
         | We all have had, but in the past. it isn't feasible now.
         | 
         | I mean, you can buy "safe" smartphone, but first you can't
         | prove beyond reasonable doubt that it is actually safe and
         | private, and second, you attract more attention because the
         | same phones are being bough by the criminals.
        
         | smoldesu wrote:
         | I've been heavily considering getting a PinePhone, when a new
         | iteration comes along. I only need my phone for calling and
         | texting, anything on top of that is just icing on the cake. If
         | I can run all my favorite GTK apps on it, then I'll consider it
         | a slam dunk.
        
         | stjohnswarts wrote:
         | Get a pixel 4a and put CalyxOS on it and you're good. Only use
         | it for work and keep your flip phone for personal stuff for
         | maximum security I guess. Going from a flip phone to ANY smart
         | phone is going to be a massive upgrade in terms of capability,
         | even the librem phone, so I don't really know what you're going
         | on about there.
        
         | rdiddly wrote:
         | I'm still waiting for my Librem 5, ordered in early 2019.
         | Although to be fair, I chose to be in a later release group, in
         | order to benefit from design iterations, bugs being worked-out
         | etc.
        
       | twodave wrote:
       | Man, they really stepped in it didn't they? Who would have
       | thought "safe for kids" and "respectful of my privacy" could be
       | enemies? But this is the position Apple has put itself in. There
       | isn't really a graceful way out of this.
        
       | viktorcode wrote:
       | The on-device CSAM scan must be canceled, not delayed. It is a
       | dangerous payload, future backdoor if you will, bundled with more
       | friendly offline opt-in features and wrapped in "think of the
       | children" paper
        
       | arrty88 wrote:
       | Must have taken an act of congress. Imagine that. Powerful people
       | dont want their photos analyzed. Hmmm
        
         | adventured wrote:
         | That's not how it works.
         | 
         | One group of powerful people, better positioned to take
         | advantage, want to analyze the personal contents of other
         | powerful people. The more well positioned group always thinks
         | they can be spared from their own creations, that they're in
         | control of it. The politicians that passed the Patriot Act
         | didn't personally fear it, they viewed themselves as the
         | masters of it; it was to be applied to the other people.
         | 
         | For example, that's how you build an illegal dragnet spying
         | operation against an inbound, democratically elected President.
         | You do it courtesy of having the loyalty of the intelligence
         | community top to bottom. And you avoid that illegal activity
         | becoming a punished criminal deed by controlling at least half
         | or more of the government. One power group targeting another
         | power group, that's how Washington DC has always worked, and
         | will always work (as well as most governments throughout
         | history, it's endemic to politics/politicians).
        
       | leetrout wrote:
       | "child safety features" ... forced invasion of privacy more like
       | it
        
       | diebeforei485 wrote:
       | It's not clear if this means it's coming in iOS 15.2 or
       | something, or if it's being delayed to iOS 16.
       | 
       | If it's the former, it may be best to stick with iOS 14.
        
       | amelius wrote:
       | Apple still monitors executable files when you try to run them.
       | 
       | Who says they aren't using this as a fingerprint, or worse, as a
       | way to correlate your behavior with that of a CP offender, or
       | other types of criminal.
        
       | ithkuil wrote:
       | It never ceases to astonish me how people keep reacting to this
       | while apparently not really caring to understand how the thing
       | actually works and how it fits into the existing workflows and
       | thus what are the alternatives.
       | 
       | I mean, I'm not surprised this whole thing backfired and that
       | there was a strong uncanny valley reaction to the prospect of
       | having parts of the workflow happening in your device ("a
       | policeman in my pocket" I read somewhere).
       | 
       | I am surprised though that it seems impossible to resolve the
       | issue (or at least making progress in the framing of) this with a
       | honest and nuanced conversation.
        
         | R0b0t1 wrote:
         | I don't want a conversation. I don't want it, and I don't want
         | to compromise.
         | 
         | Plenty of people think the same.
        
           | ithkuil wrote:
           | How can you be sure to know what is it what you don't want if
           | you want to have a conversation?
        
             | gigel82 wrote:
             | I don't want any functionality capable of scanning photos
             | on my device for "illegal" content; I simply don't want
             | that code living on my device no matter how restricted its
             | (initially planned) use is.
        
               | ithkuil wrote:
               | Fair enough. Framing it like that it seems like a small
               | step from a dystopia.
               | 
               | But let's face it, we're feet deep in it already: unless
               | you control what code runs on your device you're never
               | safe from code that scans data on your device.
               | 
               | I not sure I really buy the slippery slope argument. If
               | in the future Apple wanted to be more intrusive, they
               | would just be more intrusive and scan your photos on your
               | device for real, not with a convoluted and probabilistic
               | fingerprint method.
               | 
               | What is the weak point? To get people accustomed to being
               | scanned? Aren't people already? Your content is already
               | scanned the moment it reaches to the cloud.
               | 
               | What does this extra client-side scanning add to the
               | dystopia that we're not already experiencing?
        
               | gigel82 wrote:
               | It's the opening of Pandora's box. Scanning your device
               | is a huge paradigm shift; we all know the cloud is
               | "someone else's computer" and untrustworthy but thus far
               | one's device was sacred and no one dared touch it.
               | 
               | The floodgates are open now; politicians and other
               | "interested parties" have watched this unfold very
               | carefully and gleefully noted the majority didn't care as
               | much as everyone expected, so they'll definitely be
               | pushing for it now.
               | 
               | Imagine Windows Defender (an antimalware / antivirus
               | distributed with all versions of Windows and enabled by
               | default) starts scanning one's hard drive (and attached
               | external drives) for image files (it already scans
               | documents and executables and even uploads samples of
               | malicious binaries to Microsoft for analysis): how would
               | you / the world react?
        
               | ithkuil wrote:
               | > Scanning your device is a huge paradigm shift
               | 
               | But that's not what they want to do. They want to perform
               | a client side fingerprint of a subset of images before
               | they get uploaded to the cloud.
               | 
               | You can argue that they _could_ in the future turn this
               | into a scan of everything on the device. But you can also
               | argue that if that 's what they want to do in the future
               | they'll do it in the future. It's all about trust. If you
               | don't trust Apple to not push nefarious code on your
               | devices, stay away from Apple, and that's true even
               | before all this.
        
           | onethought wrote:
           | It's really not a compromise. They found a way to scan your
           | photos for illegal stuff without during upload without
           | compromising encryption. Meanwhile all the major competitors
           | just openly inspect All your photos.
           | 
           | This is equivalent getting up set with Signal because it
           | spell checks your text as you write it: "you are scanning
           | what I write!!!!"
        
             | heavyset_go wrote:
             | > _without compromising encryption._
             | 
             | Apple holds the keys to all iCloud backups and regularly
             | uses them in order to serve customers' data in response to
             | data requests from the government.
             | 
             | Apple gives up US customer data for about 150,000
             | users/accounts a year[1].
             | 
             | [1] https://www.apple.com/legal/transparency/us.html
        
               | onethought wrote:
               | As do all the competitors... but they can also access
               | your photos for random data mining. Which no one has
               | shown Apple do that. We know 100% google do it...
        
         | gremloni wrote:
         | No conversation. Just fucking no. The kids are protected
         | enough.
        
         | LudwigNagasena wrote:
         | I don't want a honest and nuanced conversation on whether
         | Holocaust was ok or whether rape is good. This is way past any
         | point for a nuanced conversation.
        
           | onethought wrote:
           | That's not at all close as an analogy.
           | 
           | So the alternative that everyone is happy with, is companies
           | openly scan all your images as much as they want... but we
           | are outraged at the company trying to improve the status quo
           | on privacy... and we don't want to talk about it. Just stop
           | improving!
        
             | LudwigNagasena wrote:
             | Just stop trying to encroach on my privacy. I don't care
             | whether it is done by a government or an international
             | company with a capitalization of over two and half trillion
             | dollars.
        
               | onethought wrote:
               | Right... so Apple is trying to not encroach and is
               | punished for it. That's kind of my point.
        
               | LudwigNagasena wrote:
               | The whole point of the feature is to analyze images that
               | you possess.
        
               | onethought wrote:
               | No. The whole point is to avoid scanning all your images
               | on their servers and instead only scan when there is a
               | suspicion. Ie Hash match.
               | 
               | You will not have any of your images scanned at all, if
               | you don't load a certain number of images that's metadata
               | match known CSAM. That's a superior position, than:
               | everything you upload is scanned for visual features.
        
           | [deleted]
        
         | tibyat wrote:
         | ok you're astonished. but you also haven't moved the
         | conversation forward at all. you're just talking down on the
         | field, "I understand everything about this topic and people's
         | reactions to this topic, and I am disappointed in all of you."
         | 
         | turns out it takes a lot more effort to actually make an
         | impact, and comment sections are typically full of low effort
         | gut reflexes. It doesn't help that every comment section starts
         | over, progress is harder this way.
        
         | heavyset_go wrote:
         | Apple already tried the "critics don't understand our system!"
         | deflection a while ago.
         | 
         | It's a deflection because critics' objections don't necessarily
         | hinge on the technical implementation details of the system,
         | they object to it based on its value proposition to the user,
         | and on principle.
         | 
         | Once we move past the core of those objections, then yes, some
         | critics _also_ object to the system 's technical
         | implementation, and some of them are correct in their analysis
         | and some are less so.
        
           | ithkuil wrote:
           | > they object to it based on its value proposition to the
           | user, and on principle.
           | 
           | Not sure everybody is on the same page what the value
           | proposition to the user is. That's intimately tied on the
           | "how the system works" which is not merely an implementation
           | detail.
           | 
           | I'd like to talk about that. Most of the threads about this
           | topic I found are full of flames without much content.
           | Hopefully I found somebody who can help me clarify what is it
           | that bothers so much.
        
         | ece wrote:
         | I think it's hilarious that for as long as iOS has existed,
         | there have been people pointing out it's pitfalls in nuanced
         | ways, and almost nobody listened. Now that everything they've
         | said has pretty much come to pass about user freedoms, instead
         | of accepting personal responsibility and admitting iOS flaws,
         | people want more nuanced conversations. Utter cowardice.
        
         | totetsu wrote:
         | Snappy phrases move culture. While us nerds were having our
         | Long honest and nuanced conversation the world moved on
        
         | mixmastamyk wrote:
         | I absolutely do not want this on any devices I own.
         | 
         | I'm preparing to destroy any Apple devices in my possession
         | (thankfully old and due for replacement) when my pine64
         | arrives. Don't care about any details. How's that for nuanced?
        
         | nbzso wrote:
         | Honest and nuanced conversation? I am dealing with Apple
         | righteousness and arrogance for far too long. 20 + years to be
         | exact. And I ended removing macOS workflow of my company
         | because of this "privacy innovation" for the children.
         | 
         | This corporation died long time ago, people just like the Idea
         | of old Rebel Apple for creative minds and individuals.
         | 
         | People actually understand how this will work. A lot of
         | educated and high level professionals reacted with vigor.
         | 
         | Spinning this towards "people don't understand" is blasphemy
         | coming from serious self-delusion, no excuses given.
         | 
         | Resolving the issue is simple: Break Big Tech monopolies and
         | create a climate for change. AT&T break up comes to mind. It is
         | time to wake up. User data is the petrol of the world of
         | tomorrow. Access to user data and implementation of solutions
         | based on it must be regulated not towards some three letter
         | agencies wet fantasy but towards respecting individual privacy
         | and existing laws and democratic principles.
         | 
         | Even if Apple cancels this intrusion on user space, which I
         | heavily doubt, and implements CSAM only on iCloud servers (as
         | every one else), the image of Apple as an guardian of user
         | privacy (false or not) is gone forever.
         | 
         | People are talking about alternatives and "moving off-cloud to
         | NAS" which is actually the right thing to do in this situation.
         | 
         | Oh and Apple will continue to be loved. But this love will be
         | more than biter-sweet, it will become dangerous. Some of us are
         | old enough to know when to leave a sinking ship.
        
       | shrimpx wrote:
       | The most botched and nonsensical feature announcement ever.
       | 
       | > Here's an extremely sophisticated mechanism that has insane
       | negative potential ramifications for our users and our brand, but
       | that's ok because it's all in the name of catching pedos.
       | 
       | > PS: And pedos can turn off this mechanism with one toggle.
       | 
       | Since this sounds too idiotic to be real, people conclude it must
       | be evil, by Occam's Razor.
       | 
       | I have a deeper fear: That it's actually that idiotic.
        
       | foxfluff wrote:
       | Critically important? Really?
       | 
       | A system that is trivial to circumvent and which only catches
       | people sharing pics that have already been added to a database is
       | not going to move the needle on actual physical abuse of
       | children.
        
         | Joeri wrote:
         | You would think that, but facebook detects 20 million shares of
         | this material per year. Presumably at least part of that group
         | that shares the known material are also abusing children and
         | creating new material, so this does look like an effective
         | detection method for finding some of the people who abuse
         | children.
         | 
         | https://www.missingkids.org/content/dam/missingkids/gethelp/...
        
           | jrockway wrote:
           | I'm not convinced that's a good argument. Shouldn't every bag
           | of coffee you buy include a camera that reports any illegal
           | activity to the government? A lot of criminals drink coffee,
           | after all!
        
             | c7DJTLrn wrote:
             | We must mandate speed monitoring devices for every car
             | sold. Each time the vehicle speed exceeds the speed limit
             | we will deduct 100 social credit points from the owner's
             | balance.
             | 
             | Hundreds of thousands of people die each year due to filthy
             | criminals driving too fast. Introducing this measure will
             | save lives.
             | 
             | If you are uncomfortable with this measure, you drive too
             | fast.
        
               | jrockway wrote:
               | Very good point. Actually, the next version of iOS should
               | include a feature to make sure that you don't roll
               | through a stop sign at 1MPH at 3 o'clock in the morning.
               | STOP means stop, not kind of look both ways and carefully
               | evaluate the situation! An Authorized Phone Receipticle
               | built into your car will ensure that the phone's camera
               | is facing forward so our Machine Learning Artificial
               | Intelligence can detect stop signs, and the
               | accelerometers will detect whether or not you stopped
               | completely. If you didn't, your device will do some
               | bullshit with bitcoins or something eventually leading to
               | it sending a report to the Apple Safe Driving
               | Headquarters ("ASDHQ") where totally trained technicians
               | that definitely have health insurance and get paid more
               | than $1/hour will check the data to see if it's a false
               | positive. If it's not a false positive, and BTW, from
               | exactly zero hours of real world testing we have
               | determined that the false positive rate is less than 1 in
               | 1 trillion, we'll report you to our government partners
               | who will take care of your reeducation from there.
               | Driving is a privilege, not a right!!!
               | 
               | Over ONE HUNDRED people die per year from people not
               | coming to a complete stop at stop signs, and at Apple, we
               | care!!!!!! We'll never let the screeching voices of the
               | minority stop us from invading your privacy for no good
               | reason. We like the screeching, it makes us feel
               | important and relevant. As the inventors of a small
               | computer with a battery in it, we consider ourselves
               | better than God.
        
               | alexashka wrote:
               | You joke but if everyone had a tag, the speed limit could
               | adjust itself in real time, enabling you to go faster
               | when it is safe to do so and go slower when it isn't.
               | 
               | In other words, we could improve the driving experience
               | _and_ save lives with a little bit of work if we had
               | folks with a working brain in government and elsewhere.
        
               | jrockway wrote:
               | I think this exists already. It's just an LED sign where
               | there is normally a speed limit sign, and it adjusts up
               | and down depending on conditions.
        
           | foxfluff wrote:
           | I have read Facebook's report on this. It's not 20 million
           | shares of kids getting raped by adults.
           | 
           | https://about.fb.com/news/2021/02/preventing-child-
           | exploitat...
           | 
           | "We found that more than 90% of this content was the same as
           | or visually similar to previously reported content. And
           | copies of just six videos were responsible for more than half
           | of the child exploitative content we reported in that time
           | period."
           | 
           | "we evaluated 150 accounts that we reported to NCMEC for
           | uploading child exploitative content in July and August of
           | 2020 and January 2021, and we estimate that more than 75% of
           | these people did not exhibit malicious intent (i.e. did not
           | intend to harm a child). Instead, they appeared to share for
           | other reasons, such as outrage or in poor humor"
           | 
           | So a lot of this is memetic spreading, not pedos and child
           | abusers sharing their stash of porn. And people don't react
           | to child porn by spreading it like a meme on Facebook, so
           | what are _these_ pictures that get shared a lot? What 's
           | happening is people find a funny/hilarious/outrageous picture
           | and share that. Funny moments might happen when kids play
           | with pets, for example.
           | 
           | The other part is consenting teens sexting their own photos.
           | And then there's some teens (e.g. 17-year-olds, which by the
           | way is old enough to consent in some countries) getting
           | accidentally shared along with adult porn by people who don't
           | know the real age.
           | 
           | https://research.fb.com/blog/2021/02/understanding-the-
           | inten...
           | 
           | "Unintentional Offenders: This is a broad category of people
           | who may not mean to cause harm to the child depicted in the
           | CSAM share but are sharing out of humor, outrage, or
           | ignorance.
           | 
           | Example: User shares a CSAM meme of a child's genitals being
           | bitten by an animal because they think it's funny.
           | 
           | Minor Non-Exploitative Users: Children who are engaging in
           | developmentally normative behaviour, that while technically
           | illegal or against policy, is not inherently exploitative,
           | but does contain risk.
           | 
           | Example: Two 16 year olds sending sexual imagery to each
           | other. They know each other from school and are currently in
           | a relationship.
           | 
           | Situational "Risky" Offenders: Individuals who habitually
           | consume and share adult sexual content, and who come into
           | contact with and share CSAM as part of this behaviour,
           | potentially without awareness of the age of subjects in the
           | imagery they have received or shared.
           | 
           | Example: A user received CSAM that depicts a 17 year old,
           | they are unaware that the content is CSAM. They reshare it in
           | a group where people are sharing adult sexual content."
           | 
           | So there's reason to think that the vast majority of this
           | stuff isn't actually child porn in the sense that people
           | think about it. It might be inappropriate to post, it might
           | be technically illegal, but it's not what you think. And if
           | it's not child porn, you can't make the case that it's
           | creating a market for child abuse. By reporting it, you're
           | not catching child rapists.
           | 
           | I don't have a reference handy but I recall reading about the
           | actual child porn that inevitably does get shared on every
           | platform, much of it is posted by bots over VPNs or tor. So
           | its volume isn't representative of the amount of child
           | abusers on the network, and reporting these accounts is not
           | likely to lead to anything.
           | 
           | Also: In May 2019 the UK's Independent Enquiry into Child
           | Sexual Abuse heard that reports received by the National
           | Crime Authority from the United States hotline NCMEC included
           | large numbers of non-actionable images including cartoons,
           | along with personally identifiable information of those
           | responsible for uploading them. According to Swiss police,
           | _up to 90% of the reports received from NCMEC relate to
           | innocent images_. Source:
           | https://www.article19.org/resources/inhope-members-
           | reporting...
           | 
           | Out of those remaining 10%, how much leads to convictions?
           | Very little. Sorry can't dig up a source right now. Point is:
           | millions of reports lead to mostly nothing. Meanwhile,
           | children continue to be abused for real, and the vast
           | majority of those who want to keep producing, sharing, and
           | storing such imagery surely have heard the news and will find
           | a different way to do it.
           | 
           | Platform operators are incentivized to report everything, if
           | they're playing the reporting game. Tick box "nudity or
           | sexual conduct", tick box "contains minors"? Report it.
           | 
           | It doesn't help the discussion at all that everything
           | involving nudity and minors (even _cartoons_ ) gets lumped
           | together as "CSAM" with actual child porn produced by adults
           | who physically abuse kids.
        
             | pvarangot wrote:
             | > Very little. Sorry can't dig up a source right now.
             | 
             | I don't think the NCMEC shares numbers about how many of
             | their reports result in actions by law enforcement
             | agencies. They probably also don't really know, its kind of
             | like a dragnet.
             | 
             | Also under the current "I'll know it when I see it" CSAM
             | doctrine in US courts cartoons can actually be illegal and
             | cartoons depicting the abuse of children are usually banned
             | on most big media sharing platforms in the US, and most
             | companies in the US won't host you or let you serve ads if
             | you have that material. So yeah, it's not only muslim
             | totalitarians that are ok with banning cartoons and
             | punishing people for drawing them or sharing them, Uncle
             | Sam may also send you to jail and deprive you of your
             | rights if you draw the wrong thing.
        
               | noptd wrote:
               | >I don't think the NCMEC shares numbers about how many of
               | their reports result in actions by law enforcement
               | agencies.
               | 
               | I'm not surprised. Quantifiable accountability can be
               | extremely problematic if you aren't actually making
               | meaningful impact on the problem that you are claiming to
               | help solve.
        
         | tzs wrote:
         | You (and quite a few others) seem to be making some implicit
         | assumptions that are not at all obvious to me:
         | 
         | 1. There is not much overlap between people sharing pictures
         | from the database and people sharing new child porn that is not
         | in it.
         | 
         | 2. People sharing child porn are a lot more smarter or security
         | conscious than the average person.
        
         | madeofpalk wrote:
         | Are you objecting to the general concept of CSAM scanning,
         | something every online storage provider does?
         | 
         | I think it's important to note that Apple is late to this, and
         | they're just playing catch-up in implementing this scanning.
         | The only difference between Apple and others is that they do it
         | completely (or partially) on-device rather than all on their
         | servers. And they they explicitly told people exactly what and
         | how they're doing it all.
         | 
         | It's kind of a funny situation where in Apple being _more_
         | transparent than any other implementer of this same process,
         | they 've gotten themselves in more strife.
        
           | foxfluff wrote:
           | > Are you objecting to the general concept of CSAM scanning,
           | something every online storage provider does?
           | 
           | Not really. I'm just saying it's laughable to think that CSAM
           | scanning in its current form is critically important. Maybe
           | it's critically important for feelgood and PR but not for
           | actually preventing child abuse. It's almost like saying
           | scanning for catalogued images of violence stops violence. If
           | it only were that easy, damn, the world would be a good
           | place.
           | 
           | Now as to what online providers should or shouldn't do, I
           | can't say. But a part of me hopes that they continue the
           | march towards egregious censorship and privacy violations so
           | that people would eventually WAKE THE FUCK UP and realize
           | that there's no way to have freedom and privacy when you're
           | handing your life to corporations and proprietary software. I
           | really do wish for a world that is de-facto free software and
           | fully controlled by the user.
           | 
           | As for iCould, I have no skin in the game. I've never used an
           | Apple product.
        
             | short_sells_poo wrote:
             | Unfortunately this clearly won't happen because of the
             | staggering amount of people _even on HN_ who see nothing
             | wrong with having on-device scanners reporting back to HQ.
             | I 'm seriously baffled by the number of apologists and
             | people who see nothing wrong with this or who flat out
             | refuse to admit that this can lead to abuse.
             | 
             | Once the on-device scanning Pandora's box is open, it's
             | trivial for governments to request more stuff to be added
             | to the databases and Apple can't claim the defense of "we
             | have no such ability currently" anymore.
             | 
             | If I was paranoid I'd wonder how much astroturfing is going
             | on here.
        
               | xpe wrote:
               | > Unfortunately this clearly won't happen because of the
               | staggering amount of people even on HN who see nothing
               | wrong with having on-device scanners reporting back to
               | HQ.
               | 
               | This is a mischaracterization of many of the arguments.
               | 
               | short_sells: My goal here is meta-goal: I am not trying
               | to change your mind on this issue; rather I want you to
               | acknowledge the strongest rational forms of disagreement.
               | 
               | This is a complex issue. It is non obvious how to trade
               | off goals of protecting kids, protecting privacy,
               | minimizing surveillance, catching predators, and dealing
               | with the effects of false positives.
               | 
               | It is too simplistic (and self defeating) to think that
               | just because people disagree with a particular conclusion
               | that they are not allies with many of your goals.
        
               | short_sells_poo wrote:
               | Yes, but we have seen in the past that privacy once lost
               | is nigh impossible to regain, and it is also obvious that
               | the scanning Apple is proposing is trivial to bypass.
               | 
               | So what is it actually trying to accomplish?
               | 
               | I really struggle to believe that they are trying to
               | protect kids (out of the goodness of their hearts).
               | 
               | The only explanation I can think of is that this is some
               | attempt at appeasing government agencies.
        
               | xpe wrote:
               | > So what is it actually trying to accomplish?
               | 
               | This form of question, as written is unnecessarily
               | limiting ...
               | 
               | (a) there doesn't have to be _one_ thing that Apple was
               | trying to accomplish
               | 
               | (b) there doesn't have to be _one_ motivation
               | 
               | ... so I 'm going to respond to the spirit of the
               | question with a set of explanations, all of which may be
               | true (to some degree) at the same time.
               | 
               | - parents are fearful of their kid's online activities;
               | 
               | - parents are open to trying new ways to give their kids
               | freedom with some guardrails;
               | 
               | - yes, many people at Apple do want to protect kids out
               | of the goodness of their hearts. This fundamental
               | instinct is widely shared, particularly among parents.
               | 
               | - putting the 'why' aside, many customers perceive value
               | and will pay for it;
               | 
               | - Apple executives are mostly profit-seeking (with
               | certain constraints such as: mental models, brand
               | constraints, regulations);
               | 
               | - shareholders seek profits and generally have less
               | loyalty to any particular company's 'values' -- meaning
               | they will 'shop around' for the best performing
               | companies;
               | 
               | - as a group, shareholders see mostly upside and little
               | downside -- don't perceive significant direct harm from
               | these changes (at least, not until this became a public
               | relations issue);
               | 
               | - generally, corporations benefit from playing nice with
               | the U.S. government;
               | 
               | - Apple, in particular, has quite publicly pushed back on
               | law enforcement's calls for decryption;
               | 
               | - in particular, with heightened scrutiny of the large
               | tech companies, olive branches are particularly useful;
               | 
               | - some at Apple may prefer to lead with a proactive
               | solution rather than wait for imposed regulations;
               | 
               | - some at Apple see this as a proactive branding
               | opportunity;
               | 
               | - some Apple engineers are at the top of their field
               | regarding encryption, security, etc and may have deemed
               | their offering the best practical option available;
               | 
               | - some at Apple certainly understand the risks but assess
               | the balance of false positives and false negatives
               | differently than you do;
               | 
               | My hope is to make it a bit easier to recognize the
               | complexity here. Though it may be true that organizations
               | act as one entity, it is not true that they have singular
               | intent. Attempts to claim a singular intent or goal are
               | subjective interpretations.
               | 
               | Note: the list above is presented sequentially, but I am
               | not claiming any causal ordering. They would be better
               | presented as a network/graph connected by topics and
               | relationships.
        
               | xpe wrote:
               | Below, I'm going to push back on what I see as
               | overconfident and overgeneralized claims. Both of these
               | can inhibit understanding and listening.
               | 
               | > but we have seen in the past that privacy once lost is
               | nigh impossible to regain
               | 
               | Yes, this is a key argument in the mix.
               | 
               | Some follow up questions:
               | 
               | 1. As I understand it, here on HN, we are an
               | international audience of various ages. With that in
               | mind, I don't know your contextual experience. Are you
               | scoping this (a) to the internet era (roughly 2000 to
               | present)? / (b) to particular countries?
               | 
               | 2. The statement quoted above is stated as if it is a
               | fact, but I hope you realize it is actually a prediction.
               | What is the historical context for this prediction? How
               | far out into the future are you predicting?
               | 
               | 3. Can you pin down your prediction more precisely? What
               | does "nigh" mean? (There is a lot of variation in what
               | "approximately" means to different people. Often the
               | 'exceptions' are quite informative.)
               | 
               | 4. The argument, as written, is quite general, which
               | makes it hard to discuss. Whose privacy and in what
               | context? Chinese citizens searching the internet?
               | Journalists doing investigative reporting? Americans
               | shopping in surveilled supermarkets? (Think of this as an
               | opportunity to explain)
               | 
               | 5. Do you mean all of the above? If you do, yes, people
               | say that online privacy has eroded in many senses. At the
               | same time, the tools for encryption have become more
               | powerful, understood, and used. My point: if you make a
               | very general statement, it is only fair if you cover the
               | full range here.
               | 
               | In summary, with the above questions, I want to both
               | better understand you -and- push back too. Unfortunately,
               | I don't find the discussion chain above (the ~3
               | ancestors) particularly persuasive. I say this even
               | though I agree with some aspects of it.
               | 
               | So you know where I'm coming from: in almost all
               | situations, I've found it is more effective to
               | understand, discuss, explain, persuade rather than
               | 'writing off' a group of people because you don't really
               | understand them.
               | 
               | P.S. I've addressed your other points in a sibling
               | comment.
        
               | xpe wrote:
               | > If I was paranoid I'd wonder how much astroturfing is
               | going on here.
               | 
               | Ah, good old apophasis:
               | 
               | > a rhetorical device wherein the speaker or writer
               | brings up a subject by either denying it, or denying that
               | it should be brought up. -
               | https://en.wikipedia.org/wiki/Apophasis
               | 
               | In relation to
               | https://news.ycombinator.com/newsguidelines.html :
               | 
               | > Please don't post insinuations about astroturfing,
               | shilling, brigading, foreign agents and the like. It
               | degrades discussion and is usually mistaken.
        
               | noptd wrote:
               | Ironic considering you posted this replying to a comment
               | that you already replied to ten hours earlier.
               | 
               | If one was paranoid, they might wonder if you forgot to
               | switch accounts /s
        
               | xpe wrote:
               | > Ironic considering you posted this replying to a
               | comment that you already replied to ten hours earlier.
               | 
               | How so?
        
           | washadjeffmad wrote:
           | They're pretty obviously not.
           | 
           | CSAM scanning stops common proliferation of already
           | identified materials and can help keep those from casually
           | entering new markets. It does not protect children from being
           | newly exploited by people using these services unless the
           | services are also doing things they claim not to be doing.
           | 
           | In that case, Apple's claim of critical importance, if we
           | interpret that as any more than rhetoric, doesn't mean what
           | they imply it to mean.
           | 
           | Edit: I replied before you edited your comment. Leaving this
           | one as is.
        
             | concinds wrote:
             | These systems rely on the premise that catching p _dos and
             | putting them out of circulation_ does* protect children,
             | since many children are abused by relatives and such. I
             | think it's a reasonable assumption!
        
               | only_as_i_fall wrote:
               | You'd have to actually have a reasonable chance of
               | catching pedophiles though. If the system is as trivial
               | to bypass as turning off photo sync it isn't a very good
               | justification.
        
             | pvarangot wrote:
             | > It does not protect children from being newly exploited
             | by people using these services unless the services are also
             | doing things they claim not to be doing.
             | 
             | While I only know about this through anecdotes people
             | share, I understand this systems generate leads so that
             | agents (usually federal) can infiltrate groups and catch
             | people uploading new material, thus preventing them for
             | further victimizing minors.
        
               | washadjeffmad wrote:
               | That smacks of those harebrained terrorist entrapment
               | plots where the federal and undercover agents are the
               | only members of groups they've "infiltrated".
               | 
               | Scaling up detection also makes it more tempting for bad
               | actors to seed more content to fabricate the appearance
               | of an epidemic. We already have inaccurate gunshot
               | detection systems, field drug tests, and expert bite mark
               | analysis being used to convict people. Would a jury even
               | be able to examine the evidence if it involved CSAM?
        
               | pvarangot wrote:
               | Yeah it's pretty bad where this is headed. It also
               | weaponizes digital images that are very easy to find.
               | 
               | Not sure if you are honestly asking about the jury thing
               | but judges do have the legal superpower to see illegal
               | images and they will usually instruct the jury to vote
               | based on what they saw.
               | 
               | The US federal prison system being one of the biggest
               | bullies on earth though, this cases rarely go to trial.
        
               | [deleted]
        
               | heavyset_go wrote:
               | > _Would a jury even be able to examine the evidence if
               | it involved CSAM?_
               | 
               | The prosecutor will hire expert witnesses who will
               | explain that their extensive training and education has
               | led them to believe that the evidence is certainly
               | illegal material.
        
           | newsbinator wrote:
           | Other implementers don't scan on-device.
           | 
           | It's not only a matter of Apple's openness about it. It's the
           | fact of what they're announcing.
        
           | vorpalhex wrote:
           | So the only way to opt out of an intrusive, sloppy and flawed
           | algorithm now is based on you hoping Apple sticks to their
           | word? The same company that gives the CCP unfettered access
           | to Chinese iCloud data?
        
           | yyyk wrote:
           | >It's kind of a funny situation where in Apple being more
           | transparent than any other implementer of this same process.
           | 
           | It isn't the same process, it's an on-device scanning process
           | and Apple is the first to implement this. Had Apple said they
           | were scanning iCloud directly nobody bat an eye (I, for one,
           | assumed they already did).
        
             | [deleted]
        
             | alibert wrote:
             | I would not say it is "scanning" as the device doesn't know
             | the final result (match or not). If I had to summarize very
             | broadly the thing to my understanding: it will be like
             | doing a "loose" SHA on every photos in the Photos app and
             | sending these to Apple (if iCloud is enabled). Server side,
             | Apple checks if the hashes match or not. Isn't this like
             | "scanning in iCloud" but without Apple needing to have a
             | decrypted photos in their servers ?
        
               | yyyk wrote:
               | >I would not say it is "scanning" as the device doesn't
               | know the final result (match or not)
               | 
               | I don't think this is a relevant distinction. If the
               | device had known the result and just sent Y/N to Apple,
               | what would change in your argument? Nothing. Your last
               | sentence would be just as arguable.
               | 
               | >Isn't this like "scanning in iCloud" but without Apple
               | needing to have a decrypted photos in their servers?
               | 
               | Note that Apple right now already has the decrypted
               | photos since they have the decryption key. There's no
               | evidence they are even considering E2EE right now, and
               | since there are other legal requirements for scanning
               | (e.g. terrorist material) I'm not sure this allows them
               | to implement E2EE.
               | 
               | And I don't see how the client-side feature can remain as
               | is. There are obvious cases that would be caught by the
               | typical server-side scanning and won't be caught here, so
               | once the door was opened, the government will pressure.
               | For example:
               | 
               | * What happens when the NCMEC database updates? It could
               | be that the phone rescans all your images. Or that apple
               | keeps the hash and rescans that. Note that the second
               | case is identical to some server-side scanning
               | implementations.
               | 
               | * What happens when you upload something to iCloud,
               | delete it from your phone and then the NCMEC database
               | updates?
               | 
               | If Apple keeps the hash, it's just server-side again. If
               | Apple doesn't keep it but uses client side scanning, the
               | phone has to keep the hashes so you can't ever truely
               | delete an image you've taken. If there's no rescan, the
               | bad guys get to keep CSAM on iCloud so long as they
               | passed the original scanning with the older NCMEC
               | database - surely the government wouldn't accept that.
               | 
               | (I considered scanning _on download_ but I don 't know if
               | Apple/the government would like this compromise, since
               | with that approach CSAM can remain on iCloud undetected
               | so long as it's not downloaded, anyway it's not in the
               | original papers).
               | 
               | Basically, either they scan on client-side more than they
               | let on or we end up in a world with both server-side and
               | client-side scanning. The latter is arguably worst of
               | both worlds, the former has implications which need to be
               | looked at.
        
               | hfsh wrote:
               | > I would not say it is "scanning" as the device doesn't
               | know the final result (match or not).
               | 
               | I would argue that 'scanning' is the process of
               | collecting data, not necessarily of interpreting it.
        
               | mikehearn wrote:
               | Isn't the "interpreting" step the one that matters?
               | 
               | Apple takes a photo, runs it through some on-device
               | transformations to create an encrypted safety voucher,
               | then it gets "interpreted" once it's uploaded to the
               | cloud and Apple attempts to decrypt it using their secret
               | key.
               | 
               | Google uploads a raw photo, which itself is essentially a
               | meaningless value in the context of identifying CSAM, and
               | Google "interprets" it on the server by hashing it and
               | comparing it against some database.
               | 
               | In both cases, the values that are uploaded by the
               | respective companies' devices don't mean anything, in the
               | context of CSAM identification, until they are
               | interpreted on the server.
        
             | strogonoff wrote:
             | It's just the continuation of the irony: Apple is also the
             | only really popular consumer cloud provider with a proper
             | e2e implementation, so they _can't_ scan photos in iCloud.
             | It's why they had to implement this process in the first
             | place.
             | 
             | Some of Apple's management must ponder that _if only_ they
             | didn't go to such lengths in denying themselves access to
             | user data, they'd have it so much easier--given no other
             | takers of such a challenge among mid- to high-end device
             | manufacturers, we all would probably have settled for e2e
             | just never being a feature of commonly available cloud
             | services. I wonder how successful the privacy-focused
             | promotion had been for Apple; they seemed pretty OK making
             | people want their devices for all the other reasons.
        
           | djbebs wrote:
           | Yes, I am. I object to any and all forms of mass surveillance
           | and spying.
           | 
           | The fact that cloud providers aren't building their systems
           | in such a way that they are unable of conducting such spying
           | is cause for concern. The fact that they are conducting such
           | spying is outrageous.
        
           | Lamad123 wrote:
           | You just said it!! Google is scanning its own servers.. This
           | company is scanning something that is supposed to be yours!
        
             | concinds wrote:
             | I fail to see any difference, when your data is as much
             | "yours" whether cloud backup is turned on or off. Google
             | and others scan data stored in Drive (and other cloud
             | storage) not just for CP, but for copyright infringement,
             | etc; even files _not shared with anyone_. Someone here just
             | yesterday complained that Google in 2014 kept deleting
             | personal files of ripped DVDs from their Drive. Given that
             | many people use GDrive /OneDrive/iCloud as their main
             | storage, where most of their PC files are stored in the
             | cloud, I fail to see any logic in making a cloud vs
             | personal device distinction.
        
               | yunohn wrote:
               | Would you argue there's no difference between the police
               | searching a safety deposit box (which is a bad example,
               | since it's actually quite private) versus constantly
               | going through and indexing the contents of your home,
               | trying to find illegal objects?
        
               | stjohnswarts wrote:
               | When you hand it off to google it's now on their servers
               | and as their user agreements point out your shit will be
               | scanned and made available to law enforcement upon
               | request and proper authority (whether that's a warrant or
               | just Bill from computer forensics asking for it, I don't
               | really know). When it's on your phone it is not being
               | scanned. It's just like having to get a warrant to rifle
               | through your house for evidence. The same should be
               | required for personal devices. It being digital doesn't
               | mean it should be any less safe than if it was on paper.
        
             | Terretta wrote:
             | Have you considered the number of client-side SDKs
             | surreptitiously doing things on "your" mobile device?
             | 
             | One of the most common is "scanning" your location
             | constantly, feeding your daily routines to marketing data
             | aggregators, which is arguably more invasive of the average
             | person's privacy than CSAM flagging.
             | 
             | I've posted in the past a list of a short-list of offending
             | SDKs frequently phoning home from across multiple apps from
             | multiple developers. Since weather apps sending your
             | location surprised people, I thought this problem would get
             | more traction.
             | 
             | This Apple thing, where this is iCloud file upload client
             | SDK doing a thing on upload, is an instance of this class
             | of problem.
             | 
             | It's not an Apple thing, it's a client SDK thing, and the
             | problem of trusting actions on "your" end of a protocol or
             | cloud service is not a solved thing.
        
             | vultour wrote:
             | Right before it's uploaded to their servers...
        
           | stjohnswarts wrote:
           | No other company is scanning for CSAM on your phone, if you
           | have proof of that please bring it forward. I would love to
           | see evidence of that on my Microsoft, Linux, and Android
           | devices. We'll be waiting. Everyone and their uncle knows
           | everything in the cloud is scanned unless it's encrypted
           | before being uploaded.
        
             | jodrellblank wrote:
             | > " _Everyone and their uncle knows everything in the cloud
             | is scanned unless it 's encrypted before being uploaded._"
             | 
             | And that's what Apple has announced. Comparing that to:
             | 
             | > " _No other company is scanning for CSAM on your phone_ "
             | 
             | Suggests that you think the Apple system is scanning more
             | than what you upload to the cloud, which it isn't. Or, less
             | charitably, suggests you /want people to think it is/. Your
             | other comment here
             | https://news.ycombinator.com/item?id=28405476 suggests the
             | same. And your comment here
             | https://news.ycombinator.com/item?id=28405502 the same
             | again.
        
             | kmlx wrote:
             | > No other company is scanning for CSAM on your phone
             | 
             | no, they're just doing it in the cloud, after everything
             | was uploaded automatically :)
        
           | fsflover wrote:
           | It doesn't matter what the purpose of this move is. It's not
           | "catch-up" at all. The scanning will happen on the devices
           | using the users' resources, _before_ uploading. See also:
           | https://news.ycombinator.com/item?id=28309202.
        
             | judge2020 wrote:
             | It's catch-up in that Apple submits less than a thousand
             | CSAM reports to NCMEC annually:
             | 
             | > According to NCMEC, I submitted 608 reports to NCMEC in
             | 2019, and 523 reports in 2020. In those same years, Apple
             | submitted 205 and 265 reports (respectively). It isn't that
             | Apple doesn't receive more picture than my service, or that
             | they don't have more CP than I receive. Rather, it's that
             | they don't seem to notice and therefore, don't report.
             | 
             | > In 2020, FotoForensics received 931,466 pictures and
             | submitted 523 reports to NCMEC; that's 0.056%. During the
             | same year, Facebook submitted 20,307,216 reports to NCMEC.
             | 
             | https://www.hackerfactor.com/blog/index.php?/archives/929-O
             | n...
        
             | simondotau wrote:
             | If I were a US citizen, and I accepted that scanning cloud-
             | stored photos for CSAM was neccesary, I would greatly
             | prefer Apple's approach because any such search coerced by
             | Government are protected by the Fifth Amendment. Whereas if
             | the search occurs in the cloud, Government can force Apple
             | or Google to search for whatever they want, using whichever
             | algorithm they want.
        
             | madeofpalk wrote:
             | I understand people's objection with Apple's on-device
             | implementation of this scanning, but that didn't seem to be
             | parent's objection. I was just trying to clarify.
        
           | jpxw wrote:
           | Also, people are annoyed about the fact that the scanning is
           | done client-side, which is ironic because Apple are doing
           | that in order to increase the security/privacy of the system.
           | I can see why people are uncomfortable about client-side
           | scanning, but there's still some amusing irony in it.
        
             | darksaints wrote:
             | Nothing about this "feature" increases security or privacy.
        
               | helen___keller wrote:
               | If you compare it against the industry standard of in-
               | cloud scanning, the privacy comes from the fact that an
               | on-device model results in 0 Apple servers ever examining
               | your photos (unless you have 30 CSAM flags on your iCloud
               | account), whereas an on-server model results in Apple
               | servers examining every single photo you upload.
               | 
               | You can argue it's better to have neither type of
               | scanning, but if Apple considers it a critical business
               | issue that their cloud is the preferred hosting site of
               | CSAM[0], then they presumably have to pick one or the
               | other.
               | 
               | You can also argue that on-device seems creepier or more
               | invasive, even if it doesn't result in Apple examining
               | your photos, which is a reasonable reaction. It certainly
               | breaks the illusion that it's "your" device.
               | 
               | But it's a fact that the on-device model, as described,
               | results in less prying eyes on your iCloud photos than
               | the on-server model.
               | 
               | [0] I'm not claiming this is the case, just saying for
               | example
        
               | darksaints wrote:
               | None of this is an increase in security or privacy. What
               | you described is merely a _mitigation_ of a massive loss
               | in security and privacy, when compared to other massive
               | losses in security and privacy.
        
               | helen___keller wrote:
               | > What you described is merely a mitigation of a massive
               | loss in security and privacy
               | 
               | If we're looking at the end result, a mitigation of a
               | loss of privacy is an increase in privacy compared to the
               | alternative, no?
               | 
               | I mean clearly what you're saying here is "scanning
               | always bad!". I understand that, I really do. I'm saying
               | that scanning was never not on the table for a large
               | corporation hosting photos on their server. Apple held
               | out on the in-cloud scanning because they wanted a
               | "better" scanning, and GP's point is that it's ironic
               | that the one cloud provider willing to try to make a
               | "less bad" scanning option is the one most demonized in
               | the media.
               | 
               | None of this is to argue that scanning is anything less
               | than a loss in security and privacy. Yes, yes, E2EE
               | running on free and open source software that I
               | personally can recompile from source would be the best
               | option.
        
               | darksaints wrote:
               | > If we're looking at the end result, a mitigation of a
               | loss of privacy is an increase in privacy compared to the
               | alternative, no?
               | 
               | I guess you could say that in the same way that you can
               | say that a gambler who just won $10 is "winning", even
               | though their bank account is down $100,000. It only works
               | if you completely lose all perspective.
        
               | helen___keller wrote:
               | "winning" is subjective, but $10 > $0 is a fact. That's
               | all I'm trying to say here.
        
               | JohnFen wrote:
               | > GP's point is that it's ironic that the one cloud
               | provider willing to try to make a "less bad" scanning
               | option is the one most demonized in the media.
               | 
               | I think that may be because it's far from clear that
               | Apple's solution is "less bad".
        
               | JohnFen wrote:
               | > the privacy comes from the fact that an on-device model
               | results in 0 Apple servers ever examining your photos
               | (unless you have 30 CSAM flags on your iCloud account),
               | whereas an on-server model results in Apple servers
               | examining every single photo you upload.
               | 
               | Every single photo you upload is getting scanned -- it's
               | just that Apple is doing the scanning on "your" device
               | instead of their servers.
               | 
               | From the point of view of the privacy of your photos, I
               | fail to see what the difference between the two is. I
               | mean, if they did the exact same type of scanning on
               | their servers instead of your device, the level of
               | privacy would be identical.
               | 
               | In terms of general privacy risks, not to mention the
               | concept that you own your devices, there is an enormous
               | difference between the two, and on-device scanning is
               | worse.
        
               | helen___keller wrote:
               | > From the point of view of the privacy of your photos, I
               | fail to see what the difference between the two is.
               | 
               | Good point. The question is privacy "from whom". For me,
               | privacy "from apple" means mostly from malicious
               | individuals working for Apple (indeed, if you have an
               | iPhone running proprietary Apple software, you could
               | never truly have privacy from Apple the corporation).
               | 
               | There are documented cases of employees at various cloud
               | services[0] using their positions of power to spy on
               | users of said services. Performing on-server scanning
               | implies that servers regularly decrypt users' encrypted
               | data as a routine course (or worse, never encrypt it in
               | the first place), which provides additional attack
               | vectors for such rogue employees.
               | 
               | On the other hand, taking the on-device scanning as
               | described, the on-device scanning process couldn't be
               | possibly used as an attack vector by a rogue employee,
               | since Apple employees do not physically control your
               | device. Maybe an attack vector here involves changing the
               | code and pushing a malicious build, which is a
               | monumentally difficult task (and already an attack vector
               | today).
               | 
               | [0]
               | https://www.telegraph.co.uk/news/2021/07/12/exclusive-
               | extrac...
        
               | JohnFen wrote:
               | > The question is privacy "from whom".
               | 
               | For me, privacy means that I have control over who I
               | disclose what to. But context matters. If I'm in my own
               | house, I (should) have almost total control over
               | disclosure. When I'm in someone else's house, I have very
               | little control as I'm subjecting myself to their rules.
               | 
               | A smartphone is probably the most intimate, personal
               | device most people will ever own, and it's the equivalent
               | of their house. However, if you're using cloud services,
               | then you're in someone else's house and are subject to
               | their rules.
               | 
               | That's why, in my view, doing the scanning on-device is
               | not only dangerous, but unethical. Doing the scanning on
               | the servers is neither of those things.
               | 
               | I get the argument about rogue employees, but I don't
               | find it persuasive. I'm told that Apple keeps your data
               | encrypted on their servers, although they hold the keys.
               | If that's so, then "rogue employees" are something that
               | Apple can, and should, control.
        
           | baggy_trough wrote:
           | That "only" is doing a huge amount of work in your comment.
           | Turning my device into something that snitches on me is a
           | huge difference from scanning my uploads.
        
             | jodrellblank wrote:
             | If you send any of those illegal photos to Facebook,
             | Facebook will snitch on you. If you send them to iCloud
             | (under this described system), Apple will snitch on you. It
             | is no different to "scanning your uploads" because it /is/
             | "scanning your uploads". That it's the CPU in the phone
             | doing most of the work and iCloud servers making the
             | connection, vs Facebook's CPU and cloud doing all the work,
             | makes zero practical difference to anything.
             | 
             | Arguing about where in the combined system the processing
             | happens is shifting the deck-chairs on the Titanic, it's
             | not making one part of the system more or less responsible
             | than any other part.
        
         | sneak wrote:
         | Not only that, but identifying those storing existing, known
         | CSAM means that, unless they have previously unknown CSAM also,
         | no additional harm has been done, as it's just file sharing.
         | 
         | It's only production of new/novel CSAM that harms children.
         | Sharing of existing, known CSAM (what this system detects)
         | doesn't harm anyone.
        
         | burnished wrote:
         | Regardless of the larger conversation, why do you think that
         | adding friction wouldn't stop people from engaging in an
         | activity? I mean, generally speaking A/B testing is used to
         | find the path to most engagement and doesn't get this kind of
         | pushback.. but as soon as you want to make something harder you
         | get people saying "but that wont do anything". It demonstrably
         | does! 'trivial to circumvent' in a world where people regularly
         | have the experience of being called out to plug something in,
         | when "have you checked the power button and power cord" has
         | been a common trope for over two decades!
         | 
         | I'm genuinely curious about your thoughts but to be clear my
         | focus is on this very narrow, nigh nitpick tangent.
        
           | foxfluff wrote:
           | > Regardless of the larger conversation, why do you think
           | that adding friction wouldn't stop people from engaging in an
           | activity?
           | 
           | See war on drugs or piracy, or alcohol prohibition for
           | instance.
           | 
           | Now there's a million ways to share pictures online in a
           | manner that bypasses the few big platforms' interference, and
           | you really don't need to be a genius to use (say) password-
           | protected archives. That's how a lot of casual online piracy
           | happens.
           | 
           | This thing does _very little_ to prevent spreading CSAM
           | pictures, and it does nothing to prevent actual child abuse.
        
             | poetaster wrote:
             | The password protecteded zip would be scanned? gog
             | encrypted files decrygpted? I think you are missing
             | parent's point.
        
             | burnished wrote:
             | Well, those things were actually pretty effective at
             | stopping people. People clearly circumvented those
             | restrictions, but from my perspective the number of people
             | that smoke weed probably went up after it became easy to
             | access in my state, and the quality and variety of products
             | increased enormously. They weren't effective at achieving
             | any broader aim, like stopping or slowing drug production
             | but frankly that wasn't even the point. Regardless, I think
             | we can both agree that these broad bans decreased the
             | number of people participating in that behavior. I think
             | the shift in our societies viewpoints on them came about as
             | an awareness that they were ineffective at achieving any
             | moral or justifiable aim. I think the correct criticism of
             | them isn't that they failed to prevent or impede a given
             | behavior, but that they weren't worth the negative
             | externalities (if you'll forgive the understatement).
             | 
             | Which brings me to your example of a password protected
             | archive. I'm ignorant on specifics so I'm just going to
             | sketch a broad argument and trust that you'll correct me if
             | my premise is incorrect or my argument otherwise flawed.
             | Essentially, if something is opt-in instead of opt-out, a
             | non-trivial portion of the population won't do it.
             | Especially if it is even slightly technical, there are just
             | a lot of people who will just stop thinking as soon as they
             | encounter a word they don't already know. So if the
             | preventative measure is something that is not automatic and
             | built in to whatever tool they are using to share images,
             | then that preventative measure will not protect most of
             | them. So to bring it full circle, I think it would do much
             | to prevent the spreading of CSAM because other bans have
             | been effective and I don't think most people have the
             | technical literacy to even be aware of how to protect
             | themselves form surveillance. As you say you don't have to
             | be a genius, but I'd suggest you'd need to be above
             | average, which gives you over half the population.
             | 
             | Also thanks for responding, I hope I'm not coming across as
             | unpleasantly argumentative, I mean all this in the most
             | truth-seeking sense of a discussion (I was going to say
             | 'sporting sense of a debate' when I realized I had never
             | been part of a formal debate team and that might not mean
             | what I thought it meant, heh).
        
               | foxfluff wrote:
               | Sorry I haven't had the time to write a thoughtful reply.
               | 
               | Are you talking about a state where cannabis was de-
               | criminalized?
               | 
               | I agree that ease of access does have an effect on
               | people, but the effect of iCloud scanning is so marginal
               | in the grand scheme of things that it's almost like
               | fighting drugs by installing surveillance cameras in
               | malls. They just go trade and smoke elsewhere. The
               | friction is virtually zero, but the privacy concern of
               | scanning on half a billion Apple devices is much worse.
               | 
               | It's worth keeping in mind that CSAM is already highly
               | illegal and banned, whether Apple scans iCloud photos
               | makes no difference on that front. So it's nothing like
               | the difference between weed being de-criminalized or not.
               | 
               | Also, fact is you already have to jump through hoops to
               | obtain CSAM. It's _very_ rare to stumble upon it being
               | casually shared online (I think the last I witnessed it
               | must 've been around 15 years ago on 4chan, and somewhere
               | between 5 and 10 years ago in a spam attack on freenode).
               | Trying to search for it on the clearnet is mostly going
               | to yield nothing.
               | 
               | In general, people also tend to know when they're doing
               | something highly illegal. And yet they still do it, just
               | taking the steps to try avoid being caught. No difference
               | with CSAM. They will jump through hoops, and "don't store
               | child porn on iCloud" is the tiniest of hoops to jump for
               | real.
               | 
               | Password protected archives were meant to be _just one
               | example_ of how to bypass scanning on cloud platforms,
               | and one that happens to be widely used among casual
               | pirates. Google drive might be one of the biggest pirate
               | services around these days. The bigger point I was trying
               | to make is just that there are countless ways to share
               | files without exposing their contents to scanning.
               | Nothing for people who are willing to jump through hoops
               | to get CSAM.
               | 
               | Finally, one point I've had to try make over and over
               | again is that detecting the storage or distribution of
               | old (catalogued) CSAM photos is only _very tangentially_
               | related to actual abuse of childen. Unfortunately that
               | keeps happening even if you destroy the internet and make
               | sure no photo is ever stored in the cloud again.
               | 
               | I've said it before: child abuse and violence existed
               | before cameras and internet, and will continue to exist.
               | Detecting images of abuse or violence is not going to
               | stop abuse or violence.
               | 
               | And if someone makes a system that is efficient at
               | detecting all catalogued (=old) images of CSAM, then that
               | might just create a larger market for "fresh"
               | (uncatalogued) child abuse. Credit to nullc for realizing
               | this.
               | 
               | And on protecting-the-children front, there are much
               | bigger problems than stashes of old CSAM. Like grooming,
               | or chatrooms where child prostitutes are forced to stream
               | for an audience..
        
         | tgv wrote:
         | I'm betting a sizeable group has images from the database. They
         | don't know which images are known, I think. But the fact that
         | it's nearly trivial to circumvent does make me wonder.
        
         | Salgat wrote:
         | I am so confounded on why they are burning through so much good
         | will on something they don't even seem to really benefit from.
         | Has anyone explained this?
        
           | hannasanarion wrote:
           | It might only help a little bit, but it also won't hurt
           | anyone so there's not really a reason not to.
           | 
           | Stupid child abusers who put known-pornographic images in
           | their iclouds are still child abusers. The fact that fruit is
           | low hanging isn't a reason not to pick it.
        
           | Drew_ wrote:
           | Pressure from governments of course. Apple would rather
           | control how this works then cede control to
           | governments/legislation. Problem is that this still doesn't
           | stop governments from stepping in anyway. Apple has simply
           | laid their own framework instead.
        
       | threshold wrote:
       | This technology could have led to the arrests of hardcore
       | criminals and spared some poor children a very difficult life.
       | Well now we know where this generations priorities and values
       | stand.
        
         | vnchr wrote:
         | We could just imprison everyone and prevent all future crime...
         | 
         | Valuing and prioritizing privacy is not advocating for crimes
         | against children.
        
           | threshold wrote:
           | In this case it is. That's the choice today. Have a more
           | private cell phone -or- take many serious offenders off the
           | streets.
        
         | yomly wrote:
         | as a criminal you can just opt out of Apple hardware, and/or
         | implement separately encrypted content. Better - maintain your
         | facade of having an Apple device for your daytime phone and use
         | a burner by night. Has DRM really stopped piracy?
         | 
         | All this did was make it more annoying for criminals and maybe
         | make it easier to catch dumb criminals - who would have been
         | caught by other means anyway.
        
         | stjohnswarts wrote:
         | Not true. Those people would just have encrypted it before it
         | ever hit their iphone. They're not stupid. This was a blatant
         | attempt to soften the blow of apple acting as a spy for the
         | government on your phone. If that isn't far scarier than "think
         | of the children" then I don't know what is.
        
           | threshold wrote:
           | Actually they're pretty stupid. Look at what they're doing.
           | Not exactly an intellectual pursuit. And if Apple implements
           | their system it will catch many, many offenders and yes - it
           | might save some children. When did taking care of our
           | children become an afterthought? You're more worried about
           | Apple looking through your phone than letting these nuts walk
           | around the community?
        
         | roamerz wrote:
         | Privacy doesn't seem to matter to you.
         | 
         | What happens when Apple adds brainwave detection to their
         | Airpods? Now all I have to do is have a fleeting thought about
         | something that is deemed unacceptable by the ruling class and
         | the authorities show up and arrest me.
         | 
         | So in your perfect little world where do you draw the line? Is
         | it there or the next logical step where AI deems that based on
         | it's algorithm you are going to break some law tomorrow and now
         | they are going to preemptively take action?
         | 
         | Privacy matters.
        
           | threshold wrote:
           | That's science fiction nonsense. You let us know when the
           | shuttle lands then we'll talk about human trafficking.
        
         | nullc wrote:
         | Facebook makes 20 million reports a year, yet very few cases
         | are filed.
         | 
         | Worse-- this system only would detect old widely circulated
         | images. Arguably, it may increase the incentive to create
         | undetectable novel imagery for closed circulation, creating new
         | instances of abuse in the process.
         | 
         | Police making random nighttime sweeps of homes in your
         | community would also likely catch some serious criminals. Our
         | decisions have to be based on more than just the potential of a
         | narrow benefit. It's not sufficient to reduce one ill in the
         | world if the result is a world less worth living in overall.
        
       | krsdcbl wrote:
       | This reaks of "WhatsApp ToS update".
       | 
       | Taking some time to let the news wave blow ove... errr reconsider
       | and make improvements, i mean
        
       | dehrmann wrote:
       | Did they really announce this the Friday before a three-day
       | weekend?
        
       | TheRealPomax wrote:
       | "critically important child safety features"? Heinous as those
       | crimes may be, no numbers means no ground to stand on.
        
       | roody15 wrote:
       | Apple "Lets wait another 3-4 months and work on a better PR
       | campaign"
       | 
       | No plans to drop these features
        
       | buu700 wrote:
       | Great news! Now let's see if we can also get Google to reverse
       | course on backdooring every new Android app:
       | https://news.ycombinator.com/item?id=27695486
        
       | zzo38computer wrote:
       | > The (unauditable) database of processed CSAM images will be
       | distributed in the operating system (OS)
       | 
       | How much disk space will this require?
        
       ___________________________________________________________________
       (page generated 2021-09-04 23:01 UTC)