[HN Gopher] Apple Photos phones home on iOS 18 and macOS 15
       ___________________________________________________________________
        
       Apple Photos phones home on iOS 18 and macOS 15
        
       Author : latexr
       Score  : 1126 points
       Date   : 2024-12-28 19:22 UTC (1 days ago)
        
 (HTM) web link (lapcatsoftware.com)
 (TXT) w3m dump (lapcatsoftware.com)
        
       | sillywalk wrote:
       | I don't care if the device checks it, it uses homomorphic
       | encryption and differential privacy. This is turned on by
       | default. This angers me.
        
         | guzik wrote:
         | Clearly, in Cupertino, 'enhancing user experiences' without
         | consent is the top priority.
        
         | bigwordsehboy wrote:
         | "homomorphic encryption and differential privacy"
         | 
         | It is new. It is fancy. It is not clear where HE/DP is being
         | used, it depends if the code is written using the Swift
         | toolkit, but even that has paths for exfiltration if used
         | incorrectly. They claim they are using DP in Photos as stated
         | in the article here:
         | 
         | https://machinelearning.apple.com/research/scenes-differenti...
         | 
         | But the fact remains they are looking at your pictures. I do
         | not trust them for one fleeting fart in the wind on this. Think
         | about it for a hot second: HE/DP allows you to perform
         | operations on the data without knowing the data, but what if
         | someone goofs an operation and it ends up returning the actual
         | data?
         | 
         | Sorry, not buying it. Crypto is hard to get right, and when it
         | is monetized like this for "new features", it is wildly
         | unnecessary and exposes users to more risk.
        
           | ricardobeat wrote:
           | > what if someone goofs an operation and it ends up returning
           | the actual data
           | 
           | That's kind of like saying "they can take a picture of your
           | face and transmit it, but what if someone goofs an operation
           | and sends your actual head instead".
           | 
           | Any encrypted data the server has cannot 'accidentally' be
           | decrypted, and as someone else explained in this thread they
           | only send encrypted vectors describing features of the image
           | (building, tree, etc) and not the actual image. It's
           | certainly not a fact that "they are looking at your pictures"
           | [1].
           | 
           | [1] "they" being Apple; the _other guys_ could have
           | backdoored the whole OS and Apple employee 's computers for
           | all I know
        
       | gigel82 wrote:
       | > From my own perspective, computing privacy is simple: if
       | something happens entirely on my computer, then it's private,
       | whereas if my computer sends data to the manufacturer of the
       | computer, then it's not private, or at least not entirely
       | private. Thus, the only way to guarantee computing privacy is to
       | not send data off the device.
       | 
       | +1
        
       | zaroth wrote:
       | I don't even use iCloud Photos and this was on by default. Very
       | bad move by Apple to ship my photos off my device, without my
       | permission, in any shape or form, I don't care.
        
         | threeseed wrote:
         | If you don't use iCloud Photos your photos are not shipped off
         | your device.
        
           | zaroth wrote:
           | I wish it were so! That was true until they turned this
           | feature on by default.
        
         | mike_d wrote:
         | That isn't what is happening. The author of this blog post has
         | absolutely no idea what he is talking about.
        
           | JTyQZSnP3cQGa8B wrote:
           | > That isn't what is happening
           | 
           | Then why is it happening to me?
        
         | newZWhoDis wrote:
         | It's funny how the wordsmiths come out to defend Apple here.
         | 
         | Your "photos" aren't shipped off-device without your knowledge,
         | just an arbitrary blob of "metadata" that you can't audit
         | describing everything about that photo. :)
         | 
         | It's sort of like "I don't want my WiFi router uploading my
         | house online!" And someone replying "it's not your physical
         | house, just a map of the house and the real time location of
         | everyone in it! The house never moves!"
        
       | hu3 wrote:
       | So it sends your photos to be indexed on Apple servers. Turned on
       | by default.
       | 
       | This is probably done to compete with Google Photos which has a
       | great photo search by word feature.
       | 
       | With that said, Apple can use whatever privacy measures to
       | protect user data. But at the end of the day, a subpoena can
       | easily force them to hand over data.
       | 
       | The best privacy measure is to just not have the data. I guess
       | indexing photos offline in phone is not very feasible yet.
        
         | thought_alarm wrote:
         | It does not send your photos to be indexed on Apple servers.
        
           | mentalgear wrote:
           | it literally says it uses global search index on the label
           | below the check mark. It seems more than likely that (now or
           | at least in the long run) they will use user data to enhance
           | this index.
        
             | 0points wrote:
             | It's funny how one can use "literally" while almost
             | demonstrating illiteracy.
             | 
             | It's right there in the help text:
             | 
             | > Allow this device to privately match places in your
             | photos with a global index maintained by Apple so you can
             | search by almost any landmark or point of interest.
             | 
             | They are saying that they match your photo against their
             | global index.
             | 
             | They are not saying that they will add your photo to their
             | global index.
        
               | hu3 wrote:
               | > They are not saying that they will add your photo to
               | their global index.
               | 
               | Nobody said that except you.
        
         | tredre3 wrote:
         | > This is probably done to compete with Google Photos which has
         | a great photo search by word feature.
         | 
         | Apple Photos is already capable of searching your photos by
         | word/content fully offline on-device.
         | 
         | Google Photos is unable to search your photos by word/content
         | fully offline on-device. Hell, it can't even search by _file
         | name_ if you don 't sync your photos to the cloud!
         | 
         | I don't think Apple has to worry about Google at all.
        
           | hu3 wrote:
           | Neither is Apple, FTA.
        
         | accrual wrote:
         | > I guess indexing photos offline in phone is not very feasible
         | yet.
         | 
         | There has been a form of on device indexing since at least iOS
         | 12. My understanding is it performs a basic indexing typically
         | overnight when the phone is charging and allows one to perform
         | searches like "dog" or "car" and pull up matching images.
         | 
         | https://support.apple.com/guide/iphone/search-in-photos-iph3...
        
           | hu3 wrote:
           | That's nice but clearly not enough to compete with Google
           | Photos.
           | 
           | Hence why Apple started sending photo hashes to their servers
           | for indexing by default.
           | 
           | There's only so much offline computing power.
        
       | walterbell wrote:
       | Was this not announced during the iOS18 'Glowtime' event? Gosh.
       | Thanks for the neon! https://www.cnet.com/tech/mobile/everything-
       | announced-at-app...
       | 
       | https://www.internetsociety.org/resources/doc/2023/client-si...
       | The Internet Society makes the following recommendations based on
       | the European Commission's proposal:              That the
       | European Committee introduce safeguards for end-to-end
       | encryption.         That the European Committee prohibits the use
       | of scanning technologies for general monitoring, including
       | client-side scanning
        
         | bogantech wrote:
         | Should I have to watch marketing events to make sure I'm not
         | being spied on?
        
           | walterbell wrote:
           | Maybe just the ones with bright neon "Glow-foo" canary
           | branding..
        
       | scosman wrote:
       | "I don't understand most of the technical details of Apple's blog
       | post"
       | 
       | I do:
       | 
       | - Client side vectorization: the photo is processed locally,
       | preparing a non-reversible vector representation before sending
       | (think semantic hash).
       | 
       | - Differential privacy: a decent amount of noise is added the the
       | vector before sending it. Enough to make it impossible to reverse
       | lookup the vector. The noise level here is e = 0.8, which is
       | quite good privacy.
       | 
       | - OHTTP relay: it's sent through a 3rd party so Apple never knows
       | your IP address. The contents are encrypted so the 3rd party
       | never doesn't learn anything either (some risk of exposing "IP X
       | is an apple photos user", but nothing about the content of the
       | library).
       | 
       | - Homomorphic encryption: The lookup work is performed on server
       | with encrypted data. Apple can't decrypt the vector contents, or
       | response contents. Only the client can decrypt the result of the
       | lookup.
       | 
       | This is what a good privacy story looks like. Multiple levels of
       | privacy security, when any one of the latter 3 should be enough
       | alone to protect privacy.
       | 
       | "It ought to be up to the individual user to decide their own
       | tolerance for the risk of privacy violations." -> The author
       | themselves looks to be an Apple security researcher, and are
       | saying they can't make an informed choice here.
       | 
       | I'm not sure what the right call is here. But the conclusion
       | "Thus, the only way to guarantee computing privacy is to not send
       | data off the device." isn't true. There are other tools to
       | provide privacy (DP, homomorphic encryption), while also using
       | services. They are immensely complicated, and user's can't
       | realistically evaluate risk. But if you want features that
       | require larger-than-disk datasets, or frequently changing
       | content, you need tools like this.
        
         | Gabriel54 wrote:
         | I appreciate the explanation. However, I think you do not
         | address the main problem, which is that my data is being sent
         | off my device by default and without any (reasonable) notice.
         | Many users may agree to such a feature (as you say, it may be
         | very secure), but to assume that everyone ought to be opted in
         | by default is the issue.
        
           | JustExAWS wrote:
           | Do you use iCloud to store your photos?
        
             | latexr wrote:
             | I'm not the person you asked, but I agree with them. To
             | answer your question: No, I do not use iCloud to store my
             | photos. Even if I did, consent to store data is not the
             | same as consent to scan or run checks on it. For a company
             | whose messaging is all about user consent and privacy, that
             | matters.
             | 
             | This would be easily solvable: On first run show a window
             | with:
             | 
             | > Hey, we have this new cool feature that does X and is
             | totally private because of Y [link to Learn More]
             | 
             | > Do you want to turn it on? You can change your mind later
             | in Settings
             | 
             | > [Yes] [No]
        
               | JustExAWS wrote:
               | When iCloud syncs between devices how do you think that
               | happens without storing some type of metadata?
               | 
               | You don't use iCloud for anything? When you change phones
               | do you start fresh or use your computer for backups? Do
               | sync bookmarks? Browsing history?
               | 
               | Do you use iMessage?
        
               | Gabriel54 wrote:
               | In response to your question in the parent comment, no, I
               | do not use iCloud. And I do not sync any of the things
               | you mentioned here. If someone already consented to using
               | iCloud to store their photos then I would not consider
               | the service mentioned this post to be such a big issue,
               | because Apple would already have the data on their
               | servers with the user's consent.
               | 
               | edit: I will just add, even if we accept the argument
               | that it's extremely secure and impossible to leak
               | information, then where do we draw the line between
               | "extremely secure" and "somewhat secure" and "not secure
               | at all"? Should we trust Apple to make this decision for
               | us?
        
               | JustExAWS wrote:
               | Do you start fresh with an iOS installation after each
               | upgrade or do you back up your iPhone using your computer
               | and iTunes?
        
               | Gabriel54 wrote:
               | I do not have anything backed up on any cloud servers on
               | any provider. If I had to buy a new phone I would start
               | from a fresh installation and move all of my data
               | locally. It's not that I'm a "luddite", I just couldn't
               | keep track of all of the different ways each cloud
               | provider was managing my data, so I disabled all of them.
        
               | JustExAWS wrote:
               | If only Apple had a centralized backup service that could
               | store everything automatically at a click of a button so
               | you wouldn't have to juggle multiple cloud providers...
        
               | NikolaNovak wrote:
               | I kinda was somewhat with you until this point.
               | 
               | Apple IS just another cloud provider / centralized backup
               | service. It's not fundamentally different than others,
               | and if you're not in select group of whatever the
               | respectful term is for those who stay strictly inside
               | apple ecosystem, you will have multiple clouds and
               | multiple data sets and multiple backups that all interact
               | with each other and your heterogeneous devices in
               | unpredictable ways. Icloud will not help you with that
               | any more than google cloud or Samsung cloud etc. They all
               | want to own all of your stuff, neither is simply a hyper
               | helpful neutral director.
        
               | JustExAWS wrote:
               | The "fundamental difference" is that it's better
               | integrated with your device and can backup the internal
               | state of your device and the apps.
               | 
               | Even if you use Microsoft Office or GSuite and save using
               | the standard file picker, you can save to iCloud. iCloud
               | has a native app for Windows and plug ins on Windows to
               | sync browser bookmarks for Chrome, Edge and Firefox
               | 
               | And the alternative people are proposing are four or five
               | self hosted solutions?
        
               | NikolaNovak wrote:
               | Again, I think there's an assumption of single device /
               | ecosystem loyalty in your statement? I have an android
               | phone and iOS phone and three android tablets and a bunch
               | of laptops with various operating systems.
               | 
               | Iphone is "just another device". I don't feel Icloud is
               | any better integrated with my Samsung note, than google
               | is integrated with my iPhone - in fact, the opposite.
               | Google, for example, CAN sync my photos across iphone and
               | Android and windows devices. Whereas my wife knows the
               | primeval scream from the home office every 6 months I try
               | to claw photos out of apple's greedy selfish hands :-)
               | 
               | For people who JUST use iphone, sure, Icloud is the boss
               | just like for people who JUST use e.g. Samsung galaxy the
               | Samsung cloud is awesome. But that's not a high bar. I
               | feel we are still lacking empathy here for people like
               | original poster who may have more than one device in
               | their lives.
        
               | JustExAWS wrote:
               | And none of these can sync your bookmarks, iOS settings
               | or store the internal state of apps on your iPhone.
               | 
               | And I wouldn't have the same arguments if they weee using
               | Google cloud. But they are concerned about "privacy" and
               | trust _Google_?
               | 
               | But my argument is about people thinking that Apple or
               | Google should care about the minuscule number of people
               | who are hosting their own syncing services
        
               | oarsinsync wrote:
               | Not all apps support Apple's backup solution. Threema and
               | Signal come to mind.
        
               | JustExAWS wrote:
               | And that is because of policy choices by Signal.
        
               | oarsinsync wrote:
               | So because of policy choices made by app developers, you
               | have to manage multiple cloud solutions.
               | 
               | Or as the GP suggested, forego the cloud entirely. iCloud
               | and Apple's built in iOS backup is not a magic bullet
               | unfortunately.
        
               | JustExAWS wrote:
               | By one lone outlier who decides for "security" that the
               | don't want to support the platforms backup solution. That
               | app purchase didn't have to do anything besides store
               | information locally in their sandbox
        
               | warkdarrior wrote:
               | Does Signal allow the user to opt-in/opt-out into their
               | policy? Or are they forcing this policy on their users?
        
               | JustExAWS wrote:
               | No. They do not allow users to opt in
        
               | 4ad wrote:
               | > If someone already consented to using iCloud to store
               | their photos then I would not consider the service
               | mentioned this post to be such a big issue, because Apple
               | would already have the data on their servers with the
               | user's consent.
               | 
               | No, if you enable Advanced Data Protection for iCloud[1],
               | the photos stored in Apple Photos are end to end
               | encrypted.
               | 
               | [1] https://support.apple.com/en-us/108756
        
               | walterbell wrote:
               | Some iOS apps synchronize data with standard protocols
               | (e.g. IMAP, WebDAV, CalDAV) to cloud or self-hosted
               | services.
        
               | JustExAWS wrote:
               | And that doesn't help with internally stored data within
               | apps, settings, which apps you have installed on what
               | screen, passwords, etc
        
               | walterbell wrote:
               | iOS supports local device backups.
        
               | latexr wrote:
               | None of that is relevant to my point. You seem to be
               | trying to catch people in some kind of gotcha instead of
               | engaging honestly with the problem at hand. But alright,
               | I'll bite.
               | 
               | Yes, I always start with clean installs, both on iOS and
               | on macOS. Sometimes I even restart fresh on the same
               | device, as I make sure my hardware lasts. I don't sync
               | bookmarks, I keep them in Pinboard and none of them has
               | any private or remotely identifiable information anyway.
               | I don't care about saving browser history either, in fact
               | I have it set to periodically auto-clear, which is a
               | feature in Safari.
        
               | JustExAWS wrote:
               | No I am trying to say with a connected device using
               | online services, the service provider is going to have
               | access to your data that you use to interact with them.
               | 
               | To a first approximation, everyone in 2024 expects their
               | data and settings to be transferred across devices.
               | 
               | People aren't working as if it is 2010 when you had to
               | backup and restore devices via iTunes. If I'm out of town
               | somewhere and my phone gets lost, damaged or stolen, I
               | can buy another iPhone, log into my account and
               | everything gets restored as it was.
               | 
               | Just as I expect my watch progress to work when I use
               | Netflix between my phone, iPad, Roku devices etc.
        
               | latexr wrote:
               | And that should rightfully be your _informed choice_.
               | Just like everyone else should have the right to know
               | what data their devices are sending _before it happens_
               | and be given the _informed choice_ to refuse. People
               | shouldn't have to learn that from a random blog post
               | shared on a random website.
        
               | JustExAWS wrote:
               | In what world is Netflix for instance not going to know
               | your watch history?
               | 
               | How many people are going to say in 2024 that they don't
               | want continuous cloud backup? You want Windows Vista
               | style pop ups and permissions?
        
               | latexr wrote:
               | How many times are you going to shift the goalposts? This
               | is getting tiresome, so I'll make it my last reply.
               | 
               | I don't have Netflix but neither is that relevant to the
               | point, you're obviously and embarrassingly grasping at
               | straws.
               | 
               | No one is arguing against continuous cloud backups,
               | they're arguing about _sending data without consent_.
               | Which, by the way, is something Apple used to understand
               | not to do.
               | 
               | https://www.youtube.com/watch?v=39iKLwlUqBo
               | 
               | Apple's OS are already filled with Windows Vista style
               | popups and permissions for inconsequential crap, people
               | have been making fun of them for that for years.
        
               | scarface_74 wrote:
               | If you are doing continuous cloud backups and using Apple
               | services - you are already giving Apple your data and
               | your solution is to add even more permissions? You are
               | not going to both use any Apple service that requires an
               | online component and keep Apple from having your data.
               | 
               | Isn't it bad enough that I have a popup every time I copy
               | and paste between apps?
        
               | doublerabbit wrote:
               | > Isn't it bad enough that I have a popup every time I
               | copy and paste between apps?
               | 
               | For me, not really no. It reminds me I am copying
               | information and not from some phishing app, I find it
               | informative.
               | 
               | And I'm probably one of the few who actually click
               | "Reject" to the cookie pop ups having to click no on 3742
               | legitimate consents.
               | 
               | The simple answer is everything should be opt-out. I'll
               | opt-in if I require it because frankly, regardless to how
               | Fort-Knox my data is $CORP still cannot be trusted.
        
               | JustExAWS wrote:
               | If that's the case, you aren't using email either or
               | messaging?
        
               | doublerabbit wrote:
               | Strictly Signal via self-hosted VPN for messages. My
               | email web client provided by my email server (Zimbra)
               | which are hosted on colocated servers. 3cx for calls via
               | self-hosted PBX.
               | 
               | Video conferencing instead of FaceTime are made via self-
               | hosted Jitsi and if I am to brag all running on FreeBSD.
               | 
               | Out of Apple or Google I trust neither however will align
               | with Apple more than Google. It's as close as I can get
               | from not having data collected from mongrels.
        
               | ryandrake wrote:
               | As a general principle, I think computers should execute
               | commands that users issue, and then wait for the next
               | command. That's it.
               | 
               | Computers should not be sneakily doing things in the
               | background without my commanding them to do so. But if
               | they insist that the only way they can work is by doing
               | things in the background, then I expect the computer to
               | at the very least obtain my consent before doing those
               | things. And computers should _definitely_ not be
               | exfiltrating anything over to the network without my
               | explicit command to do so. This shit world we are living
               | in where your computer just does whatever the application
               | developer wants it to do rather than what the user wants
               | it to do has to come to an end!
        
               | fragmede wrote:
               | Netflix being unable to know your watch history on their
               | service is exactly the goal of homomorphic encryption.
               | The technology to make that work at that scale does not
               | exist, however for smaller bits of data, eg phone
               | numbers, that's entirely possible!
               | 
               | With PIR, an Apple phone recieving a phone call queries
               | Apple's database with that phone number, but because it's
               | using homomorphic encryption, Apple doesn't know the
               | number that called despite looking it up in their
               | database to provide caller id info, so they can't tie
               | your phone number and the callers phone number together.
               | 
               | https://machinelearning.apple.com/research/homomorphic-
               | encry...
        
             | stackghost wrote:
             | I hate this type of lukewarm take.
             | 
             | "Ah, I see you care about privacy, but you own a phone! How
             | hypocritical of you!"
        
               | latexr wrote:
               | You're describing Matt Bors' Mister Gotcha.
               | 
               | https://thenib.com/mister-gotcha/
        
               | JustExAWS wrote:
               | If you care about your "privacy" and no external service
               | providers having access to your data - that means you
               | can't use iCloud - at all, any messages service, any back
               | up service, use Plex and your own hosted media, not use a
               | search engine, etc.
        
               | stackghost wrote:
               | Do you use a phone?
        
               | JustExAWS wrote:
               | Yes. I also don't use Plex, have my own file syncing
               | service running, run my own email server, etc.
               | 
               | I also don't run a private chat server that people log
               | into - I'm like most of the iPhone and Android using
               | world
        
               | stackghost wrote:
               | Maybe lay off the sanctimonious attitude then.
        
           | Msurrow wrote:
           | I think it does address the main problem. What he is saying
           | is that multiple layers of security is used to ensure
           | (mathematically and theoretically proved) that there is no
           | risk in sending the data, because it is encrypted and sent is
           | such a way that apple or any third party will never be able
           | to read/access it (again, based on theoretically provable
           | math) . If there is no risk there is no harm, and then there
           | is a different need for 'by default', opt in/out,
           | notifications etc.
           | 
           | The problem with this feature is that we cannot verify that
           | Apple's implementation of the math is correct and without
           | security flaws. Everyone knows there is security flaws in all
           | software, and this implementation is not open (I.e. we cannot
           | review the code, and even if we could review code we cannot
           | verify that the provided code was the code used in the iOS
           | build). So, we have to trust Apple did not make any mistakes
           | in their implementation.
        
             | latexr wrote:
             | Your second paragraph is exactly the point made in the
             | article as the reason why it should be an informed choice
             | and not something on by default.
        
               | Gigachad wrote:
               | If you don't trust Apple to do what they say they do, you
               | should throw your phone in the bin because it has total
               | control here and could still be sending your data even if
               | you opt out.
        
               | chikere232 wrote:
               | Oh yeah, the well known "blind trust" model of security.
               | Never verify any claims of any vendor! If you don't trust
               | them, why did you buy from them?!
        
               | latexr wrote:
               | Bugs have nothing to do with trust. You can believe
               | completely that someone's intentions are pure and still
               | get screwed by their mistake.
        
             | Gabriel54 wrote:
             | As someone with a background in mathematics I appreciate
             | your point about cryptography. That said, there is no
             | guarantee that any particular implementation of a secure
             | theoretical algorithm is actually secure.
        
               | threeseed wrote:
               | There is also no guarantee that Apple isn't lying about
               | everything.
               | 
               | They could just have the OS batch uploads until a later
               | point e.g. when the phone checks for updates.
               | 
               | The point is that this is all about risk mitigation not
               | elimination.
        
               | dylan604 wrote:
               | > There is also no guarantee that Apple isn't lying about
               | everything.
               | 
               | Other than their entire reputation
        
               | echelon wrote:
               | Maybe your threat model can tolerate an "oopsie woopsie".
               | Politically exposed persons probably cannot.
        
               | parasubvert wrote:
               | If you don't personally write the software stack on your
               | devices, at some point you have to trust a third party.
        
               | freedomben wrote:
               | Agreed, but surely you see a difference between an open
               | source implementation that is out for audit by anyone,
               | and a closed source implementation that is kept under
               | lock & key? They could both be compromised intentionally
               | or unintentionally, but IMHO one shows a lot more good
               | faith than the other.
        
               | bolognafairy wrote:
               | No. That's your bias as a nerd. There are countless well-
               | publicised examples of 'many eyeballs' not being remotely
               | as effective as nerds make it out to be.
        
               | jrvieira wrote:
               | can you provide a relevant example for this context?
        
               | dylan604 wrote:
               | How long did the log4j exist?
               | 
               | https://www.csoonline.com/article/571797/the-apache-
               | log4j-vu...
               | 
               | What was the other package that had the mysterious .?
        
               | timschmidt wrote:
               | And yet they were found. How many such exploits lurk
               | unexamined in proprietary codebases?
        
               | dylan604 wrote:
               | yet you say this like Apple or Google or Microsoft has
               | never released an update to address a security vuln
        
               | timschmidt wrote:
               | Apple[1], Google[2], and Microsoft[3] you say?
               | 
               | You say this as if being shamed into patching the
               | occasional vuln is equivalent to security best practices.
               | 
               | Open code which can be independently audited is only a
               | baseline for trustworthy code. A baseline none of those
               | three meet. And one which by itself is insufficient to
               | counter a reflections on trusting trust style attack. For
               | that you need open code, diverse open build toolchains,
               | and reproducible builds. None of which is being done by
               | those three.
               | 
               | Are you getting your ideas about security from the
               | marketing department?
               | 
               | 1: https://arstechnica.com/security/2024/03/hackers-can-
               | extract... 2: https://www.wired.com/story/google-android-
               | pixel-showcase-vu... 3:
               | https://blog.morphisec.com/5-ntlm-vulnerabilities-
               | unpatched-...
        
               | dylan604 wrote:
               | Go ahead and put that cup of kool-aid down for a minute.
               | There are so so many OSS packages out there that have
               | never been audited? Why not? Because people have better
               | things to do. How many packages have you audited?
               | Personally, I don't have the skillz to do that. The
               | people that do expect to be compensated for their
               | efforts. That's why so many OSS packges have vulns that
               | go unnoticed until after they are exploited, which is the
               | same thing as closed source.
               | 
               | OSS is not the panacea that everyone touts it to be.
        
               | timschmidt wrote:
               | > There are so so many OSS packages out there that have
               | never been audited? Why not? Because people have better
               | things to do.
               | 
               | I'm not aware of any major open source projects that
               | haven't experienced some level of auditing. Coverity
               | alone scans everything you're likely to find in a
               | distribution like Debian or Fedora:
               | https://scan.coverity.com/o/oss_success_stories
               | 
               | > How many packages have you audited?
               | 
               | Several on which I depend. And I'm just one pair of
               | eyeballs.
               | 
               | > Personally, I don't have the skillz to do that.
               | 
               | Then why are you commenting about it?
               | 
               | > OSS is not the panacea that everyone touts it to be.
               | 
               | I don't know who's touting it as a panacea, seems like a
               | strawman you've erected. It's a necessary pre-requisite
               | without which best practices aren't possible or
               | verifiable.
        
               | hn3er1q wrote:
               | That was an entire body of research at the University of
               | Minnesota and the "hypocrite commits" weren't found until
               | the authors pointed people to them.
               | 
               | https://www.theverge.com/2021/4/30/22410164/linux-kernel-
               | uni...
        
               | beeflet wrote:
               | The developer-to-user trust required in the context of
               | open-source software is substantially less than in
               | proprietary software. this much is evident.
        
               | fijiaarone wrote:
               | I'm stealing your information.
               | 
               | Hey! That's wrong.
               | 
               | But I promise I won't do anything wrong with it.
               | 
               | Well ok then.
        
               | bolognafairy wrote:
               | This is still a very dishonest representation of what's
               | actually happening.
        
               | kalleboo wrote:
               | > _There is also no guarantee that Apple isn 't lying
               | about everything._
               | 
               | And at that point all the opt-in dialogs in the world
               | don't matter and you should not be running iOS but
               | building some custom Android ROM from scratch.
        
             | nox101 wrote:
             | Except for the fact (?) that quantum computers will break
             | this encryption so if you wanted to you could horde the
             | data and just wait a few years and then decrypt?
        
               | NavinF wrote:
               | Quantum computers don't break Differential Privacy. Read
               | the toy example at
               | https://security.googleblog.com/2014/10/learning-
               | statistics-...
               | 
               | >Let's say you wanted to count how many of your online
               | friends were dogs, while respecting the maxim that, on
               | the Internet, nobody should know you're a dog. To do
               | this, you could ask each friend to answer the question
               | "Are you a dog?" in the following way. Each friend should
               | flip a coin in secret, and answer the question truthfully
               | if the coin came up heads; but, if the coin came up
               | tails, that friend should always say "Yes" regardless.
               | Then you could get a good estimate of the true count from
               | the greater-than-half fraction of your friends that
               | answered "Yes". However, you still wouldn't know which of
               | your friends was a dog: each answer "Yes" would most
               | likely be due to that friend's coin flip coming up tails.
        
               | throw0101d wrote:
               | > _Except for the fact (?) that quantum computers will
               | break this encryption_ [...]
               | 
               | Quantum computers will make breaking RSA and Diff-Hellman
               | public key encryption easier. They will not effect things
               | like AES, nor things like hashing:
               | 
               | > _Client side vectorization: the photo is processed
               | locally, preparing a non-reversible vector representation
               | before sending (think semantic hash)._
               | 
               | And for RSA and DH, there are algorithms being deployed
               | to deal with that:
               | 
               | * https://en.wikipedia.org/wiki/NIST_Post-
               | Quantum_Cryptography...
        
               | fragmede wrote:
               | Quantum computers don't and won't meaningfully exist for
               | a while, and once they do exist, they still won't be able
               | to crack it. Quantum computers aren't this magical "the
               | end is nigh" gotcha to everything and unless you're that
               | deep into the subject, the bigger question you've got to
               | ask yourself is why is a magic future technology so
               | important to you that you just had to post your comment?
               | 
               | Anyway, back to the subject at hand; here's Apple on that
               | subject:
               | 
               | > We use BFV parameters that achieve post-quantum 128-bit
               | security, meaning they provide strong security against
               | both classical and potential future quantum attacks
               | 
               | https://machinelearning.apple.com/research/homomorphic-
               | encry...
               | 
               | https://security.apple.com/blog/imessage-pq3/
        
             | emchammer wrote:
             | Hypothetical scenario: Theo de Raadt and Bruce Schneier are
             | hired to bring Apple products up to their security
             | standards. They are given a public blog, and they are not
             | required to sign an NDA. They fix every last vulnerability
             | in the architecture. Vladimir Putin can buy MacBooks for
             | himself and his generals in Moscow, enable Advanced Data
             | Protection, and collaborate on war plans in total
             | confidence.
             | 
             | Where are the boundaries in this scenario?
        
               | saurik wrote:
               | I am pretty sure that if we had those people in charge of
               | stuff like this there would be no bar above which "opt in
               | by default" would happen, so I am unsure of your point?
        
               | astrange wrote:
               | Theo de Raadt is less competent than Apple's security
               | team (and its external researchers). The main thing
               | OpenBSD is known for among security people is adding
               | random mitigations that don't do anything because they
               | thought them up without talking to anyone in the
               | industry.
        
               | stevenAthompson wrote:
               | Freedom of speech can not exist without private
               | communications. It is an inalieanable right, therefore
               | privacy is as well.
        
             | fragmede wrote:
             | You're welcome to check their implementation yourself:
             | 
             | https://github.com/apple/swift-homomorphic-encryption
        
           | scosman wrote:
           | I think I'm saying: you're not sending "your data" off
           | device. You are sending a homomorphically encrypted locally
           | differentially private vector (through an anonymous proxy).
           | No consumer can really understand what that means, what the
           | risks are, and how it would compare to the risk of sending
           | someone like Facebook/Google raw data.
           | 
           | I'm asking: what does an opt in for that really look like?
           | You're not going to be able to give the user enough info to
           | make an educated decision. There's ton of risk of "privacy
           | washing" ("we use DP" but at very poor epsilon, or "we use
           | E2E encryption" with side channel data gathering).
           | 
           | There's no easy answer. "ask the user", when the question
           | requires a phd level understanding of stats to evaluate the
           | risk isn't a great answer. But I don't have another one.
        
             | latexr wrote:
             | Asking the user is perfectly reasonable. Apple themselves
             | used to understand and champion that approach.
             | 
             | https://www.youtube.com/watch?v=39iKLwlUqBo
        
             | Gabriel54 wrote:
             | In response your second question, opt in would look exactly
             | like this: don't have the box checked by default, with an
             | option to enable it: "use this to improve local search, we
             | will create an encrypted index of your data to send
             | securely to our servers, etc..." A PhD is not necessary to
             | understand the distinction between storing data locally on
             | a machine vs. on the internet.
        
               | ryandrake wrote:
               | Exactly. It's the height of arrogance to insist that
               | normal users just can't understand such complex words and
               | math, and therefore the company should not have to obtain
               | consent from the user. As a normal lay user, I don't want
               | _anything_ to leave my device or computer without my
               | consent. Period. That includes personal information, user
               | data, metadata, private vectors, homomorphic this or
               | locally differential that. I don 't care how private
               | Poindexter assures me it is. Ask. For. Consent.
               | 
               |  _Don 't do things without my consent!!!_ How hard is it
               | for Silicon Valley to understand this very simple
               | concept?
        
               | simondotau wrote:
               | The average smartphone is probably doing a hundred things
               | you didn't knowingly consent to every second.
               | 
               | Should Apple insist that every end user consents to the
               | user agent string sent on every HTTP request?
        
               | ryandrake wrote:
               | > The average smartphone is probably doing a hundred
               | things you didn't knowingly consent to every second.
               | 
               | You've succinctly identified a (maybe _the_ ) huge
               | problem in the computing world today. Computers should
               | not do _anything_ without the user 's command/consent.
               | This seems like a hopeless and unachievable ideal only
               | because of how far we've already strayed from the light.
               | 
               | Even Linux, supposedly the last bastion of user
               | control... it's a mess. Do a fresh install and type ps ax
               | at a shell. You'll see dozens of processes in the
               | background doing god knows what. I didn't consent to any
               | of this! The distribution's maintainer simply decided on
               | my behalf that I want the computer to be running all
               | these processes. This is totally normalized!
               | 
               | I don't expect my computer to ask for consent again and
               | again for every byte sent over the network, but I do
               | expect it to obtain my consent before generally accessing
               | the network and sending _bytes_ over the network.
        
               | parasubvert wrote:
               | "The light" you claim is that users should have the
               | knowledge and discernment to consent to what a computer
               | does.
               | 
               | To me, there's never been a case, except maybe in the
               | first decade or so of the hobby/tinkering PC movement,
               | where most users had this ability.
               | 
               | Should we just not use computers?
        
               | JTyQZSnP3cQGa8B wrote:
               | > Should we just not use computers?
               | 
               | I don't think "should we just give up?" is a reasonable
               | question to anything.
        
               | simondotau wrote:
               | > You've succinctly identified a (maybe the) huge problem
               | in the computing world today.
               | 
               | And getting downvoted for saying it, which is a
               | fascinating incongruity.
        
               | ryandrake wrote:
               | It's amazing how hostile Silicon Valley (and HN
               | commenters) are to the basic idea of consent. It's as if
               | simply asking the user for permission is a grave insult
               | to these technologists. "I shouldn't have to ask
               | permission! It implies I'm doing something bad!" they
               | might be thinking.
               | 
               | If the world was a nightclub, "Silicon Valley" would be a
               | creepy guy who walks up to every woman and says "You're
               | now dating me. To stop, you need to opt out using a form
               | that I will do my best to make sure you can't read."
        
               | simondotau wrote:
               | You're inverting morality and infantilising the consumer.
               | Apple is a corporation. Corporations don't owe you moral
               | anything, except as required by law.
               | 
               | Choosing an Apple product _is consent_ to trusting Apple.
               | Continued use their products represents ongoing consent.
               | This is an objective fact about all complex connected
               | devices and it cannot possibly be otherwise.
        
               | skydhash wrote:
               | Corporation are driven by people. They're not a separate
               | entity that decides to do things while their owners are
               | sleeping. Every actions have someone that suggested it
               | and someone that gave the green light.
        
               | walterbell wrote:
               | _> incongruity_
               | 
               | Or signal of non-named stakeholders.
        
               | grahamj wrote:
               | > I didn't consent to any of this!
               | 
               | Yes you did. You purchased a computer, put this software
               | on it and executed it. If you didn't want it to do
               | whatever it's doing you should have determined what it
               | would do beforehand and chose not to do it.
        
               | ryandrake wrote:
               | > whatever it's doing
               | 
               | Even assuming that running the software implies my
               | consent (which I would dispute), how do I make the
               | decision about whether I should execute the software if I
               | don't know what it is doing?
               | 
               | This all-or-nothing approach is also problematic. I
               | should not have to allow the developer free rein to do
               | whatever he wants, as a condition of using the software.
               | This is why operating systems are slowly building
               | granular permissions and consent checks.
        
               | grahamj wrote:
               | Installing and booting Linux absolutely implies consent
               | to let it do what it does. It's open source, you can
               | evaluate what it does before booting it. You know it's
               | comprised of many processes, you know it has a networking
               | stack, you connected it to a network. You can't then ask
               | OMG why didn't it ask before sending something?
               | 
               | I agree that all-or-nothing is problematic but even with
               | a flexible permission system the best you can hope for is
               | for all the things apps do to be itemized and set to sane
               | defaults. But even then sanity is subjective. For every
               | person like you (and me fwiw) who values privacy there
               | are 1000 people who will never find the settings, don't
               | care about privacy, and will wonder why stuff isn't
               | working.
               | 
               | Ultimately privacy is similar to security in that it
               | comes down to trust. If you don't trust your OS you're
               | screwed. Your choices are try to exert as much control
               | over it as possible, or don't use it.
        
               | oneeyedpigeon wrote:
               | > I do expect it to obtain my consent before generally
               | accessing the network and sending bytes over the network.
               | 
               | How would that make any difference in this case?
               | Presumably, you'll have long-ago checked the "allow
               | general access to the network" setting, so you've given
               | consent to the "send my photo data" action. Heck, surely
               | connecting to the internet in the first place is implicit
               | consent that you want to send stuff over the network?
        
               | ryandrake wrote:
               | If I were actually given the choice, I would not check
               | any checkbox allowing an application broad, unfettered
               | access to the network. But, in most cases I'm not even
               | given that choice!
        
               | brians wrote:
               | Every TCP session leaks some PRNG state for the ISN. That
               | might leak information about key material.
               | 
               | Every NTP session leaks time desync information, which
               | reveals--on modern hardware--relativistic travel,
               | including long airplane trips.
               | 
               | Every software update leaks a fortune about what you run
               | and when you connect.
               | 
               | I don't think it's reasonable to ask that people consent
               | to these; I don't think they can. I absolutely agree that
               | photo metadata is _different_ and at a way higher level
               | of the stack.
        
               | rkagerer wrote:
               | This, 1000x. Thank you for voicing the absurdness of
               | their approach to 'consent'.
        
               | scosman wrote:
               | Even here with HN crowd: it's not an index, it's not
               | stored on a server, and it's not typical send-securely
               | encryption (not PK or symmetric "encrypted in transit",
               | but homomorphic "encrypted processing"). Users will think
               | that's all gibberish (ask a user if they want to send an
               | index or vector representation? no clue).
               | 
               | Sure, you can ask users "do you want to use this". But
               | why do we ask that? Historically it's user consent
               | (knowingly opting in), and legal requirements around
               | privacy. We don't have that pop up on any random new
               | feature, it's gated to ones with some risk. There are
               | questions to ask: does this technical method have any
               | privacy risk? Can the user make informed consent? Again:
               | I'm not pitching we ditch opt-in (I really don't have a
               | fix in mind), but I feel like we're defaulting too
               | quickly to "old tools for new problems". The old way is
               | services=collection=consent. These are new privacy
               | technologies which use a service, but the privacy is
               | applied locally before leaving your device, and you don't
               | need to trust the service (if you trust the DP/HE
               | research).
               | 
               | End of the day: I'd really like to see more systems like
               | this. I think there were technically flawed statements in
               | the original blog article under discussion. I think new
               | design methods might be needed when new technologies come
               | into play. I don't have any magic answers.
        
               | lapcat wrote:
               | > I think there were technically flawed statements in the
               | original blog article under discussion.
               | 
               | Such as?
        
               | fragmede wrote:
               | The third choice, after opt-in and opt-out is to force
               | the user to choose on upgrade before they can use their
               | device again. "Can we use an encrypted, low-resolution
               | copy of your photos that even we ourselves can't see?"
        
               | scratchyone wrote:
               | Okay except "encrypted, low-resolution copy of your
               | photos" is an incredibly bad explanation of how this
               | feature works. If nobody on HN so far has managed to find
               | an explanation that is both accurate and understandable
               | to the average consumer, any "hey can we do this" prompt
               | for this feature is essentially useless anyways. And,
               | IMO, unnecessary since it is theoretically 100%
               | cryptographically secure.
        
               | fragmede wrote:
               | I think it's sufficiently accurate, why don't you think
               | it is? I don't think the vector vs low-res aspect is
               | particularly material to understanding the key fact that
               | "even we ourselves can't see?"
        
               | detourdog wrote:
               | I Think the best response is make it how iCloud storage
               | works. The option is keep my stuff on the local device or
               | use iCloud.
        
             | sensanaty wrote:
             | I don't care if all they collect is the bottom right pixel
             | of the image and blur it up before sending it, the sending
             | part is the problem. I don't want _anything_ sent from MY
             | device without my consent, whether it 's plaintext or
             | quantum proof.
             | 
             | You're presenting it as if you have to explain elliptic
             | curve cryptography in order to toggle a "show password"
             | dialogue but that's disingenuous framing, all you have to
             | say is "Allow Apple to process your images", simple as
             | that. Otherwise you can argue many things can't possibly be
             | made into options. Should location data always be sent,
             | because satellites are complicated and hard to explain?
             | Should we let them choose whether they can turn wifi on or
             | off, because you have to explain IEEE 802.11 to them?
        
               | simondotau wrote:
               | > I don't want anything sent from MY device without my
               | consent
               | 
               | Then don't run someone else's software on your device.
               | It's not your software, you are merely a licensee. Don't
               | delude yourself that you are morally entitled to absolute
               | control over it.
               | 
               | The only way to have absolute control over software is
               | with an RMS style obsession with Free software.
        
               | Koshkin wrote:
               | The "moral entitlement" has nothing to do with this. The
               | software is _legally_ required to abide by its license
               | agreement (which, by the way, _you_ are supposed to have
               | read, understood, and accepted prior to using said
               | software).
        
               | simondotau wrote:
               | I honestly can't tell if you're being sarcastic. A
               | license grants the end user permission to use the
               | software. It is not a series of obligations for how the
               | software operates. This would be excruciatingly obvious
               | if you read any software license.
        
               | Koshkin wrote:
               | A license agreement is, well, an _agreement_ between the
               | manufacturer and the consumer which may include a
               | requirement to acknowledge certain aspects of how the
               | software operates (e.g. the user may be required to agree
               | to "share" some data).
        
               | simondotau wrote:
               | Some commercial software licenses may include various
               | disclaimers which exist to ward away litigious assholes.
               | They only serve to protect the vendor against legal
               | complaints, and do not impart responsibilities upon the
               | vendor. Such disclaimers are not necessary but corporate
               | lawyers have a _raison d 'etre,_ and at a certain scale
               | assholes become inevitable.
        
               | int_19h wrote:
               | They might not be _legally_ entitled to it, but that 's
               | just because of our shitty "intellectual property" laws.
               | Morally speaking, OP is absolutely entitled to have a
               | device that they own not spying on them.
        
               | simondotau wrote:
               | Regardless of one's opinion of intellectual property
               | laws, nobody is morally entitled to demand that someone
               | else build the exact oroduct they want. In fact it is
               | immoral to demand that of other people -- and you
               | certainly wouldn't like it if other people could demand
               | that of you.
               | 
               | Want a phone that doesn't spy on you? Make it yourself.
               | If you can't, find some like-minded people and
               | incentivise them (with money or otherwise) to make it for
               | you. If they can't (or won't) perhaps contemplate the
               | possibility that large capitalist enterprises might be
               | the only practical way to develop some products.
        
               | Nullabillity wrote:
               | This is just "might makes right" bullshit with slightly
               | prettier framing.
        
               | simondotau wrote:
               | This has absolutely nothing to do with "might makes
               | right". If a fast food store decides to offer a
               | Vietnamese Peanut Burger and Sugar Cane Juice combo, nut
               | allergy suffers are not "morally entitled" to a nut-free
               | option and diabetics are not "morally entitled" to a
               | sugar-free juice option. This applies whether the fast
               | food store is a small family run business, or McDonalds.
               | 
               | To suggest that customers are "morally entitled" to a
               | Samsung phone with zero tracking and zero telemetry is
               | similarly absurd. If you don't like Samsung's product,
               | don't buy it.
        
               | Nullabillity wrote:
               | > If a fast food store decides to offer a Vietnamese
               | Peanut Burger and Sugar Cane Juice combo, nut allergy
               | suffers are not "morally entitled" to a nut-free option
               | and diabetics are not "morally entitled" to a sugar-free
               | juice option.
               | 
               | Why not? What gives McD the right to make such a decision
               | unilaterally, other than might?
               | 
               | In fact, this is how disability legislation (for example)
               | already tends to work. You don't get to tell disabled
               | people to just go somewhere else, you _have to_ make
               | reasonable accomodations for them.
        
               | simondotau wrote:
               | > What gives McD the right to make such a decision
               | unilaterally
               | 
               | This cannot be a serious question.
        
               | JTyQZSnP3cQGa8B wrote:
               | > nut allergy suffers are not "morally entitled" to a
               | nut-free option
               | 
               | Restaurant have a legal obligation to warn the customers.
               | AKA "opt-in" which is NOT what Apple is doing. And it's
               | the whole issue with their behavior.
        
               | simondotau wrote:
               | Apple's food scientists have verified the food safety of
               | their new recipe, and they are sufficiently confident
               | that nobody will suffer any allergic reaction. Nobody has
               | disputed their assessment.
               | 
               | That doesn't stop consumers from engaging in _Info Wars_
               | style paranoia, and grandstanding about the
               | aforementioned paranoia.
        
               | Teever wrote:
               | That's absurd.
               | 
               | We can regulate these problems.
               | 
               | If the EU can regulate away the lightning connector they
               | can regulate away this kind of stuff.
        
               | simondotau wrote:
               | You're seriously arguing that it's absurd for customers
               | to have "absolute control" over all software?
               | 
               | No EU regulation could regulate away all "moral" concerns
               | over software. More specifically, they EU _could_
               | regulate, but the overwhelming majority of software
               | companies would either strip significant features out for
               | EU customers, or exit the market altogether.
        
               | LtWorf wrote:
               | Lol, they keep threatening that but they still like the
               | money of the europeans.
        
               | simondotau wrote:
               | The EU hasn't threatened granting consumers "absolute
               | control" over all software.
        
               | LtWorf wrote:
               | I'd vote for a party that said the only legal license is
               | AGPL :D
        
             | Vegenoid wrote:
             | There is significant middle ground between "do it without
             | asking" and "ask about every single thing". A reasonable
             | option would be "ask if the device can send anonymized data
             | to Apple to enable such and such features". This setting
             | can apply to this specific case, as well as other similar
             | cases for other apps.
        
             | Nullabillity wrote:
             | If you can't meaningfully explain what you're doing then
             | you can't obtain informed consent. If you can't obtain
             | informed consent then that's not a sign to go ahead anyway,
             | it's a sign that you _shouldn 't do it_.
             | 
             | This isn't rocket surgery.
        
               | scosman wrote:
               | +100 for "rocket surgery".
               | 
               | I mostly agree. I'm just annoyed "this new privacy tech
               | is too hard to explain" leads to "you shouldn't do it".
               | This new privacy tech is a huge net positive for users.
               | 
               | Also: from other comments sounds like it might have been
               | opt-in the whole time. Someone said a fresh install has
               | it off.
        
               | Nullabillity wrote:
               | > This new privacy tech is a huge net positive for users.
               | 
               | It's a positive compared to doing the same "feature"
               | without the privacy tech. It's not necessarily a positive
               | compared to not forcing the "feature" on the user at all.
               | 
               | The privacy tech isn't necessarily a positive as a whole
               | if it leads companies to take more liberties in the name
               | of "hey you don't need to be able to turn it off because
               | we have this magical privacy tech (that nobody
               | understands and may or may not actually work please don't
               | look into it too hard)".
        
           | brookst wrote:
           | Notice is always good and Apple should implement notice.
           | 
           | However, "my data is being sent off my device" is incorrect,
           | as GP explained. Metadata, derived from your data, with noise
           | added to make it irreversible, is being sent off your device.
           | It's the equivalent of sending an MD5 of your password
           | somewhere; you may still object, but it is not factually
           | correct to say your password was transmitted.
        
             | amelius wrote:
             | > It's the equivalent of sending an MD5 of your password
             | somewhere; you may still object, but it is not factually
             | correct to say your password was transmitted.
             | 
             | Hackers love to have MD5 checksums of passwords. They make
             | it way easier to find the passwords in a brute force
             | attack.
             | 
             | https://en.wikipedia.org/wiki/Rainbow_table
        
               | throw0101d wrote:
               | >> _It 's the equivalent of_ [...]
               | 
               | > _Hackers love to have MD5 checksums of passwords._
               | 
               | Hackers love not understanding analogies. :)
        
               | robocat wrote:
               | Hackers love to make defective analogies (especially
               | redundant recursive ones) and invite sarcastic
               | corrections to them.
        
               | icehawk wrote:
               | Nobody responding seriously to this because you seem to
               | have missed the part where GP said "with noise added to
               | make it irreversible" and the third sentence in that
               | wikipedia article.
        
               | brookst wrote:
               | Hackers don't know about salts yet?
        
               | defrost wrote:
               | Bath salts yes, security salts, not so much.
        
             | 0points wrote:
             | > It's the equivalent of sending an MD5 of your password
             | somewhere
             | 
             | a) MD5 is reversible, it just cost GPU time to brute force
             | 
             | b) It is unproven that their implementation is irreversible
        
               | fragmede wrote:
               | BFV has been proven to be irreversible, and Apple open
               | sourced their Swift library implementing it, so it's not
               | _totally_ unproven.
               | 
               | https://github.com/apple/swift-homomorphic-encryption
        
             | Drew_ wrote:
             | > However, "my data is being sent off my device" is
             | incorrect, as GP explained. Metadata, derived from your
             | data, with noise added to make it irreversible, is being
             | sent off your device.
             | 
             | Sounds like my data is being sent off my device.
             | 
             | > It's the equivalent of sending an MD5 of your password
             | somewhere
             | 
             | Sounds even worse lol
        
               | Klonoar wrote:
               | It does not sound like that at all.
               | 
               | There is plenty of data on your device that isn't "your
               | data" simply due to existing on your device.
        
             | thegrim33 wrote:
             | Well that's what you're told is happening. As it's all
             | proprietary closed source software that you can't inspect
             | or look at or verify in any manner, you have absolutely
             | zero evidence whether that's what's actually happening or
             | not.
        
               | astrange wrote:
               | If you can't inspect it that just means you don't know
               | how to use Ghidra/Hopper. ObjC is incredibly easy to
               | decompile and Swift isn't much harder.
        
             | mvdtnz wrote:
             | If the information being sent from my advice cannot be
             | derived from anything other than my own data then it is my
             | data. I don't care what pretty dress you put on it.
        
           | mensetmanusman wrote:
           | When your phone sends out a ping to search for cellular
           | towers, real estate brokers collect all that information to
           | track everywhere you go and which stores you visit.
           | 
           | Owning a phone is a privacy failure by default in the United
           | States.
        
             | mike_d wrote:
             | > When your phone sends out a ping to search for cellular
             | towers, real estate brokers collect all that
             | 
             | Care to provide a pointer to what device they are using? I
             | would absolutely get my real estate license for this.
        
             | talldayo wrote:
             | You are being downvoted because you're so painfully
             | correct. It's not an issue _exclusive_ to the United
             | States, but American intelligence leads the field far-and-
             | away on both legal and extralegal surveillance. The
             | compliance forced by US Government agencies certainly helps
             | make data tracking inescapable for the average American.
             | 
             | Unfortunately, the knee-jerk reaction of many defense
             | industry pundits (and VCs, for that matter) is that US
             | intelligence is an unparalleled moral good, and the virtues
             | of privacy aren't worth hamstringing our government's work.
             | Many of these people will try to suppress comments like
             | yours because it embarrasses Americans and American
             | business by association. And I sympathize completely - I'm
             | dumbfounded by the response from my government now that we
             | know China is hacking our telecom records.
        
               | 0points wrote:
               | FWIW, SS7 had known flaws very long ago.
               | 
               | It's apparent it has been kept in place because of all of
               | the value it provides to the 5 eyes.
        
           | clint wrote:
           | Do you consider your data to include non-reversible hashes of
           | your data injected with random noise? I'm not sure I consider
           | that my data. Its also not even really meta-data about my
           | data.
        
           | kemayo wrote:
           | I'm not sure I agree -- asking users about every single minor
           | feature is (a) incredibly annoying, and (b) quickly causes
           | request-blindness in even reasonably security-conscious
           | users. So restraining the nagging for only risky or
           | particularly invasive things makes sense to me.
           | 
           | Maybe they should lump its default state into something that
           | already exists? E.g. assume that if you already have location
           | access enabled for Photos (it does ask!), you've already
           | indicated that you're okay with something about this
           | identifying being sent to Apple whenever you take a picture.
           | 
           | My understanding is that Location Services will, among other
           | things, send a hash of local WiFi network SSIDs and signal
           | strengths to a database Apple maintains, and use that to
           | triangulate a possible position for you. This seems loosely
           | analogous to what's going on here with the compute-a-vector
           | thing.
        
             | lapcat wrote:
             | > Maybe they should lump its default state into something
             | that already exists?
             | 
             | It could be tied to iCloud Photos, perhaps, because then
             | you already know that your photos are getting uploaded to
             | Apple.
        
               | kemayo wrote:
               | Insofar as the photos _aren 't_ getting uploaded to Apple
               | for this, that seems a bit extreme.
               | 
               | (We could argue about it, but personally I think some
               | kind of hash doesn't qualify.)
        
               | lapcat wrote:
               | What's the Venn diagram of people who both (1)
               | deliberately refrain from enabling iCloud Photos but
               | nonetheless (2) want the Photos app to phone home to
               | Apple in order to identify landmarks in locally stored
               | photos?
        
               | kemayo wrote:
               | It's probably a pretty large set of people, perhaps even
               | the majority, since I'd suspect that most people _don 't_
               | pay for additional iCloud storage and can't fit their
               | photo library into 5GB.
               | 
               | In fact, I'm willing to bet that if they'd added this
               | feature and gated it behind iCloud Photos being enabled,
               | we'd have different articles complaining about Apple
               | making a cash grab by trying to get people to pay for
               | premium storage. :P
        
               | lapcat wrote:
               | > It's probably a pretty large set of people, perhaps
               | even the majority
               | 
               | As the article notes, this new feature is so "popular"
               | that neither Apple nor the Apple media have bothered to
               | mention it. AFAICT it's not even in Apple's document
               | listing all the new features of iOS 18: https://www.apple
               | .com/ios/ios-18/pdf/iOS_18_All_New_Features...
        
               | kemayo wrote:
               | True, but I don't see how that relates to anything? You
               | asked for a hypothetical set of people who'd have iCloud
               | Photos disabled but would accept metadata being sent to
               | Apple for better search. I can't help you if you want to
               | move the goalposts after I give you that.
        
               | lapcat wrote:
               | > You asked for a hypothetical set of people who'd have
               | iCloud Photos disabled but would accept metadata being
               | sent to Apple for better search.
               | 
               | No, I didn't ask for a hypothetical set. I wanted the
               | actual set of people.
        
               | kemayo wrote:
               | Well, neither of us have any way of surveying the public
               | about that, do we? My claim that those people would be
               | okay with it has as much weight as yours that they
               | wouldn't.
               | 
               | I can try to turn down my natural inclination towards
               | cautious phrasing, if you'd like? Get us on the same
               | level. :D
        
             | sumedh wrote:
             | "asking users about every single minor feature is (a)
             | incredibly annoying"
             | 
             | Then why lie and mislead customers that your data stays
             | local?
        
               | kemayo wrote:
               | I don't think that's a fair characterization of what
               | they're doing.
        
               | FireBeyond wrote:
               | No? There's _literal_ billboards linked on this thread
               | that say "what happens on your iPhone stays on your
               | iPhone."
               | 
               | Apple patting itself on the back.
        
               | scratchyone wrote:
               | I'm sure you know that the point of that billboard is to
               | state that your iPhone protects your privacy. That is
               | generally true, Apple is by far the most privacy-focused
               | major phone and software company. Advertising isn't
               | literal, if we're going to be pedantic here the photons
               | emitted by your iPhone's screen technically leave your
               | iPhone and definitely contain private information.
        
               | asadotzler wrote:
               | It's not pedantic to call out misleading advertising
               | unless you're a shill for the ones doing the misleading.
        
               | sgammon wrote:
               | If you one-way encrypt a value, and that value leaves the
               | phone, with no way to recover the original value, then
               | the original data never left the phone.
        
             | KennyBlanken wrote:
             | Especially for a company which heavily markets about how
             | privacy-focused it is,
             | 
             | 1)sending my personal data to them _in any way_ is not a
             | "feature." It's especially not a feature because what it
             | sets out to do is rather unnecessary because every photo
             | has geotagging, time-based grouping, and AI/ML/whatever on-
             | device keyword assignments _and_ OCR. I can open up my
             | phone right now and search for every picture that has grass
             | in it. I can search for  "washington" and if I took a
             | picture of a statue of george washington that shows the
             | plaque, my iPhone already OCR'd that and will show the
             | photo.
             | 
             | 2)"minor" is not how I would ever describe _sending data
             | based off my photos to them_ , regardless of how much it's
             | been stuffed through a mathematical meat grinder.
             | 
             | 3)Apple is usually very upfront about this sort of thing,
             | and also loves to mention the most minor, insignificant,
             | who-gives-a-fuck feature addition in the changenotes for
             | "point" system updates. We're talking things like "Numbers
             | now supports setting font size in chart legends" (I'm
             | making that up but you get the point.)
             | 
             | This was very clearly an "ask for forgiveness because the
             | data we want is absolutely priceless and we'll get lots of
             | it by the time people notice / word gets out." It's along
             | the lines of Niantic using the massive trove of photos from
             | the pokemon games to create 3d maps of everywhere.
             | 
             | I specifically use iOS because I value my privacy (and
             | don't want my cell phone data plan, battery power, etc to
             | be a data collection device for Google.) Sending data based
             | off my photos is a hard, do-not-pass-go-fuck-off-and-die
             | line in the sand for me.
             | 
             | It's especially shitty because they've gated a huge amount
             | of their AI shit behind owning the current iPhone
             | model....but apparently my several generation old iPhone is
             | more than good enough to do some AI analysis on all my
             | photos, to upload data for them?
             | 
             | Fuck everyone Apple who was involved in this.
        
               | kalleboo wrote:
               | > _This was very clearly an "ask for forgiveness because
               | the data we want is absolutely priceless and we'll get
               | lots of it by the time people notice / word gets out._
               | 
               | It's very clearly not, since they've gone to huge lengths
               | to make sure they can't actually see the data themselves
               | see the grandparent post.
        
               | grahamj wrote:
               | > It's especially shitty because they've gated a huge
               | amount of their AI shit behind owning the current iPhone
               | model....but apparently my several generation old iPhone
               | is more than good enough to do some AI analysis on all my
               | photos
               | 
               | Hear hear. As if they can do this but not Visual
               | Intelligence, which is just sending a photo to their
               | servers for analysis. Apple has always had artificial
               | limitations but they've been getting more egregious of
               | late.
        
             | isodev wrote:
             | > asking users about every single minor feature
             | 
             | Then perhaps the system is of poor design and needs further
             | work before being unleashed on users...
        
           | uoaei wrote:
           | I guess it depends on what you're calling "your data" --
           | without being able to reconstruct an image from a noised
           | vector, can we say that that vector in any way represents
           | "your data"? The way the process works, Apple makes their own
           | data that leaves your device, but the photo never does.
        
             | nottorp wrote:
             | It's the same as the CSAM initiative. It doesn't matter
             | what they say they send, you cannot trust them to send what
             | they say they send or trust them not to change it in the
             | future.
             | 
             | Anything that leaves my devices should do so with my opt-IN
             | permission.
        
               | timmytokyo wrote:
               | Even if they implemented the feature with opt-in
               | permissions, why would you trust this company to honor
               | your negative response to the opt-in?
        
           | matthewdgreen wrote:
           | I'm a cryptographer and I just learned about this feature
           | today while I'm on a holiday vacation with my family. I would
           | have loved the chance to read about the architecture, think
           | hard about how much leakage there is in this scheme, but I
           | only learned about it in time to see that it had already been
           | activated on my device. Coincidentally on a vacation where
           | I've just taken about 400 photos of recognizable locations.
           | 
           | This is not how you launch a privacy-preserving product if
           | your intentions are good, this is how you slip something
           | under the radar while everyone is distracted.
        
             | makeitdouble wrote:
             | To play Apple's advocate, this system will probably never
             | be perfect, and stand up to full scrutinity from everyone
             | on the planet. And they also need the most people possible
             | activated as it's an adverserial feature.
             | 
             | The choice probably looks to them like:                 A -
             | play the game, give everyone a heads up, respond to all
             | feedback, and never ship the feature           B - YOLO it,
             | weather the storm, have people forget about it after the
             | holiday, and go on with their life.
             | 
             | Wether B works is up to debate, but that was probably their
             | only chance to have it ship from their POV.
        
               | walterbell wrote:
               | Did a variation of A already happen in 2022, with
               | "client-side scanning of photos"?
        
               | makeitdouble wrote:
               | Yes. That also was a thoroughly botched version of A, but
               | I think even a good version of A won't see them ship
               | anything within this century.
               | 
               | IMO giving up on having it widely used and just ship it
               | turned off would be the best choice. But it's so obvious,
               | there must be other ceitical reasons (good or bad) that's
               | not an option.
        
               | chikere232 wrote:
               | To give you feedback in your role as Apple's advocate:
               | 
               | "we had to sneak it out because people wouldn't consent
               | if we told them" isn't the best of arguments
        
               | makeitdouble wrote:
               | Agreed. This two/three years in particular, there has
               | been more instances where what's best for Apple hasn't
               | been what's best for their users.
        
             | calf wrote:
             | In engineering we distinguish the "how" of verification
             | from the "why" of validation; it looks like much comments
             | disagreement in this post is about the premise of whether
             | ANY outgoing data counts as a privacy consent issue. It's
             | not a technical issue, it's a premises disagreement issue
             | and that can be hard to explain to the other side.
        
           | theshrike79 wrote:
           | How would you explain client side vectorization, differential
           | privacy and homomorphic encryption to a layman in a single
           | privacy popup so that they can make an informed choice?
           | 
           | Or is it better to just trust that mathematics works and thus
           | encryption is a viable way to preserve privacy and skip the
           | dialog?
        
           | sgammon wrote:
           | "Your data" is not actually being sent off your device,
           | actually, it is being scrambled into completely unusable form
           | for anyone except you.
           | 
           | This is a much greater level of security than what you would
           | expect from a bank, for example, who needs to fully decrypt
           | the data you send it. When using your banking apps over HTTPS
           | (TLS), you are trusting the CA infrastructure, you are
           | trusting all sorts of things. You have fewer points of
           | failure when a key for homomorphic encryption resides only on
           | your device.
           | 
           | "Opting-in by default" is therefore not unsafe.
        
           | jeffybefffy519 wrote:
           | The big mistake here is ownership of your apple devices is an
           | illusion...
        
         | talldayo wrote:
         | The right call is to provide the feature and let users opt-in.
         | Apple knows this is bad, they've directly witnessed the
         | backlash to OCSP, lawful intercept and client-side-scanning.
         | There is no world in which they did not realize the problem and
         | decided to enable it by default anyways knowing full-well that
         | users aren't comfortable with this.
         | 
         | People won't trust homomorphic encryption, entropy seeding or
         | relaying when none of it is transparent and all of it is
         | enabled in an OTA update.
         | 
         | > This is what a good privacy story looks like.
         | 
         | This is what a coverup looks like. Good privacy stories never
         | force third-party services on a user, period. When you see that
         | many puppets on stage in one security theater, it's only
         | natural for things to feel off.
        
           | oneplane wrote:
           | It's not that binary. Nobody is forcing anything, you can not
           | buy a phone, you can not use the internet. Heck, you can even
           | not install any updates!
           | 
           | What is happening, is that people make tradeoffs, and decide
           | to what degree they trust who and what they interact with.
           | Plenty of people might just 'go with the flow', but putting
           | what Apple did here in the same bucket as what for example
           | Microsoft or Google does is a gross misrepresentation.
           | Present it all as equals just kills the discussion, and
           | doesn't inform anyone to a better degree.
           | 
           | When you want to take part in an interconnected network, you
           | cannot do that on your own, and you will have to trust other
           | parties to some degree. This includes things that might
           | 'feel' like you can judge them (like your browser used to
           | access HN right here), but you actually can't unless you
           | understand the entire codebase of your OS and Browser, all
           | the firmware on the I/O paths, and the silicon it all runs
           | on. So you make a choice, which as you are reading this, is
           | apparently that you trust this entire chain enough to take
           | part in it.
           | 
           | It would be reasonable to make this optional (as in, opt-in),
           | but the problem is that you end up asking a user for a ton of
           | "do you want this" questions, almost every upgrade and
           | install cycle, which is not what they want (we have had this
           | since Mavericks and Vista, people were not happy). So if you
           | can engineer a feature to be as privacy-centric yet automated
           | as possible, it's a win for everyone.
        
             | talldayo wrote:
             | > What is happening, is that people make tradeoffs, and
             | decide to what degree they trust who and what they interact
             | with.
             | 
             | People aren't making tradeoffs - that's the problem. Apple
             | is making the tradeoffs for them, and then retroactively
             | asking their users "is this okay?"
             | 
             | Users shouldn't need to buy a new phone to circumevent
             | arbitrary restrictions on the hardware that is their legal
             | property. If America had functional consumer protections,
             | Apple would have been reprimanded harder than their
             | smackdowns in the EU.
        
               | oneplane wrote:
               | People make plenty of tradeoffs. Most people trade most
               | of their attention/time for things that are not related
               | to thinking about technical details, legal issues or
               | privacy concerns. None of this exists in their minds.
               | Maybe the fact that they implicitly made this tradeoff
               | isn't even something they are aware of.
               | 
               | As for vectorised and noise-protected PCC, sure, they
               | might have an opinion about that, but people rarely are
               | informed enough to think about it, let alone gain the
               | insight to make a judgment about it at all.
        
               | cortesoft wrote:
               | They can just not enable the feature if they don't want
               | this?
        
               | latexr wrote:
               | The feature is enabled by default. That's the only reason
               | we're having this discussion at all.
        
             | gigel82 wrote:
             | I don't want my photos to take part in any network; never
             | asked for it, never expected it to happen. I never used
             | iCloud or other commercial cloud providers. This is just
             | forceful data extraction by Apple, absolutely egregious
             | behavior.
        
               | oneplane wrote:
               | Your photos aren't taken. Did you read the article at
               | all?
        
               | gigel82 wrote:
               | The "network" mention was in reply to your comment about
               | "participating in a network" which was never the case for
               | one's personal photos (unless explicitly shared on a
               | social network I guess).
               | 
               | I did read the article, yes :) Maybe our photos are not
               | sent bit-by-bit but enough data from the photos is being
               | sent to be able to infer a location (and possibly other
               | details) so it is the same thing: my personal data is
               | being sent to Apple's servers (directly or indirectly,
               | partially or fully) without my explicit consent.
               | 
               | At least the last time they tried to scan everyone's
               | photos (in the name of the children) they pinky promised
               | they'd only do it before uploading to iCloud, now they're
               | doing it for everyone's photos all the time - it's
               | disgusting.
        
               | oneplane wrote:
               | No, your photos aren't sent, also not 'pieces' of it.
               | They are creating vector data which can be used to create
               | searchable vectors which in turn can be used on-device to
               | find visual matches for your search queries (which are
               | local).
               | 
               | You can imagine it as hashes (created locally), some
               | characters of that hash from some random positions being
               | used to find out if those can be turned into a query
               | (which is compute intensive so they use PCC for that). So
               | there is no 'data' about what it is, where it is or who
               | it is. There isn't even enough data to create a picture
               | with.
               | 
               | Technically, everything could of course be changed, heck,
               | Apple could probably hire someone with binoculars and spy
               | on you 24/7. But this is not that. Just like baseband
               | firmware is not that, and activation is not that, yet
               | using them requires communication with Apple all the
               | same.
        
               | jazzyjackson wrote:
               | My understanding as the blog laid it out was that the
               | cloud service is doing the vector similarity search
               | against a finite database of landmark feature vectors,
               | but they are performing that mathematical function under
               | homomorphic encryption such that the result of the vector
               | comparison can only be read with a key that never left
               | your device, so it's just adding a tag "Eiffel tower"
               | that only you see, but the feature vector is sent off
               | device, it's just never able to be read by another party.
        
               | oneplane wrote:
               | Yep. It's essentially an implementation of remote
               | attestation "the other way around". Normally the edge
               | device is untrusted and needs to attest a variety of
               | things before compute is done and the result is accepted,
               | but PCC is the other way where the edge device holds the
               | keys (technically octagon works that out, but it's backed
               | by the on-device SEP).
               | 
               | So it does it multiple ways:
               | 
               | - Finite sets and added noise, doesn't hurt performance
               | too much but does make it nearly impossible to ID/locate
               | a photo
               | 
               | - Encryption at rest and in transit
               | 
               | - Transit over hops they don't own
               | 
               | - Homomorphic Encryption during remote compute
               | 
               | The data it finds was available in two ways: the "result"
               | and the vector embedding. Not sure which one you end up
               | consuming since it also has to work on older models that
               | might not be able to load the embeddings and perform
               | adequately, but it doesn't matter since the data itself
               | will be unique so you can't do parallel reconstruction,
               | but it is also uniquely meaningless to anyone without a
               | key. They are essentially computing on needles that
               | aren't in a haystack, but in a needle stack.
               | 
               | The primitives all this is built on have been around for
               | quite a while, including their HBONE implementation, the
               | cryptographically hidden data distribution and the SEP.
               | So far, it has been the only one of its kind outside of
               | disjointed options like buying and operating your own
               | HSM, a large TOR network and a yet to-be-invented self-
               | hosted PCC solution (AMD was supposed to release
               | something but they failed at that, just not as bad as
               | Intel messed up with SGX).
               | 
               | Technically, even with everything else removed, just some
               | good TLS 1.2+ and homomorphic encryption would have been
               | more than any other mass market manufacturer has ever
               | done in an effective way. But by adding the additional
               | factors such as degrees of separation so they couldn't
               | get in themselves (without breaking it for everyone in
               | the process) is what makes this so much more robust.
        
               | gigel82 wrote:
               | That is incorrect. If everything was local they wouldn't
               | need HE and OHTTP and everything else.
               | 
               | I would be ok with this being a local feature, where I
               | can download the signature database to my device and run
               | the search locally (as you say), but as it stands some
               | information about my photos (enough to detect places at
               | least, possibly more in the future) is being sent out of
               | my device. I want zero information about my photos to
               | leave my device.
        
               | talldayo wrote:
               | > Just like baseband firmware is not that, and activation
               | is not that, yet using them requires communication with
               | Apple all the same.
               | 
               | I mean, this is just wrong. Baseband firmware and carrier
               | activation _can_ be managed entirely independently of
               | Apple, they just choose to manage it themselves. The
               | number of places where Apple chooses to insert their own
               | services as arbitrary middlemen has been a perennially
               | worrying topic among Apple enthusiasts. It 's not just
               | disrespectful to people that pay a premium for fewer
               | service advertisements, it's downright unsafe and does
               | not reflect the sort of forward-thinking security that
               | people in the industry respect.
               | 
               | There was a time when Apple focused on real and
               | innovative product differentiation, but I'll be damned if
               | you can give me a post-Wozniak example that isn't under
               | antitrust scrutiny. Apple relies on marketing and
               | branding to make people feel unsafe in a fundamentally
               | insecure system - I don't respect that as a proponent of
               | innovation and competitive digital markets.
        
               | oneplane wrote:
               | Baseband firmware and OS activation have nothing to do
               | with the carrier, just like it didn't on RIM devices back
               | in the day (which is probably the only somewhat
               | comparable version of this).
               | 
               | Perhaps you are thinking about subscription activation
               | (be it GSM or CDMA) and parameters for cell networks
               | (which can indeed be consumed by the baseband, which will
               | be running firmware supplied by the manufacturer,
               | sometimes re-packaged in system images as done in OEM
               | feature phones and many android phones).
               | 
               | Either way, macOS devices do the same thing (activation)
               | as do iPads without cell networking. Same goes for radio
               | firmware loading and updates. You'll find most wintel
               | laptops doing the same for things like WiFi (regardless
               | of softmac/halfmac/hardmac chips).
        
               | cortesoft wrote:
               | Then don't enable this feature?
        
               | layer8 wrote:
               | The article and this whole thread is about the fact that
               | the feature is enabled by default without notifying the
               | user that the feature even exists.
        
           | latexr wrote:
           | > This is what a coverup looks like.
           | 
           | That's starting to veer into unreasonable levels of
           | conspiracy theory. There's nothing to "cover up", the feature
           | has an off switch right in the Settings and a public document
           | explaining how it works. It should not be on by default but
           | that's not a reason to immediately assume bad faith. Even the
           | author of the article is concerned more about bugs than
           | intentions.
        
             | CapcomGo wrote:
             | Sure it is. This isn't a feature or setting that users
             | check often or ever. Now, their data is being sent without
             | their permission or knowledge.
        
               | latexr wrote:
               | Which is wrong but doesn't make it a coverup, which _by
               | definition_ assumes trying to _hide_ evidence of
               | wrongdoing.
        
               | talldayo wrote:
               | It is a coverup. Apple is overtly and completely aware of
               | the optics surrounding photo scanning - they know that an
               | opt-in scheme cannot work as they found out previously.
               | 
               | Since they cannot convince users to enable this feature
               | in good-faith, they are resorting to subterfuge. We know
               | that Apple is vehement about pushing client-side scanning
               | on users that do not want it, I do not believe for a
               | second that this was a mistake or unintended behavior. If
               | this was a bug then it would have been hotfixed
               | immediately to prevent the unintended behavior from
               | reaching any more phones than it already had.
        
               | threeseed wrote:
               | > they are resorting to subterfuge
               | 
               | This is illogical. If Apple wanted to engage in
               | subterfuge they would simply compromise the OS.
               | 
               | When a company controls the entire stack either you trust
               | everything they do. Or nothing.
        
               | latexr wrote:
               | Exactly. It is absolutely bonkers that people are
               | claiming that Apple is trying to cover up something for
               | which they have a settings toggle and public
               | documentation.
        
               | kemayo wrote:
               | Yeah, we might quibble about what the default value of
               | the toggle should be, but them adding settings for minor
               | features like this is absolutely a good thing, and very
               | much a sign that they're not trying to hide things.
               | 
               | If anything, the lesson Apple might take from this could
               | be "adding the settings toggle was a bad idea, because
               | nobody would have cared about this at all otherwise".
        
             | gigel82 wrote:
             | Would you feel the same if Microsoft turned on Recall on
             | all Windows PCs everywhere with an update?
             | 
             | They worked very hard on security these past few months, so
             | it should be all good, right?
        
               | latexr wrote:
               | That is not the point at all and you either didn't try to
               | understand one iota of it or are outright arguing in bad
               | faith.
               | 
               | I am not claiming for one moment that enabling this by
               | default is OK. In fact, I have _explicitly_ said it is
               | not.
               | 
               | What I am saying is that it is ignorant to call this a
               | cover up, _because a cover up requires subterfuge_. This
               | feature has a freaking settings toggle _and public
               | documentation_. Calling it a cover up is the type of
               | uneducated rhetoric that makes these issues being brushed
               | off by those in power as "it's just a bunch of loonies
               | conspiracy theorists complaining".
        
               | gigel82 wrote:
               | Got it, so if Windows Defender (that is enabled by
               | default on all Windows PCs) pushes an update that scans
               | all your files on all connected drives and uploads hashes
               | to the mothership, enables this by default and proceeds
               | to execute the scan and upload immediately after update,
               | _but_ also includes a setting that lets you turn it off
               | when you find out about its existence from some 3rd party
               | article, that is all perfectly fine? (since there is no
               | subterfuge)
        
               | latexr wrote:
               | > Got it
               | 
               | Clearly you have not. If you did, you wouldn't continue
               | to give an example which is not equivalent.
               | 
               | No, it would not be "perfectly fine", I just said it
               | wouldn't. You can do something wrong without it being a
               | cover up.
        
             | matheusmoreira wrote:
             | > the feature has an off switch right in the Settings and a
             | public document explaining how it works
             | 
             | Irrelevant.
             | 
             | This is Apple's proprietary software, running on Apple's
             | computers, devices which use cryptography to prevent you
             | from inspecting it or running software they don't control.
             | Very few people have any idea how it actually works or what
             | it actually does.
             | 
             | That there's some "public document" describing it is not
             | evidence of anything.
             | 
             | > that's not a reason to immediately assume bad faith
             | 
             | The mere existence of this setting is evidence of bad
             | faith. The client side scanning nonsense proved
             | controversial despite their use of children as political
             | weapons. That they went ahead and did this despite the
             | controversy removes any possible innocence. It tells you
             | straight up that they cannot be trusted.
             | 
             | > Even the author of the article is concerned more about
             | bugs than intentions.
             | 
             | We'll draw our own conclusions.
        
           | threeseed wrote:
           | > This is what a coverup looks like
           | 
           | This is a dumb take. They literally own the entire stack
           | Photos runs on.
           | 
           | If they really wanted to do a coverup we would _never_ know
           | about it.
        
             | talldayo wrote:
             | Why wouldn't this mistake be addressed in a security hotfix
             | then? Apple has to pick a lane - this is either intended
             | behavior being enabled against user's wishes, or unintended
             | behavior that compromises the security and privacy of
             | iPhone owners.
        
               | threeseed wrote:
               | Or option (c).
               | 
               | This is a useful, harmless feature that does not comprise
               | security or privacy in any way.
        
               | talldayo wrote:
               | If it's a useful, harmless feature, then let users opt-
               | in. You'd think Apple would have learned their lesson
               | after a decade of people bitching about U2 being
               | secreted-in to their iTunes library, but apparently not.
               | 
               | Users are smart enough to decide for themselves. Let
               | them.
        
               | kalleboo wrote:
               | All I've seen this year is people complaining about all
               | the security dialogs Apple are throwing at them when they
               | upgrade macOS ARE YOU SURE ARE YOU SURE ARE YOU SURE
        
         | cma wrote:
         | Quantum makes the homomorphic stuff ineffective in the mid-
         | term. All they have to do is hold on to the data and they can
         | get the results of the lookup table computation, in maybe 10-25
         | years. Shouldn't be on by default.
        
           | oneplane wrote:
           | What makes you think that this is the biggest problem if
           | things like AES and RSA are suddenly breakable?
           | 
           | If someone wanted to get a hold of your cloud hosted data at
           | that point, they would use their capacity to simply extract
           | enough key material to impersonate a Secure Enclave. That
           | that point, you "are" the device and as such you "are" the
           | user. No need to make it more complicated than that.
           | 
           | In theory, Apple and other manufacturers would already use
           | PQC to prevent such scenarios. Then again, QC has been
           | "coming soon" for so long, it's doubtful that any information
           | that is currently protected by encryption will still be
           | valuable by the time it can be cracked. Most real-world
           | process implementations don't rely on some "infinite
           | insurance", but assume it will be breached at some point and
           | just try to make it difficult or costly enough to run out the
           | clock on confidentiality, which is all that really matters.
           | Nothing that exists really needs to be confidential forever.
           | Things either get lost/destroyed or become irrelevant.
        
             | cma wrote:
             | This is ostensibly for non-cloud data, derivatives of it
             | auto uploaded after an update.
        
         | latexr wrote:
         | > The author themselves looks to be an Apple security
         | researcher
         | 
         | They're not. Jeff Johnson develops apps (specifically Safari
         | extensions) for Apple platforms and frequently blogs about
         | their annoyances with Apple, but they're not a security
         | researcher.
        
         | gigel82 wrote:
         | This may be a "good" privacy story but a way better one is to
         | just not send any of your data anywhere, especially without
         | prior consent.
        
         | jazzyjackson wrote:
         | This must be the first consumer or commercial product
         | implementing homomorphic encryption is it not?
         | 
         | I would be surprised if doing noisy vector comparisons is
         | actually the most effective way to tell if someone is in front
         | of the Eiffel tower. A small large language model could caption
         | it just as well on device, my spider sense tells me someone saw
         | an opportunity to apply bleeding edge, very cool tech so that
         | they can gain experience and do it bigger and better in the
         | future, but they're fumbling their reputation by doing this
         | kind of user data scanning.
        
           | do_not_redeem wrote:
           | > This must be the first consumer or commercial product
           | implementing homomorphic encryption is it not?
           | 
           | Not really, it's been around for a bit now. From 2021:
           | 
           | > The other major reason we're talking about HE and FL now is
           | who is using them. According to a recent repository of PETs,
           | there are 19 publicly announced pilots, products, and proofs
           | of concept for homomorphic encryption and federated analytics
           | (another term for federated learning) combined. That doesn't
           | seem like a lot ... but the companies offering them include
           | Apple,7 Google, Microsoft, Nvidia, IBM, and the National
           | Health Service in the United Kingdom, and users and investors
           | include DARPA, Intel, Oracle, Mastercard, and Scotiabank.
           | Also, the industries involved in these early projects are
           | among the largest. Use cases are led by health and social
           | care and finance, with their use in digital and crime and
           | justice also nontrivial (figure 1).
           | 
           | https://www2.deloitte.com/us/en/insights/industry/technology.
           | ..
           | 
           | I do wonder why we don't hear about it more often though.
           | "Homomorphic encryption" as a buzzword has a lot of headline
           | potential, so I'm surprised companies don't brag about it
           | more.
        
             | fragmede wrote:
             | But what are the _products_ from them that implement HE and
             | that consumers are using? Microsoft, IBM, Intel, and Google
             | have all released libraries for HE, and there 's Duality
             | SecurePlu, but as far as actual consumer products, Apple's
             | caller ID phone number lookup and other features in iOS 18
             | is very possibly the first.
             | 
             | As far as why it's not more of a buzzword, it's far too in
             | the weeds and ultimately consumers either trust you or they
             | don't. And even if they don't trust you, many of them are
             | still going to use Apple/Google/Facebook system anyway.
        
             | lovemenot wrote:
             | >> I do wonder why we don't hear about it more often though
             | 
             | homomorphobia ?
        
           | m463 wrote:
           | It seems apple might be using it for live caller id lookup?
        
           | Klonoar wrote:
           | Apple themselves have already used it in the past (Caller ID)
        
         | lysace wrote:
         | > - OHTTP relay: it's sent through a 3rd party so Apple never
         | knows your IP address. The contents are encrypted so the 3rd
         | party never doesn't learn anything either (some risk of
         | exposing "IP X is an apple photos user", but nothing about the
         | content of the library).
         | 
         | Which 3rd party is that?
        
           | gigel82 wrote:
           | The NSA, the CCP, etc. depending on jurisdiction. (joking,
           | but not really)
        
           | oneplane wrote:
           | I don't have a list on hand, but at least Cloudflare and
           | Akamai are part of the network hops. Technically you only
           | need 2 hops to make sure no origin or data extraction can be
           | done.
        
             | jazzyjackson wrote:
             | O good, cloudflare gets one more data point on me, a ping
             | every time I add a photo to my library.
        
               | threeseed wrote:
               | a) Cloudflare doesn't know about you. It sees an IP
               | address.
               | 
               | b) If we follow your tortured logic then every hop along
               | the path from your phone to Apple will have one more data
               | point on you. That's thousands of companies a day.
        
               | jazzyjackson wrote:
               | I'm just griping that cloudflare has many eyeballs and
               | sees most of the traffic on the internet at this point.
               | How many websites have I used with Cloudflare DDoS
               | protection that checks if I'm a bot by fingerprinting my
               | device? They know plenty about me.
               | 
               | I'm also griping that "the data is encrypted !" is not a
               | good enough excuse seeing as how we've known for years
               | that the metadata is a bigger pot of gold for
               | intelligence agencies. That my mac address is taking a
               | photo and hitting a particular cell tower is pretty
               | detailed information, even without knowing the content of
               | the photo.
        
         | homofermic wrote:
         | Regarding HE: since the lookup is generated by the requestor,
         | it can be used as an adversarial vector, which can result in
         | exfiltration by nearest neighbor (closest point to vector)
         | methods. In other words, you can change what you are searching
         | for, and much like differential power analysis attacks on
         | crypto, extract information.
        
           | brookst wrote:
           | Does the noise addition not help? Is that a mitigation for a
           | different attack?
        
         | ustad wrote:
         | You're presenting a false dichotomy between "perfect user
         | understanding" and "no user choice." The issue isn't whether
         | users can fully comprehend homomorphic encryption or
         | differential privacy - it's about basic consent and
         | transparency.
         | 
         | Consider these points:
         | 
         | 1. Users don't need a PhD to understand "This feature will send
         | data about your photos to Apple's servers to enable better
         | search."
         | 
         | 2. The complexity of the privacy protections doesn't justify
         | removing user choice. By that logic, we should never ask users
         | about any technical feature.
         | 
         | 3. Many privacy-conscious users follow a simple principle: they
         | want control over what leaves their device, regardless of how
         | it's protected.
         | 
         | The "it's too complex to explain" argument could justify any
         | privacy-invasive default. Would you apply the same logic to,
         | say, enabling location services by default because explaining
         | GPS technology is too complex?
         | 
         | The real solution is simple: explain the feature in plain
         | language, highlight the benefits, outline the privacy
         | protections, and let users make their own choice. Apple already
         | does this for many other features. "Default off with opt-in" is
         | a core principle of privacy-respecting design, regardless of
         | how robust the underlying protections are.
        
           | scosman wrote:
           | I don't believe I said or implied that anywhere: 'You're
           | presenting a false dichotomy between "perfect user
           | understanding" and "no user choice."'? Happy to be corrected
           | if wrong.
           | 
           | Closest I come to presenting an opinion on the right way UX
           | was "I'm not sure what the right call is here.". The thing I
           | disagreed with was a technical statement "the only way to
           | guarantee computing privacy is to not send data off the
           | device.".
           | 
           | Privacy respecting design and tech is a passion of mine. I'm
           | pointing out "user choice" gets hard as the techniques used
           | for privacy exceed the understanding of users. Users can
           | intuitively understand "send my location to Google
           | [once/always]" without understanding GPS satellites. User's
           | can't understand the difference between "send my photo" and
           | "send homomorphicly encrypted locally differentially private
           | vector of e=0.8" and "send differentially private vector of
           | e=50". Your prompt "send data about your photos..." would
           | allow for much less private designs than this. If we want to
           | move beyond "ask the user then do it", we need to get into
           | the nitty gritty details here. I'd love to see more tech like
           | this in consumer products, where it's private when used, even
           | when opted-in.
        
             | lapcat wrote:
             | The choice is between "use an online service" or "don't use
             | an online service". That's simple enough for anyone to
             | understand.
             | 
             | Apple can try to explain as best it can how user data is
             | protected when they use the online service, and then the
             | user makes a choice to either use the service or not.
             | 
             | In my case, I have don't even have a practical use for the
             | new feature, so it's irrelevant how private the online
             | service is. As it is, though, Apple silently forced me to
             | use an online service that I never wanted.
        
             | ustad wrote:
             | I appreciate your passion for privacy-respecting technology
             | and your clarification. You make good points about the
             | nuances of privacy-preserving techniques. However, I think
             | we can separate two distinct issues:
             | 
             | 1. The technical excellence of Apple's privacy protections
             | (which you've explained well and seem robust)
             | 
             | 2. The ethical question of enabling data transmission by
             | default
             | 
             | Even with best-in-class privacy protections, the principle
             | of user agency matters. A simplified prompt like "This
             | feature will analyze your photos locally and send secure,
             | anonymized data to Apple's servers to enable better search"
             | would give users the basic choice while being technically
             | accurate. The technical sophistication of the privacy
             | measures, while commendable, doesn't override the need for
             | informed consent.
        
             | calf wrote:
             | This is not a matter of respect, it is a matter of ethics.
             | Otherwise you will just end up rationalizating
             | technocratic, unethical technology. No amount of passion
             | will justify that.
        
         | yapyap wrote:
         | > "It ought to be up to the individual user to decide their own
         | tolerance for the risk of privacy violations." -> The author
         | themselves looks to be an Apple security researcher, and are
         | saying they can't make an informed choice here
         | 
         | I don't think that that's what the author is saying at all, I
         | think he's saying that Apple should let the user decide for
         | themself if they want to send all this shit to Apple, freedom
         | for the individual. They're not saying "I dunno"
        
         | omolobo wrote:
         | So what? Why should the application talk over the Internet to
         | begin with? And why isn't that functionality off by default
         | under a settings option that clearly warns the user of the
         | consequences? I think you're missing the forest for the trees
         | here.
         | 
         | And the claims that this is good privacy/security are not at
         | all obvious either. And who are those third-parties anyway? Did
         | you verify each one of them?
        
         | jncfhnb wrote:
         | > if you want features that require larger-than-disk datasets,
         | or frequently changing content, you need tools like this.
         | 
         | Well I want them to fuck off.
         | 
         | Hidden in your commentary here is the fact that the vector
         | representation of the image is _the contents of the image_. It
         | very well may be that they cannot reverse the exact image. But
         | it's still a representation of the image that has to be good
         | for something. Without being too familiar I would be willing to
         | hazard a guess that this could include textual labels and
         | classifications of what is in the image.
         | 
         | I don't give a shit how good your internal controls are. I
         | don't have anything particularly interesting to hide. I still
         | do not want you taking my pictures.
        
           | scratchyone wrote:
           | If you read the research you'd know that they don't have
           | access to the vector either. They _never_ decrypt the data.
           | All operations on their server are done directly on the
           | encrypted data. They get 0 information about your photos.
           | They cannot even see which landmark your vector was closest
           | to.
        
             | jncfhnb wrote:
             | I don't care! I do not want them helping themselves to
             | representations of my data. I don't care if it's encrypted
             | or run through a one way hash. I don't care if they only
             | interact with it via homomorphic methods. They can, again,
             | fuck the hell off.
             | 
             | A private corporation has no business reading my files for
             | its own benefit.
        
         | griomnib wrote:
         | I'm deeply familiar with all of these techniques, the core
         | issue here is informed consent which they have not obtained.
         | 
         | Furthermore, Apples privacy stance is generally a sham as their
         | definition of "human rights" doesn't extend to China. Which
         | either means Apple doesn't respect human rights, or they don't
         | view Chinese people as human.
        
           | iammattmurphy wrote:
           | That's not really fair; Apple's in a sticky wicket when it
           | comes to the Chinese government, and they're not the only
           | ones.
           | 
           | The Chinese government are debatably inhuman. They've
           | literally censored the word "censorship." (Then they censored
           | what people used euphemistically for
           | censorship--"harmonious.") It's funny from the outside but
           | also a miserable state of affairs in 2024.
        
             | griomnib wrote:
             | It's _very_ fair, Apple has historically been very happy to
             | be the sponsor of horrible human rights violations in their
             | supply chain, only marginally paying attention to suicides
             | in their factories when the PR got too bad.
             | 
             | Apples version of "human right" includes suicide nets as an
             | alternative to treating people humanely. That's why their
             | stance is pure marketing - they have blood on their hands.
             | 
             | And guess what? You can't use Google in China, and while
             | Google isn't by any means perfect, they aren't Apple.
        
               | yladiz wrote:
               | Oh come on, it's not like Google is better. 1. Google
               | isn't available in China not because of some moral
               | reason, and they were in fact available in the past and
               | Google has attempted to go back to China as a search
               | engine, etc. before. They aren't available because China
               | hacked into their systems and took code and got access to
               | certain accounts, and at the time Google essentially
               | decided it wasn't worth offering services there. There's
               | no reason Google wouldn't start working in China again on
               | a moral level. 2. Google works directly with the US
               | Department of Defense, so as much as Apple has blood on
               | their hands, so does Google.
        
               | griomnib wrote:
               | A) Google refused to censor in the mainland and that's
               | why they got blocked, B) Google employees had a walkout
               | when they tried to go back to China. C) I've never seen
               | an Apple employee protest _anything_.
               | 
               | Now, that happened under a different type of Google (I'm
               | sure they'll go back now that they torched the culture),
               | but Apple employees who walk around SV like they are
               | better than everybody else - which people on their
               | privacy teams very much _do_ - are pathetically deluded.
               | 
               | All the megacorps are evil, and while Apple likes to put
               | on a holier-than-thou act, it's just as much bullshit for
               | them to claim they value human rights as it is for Google
               | to say they value privacy.
               | 
               | They value money, that's it.
        
           | slashdave wrote:
           | Apple follows the law. First you need to get the Chinese
           | government to respect those rights. The only other choice is
           | to stop doing business entirely in the country.
        
             | griomnib wrote:
             | A choice many companies have made. Apple is in China to
             | make money, which is what a corporation is set up to do. My
             | point is them claiming the moral high ground of a human
             | rights defender is utterly laughable bullshit.
        
         | thefounder wrote:
         | The best you can hope is integrity and security until your
         | information reaches the destination but to assume that Apple or
         | the U.S government cannot decipher the information you sent it
         | or use it against you(i.e. set a person of interest as
         | "landmark" and find out who's iPhone matches that "landmark)
         | you must be foolish.
         | 
         | It's no longer a conspiracy. I think we are all over past that
         | time(i.e with Snowden and Wikileaks). We live in a surveillance
         | world and "They're guarding all the doors and holding all the
         | keys".
        
         | StanislavPetrov wrote:
         | >There are other tools to provide privacy (DP, homomorphic
         | encryption), while also using services. They are immensely
         | complicated, and user's can't realistically evaluate risk.
         | 
         | It is simple for any user to evaluate risk the risk of their
         | data being breached on 3rd party servers when their data isn't
         | being sent off the device - there is none. It is only when
         | corporations insist that they are going to send the data off
         | your device whether you like it or not that evaluating risk
         | becomes necessary.
        
         | throwaway81523 wrote:
         | > This is what a good privacy story looks like.
         | 
         | A good privacy story actually looks like not sending any info
         | to anyone else anywhere at any time.
        
           | bobsomers wrote:
           | Sure, but if we follow that line of thinking to its logical
           | conclusion, we must move to a cabin in the woods, 100 miles
           | from the nearest civilization, growing our own food and never
           | connecting our computing devices to anything resembling a
           | network.
        
             | ketralnis wrote:
             | No? You can have a photos app that doesn't phone home while
             | not having to move to a cabin in the woods. See: every
             | photos app that doesn't phone home, and I currently don't
             | live in a cabin in the woods.
        
             | eptcyka wrote:
             | It is a spectrum, and I'd love to be able to draw my own
             | line. I have my own photo storage solution. I do not need
             | Apple's.
        
               | dialup_sounds wrote:
               | This entire post is about a setting in Apple's Photos
               | app.
        
             | codys wrote:
             | It's not charitable to construct a contrived situation no
             | one is talking about and place that into the mouth of a
             | commenter.
        
             | nimih wrote:
             | I've read the post you're responding to like 3 times, and
             | after pondering it deeply, I'm pretty sure the conclusion
             | of their line of thinking pretty definitively stops at
             | "Apple should not be sending data off the device without
             | the user requesting it." If you think otherwise, you should
             | maybe provide more of an argument.
        
               | nmilo wrote:
               | The line of thinking is right there: "not sending any
               | info to anyone else anywhere at any time"
               | 
               | There are way more egregious privacy concerns than
               | sending non-reversibly encrypted noisy photos to Apple.
               | Why draw the line here and not the far worse things
               | happening on your phone and computer right now?
        
               | kelseyfrog wrote:
               | Demanding consistency of the human psyche is a fool's
               | errand.
        
               | threeseed wrote:
               | Because the conclusion is not workable.
               | 
               | Almost every single app today interacts with the network
               | in some way.
               | 
               | You would be constantly annoying the user with prompt
               | after prompt if you wanted to get consent for sending any
               | relatively harmless data off the device.
        
               | rkagerer wrote:
               | It's telling that Android, for example, has all sorts of
               | granular permissions you can set for an app, but "Network
               | Access" is not one of them.
               | 
               | My Calculator app does _not_ need to call home.
               | 
               | A good portion of the apps I use fall into this category,
               | and a straightforward mechanism to opt them out of access
               | would be welcome.
        
               | botanical76 wrote:
               | FWIW, a "Network Access" app permission is one of the
               | features that GrapheneOS provides. It is only setting
               | offered to the user every single app install. It should
               | be in base AOSP, and I have to wonder why it isn't
               | already.
        
               | LtWorf wrote:
               | Ah the GDPR complaint. Just don't collect the data and
               | you won't be annoying anyone!
        
               | nicce wrote:
               | It is probably reasonable for average end-user to expect
               | that landmark based search works without enabling the
               | extra setting.
               | 
               | They have option to disble if they care.
        
               | z3phyr wrote:
               | The initiative is for the user to command their computer
               | to communicate or not with the information of their
               | choosing.
               | 
               | "Computer, I command thee to send this and only this
               | information over the channel of my choosing, using
               | following encryption scheme, for here be my seal of
               | approval for anyone who might want to verify, and here be
               | the key"
               | 
               | "Sicut Vult"
        
               | nicce wrote:
               | I understand the enthusiasm but from the business
               | perspective it does not matter. Many businesses would
               | fail if they go too deep on this. Their only audience
               | would be people who are experts in the area. Other people
               | are confused and disappointed since things are not
               | working as they expect.
               | 
               | On Apple's scale, most people care about the things they
               | can do, not about how it happens. For that reason,
               | default matters when the option is only about the
               | internal process pipeline and privacy.
               | 
               | As a result, it is enough to showcase that in case some
               | expert investigates the matter, they show that privacy is
               | considered in a reasonable level.
               | 
               | Maybe some day in the future these things are common
               | knowledge, but I fear that the knowledge gap just
               | increases.
        
             | satvikpendem wrote:
             | Slippery slope fallacy. Nothing you said derives from not
             | wanting to send information to remote servers, it's a false
             | dichotomy.
        
             | ab_testing wrote:
             | Or just use Linux where no component phones home.
        
               | parasubvert wrote:
               | Plenty of Linux distributions phone home, in some way, by
               | default.
        
               | LtWorf wrote:
               | They send user data? Source?
               | 
               | Debian has popcon, which must be enabled explicitly and
               | that is it. It sends the list of the installed packages.
        
           | st3fan wrote:
           | Your answer shows how we all have a very different idea of
           | what our own desired privacy level is. Or what privacy even
           | means.
        
             | throwaway81523 wrote:
             | If you think that sending data to a remote server is
             | equally private to not sending it, then you are the one who
             | doesn't know what privacy means.
             | 
             | Of course it's fine to not desire privacy, or to desire a
             | privacy level that is less than private. That's up to you.
             | I liked the privacy of my old Canon digicam that had no
             | internet. A photo app on a phone that sends stuff over the
             | network might bring some useful functionality in return,
             | but it can only be considered a regression in terms of
             | privacy.
        
               | theshrike79 wrote:
               | Privacy isn't a binary option. There are levels of
               | privacy between "private" and "not private".
               | 
               | What Apple has implemented is a LOT closer to "private"
               | than "not private"
        
         | hahahacorn wrote:
         | Thank you for this comment. I found the author's ignorance to
         | be fairly discrediting, and was surprised to find so many
         | follow up comments equally railing on Apple.
         | 
         | Between the quote you pointed out and:
         | 
         | "One thing I do know, however, is that Apple computers are
         | constantly full of privacy and security vulnerabilities, as
         | proved by Apple's own security release notes" which just reeks
         | of survivorship bias.
         | 
         | I think the final call of what is right here _shouldn't_ be
         | informed by the linked article.
         | 
         | IMO, enabled by default without opt-in is absolutely the right
         | call when judging between 1: Feature value 2: Security risk 3:
         | Consent Fatigue.
         | 
         | If you're data-conscious enough to disagree with my prior
         | statement, you should consider having lockdown mode enabled.
         | 
         | If you disagree with my prior statement because of how Apple
         | locks you into Photos, :shake_hands:.
         | 
         | If Enhanced Visual Search is still enabled by default in
         | lockdown mode, then I think that's worth a conversation.
        
           | lapcat wrote:
           | > I found the author's ignorance to be fairly discrediting
           | 
           | Why in the world am I supposed to be an expert on homomorphic
           | encryption? How many people in the world are experts on
           | homomorphic encryption?
           | 
           | > which just reeks of survivorship bias.
           | 
           | What does that even mean in this context?
           | 
           | > 1: Feature value
           | 
           | What is the value of the feature? As the article notes, this
           | new feature is flying so low under the radar that Apple
           | hasn't bothered to advertise it, and the Apple media haven't
           | bothered to mention it either. You have to wonder how many
           | people even wanted it.
           | 
           | > If you're data-conscious enough to disagree with my prior
           | statement, you should consider having lockdown mode enabled.
           | 
           | That's ridiculous. Apple itself has said, "Lockdown Mode is
           | an optional, extreme protection that's designed for the very
           | few individuals who, because of who they are or what they do,
           | might be personally targeted by some of the most
           | sophisticated digital threats. Most people are never targeted
           | by attacks of this nature." https://support.apple.com/105120
           | 
           | Lockdown mode is basically for famous people and nobody else.
        
             | hahahacorn wrote:
             | > Why in the world am I supposed to be an expert on
             | homomorphic encryption? How many people in the world are
             | experts on homomorphic encryption?
             | 
             | No one, at any point, implied you had to be an expert on
             | homomorphic encryption. But if you're going to evaluate the
             | security risk of a feature, and then end up on the front
             | page of HN for said security risk, I think it's fair to
             | criticize your lack of objectivity (or attempt at
             | objectivity) by way of not even trying to understand the
             | technical details of the blog.
             | 
             | I will say I think my word choice was unnecessarily harsh,
             | I'm sorry. I think I meant more indifference/inattention.
             | 
             | > What does that even mean in this context?
             | 
             | Apple's list of Security releases is long and storied. By
             | comparison, the Solana Saga Web3 phone's list of security
             | releases is short and succinct. Therefore, the Solana Saga
             | must be more secure and has better security than an Apple
             | device!
             | 
             | > What is the value of the feature? As the article notes,
             | this new feature is flying so low under the radar that
             | Apple hasn't bothered to advertise it, and the Apple media
             | haven't bothered to mention it either. You have to wonder
             | how many people even wanted it.
             | 
             | The marketability of a feature is not necessarily
             | correlated with its value. Some features are simply
             | expected and would be silly to advertise, i.e. the ability
             | to check email or text friends. Other features are
             | difficult to evaluate efficacy, so you release and collect
             | feedback instead of advertising and setting false
             | expectations.
             | 
             | > Lockdown mode is basically for famous people and nobody
             | else.
             | 
             | Similar to Feature value, that audience of that statement
             | is your average person (read: does not read/post on hacker
             | news). Based off the your pedigree, I feel as though you
             | probably know better, and given your "no tolerance for
             | risk" for such a feature, it's something worth at least
             | considering, and definitely isn't ridiculous.
             | 
             | I think it's great you started this conversation. I
             | disagree with your opinion, and that's okay!! But I don't
             | think it's particularly beneficial to any discourse to 1.
             | Imply that you are evaluating security risk 2. Be given a
             | well written technical article so that you are able to make
             | an informed decision (and then share that informed
             | decision) 3. Ignore relevant information from said article,
             | make an uninformed decision 4. Be surprised when someone
             | says you made an uninformed decision 5. Imply the only way
             | to make an informed decision would be to be an expert in
             | the relevant fields from the technical article
             | 
             | Anyway - thanks for writing and replying. Creating and
             | putting yourself out there is hard (as evidenced by my
             | empty blog that I promised I'd add to for the past 2
             | years). And my criticism was too harsh.
        
               | lapcat wrote:
               | > if you're going to evaluate the security risk of a
               | feature
               | 
               | I wouldn't characterize that as the point of my blog
               | post. It's primarily about user consent, or lack thereof.
               | 
               | > and then end up on the front page of HN for said
               | security risk
               | 
               | I have no control over that. I didn't even submit the
               | article to HN.
               | 
               | > Apple's list of Security releases is long and storied.
               | By comparison, the Solana Saga Web3 phone's list of
               | security releases is short and succinct. Therefore, the
               | Solana Saga must be more secure and has better security
               | than an Apple device!
               | 
               | This is a red herring. I wasn't comparing Apple security
               | to any other company's security. I was merely pointing
               | out the possibility of bugs and vulnerabilities in
               | Apple's new feature.
               | 
               | > Other features are difficult to evaluate efficacy, so
               | you release and collect feedback instead of advertising
               | and setting false expectations.
               | 
               | Well, I've now given my feedback on the new feature.
               | 
               | > Similar to Feature value, that audience of that
               | statement is your average person (read: does not
               | read/post on hacker news). Based off the your pedigree, I
               | feel as though you probably know better
               | 
               | I'm not sure I understand. Are you claiming that Apple,
               | in its support document, is deliberately
               | mischaracterizing Lockdown Mode?
               | 
               | > But I don't think it's particularly beneficial to any
               | discourse to 1. Imply that you are evaluating security
               | risk
               | 
               | As I've said above, I wasn't.
               | 
               | > 3. Ignore relevant information from said article
               | 
               | I didn't ignore the relevant information from said
               | article. I read the article, but some of the technical
               | details are beyond my current knowledge.
               | 
               | > make an uninformed decision
               | 
               | What uninformed decision are you talking about?
               | 
               | > 4. Be surprised when someone says you made an
               | uninformed decision
               | 
               | I'm surprised because I have no idea what "uninformed
               | decision" you mean.
               | 
               | > 5. Imply the only way to make an informed decision
               | would be to be an expert in the relevant fields from the
               | technical article
               | 
               | I didn't imply that at all. To the contrary, I insisted
               | that the decision to enable the feature should be up to
               | the user, not up to Apple.
        
               | hahahacorn wrote:
               | I don't think you're trying to understand what I'm
               | saying, e.g. > 3. Ignore relevant information from said
               | article I didn't ignore the relevant information from
               | said article. I read the article, but some of the
               | technical details are beyond my current knowledge. > make
               | an uninformed decision What uninformed decision are you
               | talking about?
               | 
               | I don't think I need to specify that by uninformed
               | decision I mean evaluating the security risk of the
               | feature. I think I criticized too harshly, and you're
               | (understandably) not engaging with me fairly in
               | retaliation. If you actually want to engage with me and
               | discuss this further, feel free to shoot me an email (in
               | my about section). Otherwise, obligatory
               | https://www.paulgraham.com/vb.html.
        
               | lapcat wrote:
               | > I don't think you're trying to understand what I'm
               | saying
               | 
               | I'm trying, but obviously I'm failing.
               | 
               | > I don't think I need to specify that by uninformed
               | decision I mean evaluating the security risk of the
               | feature.
               | 
               | For the third time, that wasn't what I was trying to do
               | with the blog post.
               | 
               | > you're (understandably) not engaging with me fairly in
               | retaliation
               | 
               | I don't think you're understanding me either. I'm not
               | retaliating. I was trying to clarify.
        
           | pama wrote:
           | Enhanced Visual Search was enabled despite my default
           | lockdown mode. I worry about enhanced visual search
           | capabilities much less than several of the other risky
           | features that lockdown mode disables, but was a bit surprised
           | by the default opt-in in my lockdown mode phone.
        
             | hahahacorn wrote:
             | Yeah that's genuinely surprising and I disagree with that
             | decision.
        
         | amiga386 wrote:
         | > This is what a good privacy story looks like.
         | 
         | A good privacy story starts with "Do you consent" and not
         | transmitting a byte if you answer "no"
        
         | blindriver wrote:
         | This sounds exactly like that CSAM "feature" they wanted to add
         | but created a huge outrage because of how incredibly invasive
         | it was.
         | 
         | It sounds like it only needs a few extra lines of code to get
         | exactly what they wanted before, they just packaged it
         | differently and we all fell for it like frogs getting boiled in
         | water.
        
           | nntwozz wrote:
           | A frog that is gradually heated will actually jump out.
           | 
           | https://en.m.wikipedia.org/wiki/Boiling_frog
        
             | userbinator wrote:
             | Let's hope that part of the analogy also applies.
        
           | theshrike79 wrote:
           | Oh ffs, not this one again.
           | 
           | The CSAM filtering was a best of class implementation. I'm
           | pretty sure I'm one of maybe a dozen people who actually read
           | the spec before throwing a hissy-fit about "muh privacy!"
           | 
           | The only actual "flaw" was that maybe a state-level actor
           | could make it scan for bad stuff on your device.
           | 
           | BUT they can do it to your cloud data _today_. And if you
           | disabled cloud uploads, the local scanning was disabled too.
        
             | 4ad wrote:
             | > BUT they can do it to your cloud data _today_.
             | 
             | No, they can't if you enable ADP[1].
             | 
             | [1] https://support.apple.com/en-us/108756
        
         | schappim wrote:
         | For context, @scosman is self-described as "Formerly Apple
         | Photos" in his Twitter bio.
        
         | AJRF wrote:
         | > This is what a good privacy story looks like.
         | 
         | Not at all. A good privacy story is not sending this data
         | anywhere.
        
         | thousand_nights wrote:
         | this is what gaslighting looks like
         | 
         | how about they don't send anything about my photos to their
         | servers and i get to keep my shit on my own device
         | 
         | i suppose we're past that to the point where techbros like you
         | will defend personal data exfiltration because.. uhh idk?
         | trillion dollar corporation knows best?
        
         | matheusmoreira wrote:
         | > I'm not sure what the right call is here.
         | 
         | I am sure.
         | 
         | The right call is to never send _any_ data from the device to
         | _anyone_ unless the user _explicitly_ tells the device to do
         | it.
         | 
         | The only thing the device should do is whatever its user tells
         | it to do.
         | 
         | The user didn't tell it to do this. Apple did.
         | 
         | > But the conclusion "Thus, the only way to guarantee computing
         | privacy is to not send data off the device." isn't true
         | 
         | Irrelevant. It was never about privacy to begin with. It was
         | always about power, who owns the keys to the machine, who
         | commands it.
         | 
         | Vectorization, differential privacy, relays, homomorphic
         | encryption, none of it matters. What matters is the device is
         | going behind the user's back, doing somebody else's bidding,
         | protecting somebody else's interests. That they were careful
         | about it offers little comfort to users who are now aware of
         | the fact "their" devices are doing things they weren't supposed
         | to be doing.
        
           | parasubvert wrote:
           | Complete nonsense. *All networked devices do things behind
           | their users back* at this point, and have for years, and do
           | not ask for consent for most of it. And users would REJECT
           | granular opt-in as a terrible UX.
           | 
           | Let's look at the primary alternative, Android. It generally
           | does not provide you this level of granular control on
           | network access either without rooted hacks. Apps and the
           | phone vendor can do whatever they want with far less user
           | control unless you're a deep Android nerd and know how to
           | install root-level restriction software.
        
             | matheusmoreira wrote:
             | Yes, apps going behind people's back and exfiltrating
             | personal information has become normal. That's not an
             | argument, it's merely a statement of fact. This shouldn't
             | be happening at all. The fact it got to this point doesn't
             | imply it shouldn't be stopped.
             | 
             | No one's advocating for granular opt in either. There are
             | much better ways. We have to make it so that data is toxic
             | to corporations. Turn data into expensive legal liabilities
             | they don't want to deal with. These corporations should not
             | even be thinking about it. They should be scrambling to
             | forget all about us the second we are done with them, not
             | covertly collecting all the data they possibly can for
             | "legitimate" purposes. People should be able to use their
             | computers without _ever_ worrying that corporations are
             | exploiting them in any way whatsoever.
             | 
             | The Android situation is just as bad, by the way. Rooting
             | is completely irrelevant. You may think you can hack it but
             | if you actually do it the phone fails remote attestation
             | and the corporations discriminate against you based on
             | that, usually by straight up denying you service. On a very
             | small number of devices, Google's phones ironically, you
             | can access those keys and even set your own. And it doesn't
             | matter, because the corporations don't trust your keys,
             | they only trust the keys of other corporations. They don't
             | care to know your device is secure, they want to know it's
             | fully owned by Google so that you can't do things the
             | corporations don't like.
             | 
             | It's not something that can be solved with technology.
             | Computer freedom needs to become law.
        
         | user3939382 wrote:
         | > This is what a good privacy story looks like.
         | 
         | What a good privacy story looks like is that my photos aren't
         | sent anywhere in any way shape or form without explicit opt in
         | permission.
        
           | notyourwork wrote:
           | Do you not sync to iCloud?
        
             | ternnoburn wrote:
             | That sounds like "with opt in permission".
        
               | EA-3167 wrote:
               | You aren't wrong, but... it's odd coming here to HN and
               | seeing people talk about privacy like we aren't in the
               | nth generation of people trading theirs away for a
               | pittance. I think the market for the sort of privacy
               | envisioned by some here is incredibly small, incredibly
               | niche, and honestly one of the least likely to buy an
               | iPhone in the first place.
               | 
               | Most people broadcast their lives on social media,
               | happily opt in to all sorts of schemes that track them
               | just for minor conveniences. For people like that, the
               | idea that the privacy protection outlined by the OP isn't
               | enough rings really hollow.
               | 
               | Or to put it bluntly, at some point this really stops
               | feeling like a practical debate, and more of an
               | ideological one.
        
             | chatmasta wrote:
             | You can enable Advanced Data Protection and all your iCloud
             | data will be stored as encrypted blobs.
        
               | sneak wrote:
               | ADP sends hashes of the plaintext and filenames to Apple
               | effectively in the clear (non e2ee).
               | 
               | If only you and three other people have a unique file,
               | Apple knows you are a group.
        
             | timcobb wrote:
             | No
        
               | 7402 wrote:
               | me neither.
        
             | giancarlostoro wrote:
             | You can choose to do so, or not do so.
        
           | scosman wrote:
           | Your photos aren't sent anywhere in this system.
        
             | hu3 wrote:
             | Metadata is data.
        
               | theshrike79 wrote:
               | This is hardly even metadata. It can't be traced to you
               | nor can it be reversed.
        
         | pton_xd wrote:
         | > This is what a good privacy story looks like.
         | 
         | I have an idea: send an encrypted, relayed, non-reversible,
         | noised vector representation of your daily phone habits and
         | interactions. That way you can be bucketed, completely
         | anonymously of course, with other user cohorts for tracking,
         | advertising, and other yet-to-be discovered purposes.
         | 
         | It's a great privacy story! Why would you have a problem with
         | that?
        
           | 1123581321 wrote:
           | What would be the value to the user in your scenario? In the
           | photos app real scenario, it's to enable a search feature
           | that requires pairing photos with data not on the phone. (I
           | understand you're being sarcastic.)
        
             | pton_xd wrote:
             | Maybe we can do some analysis and optimize phone battery
             | life based on your cohorts usage patterns.
             | 
             | I don't know, I'm sure we'll figure something out once we
             | have your data!
        
               | giancarlostoro wrote:
               | The entire point is that you don't actually have the
               | data, only the client can decrypt any of it.
        
               | 1123581321 wrote:
               | That doesn't make sense, and the other user is right that
               | you can't give up personal data with this scheme. Perhaps
               | focus on the real privacy leaks from cell phones like
               | tower connections and sign-ins to Instagram.
        
           | jrk wrote:
           | They don't "have your data," even at an aggregated and noised
           | level, due to the homomorphic encryption part.
           | 
           | Restating the layers above, in reverse:
           | 
           | - They don't see either your data or the results of the query
           | (it's fully encrypted _even from them where they compute the
           | query_ -- this is what homomorphic encryption means)
           | 
           | - Even if they broke the encryption and had your query data /
           | the query result, they don't know who "you" are (the relay
           | part)
           | 
           | - Even if they had your query hash and your identity, they
           | couldn't reverse the hash to identify which specific photos
           | you have in your library (the client-side vectorization +
           | differential privacy part), though by the this point they
           | could know what records in the places database were hits. So
           | they could know that you took a photo of a landmark, _but
           | only if the encryption and relay were both broken_.
        
         | matthewdgreen wrote:
         | The devil is in the differential privacy budget. I am in Japan
         | and I've taken hundreds of photos this week. What does that
         | budget cover?
        
         | waynecochran wrote:
         | Can this be verified?
        
         | SoftTalker wrote:
         | It's a nice story but Apple can change the implementation any
         | time they want or are forced to.
        
         | themaninthedark wrote:
         | I am bit bit confused: Data is being sent to Apple, in such a
         | way that it can not be traced back to the user. Apple does some
         | processing on it. Then somehow magically, the pictures on your
         | phone are updated with tags based on Apple's processing....but
         | Apple doesn't know who you are.....
        
           | kalleboo wrote:
           | There is a way to perform processing on encrypted data so the
           | result is also encrypted and the person doing the processing
           | never knows anything about the data that was processed on or
           | the result (which can only be decrypted by the user with the
           | original encryption keys)
           | 
           | https://en.wikipedia.org/wiki/Homomorphic_encryption
           | 
           | > _Homomorphic encryption is a form of encryption that allows
           | computations to be performed on encrypted data without first
           | having to decrypt it. The resulting computations are left in
           | an encrypted form which, when decrypted, result in an output
           | that is identical to that produced had the operations been
           | performed on the unencrypted data. Homomorphic encryption can
           | be used for privacy-preserving outsourced storage and
           | computation. This allows data to be encrypted and outsourced
           | to commercial cloud environments for processing, all while
           | encrypted_
           | 
           | And the way the data comes back to you is via the third-party
           | relay which knows your IP but nothing else
        
             | themaninthedark wrote:
             | Ok, that's the step that was missing. I couldn't figure out
             | how there was a benefit to the users without data being fed
             | back and data can't be fed back without knowing some ID.
             | 
             | So, while Apple doesn't know the ID of the person sending
             | the data, they have a 'room number' that links back to an
             | ID.
             | 
             | If Apple were to decide to scan photos for pictures of
             | 'lines of white powder' they couldn't tell the police your
             | name but they could say that the 3rd party knows who you
             | are.
        
           | raincole wrote:
           | You joked, but you accidentally described what homomorphic
           | encryption does. (if implemented correctly)
           | 
           | > Then somehow magically, the pictures on your phone are
           | updated with tags based on Apple's processing....but Apple
           | doesn't know who you are.....
           | 
           | Yes, this is the whole point.
        
         | dannyw wrote:
         | Isn't this 95% the same as Apple's CSAM scanning paper?
        
         | chikere232 wrote:
         | The nearest neighbour search is sharded, which apple's blog
         | admits is a privacy issue, which is why they're running the DP
         | and OHTTP parts.
         | 
         | If apple were to add additional clusters that match "sensitive"
         | content and endeavour to put them in their own shards distinct
         | from landmarks, they defeat the homomorphic encryption part
         | while still technically doing it.
         | 
         | The DP part can be defeated with just statistics over time;
         | someone with any volume of sensitive content will hit these
         | sensitive clusters with a higher likelihood than someone
         | generateing noise injected fake searches.
         | 
         | The OHTTP part can be defeated in several ways, the simplest of
         | which is just having a clause in a non-public contract allowing
         | apple to request logs for some purpose. They're paying them and
         | they can make up the rules as they go.
        
         | geoffmac wrote:
         | This answer should be much higher. Thank you
        
       | jazzyjackson wrote:
       | I'm moving my family out of apple photos, self hosted options
       | have come a long way. I landed on immich [0] and a caddy plugin
       | that allows for PKI certificate for account access while still
       | allowing public shared URLs [1]*
       | 
       | There's also LibrePhotos which is packed with features but
       | doesn't have as much polish as immich. They do however have a
       | list of python libraries that can be used for offline/local
       | inference for things like having an image model create a
       | description of a photo that can benefit the full text search. [2]
       | 
       | [0] https://immich.app/
       | 
       | [1] https://github.com/alangrainger/immich-public-
       | proxy/blob/mai...
       | 
       | [2] https://docs.librephotos.com/docs/user-guide/offline-setup
       | 
       | * Haven't actually tried this plugin yet, my weekend project is
       | setting up my caddy VPS to tunnel to the immich container running
       | on my Synology NAS
        
         | nrki wrote:
         | Immich is amazing, truly great software, and it's oss
        
           | layer8 wrote:
           | It's still beta though and often has breaking changes that
           | can require updating the server software in sync with the
           | mobile app versions.
        
         | sneak wrote:
         | Sure, I use Ente and have iCloud Photos turned off, but TFA is
         | referring to what happens in the Apple Photos app, which
         | happens even if you have iCloud Photos turned off.
        
       | walterbell wrote:
       | _> On macOS, I can usually prevent Apple software from phoning
       | home by using Little Snitch. Unfortunately, Apple doesn 't allow
       | anything like Little Snitch on iOS._
       | 
       | On Android, NetGuard uses a "local VPN" to firewall outgoing
       | traffic. Could the same be done on iOS, or does Apple network
       | traffic bypass VPNs? Lockdown mentions ads, but not Apple
       | servers, https://lockdownprivacy.com/.
       | 
       | Apple does publish IP ranges for different services, so it's
       | theoretically possible to block 17.0.0.0/8 and then open up
       | connections just for notifications and security updates,
       | https://support.apple.com/en-us/101555
        
         | dev_tty01 wrote:
         | >On Android, NetGuard uses a "local VPN" to firewall outgoing
         | traffic. Could the same be done on iOS, or does Apple network
         | traffic bypass VPNs? Lockdown mentions ads, but not Apple
         | servers, https://lockdownprivacy.com/.
         | 
         | Why is NetGuard more trustworthy than Apple?
        
           | walterbell wrote:
           | NetGuard firewall doesn't run on iOS, so there's no point in
           | comparing to Apple. For those on Android, NetGuard is open-
           | source, https://github.com/M66B/NetGuard
           | 
           | On iOS, Lockdown firewall is open-source,
           | https://github.com/confirmedcode/Lockdown-iOS
        
         | varenc wrote:
         | An iOS "local VPN" could definitely block all traffic to Apple
         | IP ranges. But it lacks the ability to associate traffic with
         | the originating process/framework. Like if, for example, I
         | wanted to only allow iMessage to talk to Apple but nothing
         | else. This is what Little Snitch and other software gives you
         | on macOS/Linux/etc.
         | 
         | But even blanket blocking of all Apple IP ranges probably
         | wouldn't do anything here. As documented, your device sends
         | noise injected image vectors to OHTTP relays and doesn't
         | contact Apple directly. By definition those relays are operated
         | by 3rd parties. So if you consider this type of data "phoning
         | home" you'll need to find the IPs of all of OHTTP relays iOS
         | uses. (or block the traffic that looks up the OHTTP relays).
        
           | walterbell wrote:
           | Apple's "enterprise networking" guide lists 3rd-party CDNs as
           | subdomains of apple.com, which usually resolve to akamai or
           | cloudflare subdomains. This allows those dynamic IPs to be
           | blocked via dnsmasq ipset rules. In theory, they could use
           | similar subdomain resolution for the OHTTP relays.
           | 
           | Since iOS was derived from macOS, perhaps Apple could restore
           | the link between network traffic and process.
        
         | mihaaly wrote:
         | It looks sensible assuming that Little Snitch has some high
         | level manager agent inside Apple manipulating the company
         | making these kind of sneaky attacks on customers' privacy that
         | drives the sales of Little Snitch. On the end they will also
         | make them to buy Liitle Snitch for lots of millions or billions
         | for elimination so they can attack customers freely afterwards.
         | Little Snitch hidden agents are smart!
         | 
         | I do not assume that Apple managers are that degenerate idiots
         | pushing through trust eroding marginal idiocy like this.
        
       | timsneath wrote:
       | More on homomorphic encryption here:
       | https://www.swift.org/blog/announcing-swift-homomorphic-encr...
       | 
       | Summary: "Homomorphic encryption (HE) is a cryptographic
       | technique that enables computation on encrypted data without
       | revealing the underlying unencrypted data to the operating
       | process. It provides a means for clients to send encrypted data
       | to a server, which operates on that encrypted data and returns a
       | result that the client can decrypt. During the execution of the
       | request, the server itself never decrypts the original data or
       | even has access to the decryption key. Such an approach presents
       | new opportunities for cloud services to operate while protecting
       | the privacy and security of a user's data, which is obviously
       | highly attractive for many scenarios."
        
       | geococcyxc wrote:
       | Another setting that surprised me with being turned on by default
       | apparently on macOS 15 is System Settings - Spotlight - "Help
       | Apple Improve Search": "Help improve Search by allowing Apple to
       | store your Safari, Siri, Spotlight, Lookup, and #images search
       | queries. The information collected is stored in a way that does
       | not identify you and is used to improve search results."
        
         | dev_tty01 wrote:
         | No, this is not on by default. After system install at first
         | boot it asks if you want to help improve search and it
         | describes how your data will be handled, anonymized, etc. If
         | you clicked on yes it is on. There is a choice to opt out.
        
           | nechuchelo wrote:
           | I was on by default for me, both after the macOS 14 -> 15
           | upgrade and after installing macOS 15 cleanly. I wonder if
           | they ask for consent in some regions only.
        
             | scratchyone wrote:
             | Yeah this seems to be on by default for me as well, I
             | definitely take issue with that.
        
           | newZWhoDis wrote:
           | I always say no to all analytics/improvement prompts.
           | 
           | Both the search and photos settings were on for me in both
           | iOS and MacOs. Completely unacceptable Apple
        
           | JTyQZSnP3cQGa8B wrote:
           | You're lying, it is on by default.
        
         | 2OEH8eoCRo0 wrote:
         | I love the, "help <megacorp> do x" options. Help? Your market
         | cap is over $3tn. Help yourself! And stop asking how my call
         | quality was.
        
       | dan-robertson wrote:
       | To me it seems like a reasonable feature that was, for the most
       | part, implemented with great consideration for user privacy,
       | though maybe I'm too trusting of the description. I mostly think
       | this article is rage-bait and one should be wary of 'falling for
       | it' when it shows up on hacker news in much the same way that one
       | should be wary when rage-bait articles show up in tabloids or on
       | Facebook.
       | 
       | It seems likely to me that concerns like those of the article or
       | some of the comments in this thread are irrelevant to Apple's
       | bottom line. A concern some customers may actually have is data
       | usage, but I guess it's likely that the feature is off if in low
       | data mode.
       | 
       | I wonder if this particular sort of issue would be solved by some
       | setting for 'privacy defaults' or something where
       | journalists/activists/some corporate IT departments/people who
       | write articles like the OP can choose something to cause OS
       | updates to set settings to values that talk less on the network.
       | Seems hard to make a UI that is understandable. There is already
       | a 'lockdown mode' for iOS. I don't know if it affects this
       | setting.
        
         | amatecha wrote:
         | Literally all Apple needed to do was not have it enabled by
         | default. Sending stuff over the network without asking is why
         | trust in Apple is reduced further and further.
        
           | dan-robertson wrote:
           | Not enabling something by default is pretty close to not
           | having it at all. Accessibility is a reasonable exception
           | where it makes sense to have the features even though they
           | are off by default.
           | 
           | I mostly think the reaction to this article is overblown
           | because it appeals popular ideas here about big tech. I think
           | one should be wary of Apple's claims about privacy: the
           | reason is competition with Google and so they want users to
           | be distrustful of the kinds of features that Google are
           | better at implementing (I don't want to say Apple isn't
           | trying to do the right thing either - if you look at
           | accessibility, the competition was very bad for a lot of
           | things for a long time and Apple was good despite the lack of
           | commercial pressure). But I think one should also be wary of
           | articles that make you angry and tell you what you suspected
           | all along. (eg see the commenter elsewhere who doesn't care
           | about the details and is just angry). It's much easier to
           | spot this kind of rage-bait piece when it is targeting
           | 'normal people' rather than the in-group.
        
             | lapcat wrote:
             | > But I think one should also be wary of articles that make
             | you angry and tell you what you suspected all along. (eg
             | see the commenter elsewhere who doesn't care about the
             | details and is just angry). It's much easier to spot this
             | kind of rage-bait piece when it is targeting 'normal
             | people' rather than the in-group.
             | 
             | The article was published by an Apple developer and user,
             | i.e., myself, on my personal blog, which is followed mainly
             | by other Apple developers and users. My blog comprises my
             | own personal observations, insights, and opinions. If you
             | see any rage, it would be my own personal rage, and not
             | "bait". Bait for what?
        
               | dan-robertson wrote:
               | I'm not interested in telling you what to put on your
               | blog. Do whatever you like.
               | 
               | The headline is defensible but, in my opinion, quite
               | sensationalised. People are likely to interpret it as
               | being for an article making much stronger claims than the
               | article actually does. I think a lot of the interactions
               | people had with this submission, especially early on,
               | were because the headline made them mad, rather than
               | because of its contents. I think if one is interacting
               | with some submission here due to a maddening headline,
               | one should be wary about such interactions being driven
               | by emotion, leading to poor discussion that is not
               | particularly anchored to the topic of the article, rather
               | than being driven by curiosity.
        
               | lapcat wrote:
               | > The headline is defensible but, in my opinion, quite
               | sensationalised.
               | 
               | How would you write the headline?
               | 
               | There's always a criticism of headlines, but headlines
               | are necessarily short. It's like critics want the entire
               | article text to appear in the headline, which is
               | impossible.
               | 
               | I don't know what defensible but sensationalized is
               | supposed to mean.
               | 
               | > I think a lot of the interactions people had with this
               | submission, especially early on, were because the
               | headline made them mad, rather than because of its
               | contents.
               | 
               | That's pure speculation on your part, because the
               | headline is very short and vague. In any case, it's not
               | my fault if people read the headline but not the article.
               | I _want_ people to read the article, not just the
               | headline.
        
               | dan-robertson wrote:
               | I would probably aim for a title like 'what does the
               | "Enhanced Visual Search" feature do in iOS 18 and MacOS
               | 15?' Or 'how much data is sent to Apple for the new
               | "Enhanced Visual Search" in Photos'.
               | 
               | I think the early comments on this submission were
               | because the headline made them mad because they were
               | incurious and not particularly related to the topic of
               | the article making - most could be under any article
               | about Apple Photos. One early comment was about switching
               | to alternatives to Photos and another was, paraphrasing,
               | 'I'm so angry. I don't care about homeomorphic encryption
               | or differential privacy'. Others seemed to follow the
               | theme of the latter. After a while a comment attempted an
               | overview of some of the technical details, which I
               | thought was better.
               | 
               | Perhaps a more precise complaint is that many of the
               | early comments didn't really depend on the contents of
               | the article - they could go under many Apple submissions
               | - and I think it's more likely for comments like that to
               | be written while incensed.
               | 
               | I don't think you're to blame for the comments people
               | choose to leave.
        
               | lapcat wrote:
               | > I would probably aim for a title like 'what does the
               | "Enhanced Visual Search" feature do in iOS 18 and MacOS
               | 15?' Or 'how much data is sent to Apple for the new
               | "Enhanced Visual Search" in Photos'.
               | 
               | In my opinion, these are actually misleading headlines
               | that don't represent the content of the article. In fact,
               | I've never used the Enhanced Visual Search feature, nor
               | do I know exactly how much data is sent to Apple.
               | 
               | My article was never intended as a kind of feature
               | introduction that you might see in the news media. The
               | main point was always about user consent for uploading
               | data to Apple. To be clear, my article is and has always
               | been a _complaint_. Thus, I think the headline is
               | accurate. I am complaining, and readers should know that.
               | 
               | > another was, paraphrasing, 'I'm so angry. I don't care
               | about homeomorphic encryption or differential privacy'.
               | 
               | > many of the early comments didn't really depend on the
               | contents of the article
               | 
               | The headline of the article didn't mention homeomorphic
               | encryption or differential privacy, so they must have
               | read the contents.
        
             | JTyQZSnP3cQGa8B wrote:
             | > Not enabling something by default is pretty close to not
             | having it at all
             | 
             | And I care "why" exactly? It was turned on by default on my
             | phone without my consent, it's a privacy violation, nothing
             | else matters in that case.
        
           | Klonoar wrote:
           | Apple already communicates home by default. They never even
           | fixed the macOS app signature check that they said they
           | would, and yet people still choose to use the OS.
           | 
           | (And to be clear I'm not even bothered by the signature
           | check)
           | 
           | At a certain point you have to figure that they realize it
           | doesn't matter short of some government entity forcing them
           | to stop. At the very least the protections they put in place
           | (homomorphic encryption, etc) are more than I think most
           | other companies would ever bother doing.
        
         | kllrnohj wrote:
         | if anyone else had done this then yes probably it's reasonable
         | feature done reasonably. The problem is Apple has spent tens if
         | not hundreds of millions of dollars advertising that they don't
         | do things like this. That stuff stays on your iPhone unlike
         | that other OS run by yucky advertising company. Apple would
         | never siphon your data, because they _care_ and you aren 't the
         | product.
         | 
         | Shit like this, reasonable in isolation or not, undermines that
         | story completely. If they are so willing to just outright _lie_
         | on a massive billboard, what else will they do when profits
         | demand it?
        
         | layer8 wrote:
         | It's a reasonable feature, but should nevertheless require opt-
         | in by the user. The opt-ins could certainly be bundled at
         | install/upgrade time to reduce annoyance.
        
         | wejick wrote:
         | One thing particularly not clear to me is weather ios scan all
         | data in the phone and send it to be part of public index or
         | not. I see from how the feature works from the UI it seems it's
         | not. If the feature activated by user action does this still
         | constitute as phoning home?
        
       | ProllyInfamous wrote:
       | At this point, Mac Mini M4's are _cheap enough_ and _capable
       | enough_ to just purchase two: one for off-line use, another on-.
       | 
       | Perhaps this is marketing genius (from an AAPL-shareholder POV)?
       | 
       | ----
       | 
       | I'm laughing at the insanity of all this interconnectivity, but
       | an NDA prevents me from typing the greatest source of my ironic
       | chuckles. Described in an obtuse way: a privacy-focused hardware
       | product ships with an undisclosed phone-home feature, letting the
       | feds see every time you use the product (to produce a
       | controversial product, at home).
       | 
       | Kick in my fucking door / sue me: it'll just re-enforce that I'm
       | correct about concessionary-allowances...
        
         | doctorpangloss wrote:
         | Can you be more clear?
        
           | ProllyInfamous wrote:
           | Computer are inexpensive enough to own both on- & off-line
           | hardware.
           | 
           | ----
           | 
           | Even privacy-focused hardware manufacturers will allow
           | undisclosed usage tracking (in order to continue existing,
           | themselves). In my example, the OEM delivers a physical
           | product which allows users to make tangible objects, at home.
           | Every time you launch the hardware's control software (to
           | make more controversial objects), it phones home.
        
         | threeseed wrote:
         | > letting the feds see every time you use the product
         | 
         | This does not happen.
        
           | Lammy wrote:
           | Wrong. _Anything_ that makes _any_ network request is by
           | definition a privacy leak. The network itself is always
           | listening, and the act of making any connection says that you
           | are using the computer, and where, and when.
        
             | threeseed wrote:
             | And so in this context.
             | 
             | You are saying that _every_ network request is going to a
             | target controlled by the US government.
        
               | Lammy wrote:
               | Or their friends.
               | 
               | https://en.wikipedia.org/wiki/Five_Eyes
               | 
               | https://en.wikipedia.org/wiki/Room_641A
        
               | ProllyInfamous wrote:
               | >https://en.wikipedia.org/wiki/Room_641A
               | 
               | Why yes, I did happen to work in US data centers
               | throughout the 2010's... including during Snowden's [then
               | adamently-denied] revelations.
               | 
               | "If you're taking flak, you're near the target."
        
               | ProllyInfamous wrote:
               | Yes, this is what I'm saying.
               | 
               | Particularly among "controversial" product manufacturers.
               | 
               | ----
               | 
               | Entirely unrelated: I recently visited my favorite
               | hometown (Austin) museum, UT's Harry Ransom Center. Their
               | P.E.N. exhibit has an expose on how various mid-20th
               | Century authors were in fact funded by The Ford
               | Foundation (a known CIA "investment vehicle"), including
               | Aldous Huxley (author of Brave New World). One of the
               | exhibits features multiple author signatories _adamently
               | denying this CIA-funding, over the 1950 's decade_...
               | ultimately this conspiracy theory ended up being
               | true/revealed. Adds an entirely new dimension to The
               | Savage's [BNW's main character] decision to hang himself.
        
             | dangus wrote:
             | If every app and system is making network connections by
             | default without the user's knowledge or intervention then
             | it doesn't really prove that you are using the computer at
             | all.
        
               | Lammy wrote:
               | Completely missing the point, but you're right that I
               | should have said "using or in possession" of the
               | computer. The point is that they reveal your physical
               | location to the network, so if they make requests while
               | not directly in use then it's actually even worse.
               | 
               | For context, stationary desktop machines comprise single-
               | digit percentages of Mac sales:
               | https://appleinsider.com/articles/24/03/06/macbook-pro-
               | and-m...
               | 
               |  _Every other product_ sold by Apple (Mac notebooks,
               | iPhones, watches, etc) are designed to be carried around
               | with you, happily beaconing your presence to every
               | network and to every other Apple device that will listen.
               | I have seen this with my own eyes where my M3 MBP showed
               | up on my WiFi AP status page as soon as I got home even
               | though it was supposedly """asleep""" in my bag.
        
               | dangus wrote:
               | Your Mac connected to your trusted home network when it
               | want turned off? Wow, scandalous.
               | 
               | You can randomize your MAC address. Built in OS feature.
        
               | dangus wrote:
               | "wasn't turned off" is what I meant to say.
               | 
               | What I mean is, an M3 MacBook has essentially the same
               | processor as a cell phone. Of course it's able to perform
               | activities when the system is "asleep." It even has
               | specific online behavior labeled as a feature that can be
               | turned on/off (Power Nap).
               | 
               | https://support.apple.com/guide/mac-help/turn-power-nap-
               | on-o...
               | 
               | This is behavior that is desired by Apple's users.
               | 
               | I sometimes wish that more Apple users who aren't
               | satisfied would stop complaining while continuing to
               | throw money at Apple and buy themselves a Linux laptop.
               | If you want a laptop with special features like a WiFi
               | kill switch you should buy one of those, not buy a
               | consumer appliance device that is marketed toward
               | convenience-oriented home users.
        
           | fmajid wrote:
           | Traffic analysis by the government totally happens:
           | https://lifehacker.com/how-cia-director-david-petraeuss-
           | emai...
           | 
           | Now they probably don't care about you personally, but you'd
           | be surprised how many people are connected indirectly to a
           | person of interest. Google "Maher Arar" for the horrific
           | consequences of false positives.
        
             | ProllyInfamous wrote:
             | >you'd be surprised how many people are connected
             | indirectly to a person of interest.
             | 
             | I get called "paranoid," often, but I'm being honest when
             | responding to people requesting my phone number: "you don't
             | want my number in your phone, because it'll associate you
             | with people you probably don't want to be associated with"
             | [see _Five Eyes ' 3-Degrees of Separation_].
        
       | ustad wrote:
       | Holy crap! Enabled by default! Thank you for letting everyone
       | know.
       | 
       | "Enhanced Visual Search in Photos allows you to search for photos
       | using landmarks or points of interest. Your device privately
       | matches places in your photos to a global index Apple maintains
       | on our servers. We apply homomorphic encryption and differential
       | privacy, and use an OHTTP relay that hides IP address. This
       | prevents Apple learning about the information in your photos. You
       | can turn off Enhanced Visual Search at any time on your iOS or
       | iPadOS device by going to Settings > Apps > Photos. On Mac, open
       | Photos and go to Settings > General."
        
       | TYPE_FASTER wrote:
       | I think the user should be prompted to enable new functionality
       | that sends data to the cloud. I think there should be a link to
       | details about which data is being sent, how it is being
       | transmitted, if it is stored, if it is provided to 3rd-parties,
       | and what is being done with it.
       | 
       | Maybe this is exactly what the GDPR does. But it is what I would
       | appreciate as a user.
       | 
       | I have seen how sending metrics and crash dumps from a mobile
       | device can radically improve the user experience.
       | 
       | I have also seen enough of the Google bid traffic to know how
       | much they know about us.
       | 
       | I want to enable sending metrics, but most of the time it's a
       | radio button labeled "metrics" and I don't know what's going on.
        
       | xbar wrote:
       | Now "What happens on your iPhone stays on your iPhone" seems like
       | it deserves the Lindt chocolates defense: "exaggerated
       | advertising, blustering, and boasting upon which no reasonable
       | buyer would rely."
        
       | amatecha wrote:
       | Completely, 100% agreed:
       | 
       | > the only way to guarantee computing privacy is to not send data
       | off the device.
       | 
       | > It ought to be up to the individual user to decide their own
       | tolerance for the risk of privacy violations. [...] By enabling
       | the "feature" without asking, Apple disrespects users and their
       | preferences. I never wanted my iPhone to phone home to Apple.
       | 
       | Regardless of how obfuscated or "secure" or otherwise "privacy-
       | protecting" the feature is, the fact is that some information
       | derived from one's personal content is transmitted, without prior
       | consent. Even if the information is protected, all network
       | queries are information. A timestamp that proves you took a
       | certain action at a certain time (like taking a photo, assuming
       | stuff is sent to this service immediately upon adding a new
       | photo), from a certain location (by correlating your location
       | information at that time), etc etc.. and that's just the tip of
       | the iceberg. Transmitting information from a user's device
       | without their explicit consent is a violation of their privacy.
        
         | dmix wrote:
         | How much size would it take to store a model of every known
         | location in the world and common things?
         | 
         | For ex: I sent a friend a photo of my puppy in the bathtub and
         | her Airpods (via iphone) announced "(name) sent you a photo of
         | a dog in a bathtub". She thought it was really cool and so do I
         | personally. That's a useful feature. IDK how much that requires
         | going off-device though.
        
           | tigershark wrote:
           | I'm not an expert, but I would say _extremely_ small.
           | 
           | For comparison Hunyuan video encodes a shit-ton of videos and
           | rudimentary real world physics understanding, at very high
           | quality in only 13B parameters. LLAMA 3.3 encodes a good
           | chunk of _all the knowledge available to humanity_ in only
           | 70B parameters. And this is only considering open source
           | models, the closed source one may be even more efficient.
        
             | Aachen wrote:
             | Maybe we have different understandings of what _extremely_
             | small is (including that emphasis) but an LLM is not that
             | by definition (the first L). I 'm no expert either but the
             | smaller value mentioned is 13e9. If these things are 8-bit
             | integers, that's 13 GB data (more for a normal integer or a
             | float). That's a significant percentage of long term
             | storage on a phone (especially Apple models) let alone that
             | it would fit in RAM on even most desktops which is afaik
             | required for useful speeds. Taking this as upper bound and
             | saying it must be extremely small to encode only landmarks,
             | idk. I'd be impressed if it's down to a few dozen
             | megabytes, but with potentially hundreds of such mildly
             | useful neural nets, it adds up and isn't so small that
             | you'd include it as a no-brainer either
        
           | SteveNuts wrote:
           | > That's a useful feature.
           | 
           | I'm really curious how this feature is considered useful.
           | It's cool, but can't you just open the photo to view it?
        
             | inkyoto wrote:
             | It is a notification summary.
             | 
             | There is a large number of people out there who receive
             | hundreds of notifications (willingly but mostly
             | unwillingly) _daily_ from apps they have installed (not
             | just messengers), and nearly all of them can 't cope with
             | the flood of the notifications. Most give up on tending to
             | the notifications altogether, but some resort to the
             | notification summaries which alleviate the cognitive
             | overload (<<oh, it is a picture of a dog that I will check
             | out when I get a chance to, and not a pic of significant
             | other who got themselves into a car accident>>).
             | 
             | The solution is, of course, to not allow all random apps to
             | spam the user with the notifications, but that is not how
             | laypeople use their phones.
             | 
             | Even the more legit apps (e.g. IG) dump complete garbage
             | upon unsuspecting users throughout the day (<<we thought
             | you would love this piece of trash because somebody paid
             | us>>).
        
               | unethical_ban wrote:
               | Hundreds of notifications daily is not unwillingly, nor
               | is it healthy.
               | 
               | You can disable IG trash notifications, for example.
        
               | dewey wrote:
               | The power of defaults is strong.
               | 
               | Most people using the device will never go into the
               | notification settings, if an app on install tells you
               | they need notifications to tell you about your driver
               | arriving...and then sends you marketing notifications on
               | the same channel most people will just accept that.
        
               | inkyoto wrote:
               | My IG notifications are disabled in perpetuity precisely
               | for that reason, even if it has come at a cost of not
               | receiving IM arrival notifications from a couple of
               | friends who message me exclusively via IG (another enigma
               | - why, as both have my contact details on proper
               | messaging platforms).
               | 
               | Frankly, hundreds of daily notifications is the reality
               | we can't change, and the question is: why do people
               | choose to succumb to the misery of it rather than
               | availing themselves of the proper implements so readily
               | at their disposal?
        
               | skydhash wrote:
               | My armchair explanation is that people can't cope with
               | being bored. All those notifications are little bits of
               | dopamine hits each time they arrive. It's not even FOMO,
               | just waiting for something that may be interesting.
        
         | azinman2 wrote:
         | > like taking a photo, assuming stuff is sent to this service
         | immediately upon adding a new photo
         | 
         | So you jumped to a conclusion based on an incorrect premise.
         | This is easy to see that this does not happen immediately after
         | taking a photo. One the network traffic will show this (and it
         | won't), two homomorphic encryption is expensive so it cannot.
         | Photos classically doesn't sync on demand, as most iPhone users
         | will know by way if it telling you this in the photos app when
         | it does sync. Most expensive operations are queued up for when
         | the device is plugged in (and on WiFi) because it'll otherwise
         | drain battery.
        
           | chikere232 wrote:
           | you're splitting a very fine hair while ignoring the larger
           | privacy implication of the feature. So the timestamp might or
           | might not be delayed a bit from being perfectly accurate? So
           | what? It still is data approximating when the photo was
           | taken, even if the resolution were as bad as "within a few
           | days"
        
             | hmottestad wrote:
             | The blog post outlines how Apple goes about disconnecting
             | metadata, so at best they would know that someone took a
             | photo at a rough point in time.
        
         | Aachen wrote:
         | So Signal messages aren't secure because they're transmitted
         | and so their "obfuscation" isn't enough to protect your data?
         | Have you read what the author cited (and then admitted to not
         | understanding) what Apple says they actually do to the data
         | before transmission?
         | 
         | I could see an argument in the metadata (though there are
         | multiple assumptions involved there, not least that they don't
         | truly do OHTTP but instead conspire to learn at what timestamp
         | a user took a picture), but if you already don't trust in what
         | is essentially math, I'm not sure where the uncertainty and
         | doubt ends
        
           | Krasnol wrote:
           | The obvious difference is that by sending your photos with
           | Signal, you are doing it willingly. You let it encrypt and
           | decrypt willingly. You decide who gets it.
           | 
           | Here, Apple does that for you.
        
             | hmottestad wrote:
             | Your ISP, a bunch of routers and switches and the servers
             | run by Signal can also see your encrypted photo. You don't
             | really get to decide who sees the encrypted photo. You do
             | get to decide which photo you encrypt and send though.
        
               | Krasnol wrote:
               | All of those are parts of the network infrastructure.
               | They neither "see" the photo, edit it or need it. They
               | don't even know if it's a photo.
               | 
               | Everybody knows that there is a network infrastructure
               | where your content flows through. You willingly accept
               | that as a connected device user because it is necessary
               | to be connected.
               | 
               | What Apple did is not necessary and users don't know
               | about it.
        
               | hmottestad wrote:
               | I agree that this isn't necessary and that Apple should
               | have asked for consent from the user.
               | 
               | I do want to point out that Apple doesn't get to see your
               | photo. Homomorphic encryption is really cool in that way.
               | Apple isn't able to decrypt the photo and the results
               | they produce are also encrypted. That's the beauty of
               | homomorphic encryption. I can send you an encrypted
               | spreadsheet and you can compute the sum of a column for
               | me. I'm the only one who can see the actual sum since you
               | can only see the encrypted sum.
        
           | _Algernon_ wrote:
           | The difference being that the signal message is sent with
           | consent: You literally press a button to send it there is a
           | clear causal relationship between clicking the button and the
           | message being sent.
        
         | avianlyric wrote:
         | These issues are all addressed in the Apple blog post that
         | talks about how this feature is implemented. Two steps are
         | taken to deal with these risks:
         | 
         | 1) iOS creates additional fake queries, and all queries pass
         | through scheduler that ensures you can use time-of-lookup to
         | either discriminate real queries from fake queries, or identify
         | when a photo was taken.
         | 
         | 2) All queries are performed anonymously, with the use of a
         | third party relaying service. So there's no way for Apple to
         | tie a specific query back to a specific device, or even IP
         | address.
         | 
         | Between those two mitigating features. Getting hold of an
         | individuals personal data using this feature requires you to
         | first compromise the targets phone, to disable the fake
         | queries. Then compromise the relaying party to correlate
         | queries back to a specific IP address.
         | 
         | If you can manage all that, then quite frankly you're a fool
         | for expending all that effort. When you could just use your iOS
         | compromise to have the device send you its location data
         | directly. No need to faff about waiting for your target to take
         | photos, then track multiple landmark lookups, carefully
         | collecting a few bits of additional data per query, until you
         | finally have enough to identify the location of your target or
         | targets.
         | 
         | The whole thing reminds me of XKCD 538.
         | 
         | https://machinelearning.apple.com/research/homomorphic-encry...
        
           | randunel wrote:
           | Is there a way to verify the claims of obfuscation, security
           | and privacy? Or is the only verifiable fact the sending of
           | unknown data to apple by the photos app?
        
       | varenc wrote:
       | This whole thing is reminding me of the outrage over Apple and
       | Google's privacy preserving 'Exposure Notification System' system
       | from the Covid years. It defies intuition that they can alert you
       | to exposure without also tracking you, but indeed that's what the
       | technology lets you do.
       | 
       | Similarly here, it feels like the author is leaning into a knee
       | jerk reaction about invasion of privacy without really trying to
       | evaluate the effectiveness of the technologies here (client side
       | vectorization, differential privacy, OHTTP relays, and
       | homomorphic encryption).
       | 
       | Though I 100% agree Apple should ask the user for consent first
       | for a feature like this.
        
         | naruhodo wrote:
         | I would love to evaluate the privacy of these technologies.
         | 
         | Someone reply with a link to the source code so I can see
         | exactly what it is doing, without having to take an internet
         | rando's word for it.
         | 
         | Better yet, let me compile it myself.
        
           | ok_dad wrote:
           | You had better build your own silicon chips and phone
           | hardware as well in that case.
        
             | do_not_redeem wrote:
             | Don't let perfect be the enemy of good.
        
               | Schiendelman wrote:
               | That's what rallying against Apple in the name of privacy
               | is already...
        
               | do_not_redeem wrote:
               | Are you saying we shouldn't hold Apple accountable for
               | privacy encroachments then?
        
               | Schiendelman wrote:
               | I don't think this is. I think this is well within the
               | search improvement config that started in like iOS 15.
        
             | naruhodo wrote:
             | So my options are unreservedly trust Apple or etch my own
             | transistors? Who pays _your_ salary, Rando?
        
               | ok_dad wrote:
               | The comment I was replying to was stating that source
               | code was necessary to solve privacy, I just said you'd
               | need to get down to silicon if you're going that far.
               | Don't be rude. I'm unemployed right now.
        
           | jmb99 wrote:
           | If you have the capability to actually skillfully analyze
           | this type of crypto, disassembling the binaries from your
           | device (or at the very least, an ipsw for your device) should
           | be trivial.
           | 
           | After all, you wouldn't actually be trusting the source code
           | given to you to match what's running on your device, would
           | you?
        
             | realusername wrote:
             | Reverse engineering is a separate skillet on its own, on
             | top of the other ones you need to read the source code and
             | good developers aren't necessarily good at that.
             | 
             | > After all, you wouldn't actually be trusting the source
             | code given to you to match what's running on your device,
             | would you?
             | 
             | That's why the best practice in the industry follows
             | reproducible builds.
        
           | xenic wrote:
           | You can start with this https://github.com/apple/swift-
           | homomorphic-encryption
           | 
           | Of course it is not the whole technology stack, but it is
           | something at least. If your evaluation leads to potential
           | problems, you can create issues right there on the github
           | project!
        
         | WA wrote:
         | That COVID feature was opt-in. Author is complaining about a
         | lack of opt in now.
        
       | yesfitz wrote:
       | What "places" would Apple be able to match that aren't either
       | obvious or irrelevant? (e.g. "The Grand Canyon" is obvious. The
       | interior of a home is irrelevant.) Maybe the interior of a
       | restaurant?
       | 
       | I hate to jump to conspiracy theories, but the mechanism seems
       | like a more effective version of their prior CSAM scanning
       | scheme[1]. Best part? It doesn't even require iCloud usage.
       | 
       | 1: https://www.macworld.com/article/1428633/csam-photo-
       | scanning...
        
       | amelius wrote:
       | Why does Apple keep insisting that I want anything to do with
       | them after I bought the device?
       | 
       | Let go of the freaking tether.
       | 
       | And let me choose my own photo editing service provider.
        
         | dangus wrote:
         | You literally can choose your own photo software. You don't
         | have to use Photos at all. Just get another app.
        
         | Hilift wrote:
         | Perhaps it's the legion of kool-aid drinkers that preceded you
         | and influenced their design.
        
         | slashdave wrote:
         | > Why does Apple keep insisting that I want anything to do with
         | them after I bought the device?
         | 
         | Um, because that is what their customers are asking for?
        
       | globalnode wrote:
       | whats the 'feature' in the app that requires sending a hash off
       | device?
        
         | fmajid wrote:
         | Apparently, if you take a photo of the Taj Mahal, it will find
         | photos other people took of the Taj Mahal so you can feel
         | inadequate.
        
         | 0points wrote:
         | It's called "Enhanced Visual Search" and they described it like
         | this:
         | 
         | > Allow this device to privately match places in your photos
         | with a global index maintained by Apple so you can search by
         | almost any landmark or point of interest.
         | 
         | I guess it's useful if you get very lost and don't recognize
         | the pyramids right in front of you, and forgot you have a GPS
         | in the phone.
        
       | sss111 wrote:
       | I am on the latest iOS, just discovered this setting, it wasn't
       | turned on by default.
        
         | gwill wrote:
         | odd, i'm also on latest but it was enabled
        
         | blindriver wrote:
         | Mine was turned on by default, I just checked.
        
           | Schiendelman wrote:
           | Did you upgrade from a previous version?
        
         | accrual wrote:
         | Maybe the new setting checks a certain combination of other
         | settings first, like:                   if (setting_1 &&
         | setting_2 && !setting_3)         {             new_setting =
         | true         }
         | 
         | Mine was also enabled by default.
        
           | sss111 wrote:
           | I have Photos shared library set up, one of the members of
           | the library is a child account -- I wonder if that's got
           | anything to do with it.
        
       | paxys wrote:
       | So basically - You take a picture. Apple encrypts it and uploads
       | it to their server. The server matches the (still encrypted)
       | picture to a database and tells your device "this picture
       | contains the Eiffel Tower". Later when you search for Eiffel
       | Tower on your device the photo pops up.
       | 
       | Is the complexity and security risk really worth it for such a
       | niche feature?
       | 
       | It's also funny that Apple is simultaneously saying "don't worry
       | the photo is encrypted so we can't read it" and "we are
       | extracting data from the encrypted photo to enhance your
       | experience".
        
         | RandallBrown wrote:
         | Is this a niche feature? I use this kind of search very often
         | in my photos.
        
           | aeyes wrote:
           | What are some example keywords? I have never searched for
           | landmarks, I only search for location.
           | 
           | How many landmarks are there to justify sending your data
           | off? Can't the database be stored on the device?
        
             | sprayk wrote:
             | cat, pizza, dog, red car, Tokyo, dolphins, garden
             | 
             | The usual context is "oh I saw some dolphins forever ago.
             | let me see if I can find the photos..."
        
               | aeyes wrote:
               | All of these worked in the past without sending any data
               | off the device.
               | 
               | But now you suddenly need the cloud to have it recognize
               | the Eiffel tower.
        
             | mp05 wrote:
             | One time in Paris I received a creme brulee that strongly
             | resembled the face of Jesus Christ, so naturally I took a
             | picture before devouring it.
             | 
             | Last night I was able to find the image using the single
             | word "creme". Definitely saved the story.
        
         | IncreasePosts wrote:
         | Not really. It's more like apple runs a local algorithm that
         | takes your picture of the Eiffel tower, and outputs some text
         | "Eiffel tower, person smiling", and then encrypts that text and
         | sends it securely to apples servers to help you when you
         | perform a search.
        
           | internetter wrote:
           | OP was wrong, but this is even wronger
           | 
           | Locally, a small ML model identifies potential POIs in an
           | image.
           | 
           | Another model turns these regions into a series of numbers (a
           | vector) that represent the image. For instance, one number
           | might correlate with how "skyscraper-like" the image is. (We
           | don't actually know the definition of each dimension of the
           | vector, but we can turn an image that we _know_ is the eiffel
           | tower into a vector, and measure how closely our reference
           | image and our sample image are located)
           | 
           | The thing is, we aren't storing this database with the
           | vectors of all known locations on our phone. We _could_ send
           | the vector we made on device off to Apple 's servers. The
           | vector is lossy, after all, so apple wouldn't have the image.
           | If we did this, however, apple would know that we have an
           | image of the eiffel tower.
           | 
           | So, this is the magic part. The device encrypts the vector
           | using a private key known only to it, then sends this
           | unreadable vector off to the server. Somehow, using
           | Homomorphic Encryption and other processes I do not
           | understand, mathematical operations like cosine similarity
           | can be applied to this encrypted vector _without_ reading the
           | actual contents of the vector. Each one of these operations
           | changes the value, which is still encrypted, but we do not
           | know _how_ the value changed.
           | 
           | I don't know if this is exactly what Apple does, I think they
           | have more efficient ways, but theoretically what you could do
           | is apply each row in your database to this encrypted value,
           | in such a way that the encrypted value becomes the name of
           | the POI of the best match, or otherwise junk is appended
           | (completely changing the encrypted value) Again, the server
           | has not _read_ the encrypted value, it does not know which
           | row won out. Only the client will know when it decrypts the
           | new value.
        
         | goodluckchuck wrote:
         | How cannot tell the picture contains the Eiffel Tower if the
         | image is not decrypted?
        
           | ls612 wrote:
           | Because it turns out that mathematicians and computer
           | scientists have devised schemes that allow for certain
           | computational operations to be performed on encrypted data
           | without revealing the data itself. You can do a+b=c and it
           | doesn't reveal anything about what a and b are is the
           | intuition here. This has been mostly confined to the realm of
           | theory and mathematics until very recently but Apple has
           | operationalized it for the first time.
        
             | CodeWriter23 wrote:
             | And then when the system does the computation to determine
             | your location (wait.what?)
        
               | dwaite wrote:
               | The phone has intelligence to detect things that look
               | like landmarks, and does cropping/normalization and
               | converts to a mathematical form.
               | 
               | Apple has a database trained on multiple photos of each
               | landmark (or part of a landmark), to give a likelihood of
               | a match.
               | 
               | Homomorphic encryption means that the encrypted
               | mathematical form of a potential landmark from the phone
               | can be applied to the encrypted set of landmark data, to
               | get an encrypted result set.
               | 
               | The phone can then decrypt this and see the result of the
               | query. But anyone else sees this as noise being
               | translated to new noise, including Apple's server.
               | 
               | The justification for this approach is storage - the data
               | set of landmarks can only get larger as the data set gets
               | more comprehensive. Imagine trying to match photos for
               | inside castles, cathedrals and museums as examples.
        
             | Klonoar wrote:
             | It's not the first time Apple operationalized it, they did
             | it for Caller ID awhile back.
        
         | sinuhe69 wrote:
         | They don't send the photos. Nobody sent your photos anywhere
         | but only certain meta data and its similarity vectors for
         | matching purpose.
        
         | Arn_Thor wrote:
         | They don't send the photo. They send some encrypted metadata to
         | which some noise is added. The metadata can be loosely
         | understood as "I have this photo that looks sort of like this".
         | Then the server takes that encrypted data from the anonymized
         | device and responds something like "that looks like the Eiffel
         | Tower" and sends it back to the device. The actual photo never
         | goes to the server.
        
           | dwaite wrote:
           | With the added caveat that HE is magic sauce - so the server
           | cannot see the metadata (cropped/normalized image data), and
           | doesn't know how much it does or does not look like the
           | Eiffel Tower.
        
       | shepherdjerred wrote:
       | It sounds like Apple implemented a feature and went out of the
       | way to preserve privacy. For most users the correct choice here
       | is to enable the feature by default.
        
       | xyst wrote:
       | Got to feed their piss poor "Apple Intelligence", somehow.
        
       | nixpulvis wrote:
       | I'm disappointed in the discourse around Homomorphic Encryption
       | and Differential Privacy here. As someone who briefly studied
       | adjacent subjects these tools excite me more than they scare me.
       | 
       | We trust TLS with our SSNs and CC numbers, hopefully one day we
       | can trust secure multiparty computation too.
        
         | 1shooner wrote:
         | >We trust TLS
         | 
         | Is this really an apt comparison? I understood the trust in TLS
         | to be built on open RFCs and implementation stacks. Even then,
         | whenever I send private data, I take specific steps to verify I
         | am using that trusted stack. That is not the experience
         | described in the article.
        
           | warkdarrior wrote:
           | > I take specific steps to verify I am using that trusted
           | stack
           | 
           | I would be very interested to hear what these specific steps
           | are. How do you make sure that this TLS stack really does
           | implement the RFC? How do you know that each connection is
           | indeed encrypted, and it doesn't start sending plaintext
           | after, say 30 days of usage?
        
       | quyleanh wrote:
       | If they advertise "What happens on your iPhone, stays on your
       | iPhone.", these kinds of options must be off by default. Since
       | they are on, I can consider this one and their other statements
       | about privacy lie.
        
       | jchw wrote:
       | What I want is very simple: I want software that doesn't send
       | anything to the Internet without _some_ explicit intent _first_.
       | All of that work to try to make this feature plausibly private is
       | cool engineering work, and there 's absolutely nothing wrong with
       | implementing a feature like this, but it should absolutely be
       | opt-in.
       | 
       | Trust in software will continue to erode until software stops
       | treating end users and their data and resources (e.g. network
       | connections) as the vendor's own playground. Local on-device data
       | shouldn't be leaking out of radio interfaces unexpectedly,
       | period. There should be a user intent tied to any feature where
       | local data is sent out to the network.
       | 
       | So why didn't Apple just simply ask for user permission to enable
       | this feature? My cynical opinion is because Apple knows some
       | portion of users would instantly disallow this if prompted, but
       | they feel they know better than those users. I don't like this
       | attitude, and I suspect it is the same reason why there is an
       | increasing discontent growing towards opt-out telemetry, too.
        
         | sneak wrote:
         | Developers of software want, and feel entitled to, the data on
         | your computer, both about your usage within the app, as well as
         | things you do outside of the app (such as where you go and what
         | you buy).
         | 
         | Software will continue to spy on people so long as it is not
         | technically prohibited or banned.
        
           | callc wrote:
           | I don't.
           | 
           | I highly suggest everyone else does their darnedest not too
           | either. Don't do it in your own software. Refuse and push
           | back against it at $dayJob.
           | 
           | I realize that my small contribution as a privacy and data-
           | respecting SWE is extremely small, but if we all push back
           | against the MBAs telling us to do these things, the world
           | will be better off.
        
             | fragmede wrote:
             | Why do you assume it's MBA driven? As a software developer,
             | I like knowing when my software crashes so that I can fix
             | it. I don't care or even want to know who you are, your IP
             | address, or anything that could be linked back to you in
             | any way, but I can't fix it if I don't know that it's
             | crashing in the first place.
        
               | walterbell wrote:
               | Customers can (optionally) submit crash logs via email or
               | support portal.
               | 
               | Apple iOS provides crash logs via the following
               | navigation path:                 Privacy & Security
               | Analytics Data           AppName-date-time.ips
               | 
               | Notice Apple's choice of top-level menu for crash logs?
        
               | criddell wrote:
               | Makes sense to me. Many exploits use a crash as their
               | wedge into a system.
        
               | fragmede wrote:
               | And of course you've reported every single crash you've
               | encountered via email or support portal?
               | 
               | Normal people don't email support with crash logs, they
               | just grumble about it to their coworkers and don't help
               | fix the problem. You can't fix a problem you don't know
               | about.
        
               | AlexandrB wrote:
               | > You can't fix a problem you don't know about.
               | 
               | And yet we don't have home inspectors coming into our
               | homes unannounced every week just to make sure everything
               | is ok. Why is it that software engineers feel so entitled
               | to do things that no other profession does?
        
               | fragmede wrote:
               | Because software is digital and different than the
               | physical world and someone like you understands that.
               | It's intellectually dishonest to pretend otherwise. How
               | hard is it to make a copy of your house including all the
               | things inside of it? Can you remove all personally
               | identifying features from your house with a computer
               | program? Analogies have their limitations and don't
               | always lead to rational conclusions. Physicists had to
               | contend with a lot of those after Stephen Hawking wrote
               | his book about black holes with crazy theories that don't
               | make sense if you know the math behind them.
               | 
               | Downloading a torrent isn't the same thing as going to
               | the record store and physically stealing a CD, and
               | regular people can also understand that there's a
               | difference between the invasiveness of a human being
               | entering your house and someone not doing that. So either
               | people can understand torrenting isn't the same as going
               | into a store a physically stealing something and
               | anonymized crash logs aren't the same thing as a home
               | inspector coming into your house, or Napster and
               | torrenters actually owe the millions that the RIAA and
               | MPAA want them to.
               | 
               | I'm not saying that all tracking is unequivocally good,
               | or even okay, some of it is downright _bad_. But let 's
               | not treat people as idiots who can't tell the difference
               | between the digital and physical realm.
        
               | sneak wrote:
               | Once it is running on a computer you don't own, it is no
               | longer your software.
               | 
               | To put it in the language of someone who mistakenly
               | thinks you can own information: data about crashes on
               | computers that aren't yours simply doesn't belong to you.
        
               | fragmede wrote:
               | If you don't feel a responsibility for inflicting pain on
               | other people, that's on you. I'm not a sociopath.
        
               | AlexandrB wrote:
               | The pain of a possible future panopticon dwarfs the
               | "pain" of some software crashing. Considering long-term
               | outcomes does not make you a sociopath - quite the
               | opposite.
        
               | fragmede wrote:
               | Ah, the slippery slope fallacy!
        
             | caseyy wrote:
             | So long as a significant portion of companies harvest user
             | data to provide "free" services, no well-meaning business
             | can compete with their paid apps. Not in a real way.
             | 
             | It's the prisoner's dilemma, but one vs many instead of one
             | vs one. So long as someone defects, everyone either defects
             | or goes out of business.
             | 
             | It's the same as with unethical supply chains. A business
             | using slave labour in their supply chain will out-compete
             | all businesses that don't. So well-meaning business owners
             | can't really switch to better supply chains as it is the
             | same as just dissolving their business there and then.
             | 
             | Only universal regulation can fix this. If everyone is
             | forced not to defect, we can win the prisoners dilemma. But
             | so long as even 10% of big tech defects and creates this
             | extremely lucrative business of personal data trade that
             | kills every company not participating, we will continue to
             | participate more and more.
             | 
             | Read Meditations on Moloch for more examples.
        
             | thrance wrote:
             | You sadly can't fix a systemic issue by telling individual
             | workers what not to do. There's too much money on the line.
        
               | namtab00 wrote:
               | And in sw dev, especially the US flavor, individual
               | workers are highly directly incentivized to hit that next
               | earnings target via vesting programs...
        
               | saagarjha wrote:
               | Individual developers have very limited impact on
               | earnings targets.
        
           | j45 wrote:
           | Only recently. If anyone's grown up with a world only knowing
           | this, it might be part of why it might not stand out as much.
        
           | Gigachad wrote:
           | In the OP article it seems more like users demand to search
           | their photos by text, and Apple has put in a huge effort to
           | enable that without gaining access to your photos.
        
             | j45 wrote:
             | Seems?
             | 
             | It's important to read, and not skim.
             | 
             | This is extracting location based data, not content based
             | data in an image (like searching for all photos with a cup
             | in it).
             | 
             | "Enhanced Visual Search in Photos allows you to search for
             | photos using landmarks or points of interest. Your device
             | privately matches places in your photos to a global index
             | Apple maintains on our servers. "
        
           | neverartful wrote:
           | Years ago I developed for iOS as an employee. In my case, it
           | was the product managers that wanted the data. I saw it as a
           | pattern and I hated it. I made my plans to leave that space.
        
         | joe_the_user wrote:
         | _and there 's absolutely nothing wrong with implementing a
         | feature like this, but it should absolutely be opt-in_
         | 
         | This feature is intended to spy on the user. Those kinds of
         | features can't be opt-in. (And yeah, holomorophic "privacy
         | preserving" encryption song-and-dance, I read about that when
         | it came out, etc).
        
           | throw10920 wrote:
           | This is an incredibly shallow dismissal that states the
           | opposite of Apple's claim with zero evidence or reasoning and
           | hand-waves away the very real and well-researched field of
           | homomorphic encryption.
        
         | raincole wrote:
         | > Trust in software will continue to erode
         | 
         | > there is an increasing discontent growing towards opt-out
         | telemetry
         | 
         | Really? That's news to me. What I observed is people giving up
         | more and more privacy every year (or "delegating" their privacy
         | to tech giants).
        
           | isodev wrote:
           | Do you honestly believe people understand what they're doing?
           | 
           | Nowhere in marketing materials or what passes for
           | documentation on iOS we see an explanation of the risks and
           | what it means for one's identity to be sold off to data
           | brokers. It's all "our 950 partners to enhance your
           | experience" bs.
        
             | raincole wrote:
             | > Do you honestly believe people understand what they're
             | doing?
             | 
             | No.
        
           | jchw wrote:
           | Absolutely! The important bit is that users have no choice in
           | the matter. They're pushed into agreeing to whatever ToS and
           | updating to whatever software version.
           | 
           | The backlash against Microsoft's Windows Recall should serve
           | as a good indicator of just how deeply people have grown to
           | distrust tech companies. But Microsoft can keep turning the
           | screws, and don't you know it, a couple years from now
           | everyone will be running Windows 11 anyways.
           | 
           | It's the same for Android. If you really want your Android
           | phone to be truly private, you can root it and flash a custom
           | ROM with microG and an application firewall. Sounds good! And
           | now you've lost access to banking apps, NFC payments, games,
           | and a myriad of other things, because your device no longer
           | passes SafetyNet checks. You can play a cat-and-mouse game
           | with breaking said checks, but the clock is ticking, as
           | remote attestation will remove what remains of your agency as
           | soon as possible. And all of that for a notably worse
           | experience with less features and more problems.
           | 
           | (Sidenote: I think banking apps requiring SafetyNet passing
           | is the dumbest thing on planet earth. You guys know I can
           | just sign into the website with my mobile browser anyways,
           | right? You aren't winning anything here.)
           | 
           | But most users are never going to do that. Most users will
           | boot into their stock ROM, where data is siphoned by default
           | and you have to agree to more data siphoning to use basic
           | features. Every year, users will continue to give up every
           | last bit of agency and privacy so as long as tech companies
           | are allowed to continue to take it.
        
             | spoaceman7777 wrote:
             | fwiw, on Android, you can install a custom certificate and
             | have an app like AdGuard go beyond just DNS filtering, and
             | actually filter traffic down to a request-content level. No
             | root required. (iOS forbids this without jailbreaking
             | though :/)
        
               | randunel wrote:
               | Both android and ios allow root certificates, but most
               | apps nowadays use SSL pinning, so that's no longer an
               | option, either.
        
             | mikae1 wrote:
             | If you accept Android as an option, then GrapheneOS
             | probably check a lot of your boxes on an OS level.
             | GrapheneOS developers sit between you and Google and make
             | sure that shit like this isn't introduced without the
             | user's knowledge. They actively strip out crap that goes
             | against users interests and add features that empower us.
             | 
             | I find that the popular apps for basic operation from
             | F-Droid do a very good job of not screwing with the user
             | either. I'm talking about DAVx5, Etar, Fossify Gallery,
             | K-9/Thunderbird, AntennaPod etc. No nonsense software that
             | does what I want and nothing more.
             | 
             | I've been running deGoogled Android devices for over a
             | decade now for private use and I've been given Apple
             | devices from work during all those years. I still find find
             | the iOS devices to be a terrible computing experience.
             | There's a feeling of being reduced to a mere consumer.
             | 
             | GrapheneOS is the best mobile OS I've ever tried. If you
             | get a Pixel device, it's dead simple to install via your
             | desktop web browser[1] and has been zero maintenance.
             | Really!
             | 
             | [1] https://grapheneos.org/install/web
        
               | doublebind wrote:
               | That's great... for the HN reader.
               | 
               | However, how is that supposed to work for your
               | significant other, or your mother, or your indifferent-
               | to-technology friend?
               | 
               | Don't get me wrong, I also strive to keep my device's
               | information private but, at the same time, I realize this
               | has no practical use for most users.
        
               | mikae1 wrote:
               | We can act on two levels. We (as a society) can work for
               | regulation and we (computery people) can take direct
               | action by developing and using software and hardware that
               | works in the user's interest. One does not exclude the
               | other.
               | 
               | That said. You can order a Pixel with GrapheneOS pre-
               | installed and Google Apps and services can be isolated.
        
               | prmoustache wrote:
               | You install it from them. Past the initial install they
               | get OTA updates.
               | 
               | Having said that it doesn't prevent them to check the
               | "enable network" option when installing apps.
        
               | mort96 wrote:
               | Running a custom ROM locks you out of almost all decent
               | phone hardware on the market since most have locked
               | bootloaders, and it locks you out of a ton of apps people
               | rely on such as banking and money transfer apps. You must
               | recognise that it's not a practical solution for most
               | people.
        
               | machinestops wrote:
               | Graphene mitigates the locked bootloader issue by only
               | supporting one line of phones (Pixel), which have
               | unlocked bootloaders.
               | 
               | A large amount of work has been put into making Graphene
               | specifically work with banking apps. Mine does, for
               | instance.
        
               | UnreachableCode wrote:
               | Are Calyx or Lineage worth a look? It's a tough choice
               | between the 3.
        
               | machinestops wrote:
               | I've used Lineage. I'd say it's worth a look, yes. I got
               | a Pixel for Graphene, though.
        
               | opan wrote:
               | I've happily used LineageOS without gapps for years
               | across several OnePlus devices. If I ever need a new
               | phone I check their supported devices list to pick, and
               | the stock ROM on my new device gets overwritten the day
               | it arrives. Currently using a OnePlus 8T. When I move on
               | from this device as my primary someday, I may put
               | postmarketOS on it to extend its usefulness.
        
               | chickenfeed wrote:
               | Don't forget Anom. (A cautionary tale.)
               | https://darknetdiaries.com/episode/146/
        
               | mikae1 wrote:
               | _> Running a custom ROM locks you out of almost all
               | decent phone hardware on the market since most have
               | locked bootloaders_
               | 
               | GrapheneOS only works on Pixel devices. Pixel devices are
               | fine. We have reached a point where just about every mid-
               | tier device is fine, really. I run my devices until they
               | are FUBAR or can't be updated due to EOL. EOL for Android
               | (and GrapheneOS) is ~7 years from the release date now.
               | 
               |  _> it locks you out of a ton of apps people rely on such
               | as banking and money transfer apps._
               | 
               | These _can_ be installed and isolated using work or user
               | profiles in GrapheneOS. Also as
               | https://news.ycombinator.com/item?id=42538853 points out,
               | a lot of work has been put into making Graphene work with
               | banking apps[1].
               | 
               |  _> You must recognise that it 's not a practical
               | solution for most people._
               | 
               | Of course I do. We can act on two levels. We (as a
               | society) can work for regulation and we (computery
               | people) can take direct action by developing and using
               | software and hardware that works in the user's interest.
               | One does not exclude the other.
               | 
               | [1] https://privsec.dev/posts/android/banking-
               | applications-compa...
        
               | prmoustache wrote:
               | You don't need tons of choice, but sufficient
               | availability of a decent enough choice. The google piexel
               | line supported by grapheneos is one.
               | 
               | My budget didn't allow me to buy a brand new one but I
               | could buy a second hand pixel 6a for 200EUR.
               | 
               | Having said that you can also use an older phone with
               | /e/os or lineageos and avoid apps that tracks you by
               | limiting to android apps without telemetry available on
               | f-droid.
        
               | Nullabillity wrote:
               | As the GP already mentioned, F-Droid (as great as it is)
               | won't help you access your bank account.
        
             | mormegil wrote:
             | Completely agree, just one minor point:
             | 
             | > I think banking apps requiring SafetyNet passing is the
             | dumbest thing on planet earth. You guys know I can just
             | sign into the website with my mobile browser anyways,
             | right?
             | 
             | No, you're not. For logging in, you need a mobile app used
             | as an authentication token. Do not pass go, do not collect
             | $200... (The current state of affairs in Czechia, at least;
             | you still _do_ have the option of not using the app _for
             | now_ in most banks, using password + SMS OTP, but you need
             | to pay for each SMS and there is significant pressure to
             | migrate you from it. The option is probably going to be
             | removed completely in future.)
        
               | jchw wrote:
               | Right now I don't think there's anything like this in the
               | United States, at the very least. That said, virtually
               | every bank here _only_ seems to support SMS 2FA, which is
               | also very frustrating.
        
               | chickenfeed wrote:
               | It's actually a real drag. I live in a rural area and the
               | mobile signal is up and down. Sometimes I don't get SMSs
               | for hours to a day late.
        
             | latexr wrote:
             | > Absolutely! The important bit is that users have no
             | choice in the matter.
             | 
             | If people don't have a choice, then they're not _giving_ up
             | privacy, like the person you're agreeing with said, it's
             | being taken away.
        
               | tjoff wrote:
               | Opt out is portrayed as a choice when it barely is.
               | Because it is very tiresome to always research what
               | avenues exist and explicitly opt put of them and then
               | constantly having to review that option to make sure it
               | isnt flipped in an update or another switch has appeared
               | that you also need to opt out of.
               | 
               | Maybe you need to set an environment variable. Maybe that
               | variable changes. It is pretty exhausting so I can
               | understand people giving up on it.
               | 
               | Is that really giving up on it though? Or are they
               | contorted to it?
               | 
               | If you do anything on the radio without the users
               | explicit consent you are actively user hostile. Blaming
               | the user for not exercising his/her right because they
               | didn't opt out is weird.
        
           | Gigachad wrote:
           | Apple seems to be the best option here too. They seem to have
           | put in a huge effort to provide features people demand
           | (searching by landmarks in this case) without having to share
           | your private data.
           | 
           | It would have been so much easier for them to just send the
           | whole photo as is to a server and process it remotely like
           | Google does.
        
           | codedokode wrote:
           | One of the reasons is because telemetry and backdoors are
           | invisible. If the phone was showing a message like "sending
           | your data to Cupertino" then users were better aware of this.
           | Sadly I doubt there will be a legal requirement to do this.
        
             | j45 wrote:
             | Anything is possible through lobbying for regulation and
             | policy.
             | 
             | It's the same way that bills come out to crack people's
             | policy.
             | 
             | Only people don't always know they can demand the opposite
             | so it never gets messed with again, and instead get roped
             | into fatigue of reacting to technology bills written by
             | non-technology people.
        
           | latexr wrote:
           | > What I observed is people giving up more and more privacy
           | every year (or "delegating" their privacy to tech giants).
           | 
           | Are people _giving_ up their privacy? Looks to me it's being
           | _taken_ without consent, via enormous legalese and techniques
           | of exhaustion.
        
             | j45 wrote:
             | Totally.
             | 
             | Individuals who grew up primarily as consumers of tech,
             | also have consented to a relationship of being consumed,
             | bought, and sold themselves as the product.
             | 
             | Those who grew up primarily as creators with tech, have
             | often experienced the difference.
             | 
             | This creates a really big blind spot potentially.
        
           | m463 wrote:
           | Come on, being forced to give up privacy is eroding privacy
           | and increasing discontent.
           | 
           | forced can also mean the whole no privacy by default and dark
           | patterns everywhere.
        
           | saghm wrote:
           | Whether or not people in general are aware of this issue and
           | care about it, I think it's pretty disingenuous to
           | characterize people as willfully giving up their privacy
           | because they own smartphone. When stuff like this is
           | happening on both iOS and Android, it's not feasible to avoid
           | this without just opting out of having a smartphone entirely,
           | and representing as a binary choice of "choose privacy or
           | choose not to care about privacy" is counterproductive,
           | condescending, and a huge oversimplification.
        
             | hn3er1q wrote:
             | Maybe not privacy in general but this is about location
             | privacy.
             | 
             | If you have a smartphone in your pocket, then, for better
             | or worse, you're carrying a location tracker chip on your
             | person because that's how they all work. The cell phone
             | company needs to know where to send/get data, if nothing
             | else.
             | 
             | It seems disingenuous to put a tracker chip in your pocket
             | and be up in arms that someone knows your location.
             | 
             | Unless this kerfuffle is only about Apple.
        
         | colordrops wrote:
         | Use a rooted Android phone with AFWall+ installed, with default
         | block rules. Even just LineageOS allows you to set granular
         | network settings per app, though it's not preemptive like
         | AFWall.
        
           | mdmsknrjjdndd wrote:
           | "Use a rooted..."
           | 
           | Aaaaand no.
        
             | pmlnr wrote:
             | So you don't want to actually own your devices?
        
               | highwaylights wrote:
               | This line of thinking ignores a whole bunch of legitimate
               | reasons why people knowledgeable enough to root their
               | phone still choose not to, not least of which is that I
               | have to exchange trusting a large corporation with a
               | financial incentive to keep my device secure
               | (regulations, liability) with an Internet anon with
               | incentive to do the opposite (no direct compensation, but
               | access to banking apps on the user's device).
               | 
               | Even in the case where I'm willing to risk trusting the
               | developer, they have literally zero resources to pen test
               | the software I'll be running my banking apps on, and in
               | the case of Android roms need to run known vulnerable
               | software (out-of-support source-unavailable binary blobs
               | for proprietary hardware that were never open-sourced).
               | 
               | The same argument was made about TPM's on PC's and
               | against Windows 11 for years (that they should just be
               | disabled/sidestepped). It only holds water if you don't
               | understand the problem the device solves for or have a
               | suitable alternative.
        
           | gtdawg wrote:
           | Can't run various banking apps and can't run PagerDuty on a
           | rooted device due to Google Play API Integrity Check. The
           | ecosystem is closing in on any options to not send telemetry,
           | and Google is leading the way in the restrictions on Freedom.
        
             | pmlnr wrote:
             | You can, with magisk.
        
               | mschuster91 wrote:
               | That's a cat and mouse game, and some banking apps still
               | manage to detect rooting, and so does Netflix and Google
               | Pay.
        
               | colordrops wrote:
               | Not really though. I've been using LineageOS+Magisk for
               | at least 6 years and haven't found an app that worked
               | that stopped working all of a sudden. I'm using all my
               | banking apps and everything else without issue, and have
               | been for a long time. Doesn't seem like the app devs are
               | hellbent on blocking those willing to use Magisk.
        
             | sofixa wrote:
             | > Google is leading the way in the restrictions on Freedom.
             | 
             | They're the ones allowing you to root your phone or flash a
             | custom ROM in the first place, so that's not a fair
             | characterisation. Banks have a vested interest in reducing
             | fraud, and a rooted Android might allow for easier and
             | additional attack vectors into their apps and thus systems.
        
             | colordrops wrote:
             | Naw, using Magisk and it's zygisk denylist it usually
             | works. I haven't been blocked by an app yet, including
             | pagerduty.
        
         | jaza wrote:
         | Arrogant Apple always knows best! Which is why I've always
         | said, and I'll continue saying, fuck Apple.
        
         | miki123211 wrote:
         | Opt in doesn't work, it never did.
         | 
         | The vast majority (>95%) of users does not understand what
         | those pop-ups say, seems fundamentally incapable of reading
         | them, and either always accepts, always rejects, or always
         | clicks the more visually-appealing button.
         | 
         | Try observing a family member who is not in tech and not in the
         | professional managerial class, and ask them what pop-up they
         | just dismissed and why. It's one of the best lessons in the
         | interactions between tech and privacy you can get.
        
           | Longhanks wrote:
           | Well, then >95% of users won't be using $FEATURE. Simple as
           | that. The fact that users for some reason no not consent to
           | $FEATURE the way corporations/shareholders would want them to
           | does not give anyone the right to stop asking for consent in
           | the first place.
        
           | jermaustin1 wrote:
           | > Try observing a family member who is not in tech
           | 
           | This is everyone, it is universal, I've met many people "in
           | tech" who also click the most "visually appealing" button
           | because they are trying to dismiss everything in their way to
           | get to the action they are trying to complete.
           | 
           | The microcosm that is HN users might not just dismiss things
           | at the 95%+ rate, but that is because we are fed, every day,
           | how our data is being misappropriated ate every level. I
           | think outside of these tiny communities, even people in tech,
           | are just clicking the pretty button and making the dialog go
           | away.
        
           | mihaaly wrote:
           | Opt in works!
           | 
           | If you do not want it (and that is >90% of people, who never
           | asked for it, never requested it, but was forced upon them
           | these 'enriched' lies and exposure to corporate greed).
        
           | jasoneckert wrote:
           | When looked at from another angle, opt-in does work.
           | 
           | By adding that extra step forcing users to be aware of (and
           | optionally decline) the vendors collection of personal data,
           | it adds a disincentive for collecting the data in the first
           | place.
           | 
           | In other words, opt-in can be thought of as a way to
           | encourage vendors to change their behaviour. Consumers who
           | don't see an opt-in will eventually know that the vendor
           | isn't collecting their information compared to others and
           | trust the product more.
        
           | SamInTheShell wrote:
           | Informed consent is sexy. In the Apple ecosystem, we're
           | literally paying customers. This is ridiculous. This line you
           | parroted is ridiculous. This needs to stop.
        
           | spookie wrote:
           | The issue really isn't opt-in itself but how the option is
           | presented.
           | 
           | I agree that a lot of people don't read, or attempt to
           | understand the UI being presented to them in any meaningful
           | manner. It really is frustrating seeing that happen.
           | 
           | But, think about the "colorful" option you briefly mentioned.
           | Dark patterns have promoted this kind of behaviour from
           | popups. The whole interaction pattern has been forever
           | tainted. You need to present it in another way.
        
           | jchw wrote:
           | As much as I hate cookie consent dialogs everywhere, the fact
           | is that it is clearly working. Some companies are going as
           | far as to force users to pay money in order to be able to opt
           | out of data collection. If it wasn't so cumbersome to opt-
           | out, I reckon the numbers for opt-out would be even higher.
           | And if companies weren't so concerned about the small portion
           | of users that opt-out, they wouldn't have invested in finding
           | so many different dark patterns to make it hard.
           | 
           | It is definitely true that most users don't know what they're
           | opting out of, they just understand that they have basically
           | nothing to gain anyway, so why opt-in?
           | 
           | But actually, that's totally fine and working as intended. To
           | be fair to the end user, Apple has done something extremely
           | complicated here, and it's going to be extremely hard for
           | anyone except for an expert to understand it. A privacy-
           | conscious user could make the best call by just opting out of
           | any of these features. An everyday user might simply choose
           | to not opt-in because they don't really care about the
           | feature in the first place: I suspect that's the real reason
           | why many people opt-out in the first place, you don't need to
           | understand privacy risks to know you don't give a shit about
           | the feature anyway.
        
         | that_guy_iain wrote:
         | > Trust in software will continue to erode until software stops
         | treating end users and their data and resources (e.g. network
         | connections) as the vendor's own playground. Local on-device
         | data shouldn't be leaking out of radio interfaces unexpectedly,
         | period. There should be a user intent tied to any feature where
         | local data is sent out to the network.
         | 
         | I find that there is a specific niche group of people who care
         | very much about these things. But the rest of the world
         | doesn't. They don't want to care about all these little
         | settings they're just "Oh cool it knows it's the Eiffel tower".
         | The only people who are becoming distrusting of software are a
         | specific niche group of people and I highly suspect they're
         | going to be mad about something.
         | 
         | > So why didn't Apple just simply ask for user permission to
         | enable this feature?
         | 
         | Because most people don't even care to look at the new features
         | for a software update. And let's be serious that includes most
         | of us here otherwise, this feature would have been obvious. So
         | why create a feature that no one will use? It doesn't make
         | sense. So you enable it for everyone and those who don't want
         | it opt-out.
        
         | pjmlp wrote:
         | Most people nowadays use Web based apps, which don't even need
         | to ask anything, who knows what server side is doing.
         | 
         | Which is kind of ironic in places like HN, where so many
         | advocate for Chromebooks.
        
           | j45 wrote:
           | Your location data, encoded in photo you take with the
           | phone's camera, being extracted by Apple is what this article
           | is about.
           | 
           | How many people use a web based camera or web based photo
           | album app?
        
             | pjmlp wrote:
             | GeoIP in every Web request, unless a VPN is being used,
             | alongside scrambling Mac addresses.
        
         | avianlyric wrote:
         | > So why didn't Apple just simply ask for user permission to
         | enable this feature?
         | 
         | That's an interesting question. Something to consider, iOS
         | photos has allowed you to search for photos using the address
         | the photo was taken at. To do that requires the Photos app to
         | take the lat/long of a photos location, and do a reverse-geo
         | lookup to get a human understandable address. Something that
         | pretty much always involves querying a global reverse-geo
         | service.
         | 
         | Do you consider this feature to be a violation of your privacy,
         | requiring an opt-in? If not, then how is a reverse-geo lookup
         | service more private than a landmark lookup service?
        
           | lapcat wrote:
           | This would work only if you've already given the Camera app
           | permission to geotag your photos, which I haven't, so it may
           | be a nonissue.
        
             | brians wrote:
             | It works if you use the Photos app to look at any image
             | with a geo EXIF tag.
             | 
             | But thank you for one more demonstration that even the HN
             | crowd can't reliably give or deny informed consent here.
        
               | lapcat wrote:
               | And how, pray tell, do geotagged images magically get
               | into your Photos library?
               | 
               | I actually couldn't get Photos address search to work
               | right in my testing before writing my previous comment,
               | even with a geotagged photo that I just took. So I'm not
               | sure whether I have some setting disabled that prevents
               | it.
               | 
               | The only match was via character recognition of a printed
               | form that I had photographed.
               | 
               | To be clear, I meant that it was a nonissue for me,
               | because I don't geotag my photos (except in that one
               | test). Whether it's an issue for other people, I don't
               | know.
               | 
               | One of the problems with iPhone lockdown is that it's a
               | lot more difficult to investigate how things work
               | technically than on the Mac.
        
               | heartbreak wrote:
               | > And how, pray tell, do geotagged images magically get
               | into your Photos library?
               | 
               | By saving them from any source other than the camera app
               | you've configured to not use geo-tagging.
        
               | lapcat wrote:
               | I don't do that.
               | 
               | My Photos library contains camera snapshots and
               | screenshots, nothing else.
        
               | heartbreak wrote:
               | The point still stands. That's how geo-tagged images get
               | in your photos, and the search function still works.
               | 
               | For what it's worth, I'm surprised that you never save
               | photos to your phone that you didn't take yourself. Do
               | people not send you interesting pictures? Pictures of
               | yourself?
        
               | lapcat wrote:
               | I don't actually use my phone for much. Taking photos,
               | making phone calls. I'm not the kind of person who lives
               | on their phone. I live on my laptop (which has Location
               | Services disabled entirely).
               | 
               | I don't even use Photos app on Mac anymore. They've
               | ruined it compared to the old iPhoto. I just keep my
               | photos in folders in Finder.
        
               | asadotzler wrote:
               | you started out suggesting it wasn't possible and when
               | presented with a common case showing it was not only
               | possible but likely, you moved the goal posts and now say
               | "well, I don't do that." Weak sauce.
        
               | lapcat wrote:
               | I admit that I hadn't considered the possibility of
               | importing outside geotagged photos into your own Photos
               | library, because I don't do that. But I also said
               | already, "To be clear, I meant that it was a nonissue for
               | me, because I don't geotag my photos (except in that one
               | test). Whether it's an issue for other people, I don't
               | know."
               | 
               | I don't personally have a strong feeling for or against
               | this particular feature, since I don't use geotagging at
               | all, so it doesn't affect me. I'm neither endorsing nor
               | condemining it offhand. I'll leave it to other people to
               | argue over whether it's a privacy problem. It could be! I
               | just lack the proper perspective on this specific issue.
        
               | j45 wrote:
               | It's all about soundbite replies.
               | 
               | The issue is much deeper for anyone who has remotely
               | worked with EXIF data for any creative or professional
               | work they do.
        
           | hn3er1q wrote:
           | > Do you consider this feature to be a violation of your
           | privacy, requiring an opt-in?
           | 
           | I suppose in some sense it is, as it a reverse-geo lookup
           | service, but it's also no where near to the front in the
           | location privacy war.
           | 
           | Cell phone providers basically know your exact position at
           | all times when you have your phone on you, credit card
           | companies know basically everything, cars track driving
           | directly, etc. etc.
           | 
           | I can see why some people would be up in arms but for me this
           | one doesn't feel like missing the forest for the trees, it
           | feels like missing the forest for the leaves.
        
             | avianlyric wrote:
             | I very much agree with your position. There are legitimate
             | questions to be asked about this feature being opt-in,
             | although we may find that you implicitly opt-in if you
             | enable Apple Intelligence or similar.
             | 
             | But the argument that this specific feature represents some
             | new beachhead in some great war against privacy strikes me
             | as little more that clickbate hyperbole. If Apple really
             | wanted to track people's locations, it would be trivial for
             | them to do so, without all this cloak and dagger nonsense
             | people seem to come up with. Equally, is a state entity
             | wanted to track your location (or even track people's
             | locations at scale), there's a myriad of trivially easy
             | ways for them to do so, without resorting to forcing Apple
             | to spy on their customers via complex computer vision
             | landmark lookup system.
        
           | doctorpangloss wrote:
           | You're right. But: Anyone in IT or tech, thinking deeply
           | about the raw facts. They know it always boils down to trust,
           | not technology.
           | 
           | The interesting thing is that Apple has created a cathedral
           | of seemingly objective sexy technical details that feel like
           | security. But since it's all trust, feelings matter!
           | 
           | So my answer is, if it feels like a privacy violation, it is.
           | Your technical comparison will be more persuasive if you
           | presented it in Computer Modern in a white paper, or if you
           | are an important Substack author or reply guy, or maybe take
           | a cue from the shawarma guy on Valencia Street and do a
           | hunger strike while comparing two ways to get location info.
        
             | benwaffle wrote:
             | Apple chose to implement things like OHTTP and homomorphic
             | encryption when they could easily have done without it.
             | Doesn't that count for something?
        
               | j45 wrote:
               | Nope. It's still taking the user's data away without
               | informing them, and saying trust us we super good
               | encrypted it.
               | 
               | Apple is building a location database, for free, from
               | user's photos and saying it's anonymized.
               | 
               | It's not a service I want, nor one I authorize. Nor are
               | my photos licensed to Apple to get that information from
               | me.
               | 
               | Encryption is only good relative to computational power
               | to break it available to the many, or the few.
               | 
               | Computational power usually seems always available in
               | 10-20-30 years to generally break encryption for the
               | average person, as unimaginably hard it seems in the
               | present. I don't have interest in taking any technical
               | bait from the conversation at hand. Determined groups
               | with resources could find ways.. This results in no
               | security or encryption.
        
               | avianlyric wrote:
               | > Apple is building a location database, for free, from
               | user's photos and saying it's anonymized.
               | 
               | Where on earth did you get that from? The photos app is
               | sending an 8bit embedding for its lookup query, how are
               | they going to build a location database from that?
               | 
               | Even if they were sending entire photos, how do you
               | imagine someone builds a location database from that? You
               | still need something to figure out what the image is, and
               | if you already have that, why would you need to build it
               | again?
               | 
               | > Encryption is only good relative to computational power
               | to break it available to the many, or the few. >
               | Determined groups with resources could find ways.. This
               | results in no security or encryption.
               | 
               | Tell me, do you sell tin foil hats as a side hustle or
               | something? If this is your view on encryption why are you
               | worried about a silly photos app figuring out what
               | landmarks are in your photos. You basically believe that
               | it's impossible for digital privacy of any variety is
               | effectively impossible, and that you also believe this is
               | a meaningful threat to "normal" people. The only way to
               | meet your criteria for safe privacy is to ensue all forms
               | of digital communication (which would include Hacker News
               | FYI). So either you're knowingly making disingenuous
               | hyperbolic arguments, you're a complete hypocrite, or you
               | like to live "dangerously".
        
             | avianlyric wrote:
             | > So my answer is, if it feels like a privacy violation, it
             | is. Your technical comparison will be more persuasive if
             | you presented it in Computer Modern in a white paper, or if
             | you are an important Substack author or reply guy, or maybe
             | take a cue from the shawarma guy on Valencia Street and do
             | a hunger strike while comparing two ways to get location
             | info.
             | 
             | They're broadly similar services, both provided by the same
             | entity. Either you trust that entity or you don't. You
             | can't simultaneously be happy with an older, less private
             | feature, that can't be disabled. While simultaneously
             | criticising the same entity for creating a new feature
             | (that carries all the same privacy risks) that's
             | technically more private, and can be completely disabled.
             | 
             | > The interesting thing is that Apple has created a
             | cathedral of seemingly objective sexy technical details
             | that feel like security. But since it's all trust, feelings
             | matter!
             | 
             | This is utterly irrelevant, you're basically making my
             | point for me. As above, either you do or do not trust Apple
             | to provide these services. The implementation is kinda
             | irrelevant. I'm simply asking people to be a little more
             | introspective, and take a little more time to consider
             | their position, before they start yelling from the rooftops
             | that this new feature represents some great privacy
             | deception.
        
           | j45 wrote:
           | It's a complete violation if it's a new or changed setting
           | from the default state of the user not having it possible.
           | 
           | Something to consider - location is geo-encoded already into
           | photos and doesn't need this uploaded to Apple servers.
           | Searching can be done locally on device for location.
           | 
           | Apple goes as far as to offer a setting to allow the user to
           | share photos and remove the geocoding from it.
           | 
           | Offering a new feature is opt-in.
           | 
           | Unfortunately, against my better wishes, this only erodes
           | trust and confidence in Apple that if this is happening
           | visibly, what could be happening that is unknown.
        
         | mvkel wrote:
         | This mindset is how we got those awful cookie banners.
         | 
         | Even more dialogs that most users will blindly tap "Allow" to
         | will not fix the problem.
         | 
         | Society has collectively decided (spiritually) that it is ok
         | signing over data access rights to third parties. Adding
         | friction to this punishes 98% of people in service of the 2%
         | who aren't going to use these services anyway.
         | 
         | Sure, a more educated populous might tip the scales. But it's
         | not reality, and the best UX reflects reality.
        
           | myspy wrote:
           | I always click disallow.
           | 
           | And if you design software that uses tracking and what not.
           | Go fuck yourself.
        
           | diggan wrote:
           | > This mindset is how we got those awful cookie banners.
           | 
           | The only thing I've found awful is the mindset of the people
           | implementing the banners.
           | 
           | That you feel frustration over that every company has a
           | cookie banner, is exactly the goal. The companies could
           | decide that it isn't worth frustrating the user over
           | something trivial like website analytics, as they could get
           | that without having to show a cookie banner at all.
           | 
           | But no, they want all the data, even though they most likely
           | don't use all of it, and therefore are forced to show the
           | cookie banner.
           | 
           | Then you as a user see that banner, and instead of thinking
           | "What a shitty company that don't even do the minimal work to
           | not having to show me the cookie banner", you end up thinking
           | "What a bad law forcing the company to inform me about what
           | they do with my data". Sounds so backwards, but you're not
           | the first with this sentiment, so the PR departments of the
           | companies seems like they've succeed in re-pointing the
           | blame...
        
             | necovek wrote:
             | Seconded: and we need to have worthy competitors spring up
             | without those bad practices and lousy cookie banners, and
             | people to flock to them.
             | 
             | Once that happens, the "originals" will feel the pressure.
        
               | bevan wrote:
               | Not using those "bad practices" of third party analytics
               | can be an existential threat to small businesses,
               | unfortunately.
        
               | diggan wrote:
               | Not really. You can still get metrics and analytics, you
               | just don't include PII in it. There are tons of privacy-
               | respecting platforms/services (both self-hosted and not)
               | you can use, instead of just slapping Google Analytics on
               | the website and having to show the banner.
               | 
               | But even so, I'd argue that since it's a small business,
               | you'd do much better with qualitative data rather than
               | quantitative, since it's a small business it's hard to
               | make choices based on small amount of data. Instead,
               | conduct user experience studies with real people, and
               | you'll get a ton of valuable data.
               | 
               | All without cookie banners :)
        
             | zo1 wrote:
             | In my experience, most people that have semi or full
             | decision-making control over this kind of thing have
             | absolutely no idea if they even need cookie consent
             | banners. They just fall for the marketing speak of every
             | single SAAS product that sells cookie-consent/GDPR stuff
             | and err on the side of caution. No one wants to be the guy
             | that says: "hey, we're only logging X, Y and not Z. And
             | GDPR says we need consent only if we log Z, so therefore we
             | don't need cookie consent." For starters, they need a
             | lawyer to tell them it's "A OK" to do it this way, and
             | secondly it's plain old cheaper and a lot less political
             | capital to just go with the herd on this. The cost of the
             | banner is off-loaded outside of the company and, for the
             | time being, the users don't seem to mind or care.
             | 
             | This is why half the web has cookie-consent banners. No
             | amount of developers who know the details screaming up the
             | ladder will fix this. The emergent behavior put in place by
             | the legal profession and corporate politics favors the SAAS
             | companies that sell GDPR cookie banner products and
             | libraries. Even if they're in the right, there is a
             | greater-than-zero percent chance that if they do the wrong
             | thing they'll go to court or be forced to defend
             | themselves. And even then if it's successful, the lawyers
             | still need to be paid, and the company will look at "that
             | fucking moron Joe from the website department" which caused
             | all their hassles and countless hours of productivity as a
             | result of being a "smart ass".
        
               | diggan wrote:
               | > have absolutely no idea if they even need cookie
               | consent banners
               | 
               | > This is why half the web has cookie-consent banners
               | 
               | Agree, but we as developers can have an impact in this,
               | especially in smaller companies. I've managed to "bark up
               | the ladder" sufficiently to prevent people from
               | mindlessly adding those popups before, and I'm sure
               | others have too.
               | 
               | But those companies have all been companies where user
               | experience is pretty high up on the priority ladder, so
               | it's been easy cases to make.
        
             | xp84 wrote:
             | It's odd that you think the people implementing the banners
             | want them so they can get more data. They want them because
             | they provide a shield from litigation. I don't know about
             | you, but in the past year, most of my ads on Facebook are
             | from law firms with headlines like "have you browsed
             | (insert random minor e-commerce site) in the past two
             | years? Your data may have been shared. You may be entitled
             | to compensation." If I'm a random mom and pop e-commerce
             | site and I do not add a cookie banner, and I use any form
             | of advertising at all, then I am opening myself up to a
             | very expensive lawsuit - and attorneys are actively
             | recruiting randos to serve as plaintiffs despite them never
             | being harmed by "data collection."
             | 
             | It's that simple. That's the situation with CCPA. Not sure
             | the exact form that GDPR penalties take because I'm not
             | European. But it's not a complicated issue. you have to
             | display some stupid consent thing if you're going to have
             | the code that you're required to have in order to buy ads
             | which take people to your website.
             | 
             | Note that plenty of these cookie banner products don't
             | actually work right, because they're quite tricky to
             | configure correctly, as they're attempting to solve a
             | problem within the webpage sandbox that should be solved in
             | the browser settings (and could _easily_ be solved there
             | even today by setting it to discard cookies at close of
             | browser). However, the legal assistants or interns at the
             | law firm pick their victims based on who isn't showing an
             | obvious consent screen. When they see one, it's likely that
             | they will move onto the next victim because it's much
             | easier to prove violation of the law if they didn't even
             | bother to put up a cookie banner. A cookie banner that
             | doesn't work correctly is pretty easy to claim as a
             | mistake.
        
               | diggan wrote:
               | > If I'm a random mom and pop e-commerce site and I do
               | not add a cookie banner, and I use any form of
               | advertising at all, then I am opening myself up to a very
               | expensive lawsuit
               | 
               | Nope, that's not how it works. But your whole comment is
               | a great showcase about how these myths continue to
               | persist, even though the whole internet is out there
               | filled with knowledge you could slurp up at a moments
               | notice.
        
           | necovek wrote:
           | Nope, collective indifference to subpar user experiences has
           | gotten us those lousy cookie banners.
           | 
           | Web sites could legally use cookies for non-tracking purposes
           | without cookie banners but considering people have not
           | stopped visiting sites despite the fugly click-through cookie
           | banners makes them a failure.
           | 
           | All it takes is for 50% of the internet users to stop
           | visiting web sites with them, and web site authors will stop
           | tracking users with external cookies.
        
             | chickenfeed wrote:
             | I read an article that said something along the lines of
             | people aren't prepared to pay for apps, so instead we get
             | app store silo advert supported crap-ware. And if it's not
             | the apps its click bait making fractional gains by being
             | supported by ad networks. That some of, but not all of us
             | recoil from.
        
             | GlacierFox wrote:
             | "All it takes is for 50% of the internet users to stop
             | visiting web sites with them..."
             | 
             | You've written that like it's a plausible and likely
             | scenario.
        
               | xp84 wrote:
               | Yeah, this is an insane proposal. I know GP may be
               | imagining a smart populace walking away from Big Evil
               | Facebook and X with heads held high, but the other 99% of
               | sites are also doing the same cookie banner stupidity
               | because it is roughly mandatory due to useless EU law
               | (unless you're not engaging at all in advertising even as
               | an _advertiser_ ). So, no more accessing your bank, power
               | utility, doctor, college, etc. That'll show those pesky
               | cookie banner people!
               | 
               | "The Internet" to someone boycotting cookie banners would
               | basically just be a few self-hosted blogs.
        
               | necovek wrote:
               | I am happy to learn what I may have been imagining:
               | thanks for that!
               | 
               | The law has turned out to be useless, agreed -- or at
               | least, it has driven hard-to-navigate UX that we live
               | through today. The intent could have taken us in a
               | different direction with some care (i.e. mandating a
               | clear, no-dark-pattern opt-out/opt-in ahead-of-time
               | option a la DoNotTrack header that similarly failed): if
               | web clients (browsers) were required to pass visitor's
               | preferences and if the list of shared-with was mandated
               | to be machine readable with an exact format (so browsers
               | would create nice UIs), _maybe_ we 'd get somewhere.
        
               | Pooge wrote:
               | You do not need to show a banner and ask for consent if
               | every cookie is to make the website work (e.g. for
               | authentication and settings). GDPR didn't create this
               | banner; websites that use useless cookies and phone home
               | to Big Tech are.
        
               | necovek wrote:
               | I think it's very unlikely. "All it takes" was tongue in
               | cheek.
               | 
               | (If it was likely, it would have already happened)
        
             | deivid wrote:
             | I think it's significantly less than 50% -- Your comment
             | made me think of The dictatorship of the small minority[0],
             | which places the value at 3~4%
             | 
             | [0]: https://medium.com/incerto/the-most-intolerant-wins-
             | the-dict...
        
             | YetAnotherNick wrote:
             | > All it takes is for 50% of the internet users to stop
             | visiting web sites with them, and web site authors will
             | stop tracking users with external cookies.
             | 
             | How would the content creators or news sites earn then? Web
             | is built on ads, and ads are built on tracking as
             | untargeted ads pays significantly lower than targeted.
        
           | 2Gkashmiri wrote:
           | I use Firefox focus on android and Firefox with ubo and
           | others..
           | 
           | On desktop and Firefox app, I only browse through private
           | browsing so cookies are mostly irrelevant as session ends as
           | soon as all windows close.
        
             | kQq9oHeAz6wLLS wrote:
             | But you're an edge case, and an extreme one at that.
        
           | csomar wrote:
           | No. A significant number of people care about Privacy which
           | is why 1. Apply was targeting them with Ads and 2. AdBlock
           | did _hurt_ Google 's business. Also care is different from go
           | to war (as in install Linux and manually setup a privacy
           | shield + Tor + only transact in Monero). Some people do that
           | out of principal. Many people want the Privacy features but
           | with the ease of use.
        
             | mvkel wrote:
             | Define "significant," and do you have a source?
             | 
             | I'd bet if you ask people "do you care about privacy?"
             | Close to 100% would say yes.
             | 
             | If you ask "you have to give up privacy to be able to log
             | in to your email automatically. Are you ok with that?"
             | Close to 100% would say yes.
             | 
             | If you ask "we will give you this email service for free
             | but in exchange we get to squeeze every ounce of juice that
             | we can out of it to persuade you to buy things you don't
             | need. Are you ok with that?" Close to 100% would say yes.
             | 
             | It doesn't matter what people say they care about. Their
             | actions say otherwise, if the privacy-friendly option is in
             | any way less convenient.
        
               | robertlagrant wrote:
               | Sometimes I think the real power of Gmail is the Log In
               | With Google button.
        
           | nfriedly wrote:
           | u-Block Origin's annoyances filters take care of the cookie
           | banners, giving the best of both worlds: no banners and a
           | minimal amount of tracking.
           | 
           | (The "I don't care about cookies" extension is similarly
           | effective, but since I'm already running u-block origin, it
           | makes more sense to me to enable it's filter.)
        
             | diggan wrote:
             | > u-Block Origin's annoyances filters take care of the
             | cookie banners, giving the best of both worlds: no banners
             | and a minimal amount of tracking.
             | 
             | Word of caution though, that might silently break some
             | websites. I've lost count of the times some HTTP request
             | silently failed because you weren't meant to be able to get
             | some part of the website, without first rejecting/accepting
             | the 3rd party cookies.
             | 
             | Usually, disabling uBlock, rejecting/accepting the cookies
             | and then enabling it again solves the problem. But the
             | first time it happened, it kind of caught me by surprise,
             | because why in holy hell would you validate those somehow?!
        
           | andrepd wrote:
           | Not really. A mandatory opt-in option at the browser level
           | would be the correct way to do it, but legislation forced
           | instead those cookie banners onto the webpage.
        
             | elondaits wrote:
             | No, legislation (the GDPR) doesn't say anything about
             | cookie pop ups. It says that private data (or any kind) can
             | only be used with opt in consent, given freely, with no
             | strings attached, with the ability to be withdrawn, that it
             | will be kept secure, deleted when not needed for the
             | original purpose, etc. All very reasonable stuff. Tracking
             | cookies are affected, but the legislation covers all
             | private data (IP, email address, your location, etc) ...
             | And if Browsers agreed on a standard to get and withdraw
             | opt-in consent, it would be compatible with what the
             | legislation requires.
        
           | jchw wrote:
           | Actually, if my mindset were leading, we wouldn't have cookie
           | consent banners because we would've just banned non-essential
           | tracking altogether.
        
           | no_wizard wrote:
           | I'm spitballing here but wouldn't another way to handle it
           | would be to return dummy / null responses by redirecting
           | telemetry calls to something that will do so?
           | 
           | This would have the added benefit of being configurable and
           | work on a bunch of apps instead of just one at a time too
        
           | timeon wrote:
           | Better to have cookie banners than made-up 'collective
           | decision'.
        
           | gameman144 wrote:
           | With cookie banners, legislation said that every website
           | needed to ask for consent -- a thousand sites, a thousand
           | banners.
           | 
           | Operating system level controls, though, provide a single
           | control plane. One can very easily imagine OS-level toggles
           | per application of:
           | 
           | [No Internet, No Internet outside your own app-sandbox, Ask
           | me every time, Everything is allowed].
           | 
           | No opt in from apps required -- they might break if the
           | network is disabled, but the user is still in control of
           | their data.
        
           | nazcan wrote:
           | I think it's clear that users should be able to have their
           | own agents that make these decisions. If you want an agent
           | that always defers to you and asks about Internet access,
           | great. If you want one that accepts it all great. If you want
           | one that uses some fancy logic, great.
        
           | RedComet wrote:
           | The cookie banners are an example of malicious compliance.
        
             | jonas21 wrote:
             | How so? Even the EU itself has a cookie banner:
             | 
             | https://european-union.europa.eu/
             | 
             | Unless you think the EU is maliciously complying with their
             | own regulation.
        
           | AgentOrange1234 wrote:
           | Why does it have to be more friction?
           | 
           | Users had a global way to signal "do not track me" in their
           | browser. I don't know why regulators didn't mandate
           | respecting that instead of cookie consent popups.
           | 
           | Apple IDs could easily have global settings about what you
           | are comfortable with, and then have their apps respect them.
        
         | sandworm101 wrote:
         | >> Trust in software will continue to erode until software
         | stops treating end users and their data and resources
         | 
         | Trust in closed-source _proprietary_ software. In other words:
         | trust in corporate entities. Trust in open-source software is
         | going strong.
        
           | m463 wrote:
           | Not a given though. Ubuntu phones home a lot by default.
           | 
           | Try disabling the motd stuff - it's quite pernicious by
           | design.
           | 
           | And removing the ubuntu-advantage package disables the
           | desktop. lol.
        
         | diggan wrote:
         | > I want software that doesn't send anything to the Internet
         | without some explicit intent first
         | 
         | I want this too, but when even the two most popular base OSes
         | don't adhere to this, I feel like it's an impossible uphill
         | battle to want the software running on those platforms to
         | behave like that.
         | 
         | "Local-first" just isn't in their vocabulary or best-interest,
         | considering the environment they act in today, sadly.
        
         | mgraczyk wrote:
         | Would you mind giving an example of something bad that could
         | happen to somebody as a result of Apple sending this data to
         | itself? Something concrete, where the harm would be realized,
         | for example somebody being hurt physically, emotionally,
         | psychologically, economically, etc
        
           | withinboredom wrote:
           | Once upon a time, I worked for a pretty big company (fortune
           | 500ish) and had access to production data. When a colleague
           | didn't show up at work as they were expected, I looked up
           | their location in our tracking database. They were in the
           | wrong country -- but I can't finish this story here.
           | 
           | Needless to say, if an Apple employee wanted to stalk someone
           | (say an abusive partner, creep, whatever), the fact that this
           | stuff phones home means that the employee can deduce where
           | they are located. I've heard stories from the early days of
           | Facebook about employees reading partner's Facebook messages,
           | back before they took that kind of stuff seriously.
           | 
           | People work at these places, and not all people are good.
        
             | mgraczyk wrote:
             | Your first story sounds like a good outcome.
             | 
             | I doubt Apple employees could deduce location from the
             | uploaded data. Having worked at FB I know that doing
             | something like that would very quickly get you fired post
             | 2016
        
           | acka wrote:
           | Easy, consider a parent taking pictures of their kid's
           | genitals to send to their doctor to investigate a medical
           | condition, the pictures getting flagged and reported to the
           | authorities as being child pornography by an automated
           | enforcement algorithm, leading to a 10-month criminal
           | investigation of the parent. This exact thing happened with
           | Google's algorithm using AI to hunt for CP[1], so it isn't
           | hard to imagine that it could happen with Apple software,
           | too.
           | 
           | [1] https://www.koffellaw.com/blog/google-ai-technology-
           | flags-da...
        
             | mgraczyk wrote:
             | Good example. I think this is worth the tradeoff
        
         | amelius wrote:
         | Even with opt-in a vendor will keep harassing the user until
         | they tap "yes" in an inattentive moment.
         | 
         | And I've been in situations where I noticed a box was checked
         | that I'm sure I didn't check. I want to turn these things off
         | and throw away the key. But of course the vendor will never
         | allow me to. Therefore I use Linux.
        
           | reaperducer wrote:
           | _I want to turn these things off and throw away the key. But
           | of course the vendor will never allow me to. Therefore I use
           | Linux._
           | 
           | I hate to break it to you, but these things happen in Linux,
           | too.
           | 
           | It's not the operating system that's the problem. It's that
           | the tech industry has normalized greed.
        
             | jchw wrote:
             | It is true that there are not absolutely zero instances of
             | telemetry or "phoning home" in Linux, but Desktop Linux is
             | not a similar experience to Windows or macOS in this
             | regard, and it isn't approaching that point, either. You
             | can tcpdump a clean install of Debian or what-have-you and
             | figure out all of what's going on with network traffic.
             | Making it whisper quiet typically isn't a huge endeavor
             | either, usually just need to disable some noisy local
             | networking features. Try Wiresharking a fresh Windows
             | install, after you've unchecked all of the privacy options
             | _and_ ran some settings through Shutup10 or whatever. There
             | 's still so much crap going _everywhere_. It 's hard to
             | even stop Windows from sending the text you type into the
             | start menu back to Microsoft, there's no option, you need
             | to mess with Group Policy and hope they don't change the
             | feature enough to need to change a different policy later
             | to disable it again. macOS is probably still better
             | (haven't checked in a while), but there are still some
             | features that basically can't be disabled that leak
             | information about what you're doing to Apple. For example,
             | you can't stop macOS from phoning home to check OCSP status
             | when launching software: there's no option to disable that.
             | 
             | The reason why this is the case is because while the tech
             | industry is rotten, the Linux desktop isn't really directly
             | owned by a tech industry company. There are a few tech
             | companies that work on Linux desktop things, but most of
             | them only work on it as a compliment to other things they
             | do.
             | 
             | Distributions may even take it upon themselves to "fix"
             | applications that have unwanted features. Debian is
             | infamous for disabling the KeepassXC networking features,
             | like fetching favicons and the browser integration,
             | features a lot of users actually _did_ want.
        
               | halfcat wrote:
               | Are there any tools that enable capturing traffic from
               | outside the OS you're monitoring, that still allow for
               | process-level monitoring?
               | 
               | Meaning, between the big vendors making the OS, and
               | state-level actors making hardware, I wouldn't
               | necessarily trust Wireshark on machine A to provide the
               | full picture of traffic from machine A. We might see this
               | already with servers running out-of-band management like
               | iDRAC (which is a perfectly fine, non-malicious use case)
               | but you could imagine the same thing where the NIC
               | firmware is phoning home, completely outside the
               | visibility of the OS.
               | 
               | Of course, it's not hard to capture traffic externally,
               | but the challenge here would be correlating that external
               | traffic with internal host monitoring data to determine
               | which apps are the culprit.
        
               | jchw wrote:
               | Curiosity has led me to check on and off if the local
               | traffic monitoring is missing anything that can be seen
               | externally a few times, but so far I've never observed
               | this happening. Though obviously, captures at different
               | layers can still yield some differences.
               | 
               | Still, if you were extra paranoid, it wouldn't be
               | unreasonable or even difficult to check from an external
               | vantage point.
               | 
               | > Are there any tools that enable capturing traffic from
               | outside the OS you're monitoring, that still allow for
               | process-level monitoring?
               | 
               | Doing both of these things at once would be hard, though.
               | You can't really trust the per-process tagging because
               | that processing _has_ to be done on the machine itself. I
               | think it isn 't entirely implausible (at the very least,
               | you could probably devise a scheme to split the traffic
               | for _specific_ apps into different VLANs. For Linux I
               | would try to do this using netns.)
        
             | amelius wrote:
             | Yes, but if it happens at least there is no greedy intent
             | and it will be corrected by the community.
        
           | jchw wrote:
           | For what it's worth, I use Linux, too, but as far as phones
           | go, stock phones that run Linux suffer from too many
           | reliability and stability issues for me to daily drive them.
           | I actually did try. So, as far as phones go, I'm stuck with
           | the Android/iOS duopoly like anyone else.
        
             | amelius wrote:
             | Well, I was speaking in general, not about phones.
        
         | Spooky23 wrote:
         | Consent for complex issues is a cop out for addressing privacy
         | concerns. Users will accept or reject these things without any
         | understanding of what they are doing either way. Apple seems to
         | have taken a middle ground where they de-risked the process and
         | made it a default.
         | 
         | This is a "look at me, Apple bad" story that harvests
         | attention. It sets the premise that this is an unknown and
         | undocumented process, then proceeds to explain it from Apple
         | documentation and published papers.
        
         | j45 wrote:
         | > So why didn't Apple just simply ask for user permission to
         | enable this feature? My cynical opinion is because Apple knows
         | some portion of users would instantly disallow this if
         | prompted, but they feel they know better than those users. I
         | don't like this attitude, and I suspect it is the same reason
         | why there is an increasing discontent growing towards opt-out
         | telemetry, too.
         | 
         | I'm just not sure why Apple needed to activate this by default,
         | other than not draw attention to it... and doing so that was
         | more important than the user's rights to the privacy they
         | believe they are purchasing on their device.
         | 
         | I don't care what convenience i'm being offered or sold. If the
         | user has decided what they want and the premium they are paying
         | for Apple, it must be respected.
         | 
         | This makes me wonder if there is an app that can monitor all
         | settings in an iPhone both for changes between updates, and
         | also new features being set by default to be enabled that
         | compromise the user's known wishes.
         | 
         | All this AI, and this is still overlooked.
         | 
         | I'm hoping it was an oversight.
        
         | eleveriven wrote:
         | You're absolutely right! And the decision to make this opt-out
         | feels dismissive
        
       | grishka wrote:
       | > It ought to be up to the individual user to decide their own
       | tolerance for the risk of privacy violations.
       | 
       | It's a recurring theme in the modern IT industry that the user
       | can somehow never trusted to take their own responsibility for
       | anything. Like, can I _please_ not have the  "added security" on
       | my Android phone where it would sometimes ask me for the pattern
       | instead of the fingerprint? I know my own risks, thank you very
       | much, Google. Same for macOS occasionally making me enter the
       | password instead of letting me unlock with touch ID. Same for a
       | bunch of other random software treating me as a complete
       | technologically illiterate idiot with no way to opt out of this
       | hell.
       | 
       | Remember how our devices used to serve us, not the other way
       | around?
        
         | userbinator wrote:
         | It's the rise of authoritarianism, and its appearance in IT
         | mirrors what's happening in the rest of society too.
        
       | mrshadowgoose wrote:
       | Is this just a smokescreen around slowly sneaking CSAM scanning
       | back in after the pushback last time? The "default on" behavior
       | is suspect.
       | 
       | [1] https://www.wired.com/story/apple-photo-scanning-csam-
       | commun...
        
         | amatecha wrote:
         | Yup, this is their way of injecting the "phone home" element
         | via an innocuous rationale, "location matching". The global
         | index will of course also match against other markers they deem
         | worthy of matching, even if they don't return that to the user.
        
           | Thorrez wrote:
           | But wouldn't the homomorphic encryption prevent Apple's
           | servers from knowing if there was a match or not?
        
             | amatecha wrote:
             | The server must know what it's matching at some point, to
             | be able to generate a response:
             | 
             | > The server identifies the relevant shard based on the
             | index in the client query and uses HE to compute the
             | embedding similarity in this encrypted space. The encrypted
             | scores and set of corresponding metadata (such as landmark
             | names) for candidate landmarks are then returned to the
             | client.
             | 
             | Even with the server supposedly not knowing the identity of
             | the client, the response could simply include extra
             | metadata like some flag that then triggers an instant send
             | of that photo to Apple's (or law enforcement's) servers
             | unencrypted. Who knows?
             | 
             | [0] https://machinelearning.apple.com/research/homomorphic-
             | encry..., during the period of generating
        
               | fragmede wrote:
               | > The server must know what it's matching at some point,
               | to be able to generate a response
               | 
               | The entire point of homomorphic encryption is that it
               | doesn't.
               | 
               | The homomorphic encrypted Wikipedia lookup example is
               | pretty neat.
               | 
               | https://spiralwiki.com/
               | 
               | https://news.ycombinator.com/item?id=31668814
        
               | amatecha wrote:
               | Oh, if that's the case, they really could have explained
               | that better. The language used in Apple's article doesn't
               | explain that the server cannot know the query or result
               | (it implies as such, but doesn't make this clear, nor
               | explain how/why)
        
               | lloeki wrote:
               | > they really could have explained that better. The
               | language used in Apple's article
               | 
               | This one?
               | 
               | https://machinelearning.apple.com/research/homomorphic-
               | encry...
               | 
               | I find that description perfectly clear for someone who
               | doesn't already know what homomorphic encryption means:
               | 
               | > One of the key technologies we use to do this is
               | homomorphic encryption (HE), a form of cryptography that
               | enables computation on encrypted data (see Figure 1). HE
               | is designed so that a client device encrypts a query
               | before sending it to a server, and the server operates on
               | the encrypted query and generates an encrypted response,
               | which the client then decrypts. The server does not
               | decrypt the original request or even have access to the
               | decryption key, so HE is designed to keep the client
               | query private throughout the process.
               | 
               | Later:
               | 
               | > HE excels in settings where a client needs to look up
               | information on a server while keeping the lookup
               | computation encrypted.
               | 
               | And there's more perfectly layman-understandable in PIR
               | and PNSS sections, complete with real-life examples and
               | simple diagrams.
               | 
               | One just has to _read the goddamn thing_ , which
               | apparently is an insurmountably tall order these days for
               | content that is more than 250 characters.
        
               | newZWhoDis wrote:
               | The setup for "that wasn't _real_ homomorphic
               | encryption!" is in, when in 2-4 years it comes out that
               | they were doing this exact thing.
               | 
               | The entire concept of a homomorphic encryption system is
               | a land mine outside of obscure academic discussions. In
               | practice systems marketed to the public as "homomorphic
               | encryption" will result in user data exfil mark my words.
        
             | NooneAtAll3 wrote:
             | not if you need to access from multiple devices (otherwise,
             | what's the point of this feature?)
             | 
             | in that case it's the source of common key of "the same
             | account" becomes the threat
             | 
             | and now you have to trust... megacorporation with closed-
             | garden ecosystem... to not access _its own servers_ in your
             | place?
        
           | avianlyric wrote:
           | Honestly, why the hell would Apple bother with such a
           | contrived and machiavellian strategy to spy on their users?
           | 
           | They literally own the code to iOS. If they wanted to
           | covertly track their customers, they could just have their
           | devices phone home with whatever data they wanted to collect.
           | Realistically there would be no way to know if this was
           | actually happening, because modern devices emit so much
           | encrypted data anyway, it wouldn't be hard to hide some
           | nefarious in all the noise.
           | 
           | Time Cook isn't some Bond villain, sitting in a giant chair,
           | stroking a white cat, plotting to take over the world by
           | lulling everyone into a false sense of privacy (I mean
           | Zuckerburg already did that). Apple is just a large giant
           | corporation that wants to make money, and is pretty damn open
           | about that fact. They clearly think that they can make more
           | money by doubling down on more privacy, but that doesn't work
           | if you don't actually provide the privacy, because
           | ultimately, people are really crap at keeping secrets,
           | especially when a media group would happily pay for a story,
           | even at Apple.
        
         | gojomo wrote:
         | My thoughts exactly: "we've got this crafty image
         | fingerprinting, the CSAM detection use proved too controversial
         | to roll out, but let's get the core flows into something that
         | sounds useful for users, so the code atays alive, improving, &
         | ready for future expansion."
         | 
         | Whether such fingerprinting can reliably be limited to public
         | "landmarks" is an interesting question, dependent on unclear
         | implementation details.
         | 
         | Even if the user-visible _search_ is limited to  'landmarks',
         | does the process pre-create (even if only on-device)
         | fingerprints of many other things as well? If so, it suddenly
         | becomes possible for briefly-active non-persistent malware to
         | instantly find images of interest without the wider access &
         | additional processing it'd otherwise take.
        
           | NooneAtAll3 wrote:
           | > let's get the core flows into something that sounds useful
           | for users
           | 
           | is it even that?
           | 
           | I don't see the benefit of this whatsoever
        
             | swores wrote:
             | The search feature is useful at times, and while local
             | processing is good enough to find (some of the) photos I've
             | taken that match a search term like "table", it can't
             | currently find a photo from a search term of "specific
             | neighbourhood in my city" or "name of specific mountain I
             | climbed years ago" - so if by processing on their servers
             | allows them to do that then it would be genuinely
             | beneficial.
             | 
             | But not beneficial enough to make up for the loss of
             | privacy, so I've disabled it without finding out how useful
             | or not the functionality is.
        
         | gigel82 wrote:
         | Very likely yes. Why else would they add a feature that incurs
         | costs for them as an update, at no cost to the users (and not
         | even make a fuss about it)?
         | 
         | It is obvious they are monetizing this feature somehow. Could
         | be as innocuous as them training their AI dataset, or feeding
         | into their growing ad business (locations and other things
         | identified in the photos), or collaboration with law
         | enforcement for various purposes (such as notifying the CCP
         | about people's Winnie-the-Pooh memes), or a lot more ominous
         | things.
        
           | avianlyric wrote:
           | > Very likely yes. Why else would they add a feature that
           | incurs costs for them as an update, at no cost to the users
           | (and not even make a fuss about it)?
           | 
           | Erm, you're aware of the whole Apple intelligence thing
           | right? An entire product that costs Apple money, provided at
           | "no cost" to the user (if you had an iPhone 15). Also every
           | feature in an OS update has a costs associated with it, and
           | iOS updates have cost money for the best part of a decade
           | now.
           | 
           | Has it occurred to you that reason Apple includes new
           | features in their updates is to provide customers with more
           | reasons to buy more iPhones? Just because feature are
           | provided at "no cost" at point of consumption, doesn't mean
           | Apple won't make money in the long run, and selling user data
           | isn't the only way to monetise these features. Companies have
           | been giving out "freebies" for centuries before the internet
           | existed, and the possibility of large scale data collection
           | and trading was even imaginable.
        
         | EagnaIonat wrote:
         | That whole incident was so misinformed.
         | 
         | CSAM scanning takes place on the cloud with all the major
         | players. It only has hashes for the worst of the worst stuff
         | out there.
         | 
         | What Apple (and others do) is allow the file to be scanned
         | unencrypted on the server.
         | 
         | What the feature Apple wanted to add was scan the files on the
         | device and flag anything that gets a match.
         | 
         | That file in question would be able to be decrypted on the
         | server and checked by a human. For everything else it was
         | encrypted in a way it cannot be looked at.
         | 
         | If you had icloud disabled it could do nothing.
         | 
         | The intent was to protect data, children and reduce the amount
         | of processing done on the server end to analyse everything.
         | 
         | Everyone lost their mind yet it was clearly laid out in the
         | papers Apple released on it.
        
           | Hackbraten wrote:
           | That technology of perceptional hashes could have failed in
           | numerous ways, ruining lives of law-abiding users along the
           | way.
        
           | privacyking wrote:
           | Apple sells their products in oppressive regimes which force
           | them to implement region specific features. E.g. China has
           | their own iCloud, presumeably so it can be easily snooped on.
           | 
           | If they were to add this anti-CSAM feature, it is not
           | unreasonable to think that Apple would be forced to add non-
           | CSAM stuff to the database in these countries, e.g. anything
           | against a local dictatorship/ etc. Adding the feature would
           | only catch the low hanging CSAM fruit, at the cost of great
           | privacy and probably human life. If it was going to stop CSAM
           | once and for all, it could possibly be justified, but that's
           | not the case.
        
             | foldr wrote:
             | If China can force Apple to do that stuff, then it can do
             | that regardless of whether or not they add this feature.
        
           | flik wrote:
           | Yes this is better than upload the entire photo. Just like
           | virus scan can be done entirely on device, can flagging be
           | local?. If homeomorphic encryption allows similarity
           | matching, does not seem entirely private. Can people be
           | matched?
        
           | amatecha wrote:
           | "It only has hashes for the worst of the worst stuff out
           | there." [citation needed]
           | 
           | I know someone whose MS account was permabanned because they
           | had photos of their own kid in the bathtub. I mean, I guess
           | the person could have been lying, but I doubt they would even
           | have been talking about it if the truth was less innocuous.
        
             | dwaite wrote:
             | Sure, and they do that because Microsoft's CSAM detection
             | product (which other providers like Google supposedly use)
             | operates by having unencrypted data access to your files in
             | the cloud.
             | 
             | What Apple wanted to do is do those operations using
             | homomorphic encryption and threshold key release so that
             | the data was checked while still encrypted, and only after
             | having a certain number of high likelihood matches would
             | the possibility exist to see the encrypted data.
             | 
             | So the optimistic perspective was that it was a solid win
             | against the current state of the industry (cloud accounts
             | storing information unencrypted so that CSAM products can
             | analyze data), while the pessimistic perspective was that
             | your phone was now acting as a snitch on your behavior
             | (slippery slope etc.)
        
           | troupo wrote:
           | > Everyone lost their mind yet it was clearly laid out in the
           | papers Apple released on it.
           | 
           | And people working with CSAM and databases of CSAM have said
           | it was a very bad idea.
        
           | cebert wrote:
           | > The intent was to protect data, children and reduce the
           | amount of processing done on the server end to analyse
           | everything.
           | 
           | If it's for the children, then giving up our civil liberties
           | is a small price to pay. I'd also like to give up liberties
           | in the name of "terrorism".
           | 
           | When we willingly give up our rights out of fear, these evil
           | people have won.
        
           | rzimmerman wrote:
           | I can't believe how uninformed, angry, and still willing to
           | argue about it people were over this. The whole point was a
           | very reasonable compromise between a legal requirement to
           | scan photos and keeping photos end-to-end encrypted for the
           | user. You can say the scanning requirement is wrong, there's
           | plenty of arguments for that. But Apple went so above and
           | beyond to try to keep photo content private and provide E2E
           | encryption while still trying to follow the spirit of the
           | law. No other big tech company even bothers, and somehow
           | Apple is the outrage target.
        
             | ec109685 wrote:
             | There isn't a law that requires them to proactively scan
             | photos. That is why they could turn the feature back off.
        
               | Scion9066 wrote:
               | A law by the government requiring proactive scanning of
               | photos would in fact make the whole situation worse in
               | the US because there would need to be a warrant if the
               | government is requiring the scan. As long as it's
               | voluntary by the company and not coerced by the
               | government, they can proactively scan.
        
             | mrshadowgoose wrote:
             | > a legal requirement to scan photos
             | 
             | Can you provide documentation demonstrating this
             | requirement in the United States? It is widely understood
             | that no such requirement exists.
             | 
             | There's no need to compromise with any requirement, this
             | was entirely voluntary on Apple's part. That's why people
             | were upset.
             | 
             | > I can't believe how uninformed
             | 
             | Oh the irony.
        
           | mrshadowgoose wrote:
           | > What the feature Apple wanted to add was scan the files on
           | the device and flag anything that gets a match.
           | 
           | This is not the revelation you think it is. Critics
           | understood this perfectly.
           | 
           | People simply did not want their devices scanning their
           | content against some opaque uninspectable government-
           | controlled list that might send you to jail in the case of a
           | match.
           | 
           | More generally, people usually want their devices working for
           | their personal interests only, and not some opaque government
           | purpose.
        
             | Scion9066 wrote:
             | From my understanding, it didn't scan all of the files on
             | the device, just the files that were getting uploaded to
             | Apple's iCloud. It was set up to scan the photos on the
             | device because the files were encrypted before they were
             | sent to the cloud and Apple couldn't access the contents
             | but still wanted to try to make sure that their cloud
             | wasn't storing anything that matched various hashes for bad
             | content.
             | 
             | If you never uploaded those files to the cloud, the
             | scanning wouldn't catch any files that are only local.
        
               | mrshadowgoose wrote:
               | Your understanding is correct, as was/is the
               | understanding of people critical of the feature.
               | 
               | People simply don't want their device's default state to
               | be "silently working against you, unless you are
               | hyperaware of everything that needs to be disabled".
               | Attacks on this desire were felt particularly strongly
               | due to Apple having no legal requirement to implement
               | that functionality.
               | 
               | One also can't make the moral argument that the "bad
               | content" list only included CSAM material, as that list
               | was deliberately made opaque. It was a "just trust me
               | bro" situation.
        
         | larusso wrote:
         | Yes my thoughts as well. The tech was so expensive I guess that
         | they had a need to test / run it to proof it's private? I mean
         | the model to find landmarks in your photos could run locally as
         | well or? Ok I'm not 100% sure here.
        
           | hmottestad wrote:
           | I assume that the model couldn't run locally for some reason.
           | Probably either uses too much power or needs too much memory.
        
         | KerrAvon wrote:
         | No, it is not. Whatever their other failings, Apple doesn't
         | think that way.
         | 
         | The cynical reason: consider that you can't plan engineering
         | features of this scale without written documentation, which
         | will always surface in court.
         | 
         | The prima facie reason: Apple genuinely wants to provide useful
         | features requiring server participation.
        
           | pkkkzip wrote:
           | This is incredibly naive and curiously defensive.
           | 
           | If this was a feature on its own then it would not be
           | popular.
           | 
           | Citing national security, some danger will justify its
           | existence.
           | 
           | Apple alone does not control and dictate what goes in, once
           | you reach their level of size and wealth that exceed even
           | developed countries, you ultimately cannot be the controller
           | of your destiny purely as a profit orientated corporation.
           | 
           | ex) Meta, Microsoft, Google
        
       | kittikitti wrote:
       | Here's a direct link to the paper that describes this as
       | mentioned in the post,
       | 
       | https://machinelearning.apple.com/research/homomorphic-encry...
       | 
       | The homomorphic code is available in Swift
       | 
       | https://www.swift.org/blog/announcing-swift-homomorphic-encr...
       | 
       | I have concerns about how women would be affected by law
       | enforcement who might use the location data to target abortions.
        
         | dwaite wrote:
         | > I have concerns about how women would be affected by law
         | enforcement who might use the location data to target
         | abortions.
         | 
         | I believe this risk already exists in the form of LE request
         | with court order to see the contents of an iCloud account, such
         | as conversations and photo metadata. Advanced Data Protection
         | exists to eliminate this particular vector - Apple no longer
         | has account recovery information like private keys escrowed, so
         | there's no way they can provide it if requested.
         | 
         | This would not increase such a risk since the network traffic
         | is effectively VPN'd for anonymization, clients send false data
         | periodically to break profiling or behavioral analysis, and the
         | requests and corresponding results are not comprehensible to
         | Apple's server due to the HE.
        
         | sgammon wrote:
         | > I have concerns about how women would be affected by law
         | enforcement who might use the location data to target
         | abortions.
         | 
         | What? How?
        
       | kjreact wrote:
       | Let's step back and go through the thought process of the team
       | that's implementing this feature. If they leave the feature
       | disabled by default most likely casual users will never use it
       | because they won't be able to find the setting buried under all
       | the menus. Thus after adding all their privacy layers the team
       | felt that it should be safe enough to enable by default while
       | remaining true to the company's ethos of privacy.
       | 
       | Now what would you have done differently if you were in charge of
       | rolling out such a feature? While I don't like Apple phoning home
       | without my consent, in this case my photos are not being sent to
       | Apple; only anonymized hashes used to match well known landmarks.
       | 
       | The author of the article goes on to show his bias against Apple
       | with phrases like "You don't even have to hypothesize lies,
       | conspiracies, or malicious intentions on the part of Apple to be
       | suspicious of their privacy claims" or "Apple's QA nowadays is
       | atrocious."
       | 
       | Or this rich quote "I never wanted my iPhone to phone home to
       | Apple." What smartphone or computing device never communicates
       | back to their company servers? Even when I use open source
       | libraries I have to communicate with repositories to check for
       | dependencies. Does the author hide every online transaction he
       | makes? Never using email or text messages or cloud services which
       | will leak his private information? Unlikely. He just wants to
       | grind his axe against Apple.
       | 
       | So let's step back and look at it reasonably and see if Apple is
       | trying to erode our privacy with this feature. I personally don't
       | see this particular feature as being harmful, but I will thank
       | the overzealous author of bringing it to my attention and I'll be
       | disabling the feature since I don't need it. This feature is no
       | where near as invasive as the CSAM detection tool that was
       | proposed, which did warrant critical discussion.
       | 
       | Let's let the team, undoubtedly one of many with the unenviable
       | task of making Apple Intelligence relevant, who rolled out the
       | feature get their yearly bonus and move on to discuss more
       | enlightening topics such as as the merits of keeping tabs on the
       | urination habits of the crew on the ISS via the Mac menubar.
        
         | sneak wrote:
         | If Apple can prompt me 4700 times after I've said no each time
         | to enable Apple TV and Apple Arcade and Apple Fitness
         | subscriptions, they can prompt users to enable a new feature if
         | they wish.
        
         | lapcat wrote:
         | > The author of the article goes on to show his bias against
         | Apple
         | 
         | My bias against Apple, as someone who exclusively uses and
         | develops software for Apple products? I've been a loyal Mac
         | user for over 20 years.
         | 
         | It's so ridiculous when people make an accusation like this.
         | For better or worse, I'm married to Apple platforms.
        
       | gerash wrote:
       | I don't mind the whole use of ML based features even if it
       | requires elaborate schemes to run some computation in the cloud
       | in some privacy preserving manner.
       | 
       | However I do see that Apple marketing campaign tried to fool
       | people into thinking somehow non Apple are not private and only
       | Apple provides privacy.
        
         | CodeWriter23 wrote:
         | This obviously requires location services to be enabled. Why
         | does this need ML? Hit a GIS lookup API with the lat/long of
         | the photo.
        
       | mesmertech wrote:
       | looks like its not even available on EU, I'll call that a win
        
         | pndy wrote:
         | Where are you? I'm in Poland and I just disabled it both on
         | phone and tablet - the setting switcher is "conveniently"
         | placed at the bottom of Photos settings page
        
       | supriyo-biswas wrote:
       | The referenced Apple blog post[1] is pretty clear on what this
       | feature does, and I wish the author at lapcatsoftware (as well as
       | folks here) would have read it too, instead of taking the blog
       | post as-is.
       | 
       | Apple has implemented homomorphic encryption[2], which they can
       | use to compute distance metrics such as cosine similarity without
       | revealing the original query/embedding to the server. In the case
       | of photos, an on-device model is first used to detect if a
       | landmark may be possibly present in the image, and then sends an
       | encrypted embedding[3] for the image region containing said
       | landmark, which is evaluated on the server using HE techniques
       | and then the encrypted results are sent back.
       | 
       | I'm sure someone will come along and say Apple does none of these
       | things; in which case, said commenter should probably not use
       | Apple devices since, there is no reason to trust the toggles for
       | "Enhanced Visual Search", and perhaps more importantly, the one
       | for "Advanced Data Protection" either. However, I rarely see any
       | other major company researching ML and HE together, so that alone
       | is a sufficient positive indicator in my eyes.
       | 
       | (It's interesting that this is also being downvoted. Please go
       | ahead, I can't stop you :P but please also write the parts that
       | you disagree with. Thank you.)
       | 
       | [1] https://machinelearning.apple.com/research/homomorphic-
       | encry...
       | 
       | [2] https://en.wikipedia.org/wiki/Homomorphic_encryption
       | 
       | [3] https://mlr.cdn-apple.com/video/HE_Fig_3_PNNS_889f3a279c.m4v
        
         | j45 wrote:
         | I thought less about the data, as much as the metadata, direct
         | and indirect.
        
         | penguinburglar wrote:
         | Thank you for posting the apple blog post. As usual, it's
         | really interesting research, and it's fascinating to see how
         | they solve potentially gnarly privacy issues.
        
           | lapcat wrote:
           | > Thank you for posting the apple blog post.
           | 
           | The very same blog post is linked in the submitted article.
        
         | gigel82 wrote:
         | That's not the point of the outrage though (at least not for
         | me). They enabled by default a feature that analyzes my
         | pictures (which I never upload to iCloud) and sends information
         | about them to their (and others') servers. That is a gross
         | violation of privacy.
         | 
         | To be clear, I don't care about any encryption scheme they may
         | be using, the gist is that they feel entitled to reach into
         | their users' most private data (the photos they explicitly said
         | they don't want to upload to iCloud) and "analyze" them.
         | 
         | This is the same as that time Microsoft enabled OneDrive "by
         | mistake" and started slurping people's private documents and
         | photos saved in default locations (arguably worse since no one
         | takes pictures with their PC's webcams).
        
           | supriyo-biswas wrote:
           | If the data is encrypted, does the concern still apply?
           | 
           | You bring up the example of Onedrive, but there is no use of
           | e2e encryption or HE techniques there.
        
             | gigel82 wrote:
             | Yes, of course, the concern is the data being exfiltrated
             | to begin with. Like someone else in this thread mentioned,
             | if they upload a single pixel from my image without my
             | consent, that is too much data being uploaded without my
             | consent.
        
               | scratchyone wrote:
               | If they sent a completely randomly generated integer from
               | your phone without consent, would that be okay with you?
               | Genuine question.
        
               | amiga386 wrote:
               | I'm not who you asked, but no, I wouldn't be.
               | 
               | I'd want an explanation of _why_ they want to send this
               | data. They need to seek _informed_ consent, and the
               | default needs to be _no_ data collection. Opt-in, not
               | opt-out.
               | 
               | If I do opt-in, I can _withdraw_ that consent at any
               | time.
               | 
               | I can also expect them to delete any collected data
               | within a reasonable (to me) time frame, tell me what they
               | do with it, who they share it with, supply me with any
               | personally identifying data they have colllected, and
               | allow me to correct it if it's wrong. And if they use the
               | data to make important decisions automatically, e.g. bank
               | loan yes/no, I have the right to make them use human
               | reasoning to reconsider.
               | 
               | There is no reason to let businesses non-consensually
               | collect _any_ data from you, even if that 's their entire
               | business model. Don't let their self-serving lies about
               | "you can trust us" or "it's normal and inevitable"
               | swindle you out of your privacy.
               | 
               | Incidentally,"a completely randomly generated integer"
               | could describe Apple's Advertising Identifier, which
               | allows third parties to track the bejesus out of you.
        
               | JTyQZSnP3cQGa8B wrote:
               | > If they sent [...] without consent, would that be okay
               | with you?
               | 
               | No. That would never be acceptable to me.
        
             | isodev wrote:
             | > does the concern still apply?
             | 
             | Yes it does and the blogpost specifically explains why.
             | 
             | In short, both iOS and macOS are full of bugs, often with
             | the potential of exposing sensitive information.
             | 
             | Also, it's on by default - nobody in their sane mind would
             | have bits of their photos uploaded somewhere, regardless of
             | "we promise we won't look".
             | 
             | Finally, Photos in iOS 18 is such a bad experience that it
             | seems the breach of privacy was fundamentally unjustified
             | as no meaningful improvement was introduced at all.
        
               | Thorrez wrote:
               | >regardless of "we promise we won't look".
               | 
               | AIUI, even if Apple's servers tried to look, they cannot,
               | because of the encryption.
        
               | isodev wrote:
               | Encryption does not automatically mean secure.
               | Encryptions can and will be broken. Any flaw in their
               | implementation (which nobody can verify) would render
               | encryption useless...
        
               | supriyo-biswas wrote:
               | I wonder where you'd draw the line.
               | 
               | Do you also distrust TLS for example, and therefore
               | refuse to use the internet? What about AES/Chacha for
               | full-disk encryption?
        
               | isodev wrote:
               | The line is very simple - my content stays on device,
               | secured (locally) with the current modern and practical
               | tools.
        
               | Thorrez wrote:
               | Sure, but it's more than a promise that they won't look.
               | Apple currently believes it's impossible to look.
        
               | isodev wrote:
               | Apple is not a person, they don't "believe" anything.
               | 
               | As a corporation, they "calculate" what they can say
               | without getting into more trouble than they're willing to
               | entertain.
        
             | jchw wrote:
             | > If the data is encrypted, does the concern still apply?
             | 
             | Yes! For so many reasons!
             | 
             | If an adversary is able to intercept encrypted
             | communications, they can store it in hopes of decrypting it
             | in the future in the event that a feasible attack against
             | the cryptosystem emerges. I don't know how likely this is
             | to happen against homomorphic encryption schemes, but the
             | answer is not zero.
             | 
             | I'm not suggesting everyone should spend time worrying
             | about cryptosystems being cracked all day long, and I'm not
             | saying Apple's encryption scheme here will prove insecure.
             | Even if _this particular scheme_ is cracked, it 's very
             | possible it won't reveal much of great interest anyways,
             | and again, that is simply not the point.
             | 
             | The point is that _the correct way to guarantee that your
             | data is private is to simply never transmit it or any
             | metadata related to it over a network in any form_. This
             | definitely limits what you can do, but it 's a completely
             | achievable goal: before smartphones, and on early
             | smartphones, this was the _default behavior_ of taking
             | pictures with any digital camera, and it 's pretty
             | upsetting that it's becoming incredibly hard to the point
             | of being _nearly impractical_ to get modern devices to
             | behave this way and not just fling data around all over the
             | place willy-nilly.
             | 
             | And I know people would like Apple to get credit for at
             | least attempting to make their features plausibly-private,
             | but I feel like it's just the wrong thing right now. What
             | we need today is software that gives agency back to the
             | user, and the first part of that is not sending data off to
             | the network without some form of user intent, without dark
             | patterns to coerce said intent. At best, I can say that I
             | hope Apple's approach to cloud services becomes the new
             | baseline _for cloud services_ , but in my opinion, it's not
             | the future of privacy. The future of privacy is turning the
             | fucking radio off. Why the fuck should we all buy mobile
             | devices with $1000 worth of cutting edge hardware just to
             | offload all of the hard compute problems to a cloud server?
             | 
             | I'd also like to ask a different question: if there's no
             | reason to ever worry about this feature, then why is there
             | even an option to turn it off in the first place?
             | 
             | I worry that what Apple is _really_ doing with pushing out
             | all these fancy features, including their maligned CSAM
             | scanning initiative, is trying to get ahead of regulations
             | and position themselves as the baseline standard. In that
             | future, there 's a possibility that options to turn off
             | features like these will disappear.
        
               | scratchyone wrote:
               | > I'd also like to ask a different question: if there's
               | no reason to ever worry about this feature, then why is
               | there even an option to turn it off in the first place?
               | 
               | I mean for one, because of people like you that are
               | concerned about it. Apple wants you to have the choice if
               | you are against this feature. It's silly to try to use
               | that as some sort of proof that the feature isn't safe.
               | 
               | My iPhone has a button to disable the flash in the camera
               | app. Does that imply that somehow using the camera flash
               | is dangerous and Apple is trying to hide the truth from
               | us all? Obviously not, it simply means that sometimes you
               | may not want to use the flash.
               | 
               | They likely chose to make it opt-out because their
               | research shows that this is truly completely private,
               | including being secure against future post-quantum
               | attacks.
               | 
               | > If an adversary is able to intercept encrypted
               | communications, they can store it in hopes of decrypting
               | it in the future in the event that a feasible attack
               | against the cryptosystem emerges. I don't know how likely
               | this is to happen against homomorphic encryption schemes,
               | but the answer is not zero.
               | 
               | Also, if you're going to wildly speculate like this it is
               | at least (IMO) worth reading the research press release
               | since it does answer many of the questions you've posed
               | here[0].
               | 
               | > it's pretty upsetting that it's becoming incredibly
               | hard to the point of being nearly impractical to get
               | modern devices to behave this way and not just fling data
               | around all over the place willy-nilly.
               | 
               | And honestly, is turning off a single option in settings
               | truly impractical? Yes, it's opt-out, but that's because
               | their research shows that this is a safe feature. Not
               | every feature needs to be disabled by default. If most
               | users will want something turned on, it should probably
               | be on by default unless there's a very strong reason not
               | to. Otherwise, every single iPhone update would come with
               | a 30 question quiz where you have to pick and choose
               | which new features you want. Is that a reasonable
               | standard for the majority of non tech-savvy iPhone users?
               | 
               | Additionally, the entire purpose of a phone is to send
               | data places. It has Wi-Fi, Bluetooth, and Cellular for a
               | reason. It's a bit absurd to suggest that phones should
               | never send any data anywhere. It's simply a question of
               | what data should and should not be sent.
               | 
               | [0]
               | https://machinelearning.apple.com/research/homomorphic-
               | encry...
        
               | jchw wrote:
               | > I mean for one, because of people like you that are
               | concerned about it. Apple wants you to have the choice if
               | you are against this feature. It's silly to try to use
               | that as some sort of proof that the feature isn't safe.
               | 
               | If they know some people will be against the feature, why
               | not ask instead of enabling it for them?
               | 
               | > My iPhone has a button to disable the flash in the
               | camera app. Does that imply that somehow using the camera
               | flash is dangerous and Apple is trying to hide the truth
               | from us all? Obviously not, it simply means that
               | sometimes you may not want to use the flash.
               | 
               | Do you really not see how this is not a good faith
               | comparison? I'm not going to address this.
               | 
               | > They likely chose to make it opt-out because their
               | research shows that this is truly completely private,
               | including being secure against future post-quantum
               | attacks.
               | 
               | So basically your version of this story is:
               | 
               | - Apple knows some users will not like/trust this
               | feature, so they include an option to turn it off.
               | 
               | - But they don't bother to ask if it should be turned on,
               | because they are sure they know better than you anyway.
               | 
               | I agree. And it's this attitude that needs to die in
               | Silicon Valley and elsewhere.
               | 
               | > Also, if you're going to wildly speculate like this it
               | is at least (IMO) worth reading the research press
               | release since it does answer many of the questions you've
               | posed here[0].
               | 
               | I don't _need_ to, what I said generalizes to all
               | cryptosystems trivially. The only encryption technique
               | that provably can never be cracked is one-time pad, with
               | a key of truly random data, of size equal to or greater
               | than the data being encrypted. No other cryptosystem in
               | any other set of conditions has ever been proven
               | impossible to crack.
               | 
               | Homomorphic encryption is very cool, but you can't just
               | overwhelm the user with cryptosystem design and
               | mathematics and try to shrug away the fact that it is not
               | proven to be unbreakable. The fact that homomorphic
               | encryption is not proven to be unbreakable is absolutely
               | _not_ wild speculation, it is _fact_.
               | 
               | > And honestly, is turning off a single option in
               | settings truly impractical?
               | 
               | We all just learned about today! We don't even need to
               | _speculate_ about whether it is impractical, we _know_ it
               | can 't be done, and that's before we consider that loss
               | of privacy and agency over devices is a death-by-a-
               | thousand-cuts situation.
               | 
               | > Additionally, the entire purpose of a phone is to send
               | data places. It has Wi-Fi, Bluetooth, and Cellular for a
               | reason. It's a bit absurd to suggest that phones should
               | never send any data anywhere. It's simply a question of
               | what data should and should not be sent.
               | 
               | Clearly I don't think the internet is useless and I don't
               | disable networking on all of my devices because I'm
               | talking to you right now. But the difference is, when I
               | reply to you here, I'm never surprised about what is
               | being sent across the network. I'm typing this message
               | into this box, and when I hit reply, it will send that
               | message over the network to a server.
               | 
               | The difference here is agency. Steve Jobs had a quote
               | about computers being "bicycle[s] for the mind". Well, if
               | you just found out _today_ that your device was sending
               | meta information about your private photos over the
               | network, you would be right to feel like it 's not you
               | controlling the bike anymore. The answer to this problem
               | is not throwing a bunch of technical information in your
               | face and telling you its safe.
        
               | scratchyone wrote:
               | Honestly I'm a little tired so I'm not gonna
               | completely/perfectly address everything you said here but
               | 
               | > If they know some people will be against the feature,
               | why not ask instead of enabling it for them?
               | 
               | Honestly I would just say this is because you can only
               | ask so many things. This is a hard to explain feature,
               | and at some point you have to draw a line on what you
               | should and shouldn't ask for consent on. For many people,
               | the default reaction to a cookie popup is to hit "accept"
               | without reading because they see so many of them. Consent
               | fatigue is a privacy risk too. Curious where you'd choose
               | to draw the line?
               | 
               | > Do you really not see how this is not a good faith
               | comparison? I'm not going to address this.
               | 
               | Yes, my point is that your reasoning isn't good faith
               | either. We both know it's silly and a bit conspiratorial
               | to imply that Apple adding a setting for it means they
               | know a feature is secretly bad. If they wanted to hide
               | this from us, neither of us would be talking about it
               | right now because we wouldn't know it existed.
               | 
               | > We all just learned about today! We don't even need to
               | speculate about whether it is impractical, we know it
               | can't be done, and that's before we consider that loss of
               | privacy and agency over devices is a death-by-a-thousand-
               | cuts situation.
               | 
               | That's fair, but there's honestly no perfect answer here.
               | Either you appease the HN crowd on every feature but
               | overwhelm most non-technical users with too many popups
               | to the point they start automatically hitting "yes"
               | without reading them, or you make features that you truly
               | consider to be completely private opt-out but upset a
               | small subset of users who have extremely strict privacy
               | goals.
               | 
               | How do you choose where that dividing line is? Obviously
               | you can't ask consent for every single feature on your
               | phone, so at some point you have to decide where the line
               | between privacy and consent fatigue is. IMO, if a feature
               | is genuinely cryptographically secure and doesn't reveal
               | any private data, it probably should be opt-out to avoid
               | overwhelming the general public.
               | 
               | Also, how would you phrase the consent popup for this
               | feature? Remember that it has to be accurate, be
               | understandable to the majority of the US population, and
               | correct state the privacy risks and benefits. That's
               | really hard to do correctly, especially given "21 percent
               | of adults in the United States (about 43 million) fall
               | into the illiterate/functionally illiterate category"[0].
               | 
               | [0] https://www.libraryjournal.com/story/How-Serious-Is-
               | Americas...
        
               | jchw wrote:
               | > Honestly I would just say this is because you can only
               | ask so many things. This is a hard to explain feature,
               | and at some point you have to draw a line on what you
               | should and shouldn't ask for consent on. For many people,
               | the default reaction to a cookie popup is to hit "accept"
               | without reading because they see so many of them. Consent
               | fatigue is a privacy risk too. Curious where you'd choose
               | to draw the line?
               | 
               | Cookie consent fatigue is both good and bad. Before
               | cookie consent, you just simply had no idea all of this
               | crap was going on; cookie consent took something
               | invisible and made it visible. I agree it sucks, but
               | learning that basically everything you use collects and
               | wants to continue to collect data that isn't essential
               | has opened a lot of people's eyes to how absurdly wide
               | data collection has become.
               | 
               | > Yes, my point is that your reasoning isn't good faith
               | either. We both know it's silly and a bit conspiratorial
               | to imply that Apple adding a setting for it means they
               | know a feature is secretly bad. If they wanted to hide
               | this from us, neither of us would be talking about it
               | right now because we wouldn't know it existed.
               | 
               | Apple doesn't add options for no reason. It's _useful_ to
               | control whether the camera flash goes off for potentially
               | many reasons. Similarly, if this option was absolutely
               | bullet-proof, it wouldn 't need an option. The option
               | exists because having derivatives of private data flowing
               | over to servers you don't control is not ideal practice
               | for privacy, and Apple knows this.
               | 
               | And of course Apple isn't trying to hide it, they're
               | currently trying to sell it to everyone (probably mainly
               | to regulators and shareholders, honestly) that it's the
               | best thing for user privacy since sliced bread.
               | 
               | > That's fair, but there's honestly no perfect answer
               | here. Either you appease the HN crowd on every feature
               | but overwhelm most non-technical users with too many
               | popups to the point they start automatically hitting
               | "yes" without reading them, or you make features that you
               | truly consider to be completely private opt-out but upset
               | a small subset of users who have extremely strict privacy
               | goals.
               | 
               | > How do you choose where that dividing line is?
               | Obviously you can't ask consent for every single feature
               | on your phone, so at some point you have to decide where
               | the line between privacy and consent fatigue is. IMO, if
               | a feature is genuinely cryptographically secure and
               | doesn't reveal any private data, it probably should be
               | opt-out to avoid overwhelming the general public.
               | 
               | > Also, how would you phrase the consent popup for this
               | feature? Remember that it has to be accurate, be
               | understandable to the majority of the US population, and
               | correct state the privacy risks and benefits. That's
               | really hard to do correctly, especially given "21 percent
               | of adults in the United States (about 43 million) fall
               | into the illiterate/functionally illiterate category"[0].
               | 
               | I honestly think the answer is simple: All of this cookie
               | consent bullshit exists because the _real_ answer is
               | relatively simple, but it 's inconvenient. If online
               | behavioral tracking is _so_ bad for our privacy, and we
               | can 't actually trust companies to handle our data
               | properly, we should full-on ban it under (almost?) any
               | circumstances. There. Cookie consent fixed. You can't
               | track users for "non-essential" purposes anymore. And no,
               | they don't _need_ to. There was a time before this was
               | normal, and if we can help it, there will be a time
               | _after_ it was normal too. Protecting someone 's stupid
               | business model is not a precondition for how we define
               | our digital rights.
               | 
               | This is exactly why I worry about Apple's intentions. For
               | what it's worth, I don't believe that the engineers or
               | even managers who worked on this feature had anything but
               | good intentions, and the technology is very cool.
               | Obviously, nobody is denying that Apple is good at making
               | these "privacy" technologies. But Apple as a whole seems
               | to want you to think that the root cause of the privacy
               | problem is just that our technology isn't private enough
               | and it can be fixed by making better technology.
               | Conveniently, they sell products with that technology. (I
               | do not think that it is any shocker that the first uses
               | of this technology are as innocuous as possible, either:
               | this is a great strategy to normalize it so it can
               | eventually be used for lucrative purposes like
               | advertising.)
               | 
               | But that's wrong, and Apple knows it. The privacy problem
               | is _mostly_ just an effect of the loss of agency people
               | have over their computers, and the reason why is because
               | the end user is not the person that software, hardware
               | and services are designed for anymore, it 's designed for
               | shareholders. We barely regulate this shit, and users
               | have next to no recourse if they get pushed and coerced
               | to do things they don't want to. You just get what you
               | get and you have to pray to the tech gods that they don't
               | turn the screws even more, which they ultimately will if
               | it means they can bring more value to the shareholders.
               | Yes, I realize how cynical this is, but it's where we're
               | at today.
               | 
               | So yes, it's nice to not inundate the user with a bunch
               | of privacy prompts, but the best way to do that is to
               | remove and replace features that depend on remote
               | services. And hell, Apple already _did_ do a lot of that,
               | it 's Google who would have absolute hell if they had to
               | put an individual prompt for every feature that harms
               | your privacy. Apple devices don't have very many privacy
               | prompts at all, and in this case it's to the point of a
               | fault.
               | 
               | (P.S.: I know _I_ used the term and not Apple, but even
               | calling it  "privacy" technology feels a bit misleading.
               | It's not actually improving your privacy, it's just
               | making a more minimal impact to your privacy stature than
               | the leading alternative of just sending shit to cloud
               | services raw. It's a bit like how electric vehicles
               | aren't really "green" technology.)
        
               | sgammon wrote:
               | > And I know people would like Apple to get credit for at
               | least attempting to make their features plausibly-
               | private, but I feel like it's just the wrong thing right
               | now.
               | 
               | Appeal to bandwagon; opinion discarded
        
           | s3p wrote:
           | If you really didn't want your photos to be analyzed, would
           | you be using an iPhone? Or any modern smartphone? Google
           | photos doesn't have nearly the privacy focus and no HE
           | whatsoever but I rarely see that mentioned here. It almost
           | seems like Apple gets held to a higher standard just
           | _because_ they have privacy preserving initiatives. Do you
           | use a keyboard on your iphone? You may not have heard but
           | apple is tracking which emojis you type most often [0] and
           | they get sent to apple servers.
           | 
           | [0] https://www.apple.com/privacy/docs/Differential_Privacy_O
           | ver...
        
             | WA wrote:
             | > Google photos doesn't have nearly the privacy focus and
             | no HE whatsoever but I rarely see that mentioned here. It
             | almost seems like Apple gets held to a higher standard just
             | because they have privacy preserving initiatives.
             | 
             | What is so surprising about this? If you make big claims
             | about anything, you are held to your own standards.
        
             | dwaite wrote:
             | > It almost seems like Apple gets held to a higher standard
             | just because they have privacy preserving initiatives
             | 
             | It doesnt' seem this way at all. It is rare to see someone
             | talking about current behavior, they are always talking
             | about the slippery slope - such as landmark detection
             | obviously being the first step in detecting propaganda on a
             | political dissident's phone.
             | 
             | This isn't driven by trying to hold them to a higher
             | standard; it is an emotional desire of wanting to see them
             | caught in a lie.
        
             | ctxc wrote:
             | Like parent mentioned - I don't upload photos to Google
             | photos, assume parent doesn't upload photos to iCloud.
             | 
             | Should photo info be sent to Apple/Google in this case?
        
         | joe_the_user wrote:
         | Just from memory when the scheme came up in earlier discussion.
         | 
         | The system is essentially scanning for the signature for some
         | known set of images of abuse so that it aims to capture abusers
         | who would naively keep just these images on their machines. (It
         | can't determine if a new image is abusive, notably).
         | 
         | It's conceivable some number of (foolish and abusive) people
         | will be caught this way and those favoring a long dragnet for
         | this stuff will be happy. But this opens the possibility that a
         | hacker could upload an image to an innocent person's computer
         | and get that person arrested. Those favoring the long dragnet
         | will naturally say the ends justify the means and you can't
         | make an omelet without cracking a few eggs. Oh, "think of the
         | children".
         | 
         | Edit: Also worth adding that once a company is scanning user
         | content to try to decide if the user is bad, it makes it that
         | much easier to scan all kind of content in all kind of ways for
         | all kinds of reasons. "for the good", naturally.
        
           | Thorrez wrote:
           | >The system is essentially scanning for the signature for
           | some known set of images of abuse
           | 
           | Huh? The system is scanning for landmarks, not images of
           | abuse.
           | 
           | >people will be caught this way
           | 
           | Due to the homomorphic encryption, I don't think Apple even
           | knows whether the image matches a landmark in Apple's server
           | database or not. So even if Apple put some images of abuse
           | into its server database (which Apple claims only contains
           | pictures of landmarks), I don't think Apple would know
           | whether there was a match or not.
        
             | zmgsabst wrote:
             | Does Apple explicitly say that?
             | 
             | Or only that they don't know which landmark it matched?
        
               | scratchyone wrote:
               | Fundamentally, vector search like this doesn't have a
               | concept of something "matching" or "not matching". It's
               | just a cosine similarity value. To determine if an image
               | "matches", you have to check if that similarity is within
               | some given threshold. If the results of the cosine
               | similarity operation are encrypted (they are with HE),
               | that wouldn't be possible to determine.
               | 
               | The bigger privacy risk would be that the device routes
               | the request to a specific database shard based on
               | whichever has a center-point closest to the image
               | embedding on the device. They take steps to protect this
               | information such as third-party proxying to hide user IP
               | addresses, as well as having devices send fake requests
               | so that the server cannot tell which are real user data
               | and which are fake data.
        
           | MBCook wrote:
           | That was the CSAM thing that they have announced they gave up
           | on.
           | 
           | This is totally different.
        
             | sureIy wrote:
             | Are you sure? Publicly yeah, but the same technology can
             | easily be used for the same purpose.
        
               | supriyo-biswas wrote:
               | As pointed out in a sibling comment, the result set is
               | also encrypted, so matches with abuse images, even if
               | there are some in Apple's POI database, can't be used to
               | implement the scheme as you suggest.
        
               | dwaite wrote:
               | Only in a very broad sense that they use HE to prevent
               | the server from seeing what happened.
               | 
               | The CSAM tech detected matches against particular photos
               | captured by law enforcement, and provided external
               | evidence of the match (e.g. enough positive matches
               | reconstructed a private key). It was not meant to do
               | topical matches (e.g. arbitrary child in a bathtub), and
               | had some protections to make it significantly harder to
               | manufacture false positives, e.g. noise manipulated in
               | kitten photos to cause them to meet the threshold to
               | match some known image in the dataset.
               | 
               | This gives a statistical likelihood of matching a cropped
               | image of a landmark-like object against known landmarks,
               | based on sets of photos of each landmark (like "this is
               | probably the Eiffel Tower"), and that likelihood is only
               | able to be seen by the phone. There's also significantly
               | less risk about around abuse prevention (someone making a
               | kitten photo come up as 'The Great Wall of China')
        
         | charles_f wrote:
         | > please also write the parts that you disagree with. Thank you
         | 
         | The problem invoked by the article is that data is being sent
         | back to Apple _by default_. Saying  "it's fine because it's
         | encrypted" and "don't use apple if you're not fine with that"
         | doesn't help.
         | 
         | The post complains about a product that stored sensitive
         | customer content locally now sends that data to Apple, and
         | given the combination of abuse on privacy and horrendous,
         | generalized security failures that we've seen across the
         | industry, those concerns seem genuine. Your comment is very
         | dismissive of these concerns, which would explain why it's
         | being down voted.
        
           | dwaite wrote:
           | Apple also makes a mandatory API call to captive.apple.com
           | from every device with a web view, just about every time they
           | connect to a network.
           | 
           | If someone is willing to take a hardline stance that a vendor
           | should inform the user and require opt-in consent for every
           | type of exchange, they are going to have to run platforms
           | specifically targeted toward that mindset.
        
         | zaroth wrote:
         | Everything about what they did was absolutely fantastic and
         | amazing, until they turned it on by default.
        
         | ustad wrote:
         | You're extensively describing the technical implementation
         | while missing the fundamental issue: Why is Apple enabling this
         | feature by default for what is, essentially, a luxury photo
         | search feature?
         | 
         | Let's break this down:
         | 
         | 1. Nobody is questioning whether Apple's technical
         | implementation is sophisticated or secure. It clearly is.
         | 
         | 2. Nobody is suggesting the privacy toggles don't work. They
         | do.
         | 
         | 3. The issue is about Apple deciding that automatically sending
         | data about users' photos (regardless of how securely) is an
         | acceptable default for a feature many users may never want or
         | need.
         | 
         | Consider the value proposition here: Apple invested significant
         | engineering resources into complex homomorphic encryption and
         | differential privacy... so users can search for landmarks in
         | their photos? And they deemed this feature so essential that it
         | should be enabled by default?
         | 
         | This feels like using a golden vault with military-grade
         | security to store grocery lists. Yes, the security is
         | impressive, but that's not the point. The point is: Why are my
         | grocery lists being moved to the vault without asking me first?
         | 
         | A privacy-respecting approach would be: "Hey, would you like to
         | enable enhanced landmark search in your photos? We've built
         | some really impressive privacy protections to make this
         | secure..."
         | 
         | Instead of: "We've already started analyzing your photos for
         | landmarks because we built really impressive privacy
         | protections..."
         | 
         | The sophistication of the technology doesn't justify making it
         | the default for what is ultimately an optional convenience
         | feature.
        
           | slashdave wrote:
           | > users may never want or need
           | 
           | Are you assuming that Apple did not perform a market analysis
           | when implementing this feature? I think that is unlikely,
           | considering the effort.
        
       | hmage wrote:
       | Reading comments here feels like being on Twitter, Reddit and
       | 4chan combined - a lot of people not listening to each other.
       | 
       | What happened to old HN?
        
         | WuxiFingerHold wrote:
         | I've noticed that too. I think it depends on the topic. Many
         | are biased when it comes to Apple (me too). Some even defend
         | their misbehaving. My emotional reaction to Apple misbehaving
         | is usually anger because I somehow feel disappointed, even
         | betrayed by Apple. "Apple, how can you build such gorgeous
         | hardware and still act so unethical?". This is of course
         | irrational: Apple is a company that does everything to grow.
        
         | amazingamazing wrote:
         | Too mainstream, regression to mean.
        
         | Klonoar wrote:
         | My view is that a large group of people interested in building
         | companies, tools, etc moved on and only come back when there's
         | a PR issue that pushes then to comment. What's left behind is
         | basically the same old /. esque crowd from the internet of old.
        
       | sitkack wrote:
       | I don't understand why this feature needs to exist at all.
       | Ostensibly, if someone wanted to use this, they also have
       | location data on, the phone not only knows its location but also
       | which direction it is pointed via the magnetometer.
       | 
       | I understand it is doing feature matching on the image, but if
       | you were taking a picture of the statue of liberty, the phone
       | would already know that from existing signals.
        
       | rzimmerman wrote:
       | If your core concern is privacy, surely you'd be fine with "no
       | bytes ever leave my device". But that's a big-hammer way to
       | ensure no one sees your private data. What about external
       | (iCloud/general cloud) storage? That's pretty useful, and if all
       | your data is encrypted in such a way that only you can read it,
       | would you consider that private? If done properly, I would say
       | that meets the goal.
       | 
       | What if, in addition to storage, I'd like to use some form of
       | cloud compute on my data? If my device preprocesses/anonymizes my
       | data, and the server involved uses homomorphic encryption so that
       | it also can't read my data, is that not also good enough? It's
       | frustrating to see how much above and beyond Apple has taken this
       | simple service to actually preserve user privacy.
       | 
       | I get that enabling things by default triggers some old wounds.
       | But I can understand the argument that it's okay to enable off-
       | device use of personal data IF it's completely anonymous and
       | privacy preserving. That actually seems very reasonable. None of
       | the other mega-tech companies come close to this standard.
        
         | zmgsabst wrote:
         | No -- users should be the ones to decide if "encrypted on
         | remote storage" is a beneficial trade off for them and their
         | particular situation.
         | 
         | I think there's some weird impulse to control others behind
         | these decisions -- and I oppose that relationship paradigm on
         | its own grounds, independent from privacy: a company has no
         | business making those choices for me.
         | 
         | You are free to use such services if you wish; others are free
         | not to use those services.
        
         | maronato wrote:
         | iCloud is opt in. This should be too. A lot of people are fine
         | with keeping their photos offline-only and syncing with their
         | computers through a cable.
         | 
         | Making it "private" with clever encryption is their job since
         | Apple wants to sell privacy. They aren't doing it because they
         | are nice or care about us. Plus, code is written by people and
         | people write bugs. How can you tell this is truly bug-free and
         | doesn't leak anything?
         | 
         | Ultimately, making it opt-in would be painless and could be
         | enabled with a simple banner explaining the feature after the
         | update or on first boot, like all their opt-in features. Making
         | it opt-out is irresponsible to their branding at best and
         | sketchy to their users at worst, no matter how clever they say
         | it is.
        
       | Aachen wrote:
       | > I don't understand most of the technical details of Apple's
       | blog post.
       | 
       | I did understand the cited bits, and sorry to say but this could
       | have been an optimistic post ("look at this cool new thing!")
       | 
       | I dislike Apple's anti-hacker (in the HN sense of the word)
       | practices as much as the next person and don't own any Apple
       | device for that and other reasons, but saying "it doesn't matter
       | how you solved the privacy problem, I feel it's not private"
       | doesn't make it true. Because most other people don't understand
       | the cited words either, if they read that far down anyway, this
       | seems like unfair criticism
        
         | rmbyrro wrote:
         | the main criticism is about sending private and sensitive data
         | to Apple without consent and warning
        
           | jl6 wrote:
           | I imagine Apple might argue that no private information is
           | sent thanks to the use of homomorphic encryption. But Apple's
           | explanation rings hollow without the user having the ability
           | to verify that this system is working as described.
        
             | kccqzy wrote:
             | It's not just users having the ability to verify it but
             | also the users comprehending it in the first place. Sending
             | something somewhere without the recipient being able to do
             | anything they please with that information is highly
             | _unintuitive_ , and I don't see homomorphic encryption
             | becoming popular anytime soon.
             | 
             | In a bit of personal news, in a previous job I once worked
             | on doing something similarly private to user's browsing
             | history, that is, the browsing history is sent to the
             | server without the server being able to capture or store
             | it. I was the tech lead for writing a prototype, but the
             | whole idea was then vetoed by a VP.
        
             | eleveriven wrote:
             | Users have no meaningful way to verify that their data is
             | genuinely protected
        
         | tgv wrote:
         | How can you trust something you don't understand? That must
         | come from "authority" (some person or org that you trust to
         | know about such matters). That authority isn't Apple for many
         | people. While I have cautious trust in Apple's privacy
         | policies, many people don't, and not without reason. Hence, not
         | understanding Apple's technical explanation of an Apple feature
         | you didn't opt in to sharing personal data, increases the
         | feeling of privacy violation (which in turn leads to more
         | distrust).
         | 
         | So would it be unfair criticism?
        
           | fguerraz wrote:
           | So you understand your own device's security? You have no
           | more reasons to trust the security of the Apple device in
           | your pocket than you do of an Apple device in a datacenter
           | IMHO.
        
           | avianlyric wrote:
           | > Hence, not understanding Apple's technical explanation of
           | an Apple feature you didn't opt in to sharing personal data
           | 
           | But this is the fundamental issue. The author has no idea if
           | personal data is being shared, they've made an assumption
           | based on their lack of understanding. It's entirely possible
           | that all this service does (and arguably likely), is provide
           | a private way for your phone to query a large database of
           | landmark fingerprints, then locally try and match those
           | fingerprints to your photos.
           | 
           | It doesn't require send up private data. The phone could
           | perform large geographic queries (the size of countries) for
           | batches of fingerprints to be cached locally for photo
           | matching. The homographic encryption just provides an added
           | layer of privacy, allowing the phone to make those queries in
           | a manner that makes it impossible for Apple to know what
           | regions were queried for.
           | 
           | iOS photos already uses databases to convert a photo location
           | into an address, so you can do basic location based
           | searching. That will involve doing lookups in Apple global
           | address database, do you consider that a violation of
           | people's privacy?
        
         | hmottestad wrote:
         | Homomorphic encryption is something I heard about through a
         | research paper a few years ago.
         | 
         | Back then I understood that an operation like SUM would be able
         | to compute the sum of a list of numbers where each number was
         | encrypted. The way the encryption worked made it possible to
         | add all the values together without decrypting them, and the
         | result ended up being encrypted too in such a way that the
         | owner could decrypt it and have a number with a certain known
         | accuracy.
         | 
         | If Apple is using homomorphic correctly then there should be no
         | way for them to see the data they get from your phone. The
         | other things they mention in the post as ways to prevent
         | leaking of other information through metadata or a side
         | channel.
         | 
         | The fact that this feature was enabled by default isn't exactly
         | great. Definitely should have been something that the user
         | should have been asked if they wanted to enable after
         | upgrading.
        
           | pdpi wrote:
           | One specific use Apple is making of homomorphic encryption as
           | of iOS 18 (I think) is for spam callers. You get a phone
           | call, your phone sends Apple the encrypted phone number, they
           | run it against their spam caller database, and you get the
           | encrypted spam/not spam response back. They published a bunch
           | of open source code around this functionality a while back.
           | 
           | https://www.swift.org/blog/announcing-swift-homomorphic-
           | encr...
        
             | hmottestad wrote:
             | That's really cool! Thanks for sharing :)
        
         | lapcat wrote:
         | > sorry to say but this could have been an optimistic post
         | 
         | > don't own any Apple device
         | 
         | So you don't have any skin in the game, but you're criticizing
         | someone who does?
         | 
         | My blog post is written from the perspective of an Apple user
         | whose trust has been violated. It's nice that you think--from a
         | safe distance--the technology is neat, and maybe it is, but
         | that's irrelevant to the main issue, which is the lack of user
         | consent.
        
           | talldayo wrote:
           | Hacker News unfortunately does not respond to this logic
           | unless it is a company they are trained to hate. We could run
           | the same story reporting Google and Meta's opt-out abuses,
           | and it would also reach the frontpage with just as many
           | comments. Except those comments would be violent
           | condemnation, not apologetics and hand-wringing over
           | whitepaper quotes.
           | 
           | It's tragic, because computing is in a professedly imperfect
           | place right now. Digital privacy is under fire, many payments
           | see a 30% digital service surcharge that is wholly arbitrary,
           | and revolutionary cross-platform standards are being
           | supplanted with proprietary and non-portable solutions that
           | does not benefit any user.
           | 
           | As an American, I am ashamed that our government's
           | dysfunction extends to consumer protection.
        
       | franczesko wrote:
       | We need to go back to local-first software. I'd be happy to pay
       | premium for a device, which is truly my, without big tech
       | sniffing all my data or deciding what can I do with it
        
       | bambax wrote:
       | Apple TOS:
       | 
       | > _To uphold our commitment to privacy while delivering these
       | experiences, we have implemented a combination of technologies to
       | help ensure these server lookups are private, efficient, and
       | scalable._
       | 
       | Efficiency and scalability have nothing to do with "upholding
       | one's commitment to privacy". This shows they're insincere.
       | 
       | But, is privacy achievable today? I doubt it. People desperately
       | want (or think they want) features that are opposite to privacy,
       | and if you don't give it to them, they're unhappy. I think
       | Apple's fault is in promising something they can't deliver. (That
       | also goes for Brave, Kagi, Duck Duck Go, etc.)
       | 
       | Scott McNealy famously said "You have zero privacy. Get over it."
       | This was in January 1999. 25 years ago!
        
         | supermatt wrote:
         | > Efficiency and scalability have nothing to do with "upholding
         | one's commitment to privacy". This shows they're insincere.
         | 
         | "private, efficient, and scalable" means "private AND efficient
         | AND scalable". What makes you think they are being insincere
         | about the privacy aspect?
        
           | bambax wrote:
           | If they said _To uphold our commitment to privacy while
           | delivering these experiences, we have implemented a
           | combination of technologies to help ensure these server
           | lookups are private_ then it would be fine. It would also be
           | a tautology.
           | 
           | When they add that server lookups are also "efficient and
           | scalable", it means that they have had to ponder the privacy
           | aspects with technical concerns regarding efficiency and
           | scalability, and that therefore, privacy is mitigated.
           | 
           | I think a fair reading of this sentence would be: "we provide
           | a version of 'privacy' that we feel is acceptable, within
           | reasonable efficiency and scalability constraints".
           | 
           | They're not going to dedicate a server per customer for
           | example. Would it make sense to do it? No. But it would be
           | honest to say "because of efficiency and scalability limits,
           | the 'privacy' we provide is relative and subject to
           | breaches". (I actually think that's exactly what their
           | wording is trying to say.)
        
             | supermatt wrote:
             | > I actually think that's exactly what their wording is
             | trying to say.
             | 
             | And herein lies the problem, because they are literally
             | saying what they say - private AND efficient AND scalable.
             | You are the one adding hypothetical caveats.
             | 
             | They are talking about offloading functionality that cannot
             | occur on device. The scalability and efficiency is not
             | being used as a reason to take privacy away, it is simply
             | an additional requirement of how they must meet their
             | pledge.
             | 
             | The quote you are referencing is literally about how these
             | features are implemented using homomorphic encryption to
             | perform lookups on anonymously accessed black boxes:
             | https://machinelearning.apple.com/research/homomorphic-
             | encry... . Which part of that is them sacrificing privacy
             | to make it scalable and efficient, and how would using a
             | dedicated server per user increase privacy? Is there some
             | specific privacy issue you can see with their
             | implementation?
        
               | bambax wrote:
               | To "uphold one's commitment to privacy" the only thing to
               | do is just that: uphold one's commitment to privacy. One
               | doesn't need to make sure the lookups are "efficient" or
               | "scalable" (or "clean" or "polite" or what have you).
               | That is absolutely not needed for privacy.
               | 
               | Why would they need to add these qualifiers if not to
               | explain/justify that privacy isn't the only concern,
               | there are technical matters to consider as well.
        
               | supermatt wrote:
               | Not needed for privacy, but not an obstruction to it
               | either.
               | 
               | They have thoroughly documented the architecture, so if
               | you have a concern about that, then state it.
               | 
               | At the moment you are saying that because it's efficient
               | and scalable it must be at the cost of privacy - but have
               | stated no reason for that to be true.
        
       | synecdoche wrote:
       | Apple decided to siphon user photo derived data without prior
       | consent. The purpose, capacity and risk of said data being in
       | others' hands we are expected to take their word for. All while
       | not informing users in advance or trusting users to themselves
       | decide to start the siphoning. By doing this they have violated
       | some users trust.
        
       | stefanve wrote:
       | I wonder how much of the people that are being upset about an
       | option that recognizes landmarks and that is build in a privacy
       | minded way are using chatGTP, windows, google or socials
       | 
       | The writer of the post admits that he doesn't understand the very
       | tech lite explanation of Apple nor read about Apple AI in general
       | and the way it was setup. A lot of people are upset that (maybe,
       | but unlikely) Apple knows that you made the 10 billionth photo of
       | the Golden Gate Bridge. But continu using all sorts of services,
       | living in sec camera heavy cities etc.
       | 
       | Not an apple fan but from all big tech corporations they have the
       | least amount of interest in mining user data from a business
       | perspective
        
         | makeitdouble wrote:
         | You're ignoring the part where it's opt-out.
         | 
         | We're way passed the point where things are accidently opt-out
         | instead of opt-in, if you want to dig the point, it's something
         | you should touch upon.
        
         | RyanHamilton wrote:
         | >>Not an apple fan but from all big tech corporations they have
         | the least amount of interest in mining user data from a
         | business perspective
         | 
         | I think it's more that they realised they couldn't beat google
         | at that game and hoped to see personal or governmental push
         | back like GPDR in the EU that would then give them an
         | advantage. Which would then give them an advantage. The day
         | they stop having that belief they will invade your privacy.
        
       | JBiserkov wrote:
       | > Remember this advertisement? "What happens on your iPhone,
       | stays on your iPhone."
       | 
       | 1. Take the biggest downside of your product
       | 
       | 2. Advertise the opposite
       | 
       | 3. Profit
        
       | sgammon wrote:
       | Apple: _Does backflips to protect your data_
       | 
       | Apple's competition: _Steals your data outright_
       | 
       | Most of this thread: How could Apple act this way
        
         | bergie wrote:
         | > Apple: Does backflips to protect your data
         | 
         | That's what their marketing says. I'm not sure how far you can
         | actually trust that. They're an ad and media company as well.
        
           | sgammon wrote:
           | I am merely comparing homomorphic encryption, client-side
           | vectorization and ML, and so on, to not doing those things.
           | Nothing to do with marketing. Read the manual
        
             | RyanHamilton wrote:
             | >>Apple: Does backflips to protect your data
             | 
             | Yet they enabled this feature by default. Disabled by
             | default would be easier than a backflip?
        
               | sgammon wrote:
               | The feature is not unsafe in the first place, so turning
               | it on by default grants useful functionality without
               | risk.
        
         | ghjfrdghibt wrote:
         | You're kidding yourself if you think apple is in the business
         | of protecting your data beyond marketing itself as such. Just
         | like all it's competition it's in the business of using your
         | data like them.
        
           | sgammon wrote:
           | > You're kidding yourself if you think apple is in the
           | business of protecting your data beyond marketing itself as
           | such.
           | 
           | It is entirely possible to create a business where others
           | ignore user demands, like, say, privacy. Apple does not have
           | a massive advertising incentive to process user data as
           | others do. I am not a huge Apple fan (typing this from my
           | Windows machine), but I am not ashamed to admit that some
           | companies are good at some things, and others at other things
           | (insane idea I know)
           | 
           | Apple does not _have_ to homomorphically encrypt your data.
           | They do so because they have  "made privacy a feature," as
           | they say in their marketing. You can also just read the
           | research papers or code if you want, since they have
           | published those, too. Have you read those?
        
             | ghjfrdghibt wrote:
             | No I've not read them, I don't see why I would as I'm not
             | an apple customer. But I do wonder if apple are in
             | charge/control the encryption keys to your encrypted
             | data...
        
               | sgammon wrote:
               | They are not, that is what makes it secure and why you
               | should read the manual.
        
               | ghjfrdghibt wrote:
               | So no third party, including law enforcement can access
               | data held by apple? Including apple? At any point?
               | 
               | I'm asking these questions because I'm definitely not
               | going to read the manual, for the reason I've already
               | said, you seem to have read the manual, and I find it
               | hard to believe. I'm only aware that apple offers
               | advanced protection on your uploaded data, which you have
               | to opt into, which might be what you're talking about...?
        
       | chikere232 wrote:
       | As trivia, on mac os, the photoanalysisd service will run in the
       | background and look through your photos, even if you never open
       | Apple Photos. It can't be disabled unless you disable SIP (system
       | integrity protection) which requires a complicated dance of
       | reboots and warnings. It will reenable if you turn SIP back on.
       | 
       | It seems Apple are very passionate about analysing your photos
       | for some reason, regardless if you yourself are.
        
         | hmottestad wrote:
         | Isn't that running locally on your Mac though?
        
         | m463 wrote:
         | All kinds of nonsense runs and phones home throughout the os.
         | The thing that annoyed me the most is trying to create an
         | account will phone home to apple, such as setting up a local
         | smtp/imap server on the local network.
        
         | snowwrestler wrote:
         | You may also be shocked to learn that Spotlight looks through
         | every single file on your Mac.
        
           | talldayo wrote:
           | I was. First by the md_worker processes that mysteriously
           | started pinning all of my CPU cores after a git clone. Then
           | by the realization that MacOS had built a full-text index of
           | millions of lines of source code (it only took a few hours of
           | my Mac being too hot to touch).
           | 
           | A lot of Apple's defaults are just plain bizarre. Why the
           | hell is Spotlight seeing source code mimetypes and feeding it
           | to the search index?
        
             | asdff wrote:
             | For real! When I search for a term the last thing I want is
             | some esoteric plist file but I'll return dozens of those
             | hits. Is there a way to exclude these I wonder? Limit it to
             | what I have in the default home directory structure lets
             | say and not go into my launch agents.
        
               | st3fan wrote:
               | Look in the Spotlight settings. Not only can you
               | include/exclude default search types. But you can also
               | specify folders not to index.
               | 
               | Why is this even a question, do people not look at the
               | settings at all?
        
               | talldayo wrote:
               | > Why is this even a question, do people not look at the
               | settings at all?
               | 
               | No, I expect defaults that aren't asinine from a company
               | that bills themselves as a premium experience for
               | software developers. It should be common sense, for the
               | purposes of providing software to both ordinary users and
               | software developers, to omit source files that aren't
               | relevant search results. It's one of the most annoying
               | features on Windows too, you'd _hope_ Apple would see the
               | writing on the wall and just fix it.
               | 
               | Harboring hope was my mistake, though. Sold my Mac in
               | 2018 and haven't felt a desire to daily-drive MacOS
               | since.
        
             | st3fan wrote:
             | I love it that it indexes source code. It allows me to find
             | things easily.
        
               | talldayo wrote:
               | I have never once, in all of my years owning a Mac, used
               | Spotlight search to find a source file based on it's text
               | contents. By comparison, I have absolutely wasted
               | probably close to an hour of my life cumulatively mashing
               | the down arrow to find a relevant result that wasn't a
               | Cmake file.
        
           | chikere232 wrote:
           | Local search indexing is somewhat more defendable as a system
           | level service, but yeah, it would be nice if that was also up
           | to me as a user.
        
         | rsync wrote:
         | "It seems Apple are very passionate about analysing your photos
         | for some reason, regardless if you yourself are."
         | 
         | Isn't this fragile to pollution from end users?
         | 
         | What if we all ran A local Image generator trained on our own
         | photos... But slightly broken... And just flooded their photo
         | hash collection with garbage?
         | 
         | Now what ?
         | 
         | This would be a very good flushing action. Lot would be learned
         | by seeing who got angry about this and how angry they got...
        
           | ricardobeat wrote:
           | No. The analysis in question is fully local, used for
           | indexing photos by categories in the Photos app. It is
           | unrelated to any cloud features and not something shared
           | across users.
           | 
           | They are also not using your personal photos to feed the
           | location database, most likely public sources and/or Apple
           | Maps data. If they are relying on GPS-tagged public photos
           | alone, you could probably mess up a system like this by
           | spoofing GPS location en-masse and posting them online for
           | years, but for what purpose?
        
             | rsync wrote:
             | Like I said, a good candidate for a "flushing action":
             | 
             | https://kozubik.com/items/FlushingAction/
             | 
             | Flood that system with garbage and let's see who gets upset
             | and how they communicate their upset.
             | 
             | If the system is architected as advertised it shouldn't be
             | a problem ...
        
         | slashdave wrote:
         | > for some reason
         | 
         | It's directly tied to features they provide in their photo app.
         | This is hardly obscure.
        
           | chikere232 wrote:
           | Why make it extremely hard to disable? Photos is hardly a
           | system level app
        
       | thinkingemote wrote:
       | So what can Apple do with this assuming they can send anything
       | not just "landmarks".
       | 
       | What metadata about the image do they get?
       | 
       | Are images from the web or social media or device screenshots
       | accessible too?
       | 
       | Seems at the very least they are getting a database of things,
       | objects and shapes. Can cohorts of users be identified? Useful
       | for ad targeting?
       | 
       | They should be able to determine how many devices have a photo of
       | a particular person. Or have a photo of a piece of art. Useful
       | for capturing wanted people and stolen goods! They should see how
       | many times users take a photo of another phone. When and how many
       | users take a photo of a protest. Which side has which colour. Do
       | people take selfies of themselves when sick with a novel
       | pandemic.
       | 
       | Do people take more photos of dogs at Christmas than cats.
       | 
       | Anything that can be vectorized can be identified. But what data
       | do they get and what could they do with it?
       | 
       | It's like telemetry and database of images.
        
         | dan-robertson wrote:
         | I think the differential privacy and encryption are meant to
         | help with that.
        
       | kureikain wrote:
       | Just ged rid if icloud and apple photos. slow as f*ck, crash all
       | the time once the library get big enough on my mac. damn slow to
       | scoll in the photos app with lot of picture.
       | 
       | I switched to https://immich.app/docs/features/command-line-
       | interface and import all of my photo into it. 1.3TB and it's damn
       | fast with all the face detection and ml features.
        
         | jitl wrote:
         | I have 60k photos using 1.2TB and Photos.app feels faster than
         | Google Photos or Lightroom to me -\\_(tsu)_/-
        
       | renewiltord wrote:
       | This is exactly the kind of feature I have this app for.
        
       | sgammon wrote:
       | It is frankly nothing short of amazing that Apple ships things
       | like homomorphic encryption, and differential privacy, and
       | client-side vectorization, and encrypted vectors, at the scale
       | that they inhabit... and they still get a bad report card back
       | from consumers about privacy.
       | 
       | Comparing Apple and Google, or Apple and Microsoft, it seems to
       | me that Apple's track record on these issues is actually not as
       | bad as public opinion might suggest. Meta doesn't even make the
       | list for comparison, and neither does Amazon.
       | 
       | It makes me wonder if picking privacy as a strategy is workable
       | in the first place. People trust TLS; people use banking apps on
       | their phone now without thinking. I remember in 2008 or so when
       | people still didn't _quite_ trust SSL.
       | 
       | I'm not sure if Apple will be able to bridge the gap here,
       | though, if _all_ of their competition simply chooses not to ship
       | those features. Do customers know the difference? Do they...
       | care? In theory they want their data to be private, yes. But if
       | they are not willing to educate themselves to perform their
       | counterparty obligation in fulfilling "informed consent," there
       | may be no action Apple could take to avoid catching the same bad
       | rap everyone else does.
        
         | jchw wrote:
         | > It is frankly nothing short of amazing that Apple ships
         | things like homomorphic encryption, and differential privacy,
         | and client-side vectorization, and encrypted vectors, at the
         | scale that they inhabit... and they still get a bad report card
         | back from consumers about privacy.
         | 
         | Personally, I don't shy away from criticizing Google, but
         | that's not the point. Apple makes big claims about their
         | privacy practices that neither Google nor Microsoft make, it
         | would be bizarre to hold Google or Microsoft to claims and
         | standards that Apple set for themselves.
         | 
         | > Comparing Apple and Google, or Apple and Microsoft, it seems
         | to me that Apple's track record on these issues is actually not
         | as bad as public opinion might suggest. Meta doesn't even make
         | the list for comparison, and neither does Amazon.
         | 
         | > It makes me wonder if picking privacy as a strategy is
         | workable in the first place. People trust TLS; people use
         | banking apps on their phone now without thinking. I remember in
         | 2008 or so when people still didn't _quite_ trust SSL.
         | 
         | > I'm not sure if Apple will be able to bridge the gap here,
         | though, if _all_ of their competition simply chooses not to
         | ship those features. Do customers know the difference? Do
         | they... care? In theory they want their data to be private,
         | yes. But if they are not willing to educate themselves to
         | perform their counterparty obligation in fulfilling "informed
         | consent," there may be no action Apple could take to avoid
         | catching the same bad rap everyone else does.
         | 
         | I've said this elsewhere, but what I really dislike about
         | Apple's strategy regarding privacy is that they treat privacy
         | as a purely technological problem that could be solved if only
         | we had better technology, but they ignore that a huge component
         | of why users have been subjected to so many flagrant privacy
         | violations is because they have zero visibility and zero real
         | control over their computing experiences. Apple would very much
         | like to retain their iron grip on what users are allowed to do
         | on their platforms, as they make a _ton_ of money off of their
         | control of the platforms in various ways, so they have a huge
         | incentive to make sure we 're all arguing about whether or not
         | Apple is better or worse than Google or Microsoft. Because
         | sure, I do believe that if we held Google to the same privacy
         | standards that Apple currently has, it would probably kill
         | Google. However, if Apple and Google were both forced to give
         | more transparency and control to the users somehow, they'd both
         | be in a lot of trouble.
         | 
         | Despite all of this effort, I think that user trust in the
         | privacy of cloud computing and pushing data out to the internet
         | will only ever go down, because attacks against user privacy
         | and security will only ever continue to get more and more
         | sophisticated as long as there are resourceful people who have
         | a reason to perform said attacks. And there certainly always
         | will be those resourceful people, including in many cases our
         | own governments, unfortunately.
        
           | sgammon wrote:
           | "Ask App Not To Track" would like a word
        
         | Klonoar wrote:
         | Consumers seem to think well of their pro-privacy standpoints,
         | even if the devil is in the details with regards to how
         | effective it might be.
         | 
         | The people giving "poor reports" are often the hardcore tech
         | users (or LARPers) who grew up in a different era and mindset
         | and are slowly being shoved out.
        
       | BlackFly wrote:
       | This discussion got quite long without anyone mentioning the
       | novel technical implementation paper "Scalable Private Search
       | with Wally". Kudos to Apple for researching this, but shame on
       | them for enabling this by default.
       | 
       | As a somewhat homomorphic encryption scheme, each query releases
       | some bits of information on what you searching for to avoid using
       | the whole database. Subsequent queries from a given user will
       | generally be correlated enough to build a tracer. Governments or
       | other powerful enough actors can pierce the proxy veil, heck the
       | tracer will be able to deanonymize you with enough queries
       | recorded.
       | 
       | How many queries? For me it is too tedious to work out the math
       | from the differential privacy definitions and I already know the
       | landmarks around me: I don't want such a feature.
        
         | josh2600 wrote:
         | Hi,
         | 
         | Very curious here as I haven't seen any papers demonstrating
         | attacks against the differential privacy systems proposed by
         | Apple or Google that successfully deanonymize data. Such an
         | attack in even a test database would be very interesting.
         | 
         | Do you have any papers you can cite about this entropic leakage
         | you're describing?
        
           | BlackFly wrote:
           | The very difference between somewhat and full homomorphic
           | encryption hinges on this leakage as explained in the paper.
           | The definition of differential privacy as well. They directly
           | admit to leaking a certain amount of information by stating
           | that they apply differential privacy with those given
           | parameters. The issue I am talking about is that such
           | concerns are applied on a single query but correlations
           | across query (the things that actually happen with metadata)
           | aren't considered in the delta-epsilon differential privacy
           | model, by definition.
           | 
           | So if you are curious, just read up in general about
           | differential privacy, or this classic:
           | https://gwern.net/death-note-anonymity Discussed here
           | (possibly in other places too):
           | https://news.ycombinator.com/item?id=9553494
        
       | t0bia_s wrote:
       | I'm genuinely surprised that there are many software developers
       | that defend Apple products as great tool for privacy even though
       | we are not able to inspect their code that is closed.
       | 
       | Am I missing something or is it just hypocrisy?
        
         | Argonaut998 wrote:
         | It's just hypocrisy. I don't know how people can defend Apple
         | as a privacy company when they phone home more than their
         | Google counterparts.
         | 
         | People will say that at least Apple don't have a vested
         | interest in their data since they are not an advertising
         | company like Google or Meta. But Apple collects an absurd
         | amount of data, it is phoning home every second. Some Apple
         | traffic is hard-coded to go through VPNs.
         | 
         | Apple has already experimented with ads in Maps. Shareholders
         | demand infinite growth, so it is only a matter of time before
         | Apple dives into ads themselves.
         | 
         | The only valid point is that Apple does protect your privacy -
         | only towards other companies.
         | 
         | I switched to Apple around 3 years ago because I don't like
         | having my data coupled to Google. But I am regretting my
         | decision, I will probably switch back to Android + Linux
         | instead of IOS + Mac at this rate.
        
           | t0bia_s wrote:
           | Degoogled Android (like LineageOS) is real thing that many
           | use on daily basis.
        
       | ghjfrdghibt wrote:
       | Every time one of these articles pops up about this sort of
       | behaviour but one of the big tech companies, I'm always struck by
       | the differences in the comments you get. I wonder if Google would
       | be getting the benefit of the doubt like apple is currently
       | getting were Google to implement the exact same thing as an opt-
       | out feature?
        
         | criddell wrote:
         | It doesn't seem odd to me. I'm more suspicious of Google's
         | motives and am even more suspicious of companies like TikTok,
         | Amazon, Xiaomi, and Facebook. There's a privacy spectrum.
        
         | dan-robertson wrote:
         | I think sometimes the reactions are different because the facts
         | are different. Your counterfactual seems less likely to me than
         | eg Google photos analysing and indexing each image.
        
         | Hilift wrote:
         | A short while back, Google implemented what I call "Constantly
         | Opt-Out". Google Photos frequently prompting you to "backup
         | those photos to ...".
        
       | Argonaut998 wrote:
       | I think I just noticed a similar thing for search that I'm pretty
       | sure was not there before IOS 18.
       | 
       | Going into Settings -> Search there's an option now for "Help
       | Apple Improve Search", enabled by default.
       | 
       | >Help improve Search by allowing Apple to store the searches you
       | enter into Safari(!!), Siri and Spotlight in a way that is not
       | linked to you. Searches include lookups of general knowledge, and
       | requests to do things like play music and get directions.
       | 
       | If it was there before then it was switched on again.
        
         | fn-mote wrote:
         | > allowing Apple to store the searches you enter into Safari
         | [...] in a way that is not linked to you
         | 
         | From deanonymization work even a decade ago it was clear that
         | your search history will completely unmask you.
         | 
         | I would need lots of details before I believed that their
         | method of storing the data reliably protected my privacy. (But
         | of course that is not what the quote claims.)
        
         | fudged71 wrote:
         | You're right. This is another setting turned on by default.
        
         | replwoacause wrote:
         | Oh damn you're right. This was turned on by default for me, I
         | don't think I would have opted into this.
        
         | ThouYS wrote:
         | thanks man
        
       | talkingtab wrote:
       | Apple can do this because there are absolutely no _significant_
       | consequences. In the classical legal world your only recourse to
       | this invasion of your privacy along with falsified advertising is
       | a  "class action suit". It is clear from past class actions suits
       | that (remember the batter thing) this does not prevent Apple from
       | doing the same thing again.
       | 
       | The problem is the granularity. How do millions of people recover
       | damages when a corporation knowingly (knows or should know) it is
       | acting to enrich itself in a significant manner in small amounts.
       | Let us suppose that an abstraction of the damage from this
       | offense can be quantified at $1 per customer. A significant
       | question is whether this action has any _possible_ benefit to
       | Apple. If it does not then once notified of this action, Apple
       | would immediately (as in the next update of IOS or Sequoia)
       | remedy this feature.
       | 
       | So step #1 is someone to send an official letter, perhaps with
       | just a link to this article. Or perhaps someone from Apple is
       | reading these comments and can inform us whether they are aware.
       | 
       | Next state is that Apple is aware of this problem (knows or
       | should know) and ignores it. So are you helpless? Or could you
       | file a claim in small claims court seeking to recover $1 for each
       | photo that Apple has unloaded, requesting that Apple delete all
       | photos as well as all data derived from the photos, plus filing
       | fee.
       | 
       | Next state. You comment on this post as to how to improve the
       | process and make it faster and easier. [And if you oppose this
       | idea, could you please explain your position in a way that helps
       | others understand?]
        
       | cacao-sip wrote:
       | Apple doing the best in class job for privacy and you guys would
       | rather go after it vs google, meta and others who have
       | significantly much massive issues?
        
       | ngcc_hk wrote:
       | Tbh whilst this should be improved it is more like those who
       | lives and works in free democratic countries with market access
       | by and large complained about their imperfect capitalism,
       | undemocratic and not social enough country. May be it is.
       | 
       | But I do not think so by and large.
       | 
       | Keep on fight and Lee on pushing. At least Apple has a slogan you
       | can push on.
        
       | neycoda wrote:
       | At least google told me on my phone in a pop-up that it would be
       | doing something like this.
        
       | andybak wrote:
       | Users of my (free, open-source) app seem surprised to learn that
       | we've got zero insight into usage patterns. There are situations
       | where a small amount of anonymous telemetry would be extremely
       | helpful but I'm not going to touch it with a barge-pole.
       | 
       | Opt-in makes the data useless - not just in terms of the huge
       | drop in quantity but because of the fact it introduces a huge
       | bias in the data selected - the people that would opt-in are
       | probably not a good sample of "typical users".
       | 
       | Opt-out - no matter what safeguards or assurances I could provide
       | is unacceptable to a subset of users and they will forcefully
       | communicate this to you.
       | 
       | Don't get me wrong - I understand both the ease at which bad
       | actors abuse telemetry and the ease in which "anonymous data" can
       | prove to be nothing of the kind in a multitude of surprising
       | ways.
       | 
       | But it's hard not to feel a little sad in a "this is why we can't
       | have nice things" kind of way.
        
         | spacemanspiff01 wrote:
         | Would there be a way to do the stats gathering on device, then
         | once every few months send a popup with statistics?
         | 
         | Not sure what bias it adds
         | 
         | Like
         | 
         | "hey, we make this app, and we care about privacy, here is the
         | information we have gathered over your usage for the past
         | month, can we send this to ourselves, so that we can use it to
         | improve the app?"
         | 
         | And then show human readable form of what data was collected.
        
           | sofixa wrote:
           | You'd still have extremely biased data - people who blindly
           | click OK on every pop up are not representative of your
           | typical user; people who get nightmares after hearing the
           | word "telemetry" and will gather the pitchforks if they hear
           | any hint of will always refuse, but depending on your app,
           | might be your typical user (e.g. for self-hosted picture sync
           | and catalogue, who is the target audience - people who don't
           | trust Apple/Google/Amazon/Dropbox to store their images
           | privately)
        
             | afcool83 wrote:
             | I do find myself on the "private first" side...but also
             | keep in mind that those who grab for pitchforks in defense
             | of privacy aren't a representative sample of the typical
             | user either. (A purely statistical statement).
             | 
             | It's very easy to confuse 'loud protest from a small
             | minority' and the majority opinion. If a plurality of users
             | chose to participate in an analytics program when asked and
             | don't care to protest phone-home activities when they're
             | discovered, then that's where the majority opinion likely
             | lies.
        
             | magnat wrote:
             | > people who blindly click OK on every pop up are not
             | representative of your typical user
             | 
             | You could unbias the data by including the metric
             | determining how long did it took them to click "Ok" and
             | whether they actually reviewed the data before agreeing.
        
           | diggan wrote:
           | Just as a reference of existing implementations of this: This
           | is essentially how Valve/Steam collects hardware details from
           | users/clients. Every now and then, a popup appears asking the
           | user if they'd like to participate in the "Hardware Survey",
           | together with all the data that would be submitted if they
           | accept.
           | 
           | Seems to me like a great implementation.
        
           | pbhjpbhj wrote:
           | The podcast app I use, AntennaPod (far better for me than
           | other apps, available on F-Droid, no affiliation!) just gave
           | me a local-only year in review. I thought it was a great
           | touch, and would be happy to have then shared the data from
           | that with the app's makers.
        
           | galleywest200 wrote:
           | This sort of sounds like the Steam Hardware Survey. They do
           | not collect the data willy-nilly, they ask you every few
           | months if you want to participate in a one-time check.
           | 
           | I have an incentive to see if the Linux desktop share has
           | increased, so I usually run the survey for them to get my
           | data point in. I also suppose the "gamer" crowed likes to
           | show off how powerful their "rig" is, so I would imagine they
           | commonly also run the survey for that reason as well.
        
         | GlumWoodpecker wrote:
         | This can only ever be opt-in if you want to stay on the legal
         | side of the GDPR (and equivalents in other jurisdictions). You
         | can ask, but the default needs to be "no" if no answer is
         | given.
         | 
         | I provide telemetry data to KDE, because they default to
         | collecting none, and KDE is an open-source and transparent
         | project that I'd like to help if I can. If I used your app, I
         | would be likely to click yes, since it's open-source. Part of
         | the problem I have with projects collecting user data is the
         | dark patterns used or the illegal opt-out mechanism, which will
         | make me decline sending telemetry every time, or even make me
         | ditch it for an alternative. An app that asks:
         | Can we collect some anonymized data in order to improve the
         | app?         [Yes] [No]
         | 
         | ...with equal weight given to both options, is much more likely
         | to have me click Yes if none of the buttons are big and blue
         | whilst the other choice is in a smaller font and "tucked away"
         | underneath the other (or worse, in a corner or hidden behind a
         | sub-menu).
         | 
         | Plus, I would think that SOME data would be better than NO
         | data, even if there's an inherent bias leaning towards privacy-
         | minded/power users.
        
           | Aaargh20318 wrote:
           | > This can only ever be opt-in if you want to stay on the
           | legal side of the GDPR
           | 
           | The GDPR only applies to personal data. You can collect
           | things like performance data without opt-in (or even an opt-
           | out option) as long as you are careful to not collect any
           | data that can be used to identify an individual, so no unique
           | device IDs or anything like that. Of course, you should be
           | transparent about what you collect. You also have to be
           | careful about combinations of data points that may be
           | innocuous on their own but can be used to identify a person
           | when combined with other data points.
        
         | yndoendo wrote:
         | Consent is the key issue binding all. There is complete lack of
         | consent when there is no opt-out and great degradation when the
         | default is opt-out. Trust is the only means to consent.
         | 
         | 1) Opt-in, Opt-survey, Opt-out is the only ternary to build
         | trust. Survey is an active validator of trust and assists in
         | low-bandwith communication. Question should be presented to the
         | end user the first time using it or the next time the
         | application starts and this feature was added.
         | 
         | 2) Provide the exact analytical information you want to the end
         | user so they can parse it too. The means to self-evaluate
         | allowed information to be shared with providing the reports or
         | views improves trust.
         | 
         | 3) Known privilege to trust leads to more consent. Having
         | priority support with features and bugs could be aligned with
         | those that Opt-in. Analytical history / performance may
         | assisting in solving the recent bug that was reporter.
         | 
         | Apple, Microsoft, Google, and all apply ambiguity to their
         | analytical sharing without details, not how they use it and can
         | abuse it. Most don't even provide an Opt-out. I don't trust
         | these organizations but I must engage with them through my
         | life. I don't have to use Facebook or Twitter and don't. I
         | accept the Steam survey.
         | 
         | RFC with an agreed upon analytical standard could be step to
         | solving the latch of analytical information the open source
         | community would benefit from. Both parties consenting to agreed
         | upon communication.
         | 
         | *My Point of View; meta data is still personal data. Without
         | the user the data and the meta data would not existing. Since
         | the end user is the entropy to meta data they own the meta and
         | the data.
        
           | andybak wrote:
           | Yes - I understand but in many (or even most) cases, opt-in
           | makes the data worthless. There's literally no point
           | collecting it.
        
         | hamilyon2 wrote:
         | Data collection and telemetry is sadly lemon market type of
         | situation. The most trustworthy developers are precisely the
         | ones who don't collect data from users
        
         | mgraczyk wrote:
         | Yes, this is one of the main reasons people mostly build on
         | web. It's very difficult to make desktop software better, and
         | especially Linux users are hostile to patterns that would make
         | improvements possible
        
         | withinboredom wrote:
         | I can't remember where I saw this before. However, there was a
         | site that collected analytics data client side in a circular
         | buffer (or something), and there was a menu in the settings to
         | send it back one-time or always, or download it yourself. If
         | you experienced an error, they would pop up in a toast to share
         | the analytics data with them so they could help fix the
         | problem. You could, of course, decline.
         | 
         | That was probably the best system I'd seen, but I can't
         | remember what site it was.
        
         | starbugs wrote:
         | > Opt-in makes the data useless - not just in terms of the huge
         | drop in quantity but because of the fact it introduces a huge
         | bias in the data selected - the people that would opt-in are
         | probably not a good sample of "typical users".
         | 
         | Why? I don't think that's obvious. It may also be related to
         | the way the opt-in is presented. In general, I would expect
         | this to be a workable solution. Even if the opt-in group
         | deviates from the "typical user", it's the best data you can
         | get in an honest and ethically sound way. This should certainly
         | be better than no data at all?
         | 
         | For any website/app that presents an opt-in cookie consent
         | banner this is implicitly already the case.
        
         | npteljes wrote:
         | >Opt-in makes the data useless
         | 
         | Hardly. It just has some issues with regards to what you also
         | pointed out, bias for one. But it still provides valuable
         | insight into usage patterns, systemic issues, and enables
         | tracking effects of developments over time. Correcting the bias
         | is not a bigger task than it is now - I'm sure you already have
         | an idea about feedback to different features according to
         | reviews, user reports, discussions, and so on. Opt-in is the
         | same, just much better.
        
         | eleveriven wrote:
         | Maybe the solution lies in radical transparency: explaining
         | exactly how and why telemetry would help, then letting users
         | decide. But even that requires trust...
        
       | chickenfeed wrote:
       | I have no idea what apps are sharing with what. On Android
       | network access is so ambiguous. There's such fuzzy wording. Like
       | when you are asked for location permission to use bluetooth. Even
       | perms like file system access. I don't know what this extends to.
       | Have no idea what it is doing. I recently set up a new ipad. I
       | failed to even work out where Photos ended up. Couldn't work out
       | what was backed up, and what wasn't. How to opt out of everything
       | and opt in piecemeal. Whether the OS/gadget had AI enhancements,
       | what they were or are, whether the apps would have cloud access
       | or not. In fact for an apple device it bugged me with dialogs
       | from the get go. And bluetooth kept turning itself back on. I
       | would say I am technically savvy, but I was pretty clueless. I
       | was quite keen to try out some of the AI photo tools. Like find
       | pictures of such and such, but I didn't get that far as confusion
       | abound.
        
       | xenophonf wrote:
       | Lovely.
       | 
       | And this blog post is how I find out the Photos app mysteriously
       | turned iCloud Photos back on. What the fuck? At least "Keep
       | Originals" was still set.
        
       | atmosx wrote:
       | I have replaced "Apple Photos" with photoprism. I am happy so
       | far.
       | 
       | Related blog post:
       | https://www.convalesco.org/notes/2024/09/29/setup-photoprism...
        
       | daneel_w wrote:
       | It's been doing this since long before iOS 18. I first observed
       | this behavior already on my 1st gen. iPhone SE with iOS 11 or
       | maybe iOS 12. Every now and then, especially after having taken
       | new pictures with the camera, I'd see the network activity
       | spinner get going whenever starting Photos - and I've _never_
       | enabled iCloud backups, but rather explicitly always disabled it
       | everywhere I find a setting to do so. With Wi-Fi it 'd be a short
       | one second affair, and on cellular it'd take a few seconds more.
       | I also observed the duration of the activity to vaguely extend
       | with the number of new pictures taken between starting Photos.
       | Curiously, when disabling cellular data permission for Photos and
       | having no Wi-Fi connected, the network activity spinner would
       | still pop up and just get stuck indefinitely without ever
       | finishing whatever it's reporting.
       | 
       | My gut feeling has always been that it's returning digested
       | machine learning feedback for Photos' facial/object recognition.
       | 
       | (add.: since moving from the iPhone 8 and iOS 16 to a modern
       | iPhone and iOS 17/18 this behavior has gone away - at least the
       | network activity spinner doesn't gossip anymore...)
        
       | anonymous344 wrote:
       | apple: we respect your...
       | 
       | LOL No you don't
        
       | karaterobot wrote:
       | Apple also does this thing where, after a major version update
       | (or, seemingly, random incremental updates) it'll just go ahead
       | and reset your app settings for you. So, make sure to check every
       | single privacy option once a month or so, or just give in and let
       | Apple do whatever it wants.
        
       | dishsoap wrote:
       | You can't even turn on and use a new apple product unless you
       | phone home and ask apple for permission. I'm not sure why this is
       | newsworthy. Apple products are known to phone home more than just
       | about anything else. It would be more surprising if apple photos
       | _didn 't_ phone home.
        
       | 0_____0 wrote:
       | It sounds like Apple Photos "phones home" by fetching some
       | indices? It doesn't sound like photo data leave the device, but I
       | think Apple's issue is they should have been more clear about how
       | this feature worked and what data is and isn't transmitted.
       | 
       | If it works how I think it works, I actually don't see an issue
       | here. I don't think anyone here expects there phone to not "phone
       | home" to check for OS updates by default?
        
       | jamieplex wrote:
       | Also, by the time you find that button to turn it off, Apple
       | already has the info it wanted...
        
       | akho wrote:
       | Most Apple Photos users keep their actual photos on Apple's
       | computers. There are few people who should react to this.
       | 
       | Sadly, I'm one of them.
        
       | thih9 wrote:
       | Related:                   Settings -> Siri -> Apps (Siri app
       | access) -> [an app] -> Learn from this app
       | 
       | This is on by default for all apps and it "Allows Siri to learn
       | from how you use [an app] to make suggestions across apps". I
       | found no global setting for all apps.
       | 
       | More details: https://support.apple.com/en-
       | mn/guide/iphone/iph6f94af287/io...
        
       | benterix wrote:
       | Anyone using a hardware firewall with apple devices explicitly
       | balcklisting Apple's IP address range? If so, any interesting
       | issues to watch for?
        
       | mattfrommars wrote:
       | I kept telling people here and on Reddit that Apple is using the
       | whole idea of "Privacy First" on Apple devices as a marketing
       | gimmick. The company has began introducing bloatware into their
       | product and rivaling Windows bloatware.
       | 
       | Yet, people perception on Apple bloatware here seem to be,
       | "that's fine".
        
       | mass_and_energy wrote:
       | > Allow this device to privately match places in your photos with
       | a global index maintained by Apple so you can search by almost
       | any landmark or point of interest.
       | 
       | Couldn't they just make a k/v database that has coordinates of
       | landmarks and then just pass the lat/long from the picture's EXIF
       | in an API call for lookup in said DB when the user wants to make
       | use of this feature? Isn't this part of why we have geotagging?
        
       | oblio wrote:
       | Womp womp...
       | 
       | Who would have thought the amoral money making machine would want
       | to make more money? :-)
        
       | low_tech_punk wrote:
       | Going off a tangent, I wonder if the market reveals survival
       | bias: companies and products that did respect privacy practice
       | (e.g. by asking explicit permission) were not able to harness
       | enough user data to compete with other bad players, and as a
       | result, any company would eventually end up like Facebook or go
       | out of business.
       | 
       | Sadly privacy is not a marketable feature, or at least it does
       | not have the ROI as Apple originally believed. I feel the only
       | way to level the play field is to reconsider our regulation
       | framework and treat privacy as a fundamental benefit for
       | consumers.
        
       | k3nx wrote:
       | Maybe this feature is to turn this off?
       | 
       | Years ago I went to a concert with some friends. We took pictures
       | standing in front of the stage while the staff was setting up.
       | There were some additional pictures of the show. Nothing in any
       | of the pictures had any information regarding the band names.
       | This was taken with an iPhone 6s, so no "intelligence". In the
       | past year or two I had a new "memories" album. The album title
       | was the artist name that we went to see. At first I thought this
       | was cool, then I was concerned, like, wait, how did they get this
       | information?
       | 
       | My guess, the "intelligence" used the GPS info to pull up the
       | venue and with the date figured out who the artist was. The album
       | was the headliner, the group photo was of one of the opening
       | bands.
       | 
       | I'm still on iOS 17, and I don't see any other way to turn this
       | feature off. I have an option to reset suggested memories, but
       | that's it.
       | 
       | It sounds more like a users are given a choice now, which would
       | be a good thing. If it was enabled pre iOS 18 that kind of makes
       | sense that it's enabled by default now.
        
       | nisten wrote:
       | Whatever embeddings and search history you make on your phone
       | that is "generated locally" goes straight bak to apple by
       | default.
       | 
       | They make it impractical for anyone to NOT use iCloud.
       | 
       | It's all reality distrotion field marketing bs.
        
       | eleveriven wrote:
       | A silent default opt-in erodes trust!
        
       ___________________________________________________________________
       (page generated 2024-12-29 23:00 UTC)