[HN Gopher] Apple Photos phones home on iOS 18 and macOS 15
___________________________________________________________________
Apple Photos phones home on iOS 18 and macOS 15
Author : latexr
Score : 121 points
Date : 2024-12-28 19:22 UTC (3 hours ago)
(HTM) web link (lapcatsoftware.com)
(TXT) w3m dump (lapcatsoftware.com)
| sillywalk wrote:
| I don't care if the device checks it, it uses homomorphic
| encryption and differential privacy. This is turned on by
| default. This angers me.
| guzik wrote:
| Clearly, in Cupertino, 'enhancing user experiences' without
| consent is the top priority.
| bigwordsehboy wrote:
| "homomorphic encryption and differential privacy"
|
| It is new. It is fancy. It is not clear where HE/DP is being
| used, it depends if the code is written using the Swift
| toolkit, but even that has paths for exfiltration if used
| incorrectly. They claim they are using DP in Photos as stated
| in the article here:
|
| https://machinelearning.apple.com/research/scenes-differenti...
|
| But the fact remains they are looking at your pictures. I do
| not trust them for one fleeting fart in the wind on this. Think
| about it for a hot second: HE/DP allows you to perform
| operations on the data without knowing the data, but what if
| someone goofs an operation and it ends up returning the actual
| data?
|
| Sorry, not buying it. Crypto is hard to get right, and when it
| is monetized like this for "new features", it is wildly
| unnecessary and exposes users to more risk.
| gigel82 wrote:
| > From my own perspective, computing privacy is simple: if
| something happens entirely on my computer, then it's private,
| whereas if my computer sends data to the manufacturer of the
| computer, then it's not private, or at least not entirely
| private. Thus, the only way to guarantee computing privacy is to
| not send data off the device.
|
| +1
| zaroth wrote:
| I don't even use iCloud Photos and this was on by default. Very
| bad move by Apple to ship my photos off my device, without my
| permission, in any shape or form, I don't care.
| threeseed wrote:
| If you don't use iCloud Photos your photos are not shipped off
| your device.
| hu3 wrote:
| So it sends your photos to be indexed on Apple servers. Turned on
| by default.
|
| This is probably done to compete with Google Photos which has a
| great photo search by word feature.
|
| With that said, Apple can use whatever privacy measures to
| protect user data. But at the end of the day, a subpoena can
| easily force them to hand over data.
|
| The best privacy measure is to just not have the data. I guess
| indexing photos offline in phone is not very feasible yet.
| thought_alarm wrote:
| It does not send your photos to be indexed on Apple servers.
| walterbell wrote:
| Was this not announced during the iOS18 'Glowtime' event? Gosh.
| Thanks for the neon! https://www.cnet.com/tech/mobile/everything-
| announced-at-app...
|
| https://www.internetsociety.org/resources/doc/2023/client-si...
| The Internet Society makes the following recommendations based on
| the European Commission's proposal: That the
| European Committee introduce safeguards for end-to-end
| encryption. That the European Committee prohibits the use
| of scanning technologies for general monitoring, including
| client-side scanning
| bogantech wrote:
| Should I have to watch marketing events to make sure I'm not
| being spied on?
| walterbell wrote:
| Maybe just the ones with bright neon "Glow-foo" canary
| branding..
| scosman wrote:
| "I don't understand most of the technical details of Apple's blog
| post"
|
| I do:
|
| - Client side vectorization: the photo is processed locally,
| preparing a non-reversible vector representation before sending
| (think semantic hash).
|
| - Differential privacy: a decent amount of noise is added the the
| vector before sending it. Enough to make it impossible to reverse
| lookup the vector. The noise level here is e = 0.8, which is
| quite good privacy.
|
| - OHTTP relay: it's sent through a 3rd party so Apple never knows
| your IP address. The contents are encrypted so the 3rd party
| never doesn't learn anything either (some risk of exposing "IP X
| is an apple photos user", but nothing about the content of the
| library).
|
| - Homomorphic encryption: The lookup work is performed on server
| with encrypted data. Apple can't decrypt the vector contents, or
| response contents. Only the client can decrypt the result of the
| lookup.
|
| This is what a good privacy story looks like. Multiple levels of
| privacy security, when any one of the latter 3 should be enough
| alone to protect privacy.
|
| "It ought to be up to the individual user to decide their own
| tolerance for the risk of privacy violations." -> The author
| themselves looks to be an Apple security researcher, and are
| saying they can't make an informed choice here.
|
| I'm not sure what the right call is here. But the conclusion
| "Thus, the only way to guarantee computing privacy is to not send
| data off the device." isn't true. There are other tools to
| provide privacy (DP, homomorphic encryption), while also using
| services. They are immensely complicated, and user's can't
| realistically evaluate risk. But if you want features that
| require larger-than-disk datasets, or frequently changing
| content, you need tools like this.
| Gabriel54 wrote:
| I appreciate the explanation. However, I think you do not
| address the main problem, which is that my data is being sent
| off my device by default and without any (reasonable) notice.
| Many users may agree to such a feature (as you say, it may be
| very secure), but to assume that everyone ought to be opted in
| by default is the issue.
| JustExAWS wrote:
| Do you use iCloud to store your photos?
| latexr wrote:
| I'm not the person you asked, but I agree with them. To
| answer your question: No, I do not use iCloud to store my
| photos. Even if I did, consent to store data is not the
| same as consent to scan or run checks on it. For a company
| whose messaging is all about user consent and privacy, that
| matters.
|
| This would be easily solvable: On first run show a window
| with:
|
| > Hey, we have this new cool feature that does X and is
| totally private because of Y [link to Learn More]
|
| > Do you want to turn it on? You can change your mind later
| in Settings
|
| > [Yes] [No]
| JustExAWS wrote:
| When iCloud syncs between devices how do you think that
| happens without storing some type of metadata?
|
| You don't use iCloud for anything? When you change phones
| do you start fresh or use your computer for backups? Do
| sync bookmarks? Browsing history?
|
| Do you use iMessage?
| Gabriel54 wrote:
| In response to your question in the parent comment, no, I
| do not use iCloud. And I do not sync any of the things
| you mentioned here. If someone already consented to using
| iCloud to store their photos then I would not consider
| the service mentioned this post to be such a big issue,
| because Apple would already have the data on their
| servers with the user's consent.
|
| edit: I will just add, even if we accept the argument
| that it's extremely secure and impossible to leak
| information, then where do we draw the line between
| "extremely secure" and "somewhat secure" and "not secure
| at all"? Should we trust Apple to make this decision for
| us?
| JustExAWS wrote:
| Do you start fresh with an iOS installation after each
| upgrade or do you back up your iPhone using your computer
| and iTunes?
| Gabriel54 wrote:
| I do not have anything backed up on any cloud servers on
| any provider. If I had to buy a new phone I would start
| from a fresh installation and move all of my data
| locally. It's not that I'm a "luddite", I just couldn't
| keep track of all of the different ways each cloud
| provider was managing my data, so I disabled all of them.
| JustExAWS wrote:
| If only Apple had a centralized backup service that could
| store everything automatically at a click of a button so
| you wouldn't have to juggle multiple cloud providers...
| NikolaNovak wrote:
| I kinda was somewhat with you until this point.
|
| Apple IS just another cloud provider / centralized backup
| service. It's not fundamentally different than others,
| and if you're not in select group of whatever the
| respectful term is for those who stay strictly inside
| apple ecosystem, you will have multiple clouds and
| multiple data sets and multiple backups that all interact
| with each other and your heterogeneous devices in
| unpredictable ways. Icloud will not help you with that
| any more than google cloud or Samsung cloud etc. They all
| want to own all of your stuff, neither is simply a hyper
| helpful neutral director.
| oarsinsync wrote:
| Not all apps support Apple's backup solution. Threema and
| Signal come to mind.
| walterbell wrote:
| Some iOS apps synchronize data with standard protocols
| (e.g. IMAP, WebDAV, CalDAV) to cloud or self-hosted
| services.
| JustExAWS wrote:
| And that doesn't help with internally stored data within
| apps, settings, which apps you have installed on what
| screen, passwords, etc
| walterbell wrote:
| iOS supports local device backups.
| JustExAWS wrote:
| I asked that repeatedly. Do they use iTunes for backups?
| walterbell wrote:
| Apple iTunes, iMazing (3rd party), Linux imobiledevice
| (OSS).
| latexr wrote:
| None of that is relevant to my point. You seem to be
| trying to catch people in some kind of gotcha instead of
| engaging honestly with the problem at hand. But alright,
| I'll bite.
|
| Yes, I always start with clean installs, both on iOS and
| on macOS. Sometimes I even restart fresh on the same
| device, as I make sure my hardware lasts. I don't sync
| bookmarks, I keep them in Pinboard and none of them has
| any private or remotely identifiable information anyway.
| I don't care about saving browser history either, in fact
| I have it set to periodically auto-clear, which is a
| feature in Safari.
| JustExAWS wrote:
| No I am trying to say with a connected device using
| online services, the service provider is going to have
| access to your data that you use to interact with them.
|
| To a first approximation, everyone in 2024 expects their
| data and settings to be transferred across devices.
|
| People aren't working as if it is 2010 when you had to
| backup and restore devices via iTunes. If I'm out of town
| somewhere and my phone gets lost, damaged or stolen, I
| can buy another iPhone, log into my account and
| everything gets restored as it was.
|
| Just as I expect my watch progress to work when I use
| Netflix between my phone, iPad, Roku devices etc.
| latexr wrote:
| And that should rightfully be your _informed choice_.
| Just like everyone else should have the right to know
| what data their devices are sending _before it happens_
| and be given the _informed choice_ to refuse. People
| shouldn't have to learn that from a random blog post
| shared on a random website.
| JustExAWS wrote:
| In what world is Netflix for instance not going to know
| your watch history?
|
| How many people are going to say in 2024 that they don't
| want continuous cloud backup? You want Windows Vista
| style pop ups and permissions?
| latexr wrote:
| How many times are you going to shift the goalposts? This
| is getting tiresome, so I'll make it my last reply.
|
| I don't have Netflix but neither is that relevant to the
| point, you're obviously and embarrassingly grasping at
| straws.
|
| No one is arguing against continuous cloud backups,
| they're arguing about _sending data without consent_.
| Which, by the way, is something Apple used to understand
| not to do.
|
| https://www.youtube.com/watch?v=39iKLwlUqBo
|
| Apple's OS are already filled with Windows Vista style
| popups and permissions for inconsequential crap, people
| have been making fun of them for that for years.
| scarface_74 wrote:
| If you are doing continuous cloud backups and using Apple
| services - you are already giving Apple your data and
| your solution is to add even more permissions? You are
| not going to both use any Apple service that requires an
| online component and keep Apple from having your data.
|
| Isn't it bad enough that I have a popup every time I copy
| and paste between apps?
| stackghost wrote:
| I hate this type of lukewarm take.
|
| "Ah, I see you care about privacy, but you own a phone! How
| hypocritical of you!"
| latexr wrote:
| You're describing Matt Bors' Mister Gotcha.
|
| https://thenib.com/mister-gotcha/
| JustExAWS wrote:
| If you care about your "privacy" and no external service
| providers having access to your data - that means you
| can't use iCloud - at all, any messages service, any back
| up service, use Plex and your own hosted media, not use a
| search engine, etc.
| stackghost wrote:
| Do you use a phone?
| JustExAWS wrote:
| Yes. I also don't use Plex, have my own file syncing
| service running, run my own email server, etc.
|
| I also don't run a private chat server that people log
| into - I'm like most of the iPhone and Android using
| world
| stackghost wrote:
| Maybe lay off the sanctimonious attitude then.
| JustExAWS wrote:
| I'm saying that completely sarcastically like Apple
| should add another popup for the 50 people who care about
| Apple sending non PII metadata to its servers to provide
| a server
| Msurrow wrote:
| I think it does address the main problem. What he is saying
| is that multiple layers of security is used to ensure
| (mathematically and theoretically proved) that there is no
| risk in sending the data, because it is encrypted and sent is
| such a way that apple or any third party will never be able
| to read/access it (again, based on theoretically provable
| math) . If there is no risk there is no harm, and then there
| is a different need for 'by default', opt in/out,
| notifications etc.
|
| The problem with this feature is that we cannot verify that
| Apple's implementation of the math is correct and without
| security flaws. Everyone knows there is security flaws in all
| software, and this implementation is not open (I.e. we cannot
| review the code, and even if we could review code we cannot
| verify that the provided code was the code used in the iOS
| build). So, we have to trust Apple did not make any mistakes
| in their implementation.
| latexr wrote:
| Your second paragraph is exactly the point made in the
| article as the reason why it should be an informed choice
| and not something on by default.
| Gabriel54 wrote:
| As someone with a background in mathematics I appreciate
| your point about cryptography. That said, there is no
| guarantee that any particular implementation of a secure
| theoretical algorithm is actually secure.
| threeseed wrote:
| There is also no guarantee that Apple isn't lying about
| everything.
|
| They could just have the OS batch uploads until a later
| point e.g. when the phone checks for updates.
|
| The point is that this is all about risk mitigation not
| elimination.
| scosman wrote:
| I think I'm saying: you're not sending "your data" off
| device. You are sending a homomorphically encrypted locally
| differentially private vector (through an anonymous proxy).
| No consumer can really understand what that means, what the
| risks are, and how it would compare to the risk of sending
| someone like Facebook/Google raw data.
|
| I'm asking: what does an opt in for that really look like?
| You're not going to be able to give the user enough info to
| make an educated decision. There's ton of risk of "privacy
| washing" ("we use DP" but at very poor epsilon, or "we use
| E2E encryption" with side channel data gathering).
|
| There's no easy answer. "ask the user", when the question
| requires a phd level understanding of stats to evaluate the
| risk isn't a great answer. But I don't have another one.
| latexr wrote:
| Asking the user is perfectly reasonable. Apple themselves
| used to understand and champion that approach.
|
| https://www.youtube.com/watch?v=39iKLwlUqBo
| Gabriel54 wrote:
| In response your second question, opt in would look exactly
| like this: don't have the box checked by default, with an
| option to enable it: "use this to improve local search, we
| will create an encrypted index of your data to send
| securely to our servers, etc..." A PhD is not necessary to
| understand the distinction between storing data locally on
| a machine vs. on the internet.
| sensanaty wrote:
| I don't care if all they collect is the bottom right pixel
| of the image and blur it up before sending it, the sending
| part is the problem. I don't want _anything_ sent from MY
| device without my consent, whether it 's plaintext or
| quantum proof.
|
| You're presenting it as if you have to explain elliptic
| curve cryptography in order to toggle a "show password"
| dialogue but that's disingenuous framing, all you have to
| say is "Allow Apple to process your images", simple as
| that. Otherwise you can argue many things can't possibly be
| made into options. Should location data always be sent,
| because satellites are complicated and hard to explain?
| Should we let them choose whether they can turn wifi on or
| off, because you have to explain IEEE 802.11 to them?
| talldayo wrote:
| The right call is to provide the feature and let users opt-in.
| Apple knows this is bad, they've directly witnessed the
| backlash to OCSP, lawful intercept and client-side-scanning.
| There is no world in which they did not realize the problem and
| decided to enable it by default anyways knowing full-well that
| users aren't comfortable with this.
|
| People won't trust homomorphic encryption, entropy seeding or
| relaying when none of it is transparent and all of it is
| enabled in an OTA update.
|
| > This is what a good privacy story looks like.
|
| This is what a coverup looks like. Good privacy stories never
| force third-party services on a user, period. When you see that
| many puppets on stage in one security theater, it's only
| natural for things to feel off.
| oneplane wrote:
| It's not that binary. Nobody is forcing anything, you can not
| buy a phone, you can not use the internet. Heck, you can even
| not install any updates!
|
| What is happening, is that people make tradeoffs, and decide
| to what degree they trust who and what they interact with.
| Plenty of people might just 'go with the flow', but putting
| what Apple did here in the same bucket as what for example
| Microsoft or Google does is a gross misrepresentation.
| Present it all as equals just kills the discussion, and
| doesn't inform anyone to a better degree.
|
| When you want to take part in an interconnected network, you
| cannot do that on your own, and you will have to trust other
| parties to some degree. This includes things that might
| 'feel' like you can judge them (like your browser used to
| access HN right here), but you actually can't unless you
| understand the entire codebase of your OS and Browser, all
| the firmware on the I/O paths, and the silicon it all runs
| on. So you make a choice, which as you are reading this, is
| apparently that you trust this entire chain enough to take
| part in it.
|
| It would be reasonable to make this optional (as in, opt-in),
| but the problem is that you end up asking a user for a ton of
| "do you want this" questions, almost every upgrade and
| install cycle, which is not what they want (we have had this
| since Mavericks and Vista, people were not happy). So if you
| can engineer a feature to be as privacy-centric yet automated
| as possible, it's a win for everyone.
| talldayo wrote:
| > What is happening, is that people make tradeoffs, and
| decide to what degree they trust who and what they interact
| with.
|
| People aren't making tradeoffs - that's the problem. Apple
| is making the tradeoffs for them, and then retroactively
| asking their users "is this okay?"
|
| Users shouldn't need to buy a new phone to circumevent
| arbitrary restrictions on the hardware that is their legal
| property. If America had functional consumer protections,
| Apple would have been reprimanded harder than their
| smackdowns in the EU.
| oneplane wrote:
| People make plenty of tradeoffs. Most people trade most
| of their attention/time for things that are not related
| to thinking about technical details, legal issues or
| privacy concerns. None of this exists in their minds.
| Maybe the fact that they implicitly made this tradeoff
| isn't even something they are aware of.
|
| As for vectorised and noise-protected PCC, sure, they
| might have an opinion about that, but people rarely are
| informed enough to think about it, let alone gain the
| insight to make a judgment about it at all.
| gigel82 wrote:
| I don't want my photos to take part in any network; never
| asked for it, never expected it to happen. I never used
| iCloud or other commercial cloud providers. This is just
| forceful data extraction by Apple, absolutely egregious
| behavior.
| oneplane wrote:
| Your photos aren't taken. Did you read the article at
| all?
| gigel82 wrote:
| The "network" mention was in reply to your comment about
| "participating in a network" which was never the case for
| one's personal photos (unless explicitly shared on a
| social network I guess).
|
| I did read the article, yes :) Maybe our photos are not
| sent bit-by-bit but enough data from the photos is being
| sent to be able to infer a location (and possibly other
| details) so it is the same thing: my personal data is
| being sent to Apple's servers (directly or indirectly,
| partially or fully) without my explicit consent.
|
| At least the last time they tried to scan everyone's
| photos (in the name of the children) they pinky promised
| they'd only do it before uploading to iCloud, now they're
| doing it for everyone's photos all the time - it's
| disgusting.
| oneplane wrote:
| No, your photos aren't sent, also not 'pieces' of it.
| They are creating vector data which can be used to create
| searchable vectors which in turn can be used on-device to
| find visual matches for your search queries (which are
| local).
|
| You can imagine it as hashes (created locally), some
| characters of that hash from some random positions being
| used to find out if those can be turned into a query
| (which is compute intensive so they use PCC for that). So
| there is no 'data' about what it is, where it is or who
| it is. There isn't even enough data to create a picture
| with.
|
| Technically, everything could of course be changed, heck,
| Apple could probably hire someone with binoculars and spy
| on you 24/7. But this is not that. Just like baseband
| firmware is not that, and activation is not that, yet
| using them requires communication with Apple all the
| same.
| jazzyjackson wrote:
| My understanding as the blog laid it out was that the
| cloud service is doing the vector similarity search
| against a finite database of landmark feature vectors,
| but they are performing that mathematical function under
| homomorphic encryption such that the result of the vector
| comparison can only be read with a key that never left
| your device, so it's just adding a tag "Eiffel tower"
| that only you see, but the feature vector is sent off
| device, it's just never able to be read by another party.
| latexr wrote:
| > This is what a coverup looks like.
|
| That's starting to veer into unreasonable levels of
| conspiracy theory. There's nothing to "cover up", the feature
| has an off switch right in the Settings and a public document
| explaining how it works. It should not be on by default but
| that's not a reason to immediately assume bad faith. Even the
| author of the article is concerned more about bugs than
| intentions.
| CapcomGo wrote:
| Sure it is. This isn't a feature or setting that users
| check often or ever. Now, their data is being sent without
| their permission or knowledge.
| latexr wrote:
| Which is wrong but doesn't make it a coverup, which _by
| definition_ assumes trying to _hide_ evidence of
| wrongdoing.
| talldayo wrote:
| It is a coverup. Apple is overtly and completely aware of
| the optics surrounding photo scanning - they know that an
| opt-in scheme cannot work as they found out previously.
|
| Since they cannot convince users to enable this feature
| in good-faith, they are resorting to subterfuge. We know
| that Apple is vehement about pushing client-side scanning
| on users that do not want it, I do not believe for a
| second that this was a mistake or unintended behavior. If
| this was a bug then it would have been hotfixed
| immediately to prevent the unintended behavior from
| reaching any more phones than it already had.
| threeseed wrote:
| > This is what a coverup looks like
|
| This is a dumb take. They literally own the entire stack
| Photos runs on.
|
| If they really wanted to do a coverup we would _never_ know
| about it.
| talldayo wrote:
| Why wouldn't this mistake be addressed in a security hotfix
| then? Apple has to pick a lane - this is either intended
| behavior being enabled against user's wishes, or unintended
| behavior that compromises the security and privacy of
| iPhone owners.
| cma wrote:
| Quantum makes the homomorphic stuff ineffective in the mid-
| term. All they have to do is hold on to the data and they can
| get the results of the lookup table computation, in maybe 10-25
| years. Shouldn't be on by default.
| oneplane wrote:
| What makes you think that this is the biggest problem if
| things like AES and RSA are suddenly breakable?
|
| If someone wanted to get a hold of your cloud hosted data at
| that point, they would use their capacity to simply extract
| enough key material to impersonate a Secure Enclave. That
| that point, you "are" the device and as such you "are" the
| user. No need to make it more complicated than that.
|
| In theory, Apple and other manufacturers would already use
| PQC to prevent such scenarios. Then again, QC has been
| "coming soon" for so long, it's doubtful that any information
| that is currently protected by encryption will still be
| valuable by the time it can be cracked. Most real-world
| process implementations don't rely on some "infinite
| insurance", but assume it will be breached at some point and
| just try to make it difficult or costly enough to run out the
| clock on confidentiality, which is all that really matters.
| Nothing that exists really needs to be confidential forever.
| Things either get lost/destroyed or become irrelevant.
| cma wrote:
| This is ostensibly for non-cloud data, derivatives of it
| auto uploaded after an update.
| latexr wrote:
| > The author themselves looks to be an Apple security
| researcher
|
| They're not. Jeff Johnson develops apps (specifically Safari
| extensions) for Apple platforms and frequently blogs about
| their annoyances with Apple, but they're not a security
| researcher.
| gigel82 wrote:
| This may be a "good" privacy story but a way better one is to
| just not send any of your data anywhere, especially without
| prior consent.
| jazzyjackson wrote:
| This must be the first consumer or commercial product
| implementing homomorphic encryption is it not?
|
| I would be surprised if doing noisy vector comparisons is
| actually the most effective way to tell if someone is in front
| of the Eiffel tower. A small large language model could caption
| it just as well on device, my spider sense tells me someone saw
| an opportunity to apply bleeding edge, very cool tech so that
| they can gain experience and do it bigger and better in the
| future, but they're fumbling their reputation by doing this
| kind of user data scanning.
| do_not_redeem wrote:
| > This must be the first consumer or commercial product
| implementing homomorphic encryption is it not?
|
| Not really, it's been around for a bit now. From 2021:
|
| > The other major reason we're talking about HE and FL now is
| who is using them. According to a recent repository of PETs,
| there are 19 publicly announced pilots, products, and proofs
| of concept for homomorphic encryption and federated analytics
| (another term for federated learning) combined. That doesn't
| seem like a lot ... but the companies offering them include
| Apple,7 Google, Microsoft, Nvidia, IBM, and the National
| Health Service in the United Kingdom, and users and investors
| include DARPA, Intel, Oracle, Mastercard, and Scotiabank.
| Also, the industries involved in these early projects are
| among the largest. Use cases are led by health and social
| care and finance, with their use in digital and crime and
| justice also nontrivial (figure 1).
|
| https://www2.deloitte.com/us/en/insights/industry/technology.
| ..
|
| I do wonder why we don't hear about it more often though.
| "Homomorphic encryption" as a buzzword has a lot of headline
| potential, so I'm surprised companies don't brag about it
| more.
| m463 wrote:
| It seems apple might be using it for live caller id lookup?
| lysace wrote:
| > - OHTTP relay: it's sent through a 3rd party so Apple never
| knows your IP address. The contents are encrypted so the 3rd
| party never doesn't learn anything either (some risk of
| exposing "IP X is an apple photos user", but nothing about the
| content of the library).
|
| Which 3rd party is that?
| gigel82 wrote:
| The NSA, the CCP, etc. depending on jurisdiction. (joking,
| but not really)
| oneplane wrote:
| I don't have a list on hand, but at least Cloudflare and
| Akamai are part of the network hops. Technically you only
| need 2 hops to make sure no origin or data extraction can be
| done.
| jazzyjackson wrote:
| O good, cloudflare gets one more data point on me, a ping
| every time I add a photo to my library.
| homofermic wrote:
| Regarding HE: since the lookup is generated by the requestor,
| it can be used as an adversarial vector, which can result in
| exfiltration by nearest neighbor (closest point to vector)
| methods. In other words, you can change what you are searching
| for, and much like differential power analysis attacks on
| crypto, extract information.
| ustad wrote:
| You're presenting a false dichotomy between "perfect user
| understanding" and "no user choice." The issue isn't whether
| users can fully comprehend homomorphic encryption or
| differential privacy - it's about basic consent and
| transparency.
|
| Consider these points:
|
| 1. Users don't need a PhD to understand "This feature will send
| data about your photos to Apple's servers to enable better
| search."
|
| 2. The complexity of the privacy protections doesn't justify
| removing user choice. By that logic, we should never ask users
| about any technical feature.
|
| 3. Many privacy-conscious users follow a simple principle: they
| want control over what leaves their device, regardless of how
| it's protected.
|
| The "it's too complex to explain" argument could justify any
| privacy-invasive default. Would you apply the same logic to,
| say, enabling location services by default because explaining
| GPS technology is too complex?
|
| The real solution is simple: explain the feature in plain
| language, highlight the benefits, outline the privacy
| protections, and let users make their own choice. Apple already
| does this for many other features. "Default off with opt-in" is
| a core principle of privacy-respecting design, regardless of
| how robust the underlying protections are.
| scosman wrote:
| I don't believe I said or implied that anywhere: 'You're
| presenting a false dichotomy between "perfect user
| understanding" and "no user choice."'? Happy to be corrected
| if wrong.
|
| Closest I come to presenting an opinion on the right way UX
| was "I'm not sure what the right call is here.". The thing I
| disagreed with was a technical statement "the only way to
| guarantee computing privacy is to not send data off the
| device.".
|
| Privacy respecting design and tech is a passion of mine. I'm
| pointing out "user choice" gets hard as the techniques used
| for privacy exceed the understanding of users. Users can
| intuitively understand "send my location to Google
| [once/always]" without understanding GPS satellites. User's
| can't understand the difference between "send my photo" and
| "send homomorphicly encrypted locally differentially private
| vector of e=0.8" and "send differentially private vector of
| e=50". Your prompt "send data about your photos..." would
| allow for much less private designs than this. If we want to
| move beyond "ask the user then do it", we need to get into
| the nitty gritty details here. I'd love to see more tech like
| this in consumer products, where it's private when used, even
| when opted-in.
| lapcat wrote:
| The choice is between "use an online service" or "don't use
| an online service". That's simple enough for anyone to
| understand.
|
| Apple can try to explain as best it can how user data is
| protected when they use the online service, and then the
| user makes a choice to either use the service or not.
|
| In my case, I have don't even have a practical use for the
| new feature, so it's irrelevant how private the online
| service is. As it is, though, Apple silently forced me to
| use an online service that I never wanted.
| jazzyjackson wrote:
| I'm moving my family out of apple photos, self hosted options
| have come a long way. I landed on immich [0] and a caddy plugin
| that allows for PKI certificate for account access while still
| allowing public shared URLs [1]*
|
| There's also LibrePhotos which is packed with features but
| doesn't have as much polish as immich. They do however have a
| list of python libraries that can be used for offline/local
| inference for things like having an image model create a
| description of a photo that can benefit the full text search. [2]
|
| [0] https://immich.app/
|
| [1] https://github.com/alangrainger/immich-public-
| proxy/blob/mai...
|
| [2] https://docs.librephotos.com/docs/user-guide/offline-setup
|
| * Haven't actually tried this plugin yet, my weekend project is
| setting up my caddy VPS to tunnel to the immich container running
| on my Synology NAS
| walterbell wrote:
| _> On macOS, I can usually prevent Apple software from phoning
| home by using Little Snitch. Unfortunately, Apple doesn 't allow
| anything like Little Snitch on iOS._
|
| On Android, NetGuard uses a "local VPN" to firewall outgoing
| traffic. Could the same be done on iOS, or does Apple network
| traffic bypass VPNs? Lockdown mentions ads, but not Apple
| servers, https://lockdownprivacy.com/.
|
| Apple does publish IP ranges for different services, so it's
| theoretically possible to block 17.0.0.0/8 and then open up
| connections just for notifications and security updates,
| https://support.apple.com/en-us/101555
| dev_tty01 wrote:
| >On Android, NetGuard uses a "local VPN" to firewall outgoing
| traffic. Could the same be done on iOS, or does Apple network
| traffic bypass VPNs? Lockdown mentions ads, but not Apple
| servers, https://lockdownprivacy.com/.
|
| Why is NetGuard more trustworthy than Apple?
| walterbell wrote:
| NetGuard firewall doesn't run on iOS, so there's no point in
| comparing to Apple. For those on Android, NetGuard is open-
| source, https://github.com/M66B/NetGuard
|
| On iOS, Lockdown firewall is open-source,
| https://github.com/confirmedcode/Lockdown-iOS
| timsneath wrote:
| More on homomorphic encryption here:
| https://www.swift.org/blog/announcing-swift-homomorphic-encr...
|
| Summary: "Homomorphic encryption (HE) is a cryptographic
| technique that enables computation on encrypted data without
| revealing the underlying unencrypted data to the operating
| process. It provides a means for clients to send encrypted data
| to a server, which operates on that encrypted data and returns a
| result that the client can decrypt. During the execution of the
| request, the server itself never decrypts the original data or
| even has access to the decryption key. Such an approach presents
| new opportunities for cloud services to operate while protecting
| the privacy and security of a user's data, which is obviously
| highly attractive for many scenarios."
| geococcyxc wrote:
| Another setting that surprised me with being turned on by default
| apparently on macOS 15 is System Settings - Spotlight - "Help
| Apple Improve Search": "Help improve Search by allowing Apple to
| store your Safari, Siri, Spotlight, Lookup, and #images search
| queries. The information collected is stored in a way that does
| not identify you and is used to improve search results."
| dev_tty01 wrote:
| No, this is not on by default. After system install at first
| boot it asks if you want to help improve search and it
| describes how your data will be handled, anonymized, etc. If
| you clicked on yes it is on. There is a choice to opt out.
| dan-robertson wrote:
| To me it seems like a reasonable feature that was, for the most
| part, implemented with great consideration for user privacy,
| though maybe I'm too trusting of the description. I mostly think
| this article is rage-bait and one should be wary of 'falling for
| it' when it shows up on hacker news in much the same way that one
| should be wary when rage-bait articles show up in tabloids or on
| Facebook.
|
| It seems likely to me that concerns like those of the article or
| some of the comments in this thread are irrelevant to Apple's
| bottom line. A concern some customers may actually have is data
| usage, but I guess it's likely that the feature is off if in low
| data mode.
|
| I wonder if this particular sort of issue would be solved by some
| setting for 'privacy defaults' or something where
| journalists/activists/some corporate IT departments/people who
| write articles like the OP can choose something to cause OS
| updates to set settings to values that talk less on the network.
| Seems hard to make a UI that is understandable. There is already
| a 'lockdown mode' for iOS. I don't know if it affects this
| setting.
| ProllyInfamous wrote:
| At this point, Mac Mini M4's are _cheap enough_ and _capable
| enough_ to just purchase two: one for off-line use, another on-.
|
| Perhaps this is marketing genius (from an AAPL-shareholder POV)?
|
| ----
|
| I'm laughing at the insanity of all this interconnectivity, but
| an NDA prevents me from typing the greatest source of my ironic
| chuckles. Described in an obtuse way: a privacy-focused hardware
| product ships with an undisclosed phone-home feature, letting the
| feds see every time you use the product (to produce a
| controversial product, at home).
|
| Kick in my fucking door / sue me: it'll just re-enforce that I'm
| correct about concessionary-allowances...
| doctorpangloss wrote:
| Can you be more clear?
| threeseed wrote:
| > letting the feds see every time you use the product
|
| This does not happen.
| ustad wrote:
| Holy crap! Enabled by default! Thank you for letting everyone
| know.
|
| "Enhanced Visual Search in Photos allows you to search for photos
| using landmarks or points of interest. Your device privately
| matches places in your photos to a global index Apple maintains
| on our servers. We apply homomorphic encryption and differential
| privacy, and use an OHTTP relay that hides IP address. This
| prevents Apple learning about the information in your photos. You
| can turn off Enhanced Visual Search at any time on your iOS or
| iPadOS device by going to Settings > Apps > Photos. On Mac, open
| Photos and go to Settings > General."
| TYPE_FASTER wrote:
| I think the user should be prompted to enable new functionality
| that sends data to the cloud. I think there should be a link to
| details about which data is being sent, how it is being
| transmitted, if it is stored, if it is provided to 3rd-parties,
| and what is being done with it.
|
| Maybe this is exactly what the GDPR does. But it is what I would
| appreciate as a user.
|
| I have seen how sending metrics and crash dumps from a mobile
| device can radically improve the user experience.
|
| I have also seen enough of the Google bid traffic to know how
| much they know about us.
|
| I want to enable sending metrics, but most of the time it's a
| radio button labeled "metrics" and I don't know what's going on.
| xbar wrote:
| Now "What happens on your iPhone stays on your iPhone" seems like
| it deserves the Lindt chocolates defense: "exaggerated
| advertising, blustering, and boasting upon which no reasonable
| buyer would rely."
___________________________________________________________________
(page generated 2024-12-28 23:00 UTC)