[HN Gopher] Homomorphic encryption in iOS 18
___________________________________________________________________
Homomorphic encryption in iOS 18
Author : surprisetalk
Score : 312 points
Date : 2025-01-11 16:35 UTC (4 days ago)
(HTM) web link (boehs.org)
(TXT) w3m dump (boehs.org)
| sillysaurusx wrote:
| I was going to make my usual comment of FHE being nice in theory
| but too slow in practice, and then the article points out that
| there's now SHE (somewhat homomorphic encryption). I wasn't aware
| that the security guarantees of FHE could be relaxed without
| sacrificing them. That's pretty cool.
|
| Is there any concrete info about noise budgets? It seems like
| that's the critical concern, and I'd like to understand at what
| point precisely the security breaks down if you have too little
| (or too much?) noise.
| bawolff wrote:
| Im not an expert on this, but my understanding is "noise" is
| less a security breakdown and more the entire system
| breaksdown. That is where the "somewhat" comes in, unlike
| "full" where the system can do (expensive) things to get rid of
| noise, in somewhat the noise just accumulates until the system
| stops working. (Definitely talking out of my hat here)
| sillysaurusx wrote:
| Interesting! Are there any security concerns with SHE? If
| not, it _sounds_ like all of the benefits of FHE with none of
| the downsides, other than the noise overwhelming the system.
| If that's true, and SHE can run at least somewhat
| inexpensively, then this could be big. I was once super hyped
| about FHE till I realized it couldn't be practical, and this
| has my old excitement stirring again.
| Ar-Curunir wrote:
| Most FHE schemes are constructed out of SHE schemes. Also,
| there's nothing preventing FHE from being practical, it's
| just that existing constructions are not as fast we would
| like them to be
| bawolff wrote:
| My impression is that SHE are still relatively expensive,
| not as crazy as FHE but still slow enough to preclude many
| usecases and the noise breakdown can happen relatively
| quickly making them not work for most algorithms people
| want to use FHE for.
| j2kun wrote:
| Wait until you see all the ASICs getting taped out by
| various companies right now.
| ruined wrote:
| it's incredibly algorithm-dependent. if you look into the
| thesis that originates the 'bootstrapping' technique to
| transform SHE algorithms into FHE, they determine the noise
| limit of their specific algorithm in section 7.3 and then
| investigate expanding the noise limit in 8 and 10.
|
| (written in 2009) http://crypto.stanford.edu/craig/craig-
| thesis.pdf
|
| some newer FHE don't encounter a noise limit or don't use the
| bootstrapping technique.
| Ar-Curunir wrote:
| All known FHE schemes use bootstrapping
| ruined wrote:
| i expected that, but a search turned up several things
| claiming to implement fhe without bootstrapping. i didn't
| investigate and i can't say i'm familiar so maybe they're
| bogus
| Ar-Curunir wrote:
| Correction: all known even-remotely-practical schemes
| rely on bootstrapping. See
| https://crypto.stackexchange.com/questions/103341/fully-
| homo...
| slow_typist wrote:
| As always, there is no free lunch.
| j2kun wrote:
| I did see an arxiv paper a while back that claimed to use
| category theory to do this, but my best bet was it was
| not secure.
| Ar-Curunir wrote:
| SHE doesn't relax security guarantees, it relaxes the class of
| supported computations
| 3s wrote:
| SHE vs FHE has nothing to do with security. Instead, it's about
| how many operations (eg homomorphic multiplications and
| additions) can be performed before the correctness of the
| scheme fails due to too much noise accumulating in the
| ciphertext. Indeed all FHE schemes are also SHE schemes.
|
| What typically makes FHE expensive computationally is a
| "bootstrapping" step for removing the noise that accumulated
| after X operations and threatening correctness. After
| bootstrapping you can do another X operations. Rinse and repeat
| until you finish the computation to you want to perform.
| twodave wrote:
| Not sure whether they use FHE or not (the literature I'm
| looking at simply says "homomorphic"), but we use ElectionGuard
| at our company to help verify elections for our customers. So
| there are definitely some practical uses.
| ted537 wrote:
| It's cool how neural networks, even convulutional ones, are one
| of the few applications that you can compute through homomorphic
| encryption without hitting a mountain of noise/bootstrapping
| costs. Minimal depth hurrhah!
| j2kun wrote:
| I don't think Apple is doing this. They compute the embedding
| on device in the clear, and then just do the nearest-neighbor
| part in HE (which does lots of dot products but no neural
| network).
|
| There are people doing NNs in HE, but most implementations do
| indeed require bootstrapping, and for that reason they usually
| use CKKS.
| timsneath wrote:
| More here: https://www.swift.org/blog/announcing-swift-
| homomorphic-encr...
| GeekyBear wrote:
| > One example of how we're using this implementation in iOS 18,
| is the new Live Caller ID Lookup feature, which provides caller
| ID and spam blocking services. Live Caller ID Lookup uses
| homomorphic encryption to send an encrypted query to a server
| that can provide information about a phone number without the
| server knowing the specific phone number in the request.
|
| Privacy by design is always nice to see.
| gus_massa wrote:
| I don't undertand how it can work. I assume the spam list is
| shared by all users, oherwise it will no be useful at all:
|
| Let's supouse Apple is evil (or they recive an order from a
| judge) and they want to know who is calling 5555-1234
|
| 1) Add a new empty "spam" numbers encrypted database to the
| server (so there are now two encrypted databases in the
| system)
|
| 2) Add the encrited version of 5555-1234 to it.
|
| 3) When someone checks, reply the correct answer from the
| real database and also check in the second one and send the
| reply to the police.
| colejohnson66 wrote:
| Encryptions uses per-device keys.
| GeekyBear wrote:
| > they recive an order from a judge
|
| You can't be forced to hand over customer data after you
| have designed a system so that your servers don't ever have
| that information stored in the first place, court order or
| no.
| jakelazaroff wrote:
| How would they decrypt the answer from the database?
| j16sdiz wrote:
| > The two hops, the two companies, are already acting in
| partnership, so what is there technically in the relay setup to
| stop the two companies from getting together--either voluntarily
| or at the secret command of some government--to compare notes, as
| it were, and connect the dots?
|
| The OHTTP scheme does not _technically_ prevent this. It
| increases the number parties need to cooperate to extract this
| information, hoping it would be caught somewhere in the pipeline.
| chikere232 wrote:
| and if a government say already forced ISPs to collect metadata
| about who connects to whom and when, I imagine they don't even
| need to bother getting data from the relay hosts
| eviks wrote:
| > There is no trust me bro element. > Barring some issue being
| found in the math or Apple's implementation of it
|
| Yes, is you bar the "trust me bro" element in your definition,
| you'll by definition have no such element.
|
| Reality, though, doesn't care about your definition, so in
| reality this is exactly the "trust me bro" element that exists
|
| > But we're already living in a world where all our data is up
| there, not in our hands.
|
| If that's your real view, then why do you care about all this
| fancy encryption at all? It doesn't help if everything is already
| lost
| rpearl wrote:
| I mean if you'd like, you could reimplement the necessary
| client on an airgapped computer, produce an encrypted value,
| take that value to a networked computer, send it to the server,
| obtain an encrypted result that the server could not possibly
| know how to decrypt, and see if it has done the computation in
| question. No trust is required.
|
| You could also observe all bits leaving the device from the
| moment you initialize it and determine that only encrypted bits
| leave and that no private keys leave, which only leaves the gap
| of some side channel at the factory, but you could perform the
| calculation to see that the bits are only encrypted with the
| key you expect them to be encrypted with.
| Krasnol wrote:
| How is this comment useful to the OPs valid arguments?
| sneak wrote:
| > _This should be fine: vectorization is a lossy operation. But
| then you would know that Amy takes lots of pictures of golden
| retrievers, and that is a political disaster._
|
| This downplays the issue. Knowing that Alice takes lots of
| screenshots of Winnie the Pooh memes means that Alice's family
| gets put into Xinjiang concentration camps, not just a political
| disaster.
|
| (This is a contrived example: iCloud Photos is already NOT e2ee
| and this is already possible now; but the point stands, as this
| would apply to people who have iCloud turned off, too.)
| troad wrote:
| Agreed. And for a less contrived example, people may have
| photos of political protests that they attended (and the faces
| of others present), screenshots that include sensitive
| messages, subversive acts, etc.
|
| It's worth noting though that it's now possible to opt in to
| iCloud Photo e2ee with "Advanced Data Protection". [0]
|
| [0] https://support.apple.com/en-us/102651
| sneak wrote:
| iCloud Photo e2ee still shares hashes of the plaintext with
| Apple, which means they can see the full list of everyone who
| has certain files, even if they all have e2ee enabled. They
| can see who had it first, and who got it later, and which
| sets of iCloud users have which unique files. It effectively
| leaks a social graph.
|
| It's also not available in China.
| saagarjha wrote:
| That's the joke/implication.
| vrtx0 wrote:
| Is the Apple Photos feature mentioned actually implemented using
| Wally, or is that just speculation?
|
| From a cursory glance, the computation of centroids done on the
| client device seems to obviate the need for sending embedded
| vectors of potentially sensitive photo details -- is that
| incorrect?
|
| I'd be curious to read a report of how on-device-only search
| (using latest hardware and software) is impacted by disabling the
| feature and/or network access...
| aeontech wrote:
| According to this post on Apple's Machine Learning blog, yes,
| Wally is the method used for this feature.
|
| https://machinelearning.apple.com/research/homomorphic-encry...
| vrtx0 wrote:
| Thank you! This is exactly the information the OP seems to
| have missed. It seems to confirm my suspicion that the
| author's concerns about server-side privacy are unfounded --
| I think:
|
| > The client decrypts the reply to its PNNS query, which may
| contain multiple candidate landmarks. A specialized,
| lightweight on-device reranking model then predicts the best
| candidate...
|
| [please correct me if I missed anything -- this used to be my
| field, but I've been disabled for 10 years now, so grain of
| salt]
| chikere232 wrote:
| The devil is in the proprietary details though.
| vrtx0 wrote:
| Sorry, what do you mean by "proprietary details"?
| sbuk wrote:
| They are alluding to the fact that the implementation is
| closed source, and therefore "untrustworthy". It's a
| trite point, of course, but not without some merit.
| vrtx0 wrote:
| I don't see any merit, honestly. That would assume one is
| able to audit every bit of code they run, including
| updates, and control the build system.
|
| I mean, the Wally paper contains enough information to
| effectively implement homomorphic encryption for similar
| purposes. The field was almost entirely academic ~12
| years ago...
|
| I miss talking shop on HN. Comments like that are why we
| can't have nice things.
| sbuk wrote:
| I do agree that everything is politicized. I'd have liked
| to have seen an explanation for laypeople and perhaps the
| option being opt-in. To me, there is some merit in that
| stance. It is a side-note. It is a shame that we can't
| talk about these things openly without people getting
| offended because of it.
| chikere232 wrote:
| You have to be quick if you want to disable the feature, as the
| scan starts on OS install, and disabling requires you to
| actively open the Photos app and turn it off.
| rkagerer wrote:
| This would be even more exciting if there were some way to
| guarantee your phone, the servers, etc. are running untampered
| implementations, and that the proxies aren't colluding with
| Apple.
| avianlyric wrote:
| If someone or something can tamper with your phone, then nobody
| needs to collude with proxies or Apple. They can just ask your
| phone to send them exactly what they want, without all the
| homomorphic encryption dance.
|
| The idea that Apple is going to use this feature to spy on you,
| completely misses the fact that they own the entire OS on your
| phone, and are quite capable of directly spying on you via your
| phone if they wanted to.
| timsneath wrote:
| That is exactly the goal of Private Cloud Compute, part of the
| fabric of Apple Intelligence:
| https://security.apple.com/blog/private-cloud-
| compute/#:~:te....
| cryptonector wrote:
| Upgrades have to be possible. What you want probably is
| attestation that you're running a generally available version
| that other users run too as opposed to one specially made for
| you, but since a version could be made for all those subject to
| surveillance this wouldn't be enough either.
|
| I'm not sure there's a way out of this that doesn't involve
| open source and repeatable builds (and watch out for
| Reflections on Trusting Trust).
| antman wrote:
| Is there any library for e.g. fhm hash table lookup or similar? I
| have seen papers but have not seen a consensus of what a useful
| implementatio is
| hoppp wrote:
| OpenFHE could be what you looking for.
|
| https://openfhe.org/
| chikere232 wrote:
| > You are Apple. You want to make search work like magic in the
| Photos app, so the user can find all their "dog" pictures with
| ease.
|
| What if you're a user and you don't care about searching for
| "dog" in your own photos, you might not even use the Photos app,
| but apple still scans all your photos and sends data off device
| without asking you?
|
| Perhaps this complicated dance works, perhaps they have made no
| mistakes, perhaps no one hacks or coerces the relay host
| providers... they could still have just asked for consent the
| first time you open Photos (if you ever do) before starting the
| scan.
| voidUpdate wrote:
| Android does this too. I don't really want all my photos
| indexed like that, I just want a linear timeline of photos, but
| I cant turn off their "memories" thing or all the analysis they
| do to them
| buran77 wrote:
| The "memories" part can be trivially done locally and
| probably is, it's really just reading the picture's "date
| taken", so it's conceptually as easy as a "sort by date". My
| old Android with whatever Photos app came with it (not
| Google's) shows this despite being disconnected for so long.
|
| There's nothing stopping either Apple or Google from giving
| users an option to just disable these connected features,
| globally or per-app. Just allow a "no cloud services" toggle
| switch in the Photos app, get the warning that $FEATURES will
| stop working, and be done.
|
| I know why Google isn't doing this, they're definitely
| monetizing every bit of that analyzed content. Not really
| sure about Apple though, might be that they consider their
| setup with HE as being on par with no cloud connectivity
| privacy wise.
| voidUpdate wrote:
| "memories" constantly given me notifications about "similar
| shots" at random, so I'm assuming it is trying to analyse
| the content of the photos. I managed to disable the
| notifications, but not the actual analysis
| Someone wrote:
| > The "memories" part can be trivially done locally and
| probably is, it's really just reading the picture's "date
| taken", so it's conceptually as easy as a "sort by date".
|
| It's more. It also can create memories "trip to New York in
| 2020", "Cityscapes in New York over the years", or "Peter
| over the years" (with Peter being a person added to Photos)
| Aachen wrote:
| No Android phone I've ever owned automatically uploaded your
| photos without asking. What exactly do you mean that it does
| too?
| ranguna wrote:
| Uninstall Google photos and install a dumb photos app. I
| think most android phones don't even come with Google photos
| pre installed.
| TheSpiceIsLife wrote:
| _Dumb Photo App_ by Nefarious DataExfiltration Co & Son
| tcfhgj wrote:
| Fossify gallery
| ThePowerOfFuet wrote:
| This is what the "Allow Network permission" checkbox in
| the app installation dialog on GrapheneOS is for.
| y04nn wrote:
| I don't think Android does that. It's only Google Photo and
| only if you upload them to the cloud, if you don't
| sync/upload them, you can't search them with specific terms.
| lucideer wrote:
| Android doesn't do this. Everything is opt-in.
|
| Granted they require you to opt-in in order for the photos
| app to be usable & if you go out of your way to avoid opting
| in they make your photo-browsing life miserable with prompts
| & notifications. But you do still have to opt-in.
| nine_k wrote:
| A number of good third-party photo-browsing apps make it
| non-miserable, even if you never open Google Photos or even
| uninstall it.
| lucideer wrote:
| I've seen a lot of people saying this generally but no
| specific recommendations.
|
| I've used Simple Gallery Pro before but it's not very
| good.
|
| Currently using Immich but that's not really a general
| photo app - it's got a narrow use case - so I still use
| the Google Photos app alongside it quite often.
|
| Specific alternative recommendations that aren't malware
| welcome.
| marmight wrote:
| It depends which features you need, but interestingly
| Google has _another_ , lighter weight gallery app called
| Google Gallery that does not have any cloud features
| built in.
| ckae wrote:
| Fossify Gallery (on Fdroid or Google store) works quite
| nicely for me as a nice and simple photo viewer and
| management app.
| nine_k wrote:
| Simple Gallery Pro is what I use, and it seems fine to
| me. What do you think should be added to it, or altered?
| Just curious how other people see UX.
| pertique wrote:
| I can't personally vouch for it as I'm still stuck in
| Google Photos and would prefer to self-host it, but Ente
| may interest you. Open source, end-to-end encrypted,
| self-host or cloud.
| lucideer wrote:
| I'm really happy with Immich & not looking for a
| replacement. Evaluated it vs Ente in the past & went with
| it instead - as far as I could tell their apps have the
| same features & limitations (focus on remote backup &
| display rather than on local on-device photo management &
| basic markup/editing).
|
| If (like me) you don't need e2e I can highly recommend
| Immich for its use-case though.
| dvngnt_ wrote:
| > I've used Simple Gallery Pro before but it's not very
| good.
|
| It's rock solid for me. you can browse folders move,
| copy, hide small edits. you can't search 'dog' which is a
| plus, it doesn't scan faces.
| Ghoelian wrote:
| > or even uninstall it
|
| Unfortunately google's camera app will only open google
| photos if you click the image preview after taking one.
| Just doesn't respect the default gallery app setting at
| all.
| alex7734 wrote:
| Google loves doing this.
|
| If you dare turn off Play Protect for example, you will be
| asked to turn it on every time you install or update
| anything. Never mind that you said no the last thousand
| times it asked.
| Enginerrrd wrote:
| It says it's "opt in" but as someone who hasn't opted in, I
| still get the notifications and I can see a split second
| preview of all the stuff they're not supposed to have
| computed before it asks me to opt in. So there's DEFINITELY
| shenanigans ocurring.
| numpad0 wrote:
| uninstall(disable) stock Google Photos app and install
| `gallery2.apk`. You can download one from sketchy github
| repos, or I think you can alternatively extract from Emulator
| image.
| nine_k wrote:
| Why, install a non-sketchy _open-source_ gallery app from
| F-Droid.
| AshamedCaptain wrote:
| Samsung at least does these "dog" cataloguing & searches
| entirely on-device, as trivially checked by disabling all
| network connectivity and taking a picture. It may ping home
| for several other reasons, though.
| TeMPOraL wrote:
| Does or doesn't. You can't really tell if and when it does
| any cataloguing; best I've managed to observe is that you
| can increase chances of it happening if you keep your phone
| plugged in to a charger for extended periods of time.
|
| That's the problem with all those implementations: no
| feedback of any kind. No list of recognized tags. No
| information of what is or is to be processed. No nothing.
| Just magic that doesn't work.
| reaperman wrote:
| With embeddings, there might not be tags to display.
| Instead of labeling the photo with a tag of "dog", it
| might just check whether the embedding of each photo is
| within some vector distance of the embedding of your
| search text.
| TeMPOraL wrote:
| Yes and no. Embeddings can be used in both directions -
| if you can find images closest to some entries in a
| search text, you can also identify tokens or phrases
| closest in space to any image or cluster of images, and
| output that. It's a problem long solved in many different
| ways, including but not limited to e.g.:
|
| https://github.com/pythongosssss/ComfyUI-WD14-Tagger
|
| which uses specific models to generate proper booru tags
| out of any image you pass to it.
|
| More importantly, I know for sure they have this
| capability in practice, because if you tap the right way
| in the right app, when the Moon is in just the right
| phase, both Samsung Gallery and OneDrive Photos does (or
| in case of OneDrive, used to):
|
| - Provide occasional completions and suggestions for
| predefined categories, like "sunset" or "outwear" or
| "people", etc.;
|
| - Auto-tag photos with some subset of those (OneDrive,
| which also sometimes records it in metadata), or if you
| use "edit tag" options, suggest best fitting tags
| (Samsung);
|
| - Have a semi-random list of "Things" to choose from to
| categorize your photos, such as "Sunsets", "City",
| "Outdoors", "Room", etc. Google Photos does that one too.
|
| This shows they _do_ maintain a list of correct and
| recommended classifications. They just choose to keep it
| hidden.
|
| With regards to face recognition, it's even worse.
| There's zero controls and zero information other than
| occasionally matched (and often mismatched) face under
| photo properties, that you can sometimes delete.
| llm_nerd wrote:
| Apple also does the vast majority of photo categorization
| on device, and has for years over multiple major releases.
| Foods, drinks, many types of animals including specific
| breeds, OCRing all text on the image even when massively
| distorted, etc.
|
| This feature is some new "landmark" detection and it feels
| like it's a trial balloon or something as it simply makes
| zero sense unless what they are categorizing as landmarks
| is enormous. The example is always the Eiffel tower, but
| the data to identify most of the world's major landmarks is
| small relative to what the device can already detect, not
| to mention that such lookups don't even need photo
| identification and could instead (and actually already do
| and long have) use simple location data and nearby POIs for
| such metadata tagging.
|
| The landmarks thing is the beginning, but I feel like they
| want it to be much more detailed. Like every piece of art,
| model of car, etc, including as they change with new
| releases, etc.
| oulipo wrote:
| Well, not vouching for automated scanning or whatever, but the
| advantage of homomorphic encryption is that besides the power
| usage for the computation and the bandwidth to transmit the
| data, Apple doesn't learn anything about what's in your photos,
| only you can. So even if you don't use the feature, the impact
| is minimal for you
| zombot wrote:
| Exactly, I don't want my shit sent all across the internet
| without my explicit prior consent, period. No amount of
| explanation can erase Apple's fuck-up.
| api wrote:
| Not wrong, but it's interesting that Apple gets so much flak
| for this when Google and Microsoft don't even try. If
| anything they try to invade privacy as much as possible.
|
| Of course maybe that question has its own answer. Apple
| markets itself as the last personal computing company where
| you are the customer not the product so they are held to a
| higher standard.
|
| What they should do Is do the processing locally while the
| phone is plugged in, and just tell people they need a better
| phone for that feature if it's too slow. Or do it on their
| Mac if they own one while that is plugged in.
| yard2010 wrote:
| In other words: don't hate the player hate game, but the
| point still stands.
| drawkward wrote:
| The game, unlike Apple's policy, is opt-in. Hate the
| player _and_ the game.
| okamiueru wrote:
| Whataboutisms aren't all the great you know. Google and MS
| also get flak, and they also deserve it.
|
| But now that we're talking about these differences, I'd say
| that Apple users are notoriously complacent and defend
| Apple and their practices. So, perhaps in some part it is
| in an attempt to compensate for that? I'm still surprised
| how we've now accepted that Apple receives information
| pretty much every time we run a process (or rather, if it
| ran more than 12 hours ago, or has been changed).
| BlackFly wrote:
| Well when you are building a feature that can only be
| appreciated by a subculture of people (privacy advocates),
| and they complain about the most basic faux pas that you
| could do in their culture (not asking them before you phone
| home with data derived from their data)... you have invited
| these people to criticise you.
|
| Most people I know of wouldn't care about such a feature
| other than a breathless sort of "Wow, Apple tech!" So they
| are building something which is intended to win over
| privacy conscious people, kudos to them, everyone stands to
| benefit. But the most basic custom in that subculture is
| consent. So they built something really great and then
| clumsily failed on the easiest detail because it is so
| meaningless to everyone except that target audience. To
| that audience, they don't bother criticising google or
| microsoft (again) because it goes without saying that those
| companies are terrible, it doesn't need to be said again.
| ylk wrote:
| > a feature that can only be appreciated by a subculture
| of people (privacy advocates)
|
| Just because it can't be "appreciated" by all users
| doesn't mean it's only "for" a small sub-group.
|
| It seems to me they're just trying to minimise the data
| they have access to -- similar to private cloud compute
| -- while keeping up with the features competitors provide
| in a less privacy-respecting way. Them not asking for
| permission makes it even more obvious to me that it's not
| built for any small super privacy-conscious group of
| people but the vast majority of their customers instead.
| gigel82 wrote:
| "not asking them before you phone home with data" is a
| basic faux pas for privacy advocates? LOL; that's a
| fundamental breach of trust of the highest degree, not
| basic by any means.
| Dylan16807 wrote:
| Are you under the impression that "basic" and
| "fundamental" are not synonyms?
| victorbjorklund wrote:
| You can always find someone worse. Does not mean we should
| not critise people/organizations.
|
| You think Trump is bad? Well, Putin is worse. You think
| Putin is bad? Kim Jong Un is worse.
| beretguy wrote:
| And who's worse than kim?
| camjw wrote:
| Kier Starmer, if you ask Elon
| lapcat wrote:
| > just tell people they need a better phone for that
| feature if it's too slow. Or do it on their Mac if they own
| one while that is plugged in.
|
| The issue isn't slowness. Uploading photo library
| data/metadata is likely always slower than on-device
| processing. Apparently the issue in this case is that the
| world locations database is too large to be stored locally.
| phkahler wrote:
| >> Apparently the issue in this case is that the world
| locations database is too large to be stored locally.
|
| What kind of capacity can ROM chips have these days? And
| at what cost?
| j2kun wrote:
| FWIW, I work on homomorphic encryption at Google, and
| Google has all kinds of other (non-FHE) privacy enhancing
| tech, such as differential privacy, federated learning, and
| https://github.com/google/private-join-and-compute which
| are deployed at scale.
|
| Perhaps it's not as visible because Google hasn't defaulted
| to opt-in for most of these? Or because a lot of it is B2B
| and Google-internal (e.g., a differential-privacy layer on
| top of SQL for internal metrics)
|
| [edit]: this link was a very vague press release that
| doesn't say exactly how Google uses it:
| https://security.googleblog.com/2019/06/helping-
| organization...
| keeganpoppen wrote:
| uhhh yeah it's not visible because it's not used for
| anything. because it runs contrary to Google's entire
| raison d'etre. if it's not turned on by default, what is
| even the point of doing it at all other than to pacify
| engineers who are perfectly happy to miss the forest for
| the trees? it's kind of like saying that you have the
| power of invisibility, but it only works if no one is
| looking at you.
| butlike wrote:
| Doesn't Photos.app on iOS sync with iCloud OOTB?
| GeekyBear wrote:
| Apple does photo recognition on your device.
|
| Google, on the other hand, uploads photos to their server and
| does the analysis there.
|
| There is the infamous case of the parents who Google tried to
| have arrested after they used their Android device to seek
| medical assistance for their child during lockdown. Their
| doctor asked them to send images of the problem, and Google
| called the police and reported the parents for kiddie porn.
|
| > "I knew that these companies were watching and that privacy
| is not what we would hope it to be," Mark said. "But I
| haven't done anything wrong."
|
| The police agreed. Google did not.
|
| https://www.nytimes.com/2022/08/21/technology/google-
| surveil...
|
| Google refused to return access to his account even after the
| police cleared him of wrongdoing.
| throw10920 wrote:
| > Google refused to return access to his account even after
| the police cleared him of wrongdoing.
|
| This is why I constantly work to help people reduce their
| dependence on Google. Screw that. If anyone ever tells you
| that they rely on Google for anything, show them this
| article.
| Klonoar wrote:
| They are not sending your actual photo, as has been covered
| at length on numerous threads on this very site.
| gigel82 wrote:
| That's irrelevant if the information they do send is
| sufficient to deduce "Eiffel tower" or "dog" out of it:
| that's too much information to send.
| GeekyBear wrote:
| They don't have to send anything since they do all the
| image recognition on the user's own device.
|
| Sending everything to a server is, however, how Google's
| service works.
| TeMPOraL wrote:
| What if you're a user and you're fed up with all the "magic"?
| What if you want a device that works reliably, consistently,
| and in ways you can understand from empirical observation if
| you pay attention?
|
| Apple, Google, Microsoft and Samsung, they all seem to be
| tripping over each other in an effort to make this whole thing
| just as much ass-backwards as possible. Here is how it, IMHO,
| should work:
|
| 1) It scans stuff, detects faces and features. Locally or in
| the cloud or not at all, as governed by an explicit opt-in
| setting.
|
| 2) Fuck search. Search is not discoverable. I want to _browse_
| stuff. I want a _list_ of objects /tags/concepts it recognized.
| I want a list of faces it recognized and the ability to
| manually retag them, and manually _mark any that they missed_.
| And not just a list of 10 categories the vendor thinks are most
| interesting. All of them. _Alphabetically_.
|
| 3) If you insist on search, make it work. I type in a word, I
| want all photos tagged with it. I click on a face, I want all
| photos that have matching face on it. Simple as that. Not
| "eventual consistency", not "keep refreshing, every 5th refresh
| I may show you a result", or other such breakage that's a
| staple experience of OneDrive Photos in particular.
|
| Don't know about Apple, but Google, Microsoft and Samsung all
| refuse #2, and spectacularly fail at #3, and the way it works,
| I'm convinced it's intentional, as I can't even conceptualize a
| design that would exhibit such failures naturally.
|
| EDIT:
|
| 4) (A cherry on a cake of making a sane product that works)
| Recognition data is stored in photo metadata, whether directly
| or in a sidecar file, in any of a bunch of formats sane people
| use, and is both exported along with the photos, and adhered to
| when importing new photos.
| tempworkac wrote:
| It doesn't really matter if they ask you or not, ultimately you
| have to trust them, and if you don't trust Apple, why would you
| even use an iPhone?
| chikere232 wrote:
| As they didn't ask, I will trust them less
| tempworkac wrote:
| why use a device by someone you don't trust? honestly don't
| get it. I'd use an open source android distro
| drawkward wrote:
| I am merely a data scientist, so don't really know a ton
| about mainline programming beyond a few intro CS courses.
|
| Why would an open source android distro be more
| trustworthy?
| subjectsigma wrote:
| Here is my simplified take on it which will likely get me
| flamed.
|
| Trust has many meanings but for this discussion we'll
| consider privacy and security. As in, I trust my phone to
| not do something malicious as a result of outside
| influence, and I trust it to not leak data that I don't
| want other people to know.
|
| Open source software is not inherently more secure nor
| more private. However it can sometimes be more secure
| (because more people are helping find bugs, because that
| specific project prioritizes security, etc.) and is
| usually more private. Why? Because it (usually) isn't
| controlled by a single central entity, which means there
| is (usually) no incentive to collect user data.
|
| In reality it's all kind of a mess and means nothing.
| There's tons of bugs in open source software, and
| projects like Audacity prove they sometimes violate user
| privacy. HN-type people consider open source software
| more secure and private because you can view the source
| code yourself, but I guarantee you they have not
| personally reviewed the source of all the software they
| use.
|
| If you want to use an open-source Android distro I think
| you would learn a lot. You don't need to have a CS
| degree. However unless you made massive lifestyle changes
| in addition to changing your phone, I'm not confident it
| would meaningfully make you more secure or private.
| drawkward wrote:
| It was a bit of a strawman question anyway; as someone
| who could review the source myself but wont (because the
| pain-to-utility threshold is way too high) I am then
| required to place my trust in some ad-hoc entity (the
| open-source community), that doesn't actually have a
| financial disincentive to make sure things aren't bad.
|
| I have other reasons, perhaps, to prefer open source
| stuff, but I am not ready to assume it is inherently more
| private or secure.
| internetter wrote:
| To your point, you can't even trust the software if the
| hardware is untrusted
| chikere232 wrote:
| It doesn't have to be binary. I have some trust for
| apple. They've earned it in various ways by caring for
| privacy.
|
| When they start opting me into photo scanning I lose a
| bit of trust. The homomorphic encryption makes it less
| bad. The relative quiet around the rollout of the feature
| makes it worse. Apple's past attempt to start client side
| scanning makes it worse. Etc...
|
| The net result is I trust them a bit less. Perhaps not
| enough to set my apple devices on fire yet, but a bit.
| razemio wrote:
| How can you trust any mainstream "working" iPhone or Android
| device? You already mentioned open source android distros.
| You mean those where no banking or streaming device app works
| because you have to use a replacement for gapps and the root
| / open bootloader prevents any form of DRM? That is not
| really an option for most people. I would love to have a
| Linux phone even with terrible user experience as long as I
| do not lose touch with society. That however seems to be an
| impossible task.
| tempworkac wrote:
| I'm curious what functions other than maybe depositing a
| check requires a banking app?
| bitdivision wrote:
| Depends where you live. In the US, probably not much, but
| in other countries where transfers are ubiquitous, being
| unable to use a banking app could be a real problem.
| tempworkac wrote:
| are there really countries where the bank doesn't have a
| website you can use to do a transfer, but you could do it
| through an app?
| bitdivision wrote:
| I don't know, though certainly the experience is a lot
| simpler without the 15 minute timeout, painful login, and
| extra security checks I see on web banking.
|
| Edit: Not to mention that many of the newer banks don't
| even have web banking. It's app only. Of course, its your
| choice to open an account there though
| solarkraft wrote:
| In Germany and I think the whole EU 2 factor
| authentication is mandatory, for which the favored
| implementation is an app. SMS TAN is out, the alternative
| is a secondary device you stick your card into.
| Dylan16807 wrote:
| Do you need a proprietary app for that? TOTP is fine, you
| can just pick your own.
| TeMPOraL wrote:
| Haven't seen a bank offering software TOTP in Poland.
| Over a decade ago, before smartphones became ubiquitous,
| I've seen a bank offering a physical TOTP device. These
| days, as far as I've seen, it's either SMS codes or
| single use codes on a physical scratch cards (haven't
| seen one in 5 years, though), or in-app confirmation.
| sbuk wrote:
| No, but there are bank accounts that are app only. Monzo
| in the UK is a popular example.
| lapcat wrote:
| Trust is never all or nothing. I trust Apple to an extent,
| but trust needs to be earned and maintained. I trust my mom,
| but if she suggested installing video cameras in my home for
| my "safety", or worse, she secretly installed video cameras
| in my home, then she would lose my trust.
|
| Likewise, you need to trust your spouse or significant other,
| but if there are obvious signs of cheating, then you need to
| be suspicious.
|
| An essential part of trust is not overstepping boundaries. In
| this case, I believe that Apple did overstep. If someone
| demands that you trust them blindly and unconditionally,
| that's actually a sign you shouldn't trust them.
| sbuk wrote:
| > If someone demands that you trust them blindly and
| unconditionally, that's actually a sign you shouldn't trust
| them.
|
| That's certainly a take, which you're clearly entitled to
| take. I don't disagree with the point that you make; this
| ought to have been opt in.
|
| What you _should_ do now is acknowledge this in your
| original post and then explain why they should have been
| more careful about how they released this feature.
| Homomorphic encryption of the data reframes what you wrote
| somewhat. Even though data _is_ being sent back, Apple
| never knows _what_ the data is.
| lapcat wrote:
| > What you _should_ do now is acknowledge this in your
| original post and then explain why they should have been
| more careful about how they released this feature.
| Homomorphic encryption of the data reframes what you
| wrote somewhat.
|
| Do you mean my original blog post? The one that not only
| mentions homomorphic encryption but also links to Apple's
| own blog post about it? I don't know how that can
| "reframe" what I wrote when it already framed it.
| abtinf wrote:
| So don't use the photos app. Just get an alternative camera app
| and you bypass all of this.
| thisislife2 wrote:
| It's opt-in by default so you can't "bypass" it unless you
| are aware that you can turn it off. If you don't turn it off,
| it will continue to scan your photos, and upload the data to
| Apple, whether you use the Photos app or not. (And, by the
| way, if the option to "learn from this app" is enabled (which
| is again, by default opt-in) iPadOS / ios also will be
| intrusively data collecting how you use that alternative
| camera app too ...
| plandis wrote:
| You can vote with your wallet and get a Pine Phone or something
| similar, I guess.
| t43562 wrote:
| I never even knew images could be searched this way on a phone
| and the iPhone users in my family don't either.
|
| A huge privacy-bruising feature for nothing in our case.
| OkGoDoIt wrote:
| I've been using it a lot recently. Multiple times even today
| while I've been trying to find just the right photos of my
| theater for a brochure I'm putting together. I have over
| 100,000 photos in Apple photos so even if I vaguely remember
| when I took a photo it's still difficult to find it manually.
|
| As a concrete example, someone on my team today asked me "can
| you send me that photo from the comedy festival a couple years
| ago that had the nice projections on the proscenium?". I
| searched apple photos (on my phone, while hiking through a
| park) for "sketchfest theater projection". It used the OCR to
| find Sketchfest and presumably the vector embeddings of theater
| and projection. The one photo she was referring to was the top
| result. It's pretty impressive.
|
| It can't always find the exact photo I'm thinking of the first
| time, but I can generally find any vaguely-remembered photo
| from years ago without too much effort. It is pretty magical.
| You should get in the habit of trying it out, you'll probably
| be pleasantly surprised.
| vrtx0 wrote:
| Do you mean the ability to search in Apple Photos is "privacy-
| bruising", or are you referring to landmark identification?
|
| If the latter, please note that this feature doesn't actually
| send a query to a server for a specific landmark -- your device
| does the actual identification work. It's a rather clever
| feature in that sense...
| hoppp wrote:
| Its using Concrete from Zama.
|
| I didn't like their license because it's BSD-3-Clause-Clear but
| then they state:
|
| "Zama's libraries are free to use under the BSD 3-Clause Clear
| license only for development, research, prototyping, and
| experimentation purposes. However, for any commercial use of
| Zama's open source code, companies must purchase Zama's
| commercial patent license"
|
| So Its not free, you need to pay for patent license, and they
| don't disclose how much.
|
| I recommend OpenFHE as an alternative Free open source solution.
| I know its C++ and not Rust, but no patent license and it can do
| the same thing the blog post wants to do, it even has more
| features like proxy-reencryption that I think Concrete can't do.
| nine_k wrote:
| How is this "BSD-licensed but only for research" not self-
| contradictory?
|
| It's like saying: "FREE* candy! (Free to look at, eating is
| $6.95 / pound)"
| gus_massa wrote:
| They use the patent loophole. From
| https://www.zama.ai/post/open-source
|
| > _If a company open sources their code under BSD3-clear,
| they can sell additional licenses to use the patents included
| in the open source code. In this case, it still isn't the
| software that is sold, but rather the usage rights of the
| patented intellectual property it contains._
|
| Every day I like the Apache licence more.
| nabla9 wrote:
| BSD+"additional clause" is not BSD.
|
| Just like 3+1 is not 3.
| eli wrote:
| Wouldn't the patent still be a problem with the standard
| BSD license? BSD would grant you license to redistribute
| the software but not necessarily the patent rights to use
| it.
| commandersaki wrote:
| Source? I'm unconvinced. They have been posting stuff about
| implementing HE primitives in Swift as of last year.
| j2kun wrote:
| Zama has already hit competitors with (French) patent
| charges. Apple's HE implementation is in Swift and uses BFV,
| which is very different HE from anything Zama does and
| doesn't use their source.
| commandersaki wrote:
| Yeah that was what I thought. I've seen their engineers
| also push for an employment drive for more engineers in the
| HE space, so I assume they're going to expand its use where
| applicable, building from the ground up.
| k__ wrote:
| Don't they have TEE?
| j2kun wrote:
| Doing it on device would require too much data, and TEEs have
| side-channel vulnerabilities making it difficult to deploy
| securely in prod.
| RicoElectrico wrote:
| While this should been opt-in to avoid bad publicity... As usual
| HN commenters are out of touch (nothing changed since Dropbox
| launched, right?). Local AI-powered photo search is great! By far
| one of my favorite features introduced in smartphones lately.
| Even on Samsung which lags behind a bit. Text indexing, pet ID,
| quite detailed object classes. I use the camera as something of a
| diary and notepad and AI is amazing for organizing 10k+ photos.
| barnabee wrote:
| This is about additional new photo search capabilities that are
| enabled by default and powered by sending (encrypted) data
| derived from your photos to the cloud, not locally powered AI
| search.
| sbuk wrote:
| *Homomorphically encrypted. Be precise. Deliberately
| handwaving it away as "mere" encryption paints a worse
| picture than it actually is. Yes, possibly should have been
| opt-in, and a more human-friendly explanation of how it works
| would help. However, it's not nearly as bad as HN would have
| you believe. Very much a case of them pointing out that the
| emperor is naked, when in fact they have a few shirt buttons
| missing...
| solsane wrote:
| To be fair, it sounds like it's both. Local, AI powered
| 'neural hashing' + SHE
| williamtrask wrote:
| I love homomorphic encryption, but why can't they just do a
| neural search locally?
|
| - iOS Photos -> Vectors
|
| - Search Query "Dog photos" -> Vectors
|
| - Result (Cosine Similarity): Look some dog photos!
|
| iPhones have plenty of local storage and compute power for doing
| this kind of thing when the phone is idle. And cosine similarity
| can work quickly at runtime.
| orf wrote:
| That's what they do
| dialup_sounds wrote:
| Apparently they only do the cloud HE song and dance for
| landmarks, which is too big of a data set to realistically keep
| on-device.
| woadwarrior01 wrote:
| Here's an open source iOS app[1] that does that. Incidentally,
| it's built using Apple's own MobileCLIP[2] models.
|
| [1]: https://github.com/fguzman82/CLIP-Finder2 [2]:
| https://github.com/apple/ml-mobileclip
| jerf wrote:
| Because the blog post needs some sort of concrete example to
| explain, but all concrete examples of fully-homeomorphic
| encryption are generally done better locally at the moment due
| to the extreme costs of FHE.
| internetter wrote:
| I discuss this in the post:
|
| > This seems like a lot of data the client is getting anyway. I
| don't blame you for questioning if the server is actually
| needed. The thing is, the stored vectors that are compared
| against are by far the biggest storage user. Each vector can
| easily be multiple kilobytes. The paper discusses a database of
| 35 million entries divided across 8500 clusters.
| AshamedCaptain wrote:
| There is another reason which I dislike this which is that now
| Apple has reason for "encrypted" data to be sent randomly or at
| least every time you take a picture. If in the future they
| silently change the photos app (a real risk that I have really
| emphasized in the past) they can now silently pass along a hash
| of the photo and noone would be the wiser.
|
| If an iPhone was not sending any traffic whatsoever to the
| mothership, at least it would ring alarm bells if it suddenly
| started doing so.
| commandersaki wrote:
| Isn't this the same argument that they can change any part of
| the underlying OS and compromise the security by exfiltrating
| secret data? Why specific to this Photos feature.
| cryptonector wrote:
| No. GP means that if the app was not already phoning home
| then seeing it phone home would ring alarm bells, but if the
| app is always phoning home if you use it at all then you
| can't see "phoning home" as an alarm -- you either accept it
| or abandon it.
|
| Whereas if the app never phoned home and then upon upgrade it
| started to then you could decide to kill it and stop using
| the app / phone.
|
| Of course, realistically <.00001% of users would even check
| for unexpected phone home, or abandon the platform over any
| of this. So in a way you're right.
| doublerabbit wrote:
| And which they silently do, change the applications. Maps has
| been updated for me via A/B testing. Messaging too.
| twodave wrote:
| Any app can do this really, just can't update the
| entitlements and a few other things. I would think it
| unlawful for Apple's own apps to have access to
| functionality/apis that others don't...
| JayShower wrote:
| This is so cool! I first learned about homomorphic encryption in
| the context of an election cybersecurity class and it seemed so
| pie-in-the-sky, something that would unlikely be used for general
| practical purposes and only ever in very niche areas. Seeing a
| big tech company apply it in a core product like this really does
| feel like a step in the right direction towards taking back some
| privacy.
| _verandaguy wrote:
| I'm not an expert in homomorphic encryption by any stretch (and
| I'm arguably the target audience for this blog post -- a curious
| novice), but there's one thing I don't quite get from this post.
|
| In the "appeal to cryptographers" section (which I really look
| forward to being fulfilled by someone, hopefully soon!), HE is
| equated to post-quantum cryptography. _As far as I know,_ most
| current post-quantum encryption focuses on the elimination of
| Diffie-Hellman schemes (both over finite fields and over elliptic
| curves) since those are vulnerable to Shor 's algorithm.
|
| However, it's clear from the code samples later in the post (and
| not explained in the text, afaict) that a public key gets used to
| re-encrypt the resultant value of a homomorphic add or multiply.
|
| Is this a case of false equivalence (in the sense that HE !=
| post-quantum), or is it more the case that there's some new
| asymmetric cryptography scheme that's not vulnerable to Shor's?
| j2kun wrote:
| All modern HE schemes rely on post-quantum crypto. For example,
| the ring-LWE problem used by BFV is the same as Kyber (ML-KEM)
| but with different parameters.
|
| The twist in FHE is that the server also has an encryption of
| the user's secret key, which adds an assumption called
| "circular security", and that's needed to do some homomorphic
| operations like key switching.
| _verandaguy wrote:
| Right on, thanks for the explanation!
|
| So what gets called the "public key" in the blog post is just
| the (self?-)encrypted secret key from the user?
|
| I'll read up on your other points after work -- appreciate
| the search ledes :)
| j2kun wrote:
| The public key is also just like a normal public key, but
| the encrypted secret key is often called an evaluation key
| or a key switching key, or some other names. (It's also
| public in the security sense)
| ge96 wrote:
| This is a neat topic I want to get into more myself
|
| Searching encrypted stuff is what I wondered about, in the past I
| had to decrypt everything before I could use the standard sql
| search LIKE
|
| Funny post today about cosine similarity
___________________________________________________________________
(page generated 2025-01-15 23:01 UTC)