[HN Gopher] Spiral's Homomorphic Encryption - Is This the Future...
       ___________________________________________________________________
        
       Spiral's Homomorphic Encryption - Is This the Future of Privacy?
        
       Author : lclc
       Score  : 90 points
       Date   : 2022-11-21 09:46 UTC (13 hours ago)
        
 (HTM) web link (www.21analytics.ch)
 (TXT) w3m dump (www.21analytics.ch)
        
       | a_c wrote:
       | Can someone give some pointers to how homomorphic encryption is
       | achieved? I know what it is from a high level, but would love to
       | learn more details. What types of operation it support, what kind
       | of encryption is used, etc. Thanks!
        
         | j2kun wrote:
         | This is one detailed survey:
         | https://eprint.iacr.org/2021/1402.pdf
         | 
         | The underlying cryptosystems are often LWE (Learning with
         | Errors), RLWE (Ring Learning with Errors) and one called (R)GSW
         | (named after the authors of this paper:
         | https://eprint.iacr.org/2013/340) which is also based on
         | (R)LWE.
        
       | bradleybuda wrote:
       | For decades FHE has been "not quite ready for primetime" and I
       | find it quite exciting that we're now reaching the stage where
       | it's staring to be commercially viable. I truly hope that it
       | doesn't somehow get lumped in with the technical toxic waste site
       | that is "blockchain" along the way.
        
       | mawise wrote:
       | I've been really excited about the potential use of HE for
       | private messaging. Today the most anyone does is end-to-end
       | encryption, which does a great job protecting what you're saying,
       | but it fails to protect who you are talking with. HE has the
       | potential to change that.
       | 
       | The Wikipedia demo[1] starts with a big download because you're
       | fetching an index of articles. Subsequent requests have you send
       | an encrypted one-hot[2] vector making the article you want to
       | read. The server does an encrypted dot-product of the vector with
       | the vector of articles, returning just the encrypted article
       | you're looking for.
       | 
       | A messaging system could do the same thing, where your vector
       | selects member identifiers or public keys.
       | 
       | [1]: https://spiralwiki.com/
       | 
       | [2]: https://en.wikipedia.org/wiki/One-hot
        
         | sangel wrote:
         | This is basically what we did in our project:
         | https://www.cis.upenn.edu/~sga001/papers/pung-osdi16.pdf.
         | 
         | We never built it into a product because we couldn't figure out
         | a way to monetize it to pay for the servers.
        
         | nanomonkey wrote:
         | End to end encryption _can_ hide the recipient, at the cost
         | that each recipient has to attempt to unencrypt all encrypted
         | messages with their key to see if they were the intended. This
         | is fairly fast on modern computers, and is how secure
         | scuttlebutt works. Note this is only feasible on gossip
         | protocols, pub-sub, or content addressable hash stores where
         | you are only looking at a subset of the users that you follow,
         | instead of inspecting all traffic without another side channel
         | for indicating what messages should be inspected.
        
         | noam_k wrote:
         | I think I understand what you're suggesting, but keep in mind
         | that in the Wikipedia example the database is largely static
         | (the server decides when to update), while a messaging app
         | needs to support users updating the DB. There are a lot of
         | leakage scenarios that need to be taken into account (like no
         | push notifications).
         | 
         | Spiral has a video[1] where they dive into some of the details.
         | 
         | [1] https://youtu.be/T7RDEEJ5vQg
        
         | anonporridge wrote:
         | Hiding the metadata is indeed very important for the privacy
         | conscious.
         | 
         | To quote an ex-NSA chief "We kill people based on metadata".
         | https://abcnews.go.com/blogs/headlines/2014/05/ex-nsa-chief-...
         | 
         | Metadata is data.
        
       | wslh wrote:
       | How Spiral differs of companies such as Duality Technologies [1]
       | and DPella [2].
       | 
       | [1] https://dualitytech.com/
       | 
       | [2] https://www.dpella.io/
        
         | noam_k wrote:
         | Both Spiral and Duality use homomorphic encryption (HE).
         | DPella, however, uses differential privacy (DP), which allows
         | for a different set of applications.
         | 
         | As a rule of thumb, HE allows you to offload a intensive
         | computation (or one that requires a private model) to a server
         | that you don't trust. Only you can decrypt the results. DP, on
         | the other hand, lets the server analyze big data and use the
         | results. Privacy is achieved by each input being "noisy", so no
         | little information is leaked. The statistics still work because
         | over a large data set the noises cancel out.
        
       | olah_1 wrote:
       | This could be used with Nostr[1] to add a ton of privacy! The
       | added privacy would even be a reason to pay for using the server.
       | 
       | As of now, a lot of privacy is lost when you actually look at
       | Nostr events. There are servers that check to see if a user has
       | paid before they execute the request too[2].
       | 
       | 1: https://github.com/nostr-protocol/nostr
       | 
       | 2: https://github.com/fiatjaf/expensive-relay
        
         | blintz wrote:
         | Yeah, we think private lookup could make distributed retrieval
         | protocols like IPFS, BitTorrent, etc significantly more
         | private. Currently, they are in some ways much worse from a
         | privacy perspective than centralized alternatives, since they
         | involve broadcasting information about each retrieval to a
         | large number of peers; the ability to do private lookups could
         | really help fix this.
        
       | xphos wrote:
       | Does this limit the type of operation you can do. I could imagine
       | if you were allowed decision operators you would completely
       | defeat the encryption. This makes me think there is a a limit on
       | some of the algorithms that you can actually run. I saw in the
       | comments about Differential Privacy which sounds like it
       | overcomes this issue but I am curious on what more educated
       | people how to say about the operation sets available in a
       | Homomorphic environment
        
         | noam_k wrote:
         | When using Homomorphic Encryption you need to compile your
         | application to a circuit. This means that for branches you need
         | to evaluate both sides and multiply by a bit (like a
         | multiplexor). This way you preserve privacy at the price of
         | heavy computation on the server side.
        
           | j2kun wrote:
           | +1, and some compilers already exist to do that for you. See,
           | e.g., Google's compiler (which I work on).
           | https://github.com/google/fully-homomorphic-encryption
        
           | a1369209993 wrote:
           | > This means that for branches you need to evaluate both
           | sides
           | 
           | Not true in general, since you can reuse the multiplexor
           | multiple times _during_ the evaluation, to produce
           | essentially a circuit-wise least-common-multiple of the two
           | sides of the branch. Eg, if one side performs two
           | multiplications, and the other a multiplication and a
           | division, you only need to evaluate a division and two
           | multiplications, not a division and three multiplications. So
           | "evaluate both sides" is a worst-case upper bound on the
           | amount of computation.
           | 
           | Loops are still a pain in the ass, though.
        
       | jchw wrote:
       | I'm a nerd without much academic background interested in
       | cryptographic techniques to improve privacy. For example,
       | techniques like PAKEs offer interesting privacy tradeoffs and
       | allow for E2EE as seen in software like password managers.
       | 
       | One big obstacle with E2EE, though, is that it relies on clients
       | to do basically all of the computations. But, among other things,
       | there are situations where you might imagine wanting to be able
       | to allow an operation to be completed without needing both
       | clients to actively participate, without revealing key matter
       | directly.
       | 
       | Examples of FHE seem to stick to fairly simple things, but a lot
       | of the more modern demos show off more interesting capabilities.
       | What I wonder is, what is practical today using today's stacks?
       | For example, could a server blindly perform cryptographic
       | operations under the veil of FHE, potentially using parameters
       | from multiple parties?
       | 
       | It seems like, if FHE proves to be robust and sufficiently
       | secure, it has a lot of potential, and I really wonder what can
       | be done with it today. I've made some effort to explore, but not
       | being an academic a lot of it has been pretty difficult to grok.
        
         | blintz wrote:
         | I think FHE is so powerful that we tend to let the cool
         | possibilities distract from really practical and useful stuff
         | that is possible today. Today, I think just the ability to do
         | private lookups is, while super simple from an academic lens,
         | could be really powerful. Plus, if it gets widely used, the
         | underlying tech will mature, and then the more exotic stuff
         | (multiple clients, more complex computations) becomes more
         | realistic.
         | 
         | We're currently building a service that will let you do private
         | lookups without needing to really mess with the underlying
         | cryptography or schemes. You'll be able to use it to deliver
         | even _stronger_ privacy than E2EE ( "E2EE+"?). For example, DNS
         | that doesn't learn what you resolve (this is beyond something
         | like DNS-over-HTTPS), or a messaging service that doesn't learn
         | who you talk to.
         | 
         | As far as learning more, I wrote a blog post that tries to
         | cover the basics of doing private lookups:
         | https://blintzbase.com/posts/pir-and-fhe-from-scratch/.
        
       | oulipo wrote:
       | How does that differ from Zama.ai?
        
         | haarts wrote:
         | I was not familiar with it. Thank you.
        
       | motohagiography wrote:
       | The challenges I found with FHE in practice (where I have been
       | asked to anticipate its availability as part of system
       | architecture) are that even with demo code available, without
       | some certification body acknowledging proofs of its security, and
       | blessed by a risk nullifying entity like NIST, it wasn't going to
       | get traction.
       | 
       | I like this article's crypto wallet use case, and it may be worth
       | codifying transactions that FHE protects. The {who, what, when,
       | where, why, how} of a transaction has a lot of data, and what
       | this SpiralDB does is protect {what}, although {who, when, where,
       | why, how} are available, so you need to articulate the use case.
       | 
       | The one I worked on was for health information, but that case is
       | essentially nullified now, as the pandemic was leveraged to
       | squeeze the data toothpaste out of the tube in major
       | jurisdictions, and so the data sets FHE was going to be a big
       | solution for have been accessed using a political/process
       | solution without the limitations of a technical one. The main use
       | case for FHE was to faciliate individual privacy, which is
       | essentially a limit on state discretion and powers that
       | facilitated data access through strict legal frameworks, but a
       | lot of data governance was completely compromised and gutted over
       | the pandemic, so I no longer foresee demand for FHE in this new
       | era of aggressively technocratic policy where the reason to use
       | FHE isn't enforced. The tech is inseperable from the policy in
       | this domain, and the rug has been pulled out from under the
       | policy, imo.
        
         | aftbit wrote:
         | >a lot of data governance was completely compromised and gutted
         | over the pandemic
         | 
         | Can you provide more information for those of us who are
         | interested in the intersection of health and data privacy but
         | don't work in the space?
        
           | motohagiography wrote:
           | It's a pretty niche field. I would recommend reading privacy
           | legislation in your state or country, and/or the syllabus for
           | the CIPP certifications, which are for privacy professionals.
        
         | blintz wrote:
         | Yeah, there is a a bit of cold-start problem with respect to
         | standards / certification. People kinda have to widely use
         | something before it seems worthwhile to standards bodies to
         | write standards, but often folks are sensibly cautious about
         | using non-standardized cryptography. The solution is to get
         | lots of eyeballs on it, get large organizations to really want
         | to use it, and use that push to get standards rolling.
         | 
         | Personally, I think highly regulated fields like health care
         | etc will adopt this technology extremely slowly. Academic
         | cryptographers really like health care applications but, as you
         | said, in practice, compliance is the main objective of health
         | care organizations.
         | 
         | We are more interested in applications where privacy is
         | actually a value add or a liability minimizer. For example, a
         | VPN using our service could differentiate or charge more to
         | users for offering a completely private DNS option. A crypto
         | wallet could actually advertise (and perhaps even prefer!) that
         | it doesn't spy on you.
        
       | blintz wrote:
       | Hey, creators here, cool to see people excited about this! We're
       | in the YC W23 batch, happy to answer any questions folks have.
        
         | qayxc wrote:
         | Do you use any special hardware (like FPGAs) to mitigate the
         | increase in computational cost or do you rely on standard
         | hardware?
         | 
         | I'm very interested in FHE in the context of machine learning
         | models without requiring access to unencrypted data at any
         | stage (be that training or inference). So far, the performance
         | hit wouldn't make this practical, so I was wondering whether
         | maybe hardware solutions exist to deal with that.
        
       | aliqot wrote:
       | Are there any generally regarded as safe popular homo libraries
       | out there? I want to read the way they're implemented.
        
         | zcw100 wrote:
         | OpenFHE is one https://www.openfhe.org
        
           | aliqot wrote:
           | Thanks!
        
       | plonk wrote:
       | Interesting thread by Matthew Green about HME and privacy:
       | https://ioc.exchange/@matthew_d_green/109383332045800279
        
       | GGO wrote:
       | For a second I thought I completely lost track of time when I
       | read the date on the article.
        
         | EGreg wrote:
         | " Is This the Future of Privacy?
         | 
         | 31 Nov, 2022"
         | 
         | For now, it seems to be. LOL
        
       ___________________________________________________________________
       (page generated 2022-11-21 23:01 UTC)