[HN Gopher] Claude Shannon: The Mathematical Theory of Cryptography
___________________________________________________________________
Claude Shannon: The Mathematical Theory of Cryptography
Author : declain
Score : 179 points
Date : 2021-03-17 12:39 UTC (10 hours ago)
(HTM) web link (evervault.com)
(TXT) w3m dump (evervault.com)
| dwpdwpdwpdwpdwp wrote:
| This is a really good example of a paper that is both
| mathematically rigorous and actually readable
| CliffStoll wrote:
| Just look at the people on the routing list!
| Hendrick Bode (Bode Plot) Harry Nyquest (Nyquist
| Frequency) Barney Oliver (pulse-code modulation &
| founded HP labs) John Pierce (Pierce Oscillator &
| science fiction) Ralph Hartley (Hartley transform!)
| Walter Shewhart (Statistical quality control / Shewhart cycle)
| It was my honor to have met several of these people in sometimes
| odd circumstances.
| ShaneCurran wrote:
| Hey Cliff! We haven't met, but I'm the proud owner of a Klein
| Bottle (and a couple "Portraits of Gauss"!) :-)
|
| Thanks for reading! Some of these great innovators will be
| making an appearance in Evervault Papers in the near future.
| Stay tuned!
| CliffStoll wrote:
| Yikes! There's no hiding around here!
|
| I'm happy to see developers looking at security as pervasive:
| "everything encrypted" is important since an intruder may be
| inside the system. Or as Shannon wrote in this paper, "assume
| the enemy knows the system being used." (He then writes about
| the importance of key selection.)
|
| Warm wishes - Cliff
| lisper wrote:
| > Yikes! There's no hiding around here!
|
| Not with that user name there isn't :-)
| Isamu wrote:
| This 1945 paper was classified, the 1949 version was
| declassified. It is interesting to see his thinking around
| information theory influenced by the wartime needs surrounding
| cryptography. Bell Labs of course was involved in many wartime
| technologies.
| ShaneCurran wrote:
| Hi all, founder of Evervault here -- we're building encryption
| infrastructure for developers.
|
| Cryptography is at the core of what we do. Evervault Papers is
| our way of continuing the legacy of cryptography giants like
| Shannon.
|
| We're posting one new paper on evervault.com/papers each week and
| this is our first issue. Subscribe to get a cryptography paper in
| your inbox every Thursday!
| sohkamyung wrote:
| Providing an RSS feed would be nice.
| ShaneCurran wrote:
| Good suggestion! We'll set this up.
| ShaneCurran wrote:
| Hey, just to follow up on this: we shipped RSS for Papers.
|
| https://evervault.com/api/rss
|
| Let us know if you've any questions/issues with it!
| efxz wrote:
| Shane,
|
| I suggest You to focus on the product and customers, not
| on the papers. With < 10 people You can't do much, and I
| think personally You take to many tasks on Your own...
| Hire people and keep growing! Wish You success!
| LMYahooTFY wrote:
| Jesus that was quick lol.
|
| Great job!
| betterunix2 wrote:
| Can you explain what exactly you do, other than saying that
| cryptography is at its core? Your website is a bit light on
| details. You say you encrypt data and can process encrypted
| data -- are you talking about TEEs, MPC, FHE, or something else
| entirely?
| ShaneCurran wrote:
| Sure! We build tooling that lets developers encrypt data
| before it hits their infrastructure (Relay) and which lets
| them process that encrypted data at a later date (Cages).
|
| We manage keys, but we don't store data.
|
| All crypto operations and encrypted data processing happens
| inside TEEs (AWS Nitro Enclaves, specifically [0]). Using
| Relay, you can pass data on to trusted third parties over
| TLS. With Cages, you can deploy custom code inside a TEE
| which can process data in whichever way you need.
|
| For developers who don't want plaintext data on our
| infrastructure, we also provide SDKs which let them encrypt
| data using our PKI scheme -- on their own infrastructure.
|
| [0]: https://press.aboutamazon.com/news-releases/news-
| release-det...
| betterunix2 wrote:
| Thanks! I design MPC protocols for real-world applications
| and like to stay up to date on what else is happening out
| there.
| kyoji wrote:
| I've implemented a toy version of a 3+ MPC protocol for
| graduate school, specifically private set intersection.
| Would you mind sharing what kind of MPC protocols you
| design and if you can for what types of applications? I
| don't often see this discussed on HN and my curiosity is
| piqued!
| betterunix2 wrote:
| Two-party set intersection and variants (intersection-
| sum, etc.), federated learning (secure aggregation) and
| its variants, and several things that are not yet public.
| I also did some work on anonymous trust tokens, which is
| kind of like a generalization of privacy pass that is
| meant to replace cookies for conveying e.g.
| whitelist/blacklist information. For the most part my
| work involves companies doing some kind of statistical
| analysis of joint data sets while maintaining some
| privacy constraint. Some of the work involves analyzing
| ads effectiveness, some involves public health, some
| involves machine learning, and there is a long tail of
| obscure applications that were deployed as a one-off.
| Resource constraints are the biggest technical challenge,
| but a bigger problem I and the rest of the people I work
| with face is lack of awareness or poor understanding of
| MPC (people often assume it is just a variant of DP, or
| that it is a blockchain something or other, or that it is
| totally impractical, etc.).
| kyoji wrote:
| This is super exciting for me, I am very interested in
| MPC/PSI but I haven't been introduced to much about it
| outside of academia. A ton of potential applications
| obviously but limited by computational power, as I
| understand it. Would you mind sharing what company(ies)
| you work with/for? If you can't or don't want to disclose
| publicly you can email me: kyoji1@gmail.com or
| jowens17@fau.edu. I would love to hear more!
|
| Here's my PSI project if interested:
| https://github.com/dowensagain/EfficientMultiPartyPSI
| benlivengood wrote:
| Anything worthwhile in fully homomorphic encryption yet?
| I keep seeing the tools get faster but security is still
| relatively unknown compared to modern
| symmetric/asymmetric ciphers. There's also several
| interesting papers on anonymous/garbled circuit
| evaluation that I'm assuming will lead to even better
| untrusted third-party computation services. What I'm
| waiting for is FHE/circuits/something that can
| selectively decrypt some of their own outputs.
| betterunix2 wrote:
| FHE security is reasonably well understood but not as
| well understood as EC or RSA/DH security. For the most
| part today's FHE systems are all based on the (R)LWE
| problem and the hardness of that problem is not in doubt
| _for the right parameter choices_ (though choosing the
| right parameters is a careful balancing act).
|
| It is unlikely (in my opinion) that "true" FHE
| applications will be deployed any time soon, but
| "leveled" FHE applications are already being deployed for
| a small number of levels (e.g. 2). Beyond quartic
| functions the performance is probably going to be too
| much of a problem for most applications. Homomorphic
| encryption in general is commonly used as a building
| block in larger MPC systems and you will probably see
| more widespread use of leveled FHE as such a building
| block too.
|
| As for selectively decrypting outputs, that sounds like
| functional encryption and it is still an active area of
| research (see also obfuscation, which was a hot topic a
| few years ago). I doubt you will see practical
| applications for a very long time.
| criddell wrote:
| Thank you for not making readers trade their email or sign up
| for a mailing list to get the paper.
| itcrowd wrote:
| On your homepage, evervault.com, you have an example of
| calculating the BMI in one of the images which is:
|
| > weight / Math.sqrt(height);
|
| In fact the BMI is weight / (height)^2
| edmundo wrote:
| That should be fixed now, thanks!
|
| Ps.: that's what happens when designers code :P
| ShaneCurran wrote:
| Ah, thanks for the heads up!
|
| We'll fix this ASAP
| wombatmobile wrote:
| "My greatest concern was what to call it. I thought of calling it
| 'information', but the word was overly used, so I decided to call
| it 'uncertainty'. When I discussed it with John von Neumann, he
| had a better idea. Von Neumann told me, "You should call it
| entropy, for two reasons. In the first place your uncertainty
| function has been used in statistical mechanics under that name,
| so it already has a name. In the second place, and more
| important, nobody knows what entropy really is, so in a debate
| you will always have the advantage."
|
| -- Claude Shannon
| [deleted]
| neonological wrote:
| It's true. Basically most people don't understand entropy. Both
| the scientific method and entropy are phenomenons arising from
| the assumption that probability is real. If you understand that
| entropy is merely a consequence of probability then you
| understand entropy.
|
| I think the thermodynamic law throws everyone off. Entropy is
| not the axiomatic law. The law is probability, and this same
| law also powers our science.
| ravi-delia wrote:
| Of course, depending on which direction you feel like drawing
| the arrows, probability and entropy (thermodynamic) are
| caused by entropy (information). God seems to care very much
| about preventing cheating in computation, for no particular
| reason at all.
|
| Or, conversely, God cares very much about the universe having
| an arrow of time without any notable time-asymmetric
| properties, and had to make up all sorts of information
| patches to make that happen. In that case, probability flows
| downstream from thermodynamics.
| lisper wrote:
| You might find this interesting:
|
| http://blog.rongarret.info/2014/10/parallel-universes-and-
| ar...
| neonological wrote:
| Information entropy is formally defined in terms of
| probability. Are you implying that some sort of inverse is
| possible and probability can de derived in terms of
| entropy?
|
| Could you write or cite what the formula would be in this
| case?
|
| It appears to me that the formula for entropy itself
| recursively suffers from rising entropy. Thus you cannot
| reverse it... You cannot derive individual probabilities of
| each state from just entropy. Source:
| https://en.wikipedia.org/wiki/Entropy_(information_theory)
|
| (in terms of the mathematical formula from Wikipedia, by
| saying you can't reverse it, I am saying that no individual
| P(xi) can be determined when you are just given H(X) as an
| input)
|
| Thus because of the reasoning above, probability must be
| the axiomatic source of entropy.
|
| It makes sense intuitively. Entropy is the macro phenomenon
| arising from the micro phenomenon of probability.
|
| The law of numerically higher probability events being more
| likely to occur then numerically lower probability events
| when following the arrow of time is the more fundamental
| explanation of what's going on here. Entropy is simply a
| macro numerical summary of these probabilistic events
| happening in aggregate.
| st_goliath wrote:
| > I think the thermodynamic law throws everyone off. Entropy
| is not the axiomatic law. The law is probability,
|
| Yes, and I think the other big plunder often committed is
| explaining entropy in thermodynamics as a measure for
| "disorderliness". You might get away with that if you want to
| explain diffusion processes to a class of high school
| students, but even then the idea quickly falls flat on it's
| face once they realize that in chemistry things crystalize
| out, often after _giving off_ thermal energy. (Also,
| "disorderliness" is a fairly subjective concept)
| not2b wrote:
| No, that doesn't follow: entropy can decrease provided that
| it is exothermic enough. For a spontaneous reaction to
| occur, the relevant quantity isn't the entropy, but the
| Gibbs free energy. So reactions that reduce entropy
| (because "things crystalize out") can take place if the
| enthalpy increases enough to compensate.
| neonological wrote:
| A system of loaded dice is a good counter example to
| "disorder". 10 loaded Dice weighted to always roll 6 trends
| towards our definition of order because 6 is simply more
| likely to occur.
|
| In this case rolling 10 dice increases order but that order
| represents increasing entropy.
___________________________________________________________________
(page generated 2021-03-17 23:01 UTC)