[HN Gopher] Hell is overconfident developers writing encryption ...
___________________________________________________________________
Hell is overconfident developers writing encryption code
Author : zdw
Score : 103 points
Date : 2025-02-01 03:21 UTC (4 hours ago)
(HTM) web link (soatok.blog)
(TXT) w3m dump (soatok.blog)
| Aeolun wrote:
| The reason overconfident developers roll their own crypto is
| because the 'cryptography experts' are such asshats about it. Who
| wants to deal with someone that writes this?
|
| Hell, even identifying which 'cryptography expert' actually knows
| what they're talking about and which ones are windbags is often
| more trouble than it's worth.
| tptacek wrote:
| I don't love the post either but getting angry about it doesn't
| make it less right.
| MarcelOlsz wrote:
| >Hell, even identifying which 'cryptography expert' actually
| knows what they're talking about and which ones are windbags is
| often more trouble than it's worth.
|
| If they look like they spent too much time in the sun you can't
| trust them. This criteria has yet to fail me.
| YZF wrote:
| The more nuanced message is maybe don't write crypto code
| without having spent the right amount of time to understand the
| challenges. If you don't know about timing attacks, you aren't
| familiar with the many historical mistakes that are well known,
| you don't know any theory around ciphers or hashes, then you
| probably shouldn't be writing crypto code that is used to
| secure anything important. There are a lot of ways to get up to
| speed (Coursera's cryptography courses from Stanford, books
| etc.) to at least wrap your head around why this is a problem.
| Even the experts go through peer review cycles because they
| know how easy it is to make mistakes.
|
| In other news, don't print yourself a motorcycle helmet on your
| 3d printer, don't make your own car tires, don't make your own
| break pads. I mean if you really know what you're doing maybe.
|
| EDIT: I have some personal experience with screwing up some
| crypto code. I won't get into all the details but I was
| overconfident. And this was after taking graduate level
| cryptography courses and having spent considerable time keeping
| up to date. I didn't write anything from scratch but I used
| existing algorithms/implementations in a problematic way that
| left some vulnerability. This was a type of system where just
| using something off the shelf was not an option.
| hn_throwaway_99 wrote:
| I just skimmed the article, but I generally agree. I mean, I
| get it - I think it's actually pretty easy to make an argument
| "Secure crypto code is exceedingly difficult to get right, even
| for experts. And unlike lots of other areas of code, there is
| no 'halfway broken' - crackable crypto is 'game over' in a lot
| of situations, which is why it's so dangerous to roll your
| own."
|
| But there's a huge gulf between that message and "Hell Is
| Overconfident Developers..." And importantly, I don't think
| that "overconfident developers" is the biggest problem. If
| anything, even when I very much don't want to "roll my own"
| crypto, I've found it difficult to be sure I'm using crypto
| libraries correctly and securely. Sometimes libraries are too
| low-level with non-obvious footguns, or they're too "black box"
| restrictive. I think a great example is the NodeJS crypto
| library. I think it's a great library, but using it correctly
| requires people to know and understand a fair amount about
| crypto. Whenever I use node crypto I pretty much always end up
| writing some utility methods around it (e.g. "encrypt this
| string using a key to get cipherText" and "decrypt this
| ciphertext using this key"). I think it would be better if
| those standard utility wrappers (that do things like use
| recommended default encryption algorithms/strengths, etc.) were
| also included directly in the crypto library.
|
| I think most developers are actually scared of rolling their
| own crypto, but cryptography experts don't make it easy to give
| them a plug-and-play solution.
| Uptrenda wrote:
| Based comment, updooted. Also, I'm pretty sure that if nobody
| ever "rolled their own crypto" we would never have new
| innovations in cryptography. By OPs argument everyone in the
| blockchain industry is in a state of error since using applied
| cryptography to design new protocols is certainly novel. But I
| would say that these protocols are what made me interested in
| applied cryptography in the first place. They are astonishingly
| clever, novel, and show how beautiful cryptography can be. If
| everyone was too scared to work with cryptographic constructs
| than none of the innovation that took place around smart
| contracts would be possible.
|
| I see no problem with it if your work gets peer reviewed. The
| very last point they made though I kind of agree with. If you
| end up making a new algorithm from scratch that is dangerously
| novel and potential crack pot territory. I did like how they
| went to that trouble to make the hierarchy of what is meant by
| "rolling your own crypto."
| tptacek wrote:
| This is a relatively long post that is kind of beating around the
| bush: these developers believed that OpenSSL was a trustworthy
| substrate on which to build a custom cryptosystem. It's not; this
| is why libraries like Tink and Sodium exist. They don't really
| need much more lecturing than "stop trying to build on OpenSSL's
| APIs."
| arthurcolle wrote:
| tptacek I don't want to waste your time, but do you have any
| good recommendations for material that bridges the gap between
| modern deployed cryptosystems and the current SOTA quantum
| computing world in enough detail that is useful for an engineer
| practitioner preparing for next 10 years?
| tptacek wrote:
| Nope. It's at times like these I'm glad I've never claimed I
| was competent to design cryptosystems. I'm a pentester that
| happens to be able to read ~30% of published crypto attack
| papers. My advice is: ask Deirdre Connolly.
|
| My standard answer on PQC about about the quantum threat is:
| "rodents of unusual size? I don't think they exist."
| pclmulqdq wrote:
| I have become a bit of a cryptographer (after running a
| cryptography-related company for a while), and aside from
| joke thought experiments, I am one of the most conservative
| cryptographic programmers I know.
|
| I'm personally pretty skeptical that the first round of PQC
| algorithms have no classically-exploitable holes, and I
| have seen no evidence as of yet that anyone is close to
| developing a computer of any kind (quantum or classical)
| capable of breaking 16k RSA or ECC on P-521. The problem I
| personally have is that the lattice-based algorithms are a
| hair too mathematically clever for my taste.
|
| The standard line is around store-now-decrypt-later,
| though, and I think it's a legitimate one if you have
| information that will need to be secret in 10-20 years.
| People rarely have that kind of information, though.
| some_furry wrote:
| I agree. (I'm a quantum skeptic personally.)
| jasonjayr wrote:
| We were doing an integration with a partner for our customers,
| and the contact I was working with insisted on using some OpenSSL
| primitives that were exposed in PHP:
|
| (a) they reversed the public + private parts of the key, and were
| upset when I communicated the public part of the key in cleartext
|
| (b) they speced that the string being encrypted could not exceed
| 8 bytes ......
|
| I tried so very hard and very patiently to explain to them what
| they were doing wrong, but they confidently insisted on their
| implementation. To deter fellow devs from trying this, I left
| loud comments in our code:
|
| > So these guys are totally using RSA Crypto wrong. Though it's a
| PK Crypto system, they insist on using it backwards, and using
| signatures to send us cateencrypted values, and we send encrypted
| values back to them. It's dumb. I suspect someone read through
| the PHP openssl function list, spotted RSA_encrypt_private and
| RSA_decrypt_public and decided to get overly clever.
|
| > This consumes a public key, and uses it to 'decrypt' a
| signature to recover it's original value.
|
| > To further deter use, I will not add additional documentation
| here. Please read and understand the source if you think you need
| to use this.
| jiggawatts wrote:
| Reminds me of a vendor providing an XML-RPC API with their
| enterprise product. The customer had a requirement that all PII
| information be encrypted in transit, and this was used used to
| send personal information about minors.
|
| I expected them to simply turn on HTTPS like normal people.
|
| Instead after months of effort they came back with XML
| Encryption. No, _not the standardised one_ , they cooked up the
| their own bespoke monstrosity with hard-coded RSA keys with
| both the public and private parts published in their online
| documentation. The whole thing was base-64 encrypted XML
| _inside_ more XML.
|
| I flat rejected it, but was overruled because nobody involved
| had the slightest clue what proper encryption is about. It
| _looked_ complicated and thorough to a lay person, so it was
| accepted despite my loud objections.
|
| This is how thing happen in the real world outside of Silicon
| Valley.
| tbrownaw wrote:
| The was at least _also_ TLS on top of that, yes?
| jiggawatts wrote:
| Of course not! Plain-text HTTP only, because "that's
| simpler to debug".
| crabmusket wrote:
| > The whole thing was base-64 encrypted XML
|
| Surely you mean "base64 encoded"
| 77pt77 wrote:
| And you just know everyone saw you as the out of touch, anal
| retentive technical nerd that has it out for the real team
| players.
| Volundr wrote:
| > they reversed the public + private parts of the key
|
| Is there a reason this is actually a problem? I always thought
| the public/private key was arbitrary and it was easier and less
| error prone to give people one key called public and one called
| private than hand them two keys and say "ok keep one of these a
| secret".
|
| Don't get me wrong, not defending anyone here, just curious is
| there's more to it I don't know.
| curling_grad wrote:
| I'd assume that the value intended for public exponent is
| used as private key exponent. Typically, public key exponent
| is very small compared to private key exponent. This means
| that the private key exponent is very small in their scheme,
| so attacks such as Wiener's attack[0] can be used to break
| the encryption.
|
| Also, I'd like to add that public exponent is usually fixed
| to some well-known constant such as 65537, so the attacker
| might just try brute-forcing when she knows the details of
| the scheme.
|
| [0]: https://en.wikipedia.org/wiki/Wiener%27s_attack
| Sytten wrote:
| I didnt understand the point he was trying to make about trusting
| a public key from a remote server. At somepoint you need to trust
| some third party public key if you want to send them encrypted
| data and verifying ownership is kinda left to user. Hell even
| signal does that, who is really checking their contact security
| numbers to make sure the signal server didnt send you some
| bullshit...
| some_furry wrote:
| > didnt understand the point he was trying to make about
| trusting a public key from a remote server.
|
| What stops the server from swapping SyttenPK for NSA PK?
|
| The operating word of the quote was "just".
|
| > Hell even signal does that, who is really checking their
| contact security numbers to make sure the signal server didnt
| send you some bullshit...
|
| Their latest code commits include key transparency, which is
| one good way to address this problem.
| Sytten wrote:
| I do get that, but I am not so sure what the real solution
| is. The industry standard still is some form of "the server
| sends me a key and I trust it".
|
| Key transparency is not really solving anything for the
| average person.
| some_furry wrote:
| The problem isn't "the server sends me a public key".
|
| The problem is there's no way to know if the server is
| lying.
| voxelc4L wrote:
| ... or any code?
| devmor wrote:
| s/Encryption/Security
|
| Roll-your-own cryptography is definitely the worst of it, but
| even developers that use strong crypto libraries end up misusing
| them quite often.
|
| I have worked on a _lot_ of custom made web software, and I could
| count the number of in-house authentication or authorization
| systems that _didn 't_ have glaring issues on one hand.
|
| After nearly 12 years of working in this field, I would consider
| myself extremely knowledgeable in web security and I still don't
| think I'd be comfortable writing a lot of these types of systems
| for production use without an outside analyst to double check my
| work. Unless you're a domain expert in security, you probably
| shouldn't either.
| userbinator wrote:
| The insecurity is freedom. Think of that again when you need to
| break out of Big Tech's control.
| tptacek wrote:
| What? Say more.
| Dibby053 wrote:
| The post links to this GitHub issue [1] where the critic explains
| his issues with the design and the programmer asks him to
| elaborate on how those crypto issues apply to his implementation.
| The critic's reply does not convince me. It doesn't address any
| points, and refers to some vague idea about "boring
| cryptography". In what way is AWS secrets manager or Hashicorp
| Vault more "obviously secure" than the author's 72-line
| javascript file?
|
| [1] https://github.com/gristlabs/secrets.js/issues/2
| rendaw wrote:
| And the critic's only argument is a link to their own blog...
| hatf0 wrote:
| Those aren't even the correct answer for the use-case in
| question, anywho. What they're looking for would actually be
| sops (https://github.com/getsops/sops), or age (made by the
| fantastic Filo Sottile: https://github.com/FiloSottile/age),
| or, hell, just using libsodium sealed boxes. AMS KMS or Vault
| is perhaps even worse of an answer, Actually
| maqp wrote:
| >It doesn't address any points
|
| Taking some time to point out the vulnerability is already
| charity work. Assuming that's also a commitment to a free
| lecture on how the attacks work, and another hour of free
| consultation to look into the codebase to see if an attack
| could be mounted, is a bit too much to ask.
|
| Cryptography is a funny field in that cribs often lead to
| breaks. So even if the attack vector pointed out doesn't lead
| to complete break immediately, who's to know it won't
| eventually if code is being YOLOed in.
|
| The fact the author is making such a novice mistake as
| unauthenticated CBC, shows they have not read a single book on
| the topic should not yet be writing cryptographic code for
| production use.
| LPisGood wrote:
| > Taking some time to point out the vulnerability is already
| charity work
|
| Sure, but if you're not going to reason why the vulnerability
| you're pointing out is an issue or respond well to questions
| then it's almost as bad as doing nothing at all.
|
| A non expert could leave the same Maintainers on many Github
| pages. Developers can't be expected to blindly believe every
| reply with a snarky tone and a blog link?
| danparsonson wrote:
| If the snarky comment is "your crypto implementation is
| bad", then, yes, I would always take that seriously. If I
| really know what I'm doing then I'll be able to refute the
| comment; if not, then I probably should be using an audited
| library anyway.
|
| Mistakes in crypto implementation can be extremely subtle,
| and the exact nature of a vulnerability difficult to pin
| down without a lot of work. That's why the usual advice is
| just "don't do it yourself"; the path to success is narrow
| and leads through a minefield.
| 1970-01-01 wrote:
| It probably is stemming from the fact that cryptography
| needs to be perfect code to guarantee total confidentiality
| with complete integrity. Without this goal of writing
| perfect code, that always encrypts and decrypts, but _only_
| for the keyholder(s), they 're simply begging to get things
| wrong _and_ waste time.
| maqp wrote:
| >Developers can't be expected to blindly believe every
| reply with a snarky tone and a blog link?
|
| Developers are adults with responsibility to know the
| basics of what they're getting into, and you don't have to
| get too far into cryptography to learn you're dealing with
| 'nightmare magic math that cares about the color of the
| pencil you write it with', and that you don't do stuff
| you've not read about and understood. Another basic
| principle is that you always use best practices unless you
| know why you're deviating.
|
| The person who replied to that issue clearly understands
| some of the basics, or they at least googled around, since
| they said "Padding oracle attacks -- doesn't this require
| the ability to repeatedly submit different ciphertext for
| decryption to someone who knows the key?"
|
| In what college course or book is padding oracle described
| without mentioning how it's mitigated, I have no idea. Even
| Wikipedia article on padding oracle attacks says it
| clearly: "The CBC-R attack will not work against an
| encryption scheme that authenticates ciphertext (using a
| message authentication code or similar) before decrypting."
|
| The way security is proved in cryptography, is often we
| give the attacker more powers than they have, and show it's
| secure regardless. The best practices include the notion
| that you do things in a way that categorically eliminates
| attacks. You don't argue about 'is padding oracle
| applicable to the scenario', you use message authentication
| codes (or preferably AE-scheme like GCM instead of CBC-
| HMAC) to show you know what you're doing and to show it's
| not possible.
|
| If it is possible and you leave it like that because the
| reporter values their time, and they won't bother, an
| attacker won't mind writing the exploit code, they already
| know from the open source it's going to work.
| SomaticPirate wrote:
| Wow, the smugness of that reply. Responding by calling someone
| naive and blowing them off despite there being real questions.
|
| The "insecure crypto " that they clearly link to (despite not
| wanting to put them on blast) was also a bit overdone. I guess
| we all are stuck hiring this expert to review our crypto
| code(under NDA of course) and tell us we really should use AWS
| KMS.
| BigJono wrote:
| AWS KMS is great product branding. I've never seen another
| company so accurately capture how it feels to use their
| product with just the name before.
| block_dagger wrote:
| I agree with his comment and would like to add that the critic
| came across as rude and superior. Instead of answering the
| dev's question in good faith, they linked to their own blog
| entry that has the same tone. Is it a cryptographic expert
| thing to act so rude?
| 1970-01-01 wrote:
| Great question. AWS secrets and Hashicorp Vault have both been
| audited by a plethora of agencies (and have passed). GitHub
| code for someone's pet project very likely isn't going to pass
| any of those audits. When something goes wrong in prod, are you
| going to point your copy of 'some JS code that someone put on
| the Internet' and still have a job?
|
| https://docs.aws.amazon.com/secretsmanager/latest/userguide/...
|
| https://www.hashicorp.com/trust/compliance/vault
| bagels wrote:
| Yeah, many probably wouldn't get fired for that, but small
| consolation for a breach.
| amluto wrote:
| The criticism in that issue is pretty bad, I agree. But the
| crypto in secrets.js is all kinds of bad:
|
| The use case is sometime calling this tool to decrypt data
| received over an unauthenticated channel [0], and the author
| doesn't seem to get that. The private key will be used
| differently depending on whether the untrusted ciphertext
| starts with '$'. This isn't quite JWT's alg none issue, but
| still: never let a message tell you how to authenticate it or
| decrypt it. That's the _key's_ job.
|
| This whole mess does not authenticate. It should. Depending on
| the use case, this could be catastrophic. And the padding
| oracle attack may well be real if an attacker can convince the
| user to try to decrypt a few different messages.
|
| Also, for Pete's sake, it's 2025. Use libsodium. Or at least
| use a KEM and an AEAD.
|
| Even the blog post doesn't really explain any of the real
| issues.
|
| [0] One might credibly expect the public key to be sent with
| some external authentication. It does not follow that the
| ciphertext sent back is authenticated.
| rendaw wrote:
| But having bad crypto doesn't mean you have to be
| aggressive... in fact if the critic's goal is to actually
| improve the situation (and not just vent or demonstrate their
| superiority) then being polite and actually answering the
| questions might go a long way further to remedy it.
| borski wrote:
| You're right. The problem is that after repeating the same
| thing hundreds of times to different developers you can
| develop a bit of an anger toward the situation, as you see
| the same mistakes play out over and over.
|
| I'm not defending it, but I can understand where it comes
| from.
| jhack wrote:
| Don't forget to rand(rand()).
| TheCleric wrote:
| I think one of the reasons developers roll their "own" crypto
| code is because there doesn't really seem like a simple "do it
| this way" way to do it. OpenSSL has hundreds of ways of doing
| encryption. So it almost has pitfalls by default, because of the
| complexity of the choices.
| asciii wrote:
| I did not learn a single thing from this article. I did learn
| quite a bit from the programmer's stuff and situation.
| biimugan wrote:
| Only tangential to this post, but if you need a way to share
| secrets with your teams (or whoever), Hashicorp Vault is pretty
| decent. They don't even need login access. Instead of sharing
| secret values directly, you wrap the secret which generates a
| one-time unwrapping token with a TTL. You share this unwrapping
| token over a traditional communication method like Slack or
| e-mail or whatever, and the recipient uses Vault's unwrap tool to
| retrieve the secret. Since the unwrapping token is one time use,
| you can easily detect if someone intercepted the token and
| unwrapped the secret (by virtue of the unwrapping token no longer
| working). This hint tells you the secret was compromised and
| needs to be rotated (you just need to follow-up with the person
| to confirm they were able to retrieve the secret). And since you
| can set a TTL, you can place an expectation on the recipient too
| -- for example, that you expect them to retrieve the secret
| within 10 minutes or else the unwrapping token expires.
|
| All of this has the added benefit that you're not sharing
| ciphertext over open channels (which could be intercepted and
| stored for future decryption by adversaries).
| 1970-01-01 wrote:
| It really needs to be taught in comp sci classes at this point.
| Have an entire project dedicated to why rolling your own crypto
| is at best a waste of time. Students will listen if they need to
| pass a test on it.
| AlotOfReading wrote:
| Giving people more introduction to crypto is a good way to get
| more handwritten crypto. It's part of the fun family of purely
| mathematical algorithms, alongside hashes, RNGs, and checksums.
| People make essentially the same mistakes in all of these.
| necovek wrote:
| Do you not want to train some of those students to write the
| improved cryptography of the future?
|
| I do agree having a project where you demonstrate problems they
| introduced (by exploiting them directly, or have another
| student team be the exploiters) would highlight the risks, but
| school is about teaching, not about scaring people off.
| borski wrote:
| That's what the actual crypto classes are for. If people are
| intrigued, those are fantastic elective courses.
| tombert wrote:
| Nah, hell is being stuck writing Kubernetes configs.
| quotemstr wrote:
| I'm not sure the way to get developers to stop writing their own
| crypto is to turn it into a delicious forbidden fruit edible only
| by the most virtuous. APIs sometimes have a "Veblen good"
| character to them: the more difficult, the more use they attract,
| because people come to see use of the hard API as a demonstration
| of skill.
|
| The right way to stop people writing their own cryptography isn't
| to admonish them or talk up the fiendish difficulty of the field.
| The right way to make it boring, not worth the effort one would
| spend to reinvent it.
| umvi wrote:
| What does "don't build your own crypto" even mean any more?
|
| I originally thought it meant "don't implement AES/RSA/etc
| algorithms yourself"
|
| But now it seems to mean "pay auth0 for your sign in solution or
| else you'll definitely mess something up"
|
| As an example, we have a signing server. Upload a binary blob,
| get a signed blob back. Some blobs were huge (multiple GB), so
| the "upload" step was taking forever for some people. I wrote a
| new signing server that just requires you to pass a hash digest
| to the server, and you get the signature block back which you can
| append to the blob. The end result is identical (i.e. if you
| signed the same blob with both services the result would
| indistinguishable). I used openssl for basically everything. Did
| I roll my own crypto? What should I have done instead?
| LPisGood wrote:
| Cryptographic protocols are algorithms built on cryptographic
| primitives and they have plenty of footguns.
| EE84M3i wrote:
| Now it's possible for someone to ask the server to sign a blob
| that they only know the hash of. Is that an issue in your
| threat model? No idea.
| pclmulqdq wrote:
| It used to mean "use AES instead of rolling your own form of
| symmetric encryption." Then it became "use a library for AES
| instead of writing the code." It has now reached the obvious
| conclusion of "don't do anything vaguely security-related at
| all unless you are an expert."
| tptacek wrote:
| No it hasn't. The subtext of "don't roll your own crypto" is
| that "AES" doesn't do, by itself, what most developers think
| it does; there's a whole literature of how to make AES do
| anything but transform 16 unstructured bytes into 16
| different bytes indistinguishable from random noise; anything
| past that, including getting AES to encrypt a string, is
| literally outside of AES's scope.
|
| The shorter way of saying this is that you should not use
| libraries that expose "AES", but instead things like Sodium
| that expose "boxes" --- and, if you need to do things that
| Sodium doesn't _directly_ expose, you need an expert.
|
| Contra what other people on this thread have suggested,
| reasonable security engineers do not in fact believe that
| ordinary developers aren't qualified to build their own
| password forms or permissions systems; that's a straw man
| argument.
| pclmulqdq wrote:
| Reasonable security engineers take a reasonable position on
| this. Many other developers (usually uninformed ones)
| believe this now means "don't make an auth system." Like it
| or not, this has become a sort of adage that people cargo-
| cult.
| tptacek wrote:
| If you can't keep the line between "ordinary developers
| shouldn't be working with AES" and "ordinary developers
| can write login systems" clear, you're doing people a
| disservice, because the two assertions are not alike in
| how fundamentally true they are.
| pclmulqdq wrote:
| By the way, I agree with you. My original point was that
| the "hivemind" somehow has shifted away from the
| reasonable conclusion.
| tptacek wrote:
| There's nothing personal about any of this, but I'm
| watching the both the insane poles of this argument
| ("I'll be just fine inventing my own block cipher modes"
| guy, "Only a certified security engineer can write a
| permissions check" guy") go at each other, and wondering
| why we need to pay any attention to either of them.
|
| I haven't so much seen the latter kind of guy, the one
| saying you need a certified professional to safely
| output-filter HTML or whatever, but I see "lol block
| cipher modes whatever" people on HN all the time, on
| almost every thread about cryptography, dunking on anyone
| who says "don't roll your own cryptography".
| smolder wrote:
| The stuggle was real in the early web with getting companies
| to do proper password storage and authentication but the fact
| that seasoned professionals turn to auth0 or okta (and have
| been bitten by this reliance!) nowadays strikes me as a
| little embarrassing.
| hoilmic wrote:
| > What should I have done instead?
|
| Security has largely to do with trust.
|
| When asked who I trust most in this space, the answer is always
| libsodium.
|
| I leave as much of the protocol as possible to their
| implementation.
|
| https://doc.libsodium.org/
| smolder wrote:
| I honestly think it's a meme born of propaganda. 'Don't roll
| your own' is good advice a lot of the time when there are
| hardened libraries at your disposal, but who does it serve if
| less people _know how_ to build useful cryptography? Three
| letter agencies mostly. I had no trouble implementing useful
| algorithms in college as part of my study of it, or
| understanding different potential weaknesses like timing or
| known plaintext attacks. We didn 't cover side channels like
| power analysis but how often does that actually matter outside
| of a lab?
| xvector wrote:
| No, you can not competently roll your own crypto because you
| took an algorithms class in college. Even the assumption you
| could is absurd.
| smolder wrote:
| It wasn't 'an algorithms class' but grad level
| cryptography. We did implement boxes as a project and were
| graded on them. Where do you think people learn this stuff?
| Super secret cryptography school?
| xvector wrote:
| Yes, most of us have likely taken grad level
| cryptography. No, that's not enough to securely roll your
| own crypto.
|
| Just use libsodium.
| foobarkey wrote:
| I know people want to get paid but 90% of crypto is just about
| making sure the thing takes constant time, please stop acting
| like it is some elite knowledge
| AlotOfReading wrote:
| I have an entire crypto book on my shelf that only mentions
| constant time once in an offhand paragraph about timing
| attacks. They managed to write the rest of the book on the
| other 99% of crypto.
| RainyDayTmrw wrote:
| I have empathy for people who end up stuck in the in-between
| areas, where out-of-the-box building blocks solve part of their
| problem, but how to glue them together can still get tricky.
|
| For one example, you've got a shared secret. Can you use it as a
| secret key? Do you have to feed it through a KDF first? Can you
| use the same key for both encryption and signing? Do you have to
| derive separate keys?
| max_ wrote:
| The reason why cryptography is so bad.
|
| Is this elitist culture in cryptography.
|
| Alot of algorithms are described incomprehensively.
|
| Let me give you an example.
|
| You might get a well documented specification for implementation
| of ECDSA.
|
| But it will lack something very 2 important concepts.
|
| 1. You should only use it with "safe curves"
|
| 2. It's does provide you with a list of "Safe curves"
|
| Same goes for Shamir Secret Sharing. You can know how to
| implement the algorithm.
|
| But there is so much extra "insider knowledge" or "tribal
| knowledge" that isn't obvious to non-cryptographers, that you
| need, for it to be secure.
|
| Such knowledge is often (if not always) not documented. It's in
| obscure forums, papers and blog posts like this one.
|
| There is no comprehensive encyclopedia for cryptography
| algorithms.
|
| Developers write bad encryption code because cryptographers have
| bad documentation.
|
| I would blame the cryptographers, not the developers.
| cowsaymoo wrote:
| What a coincidence! I was just browsing the Shamir's Secret
| Sharing Wikipedia page 30 seconds ago. There is a python
| implementation on it and I was worried the same exact thing as
| you before opening HN. So maybe we could start with that one.
| Is that code implementation sufficiently secure and well
| documented?
|
| https://en.wikipedia.org/wiki/Shamir's_secret_sharing?wprov=...
| some_furry wrote:
| https://zkdocs.com has a whole chapter on Shamir's Secret
| Sharing.
|
| (Ironically, this stuff _is_ documented, cryptographers just
| aren 't good at marketing.)
|
| https://www.zkdocs.com/docs/zkdocs/protocol-
| primitives/shami...
| cassonmars wrote:
| No, it is not.
|
| The arithmetic used is not constant time, meaning the actual
| computational steps involved leak information about the
| secret, were either the recombination of the shares or the
| initial splitting were observed via side channels.
|
| The arithmetic does not guard against party identifiers being
| zero or overflowing to zero, although it is not likely to
| occur when used this way.
| tptacek wrote:
| I don't understand what "blaming" the cryptographers gets you.
| Your system is either secure or it isn't. If you ship something
| insecure, that's on you.
| AlotOfReading wrote:
| There's definitely degrees of security that I'm having to
| constantly remind our security folks of when they aren't
| forced to deal with the costs of having expensive threat
| models.
| xvector wrote:
| If you're having to constantly remind your security
| engineers about "degrees of security," they may be trying
| to tell you that your threat model is unsustainably and
| dangerously lax.
| alfiedotwtf wrote:
| It's not insider knowledge, it's just the collective experience
| people have built up over time after discovering issues with
| classes of ciphers. It would be nice to have a standard
| checklist that everyone could refer to (think OWASP Top Ten)
| while reviewing a cryptosystem.
| markus_zhang wrote:
| My optimistic opinion is that since our information is being
| leaked left and right, it doesn't matter whether we roll the algo
| by ourselves or not. Might as well do it for practice...
|
| Quote from the Honorable MI5 head:
|
| > I mean, with Burgess and Maclean and Philby and Blake and Fuchs
| and the Krogers... one more didn't really make much more
| difference.
| m3kw9 wrote:
| Just add 2 to each char.
| sureglymop wrote:
| My question is, why does the library even support AES-CBC? I know
| that it's OpenSSL here but why can't we have an additional
| library then that ships only the recommended encryption schemes,
| safe curves, etc. and deprecates obsolete practices in an
| understandable process? Something that is aimed at developers and
| also comes with good documentation on why it supports what it
| does.
| ExoticPearTree wrote:
| I think what the author implies is that people should use trusted
| and vetted crypto libraries, and rollout their own encryption
| primitives based on some documentation they read from who knows
| where.
|
| Look at OpenSSL for example, even though the people who work on
| the project know a lot about crypto, they still make big
| mistakes. Now imagine the average developer trying to build
| crypto from scratch.
___________________________________________________________________
(page generated 2025-02-01 08:00 UTC)