[HN Gopher] Global Encryption Day: Demand End-to-End Encryption ...
___________________________________________________________________
Global Encryption Day: Demand End-to-End Encryption in DMs
Author : pabs3
Score : 142 points
Date : 2022-10-22 08:44 UTC (14 hours ago)
(HTM) web link (blog.torproject.org)
(TXT) w3m dump (blog.torproject.org)
| anthk wrote:
| Mail and GPG. For IM's, tox. Or Jabber+Omemo.
| jonas-w wrote:
| Why not jabber+gpg?
| leetnewb wrote:
| Lacks PFS and doesn't necessarily support groups.
| jv0010 wrote:
| It's mind blowing e2e is not a standard. I guess the equivalent
| is looking back and realising cars and homes did not have locks
| at one stage
| sebzim4500 wrote:
| Given how everyone uses TLS nowadays, a better analogy would be
| for the dealership to retain a copy of the key to your car
| after you buy it. Which would probably upset people but isn't
| as bad as having no lock at all.
| Linosaurus wrote:
| The analogy breaks down a little here.
|
| We don't want the dealership to keep the key, but we _do_
| want them to give us a new one if we lose it.
|
| So afaik we trust the manufacturer to keep the info, and to
| keep sufficient logs that anyone that abuses it to steal cars
| can be caught.
|
| I don't think the way this is works can be extended to
| communications apps though.
| [deleted]
| 100001_100011 wrote:
| How do you know if the Signal client running on your phone right
| now doesn't include a backdoor? Sure it's open source. But how do
| you know how it was compiled?
|
| What if someone changed the open source before shipping it to the
| app store?
| Semaphor wrote:
| I would say it's the usual.
|
| If someone with skills X (where depending on your knowledge and
| precautions, X can range from script kiddie to nation state) is
| after you specifically, you can only make their job harder, but
| unless you are very serious about security, you'll probably get
| pwned.
|
| If you want general security, you can probably take it as given
| that someone checked the Signal build to be the one that the
| source is available for, and that no one intercepted just your
| download. But you still have to take some parts on faith,
| always, unless you build your own CPU and continue from there.
| i_am_proteus wrote:
| How do you know that the AES instruction set on your device's
| processor doesn't include a backdoor? Sure, the algorithm is
| public, but how do you know how it was implemented?
| upofadown wrote:
| AES starts with 16 bytes and then encrypts it to 16 bytes. So
| there is no good place to hide extra data. Even a single bit
| that did not meet the AES spec for the key in use would
| produce complete garbage at decryption. So attempts to leak
| the key would at least leave a mark.
| dividuum wrote:
| Unless it's a timing based leak..
| bertman wrote:
| This is called "Reproducible Builds".
|
| https://signal.org/blog/reproducible-android/
| 100001_100011 wrote:
| Reproducible builds are for developers. As a user I didn't
| build the app on my phone.
|
| I have a phone with Signal on it. Tell me what I should do to
| verify it's running the open source Signal code.
| fsflover wrote:
| Reproducible builds benefit the user by allowing
| independent checks of the software.
| ghransa wrote:
| Not perfect chain of custody but could report to virustotal
| (virustotal.com) and compare in a sandbox:
|
| https://play.google.com/store/apps/details?id=com.funnycat.
| v...
| bertman wrote:
| If you, as a user, are concerned about reproducibility, you
| are no longer an average user. Thus, if you want this extra
| security, you can be expected to check the APK on your
| phone.
| 0xChain wrote:
| You should check out Session. Their CTO apparently uses his
| PGP key to sign every release
| https://twitter.com/session_app/status/1514108746854985730
| fgiox wrote:
| Maybe you could figure this out yourself, and share your
| findings, rather than demanding answers from others?
| RunSet wrote:
| That page says:
|
| > the Signal Android codebase includes some native shared
| libraries that we employ for voice calls (WebRTC, etc). At
| the time this native code was added, there was no Gradle NDK
| support yet, so the shared libraries aren't compiled with the
| project build.
|
| Also, assuming you trust the client, how to tell if the
| Signal _server_ is running the published code, especially
| given Signal 's track record of (not) publishing its source
| code?
|
| https://linuxreviews.org/Signal_Appears_To_Have_Abandoned_Th.
| ..
| kijiki wrote:
| Signal server is explicitly untrusted in the Signal threat
| model, which is must be due to being based in a country
| (like any other country) with laws that can be used to
| compel actions on the server's owners. They publish legal
| orders they receive and their responses.
| bigDinosaur wrote:
| Have a classic:
| https://www.usenix.org/system/files/1401_08-12_mickens.pdf
| Genghis_9000 wrote:
| Just read the code. Yes, it really is that simple. Don't use a
| language like C where it requires 15 years of study just to
| learn how to discern different dialects of the language. TODO:
| Make a language without tons of fancy HN-appeasement
| syntax/features so people can actually understand 100% what
| each semantic and syntactic element does without spending their
| life on the language and becoming a maligned evangelist
| defending what they spent their life studying.
| Ptchd wrote:
| Briar Project appear to be a good E2EE messaging app.
|
| https://briarproject.org/how-it-works/
| upofadown wrote:
| Which is of course entirely incompatible with everything else.
| We have tons of good E2EE messaging apps, all doing their own
| thing...
| CJefferson wrote:
| As I understand it, if we had true end-to-end encryption, I would
| have to make sure I kept a set of keys, copied them between every
| computer and phone I used for chatting, and if I lost those keys
| I'd lose all my messages?
|
| Honestly, for most people I don't think that's functionality they
| would want, at least without us getting much better at interfaces
| and usability. Standard ways of storing keys, for example in a
| password manager, would be a good start (does this already exist
| and I've missed it?)
| jcynix wrote:
| > if I lost those keys I'd lose all my messages?
|
| Not neccessarily, that just happens with bad implementations
| (i.e. most of them, sigh).
|
| If you get a confidential letter in the physicsl world, you
| open it, read it, and then store it in a safe,.or a locked
| drawer, correct?
|
| The software world chose to "re-seal" the letter in its
| envelope again instead. So if you loose your key, you loose
| access to the letter. The proper way to implement this would be
| to store the decrypted contents in the software equivalent of a
| lockable drawer, e.g. an encryped disk, folder or whatever.
| Which could (and should) have sensible fallbacks to retrieve
| its content.
| mort96 wrote:
| Most people don't have encrypted drives with fallbacks to
| retrieve its contents (whatever that means).
| jcynix wrote:
| Not having encrypted drives is some kind of misfeature
| nowadays, especially for mobile devices.
|
| Fallbacks can be key escrow (i.e. put a printout or a
| physical key into a sealed envelope and deposit it at a
| family member, friend, or notary), and backups, encrypted
| with a different (or more than one) key.
|
| https://en.wikipedia.org/wiki/Linux_Unified_Key_Setup for
| example allows the use of multiple keys, so a backup could
| use the same (primary) key as your drive and some secondary
| key(s) to access the backup if the primary key is lost
| somehow. As I mentioned in another comment here, keys
| should have been physical features for a long time, but
| hardware vendors would had to standardize on a general
| implementation and as we know, they all like to
| "standardize" on exactly their way of doing things.
| mr_mitm wrote:
| > As I understand it, if we had true end-to-end encryption, I
| would have to make sure I kept a set of keys, copied them
| between every computer and phone I used for chatting, and if I
| lost those keys I'd lose all my messages?
|
| Unless you made an unencrypted backup, then yes, that's true
| and that is the reason why Telegram decided against E2EE by
| default. According to them their users prefer easy cloud access
| to their messages over security.
| rmnclmnt wrote:
| Using a password manager in a sensible manner already goes a
| long way! Most synchronized E2EE services only ask for a master
| passphrase to encrypt your keys before storing them server side
| (e.g Bitwarden, Keybase, ProtonMail). Then you only need your
| password manager to recall the passphrase when synchronizing a
| new device!
|
| People need to learn again how the devices in their pocket work
| and the risks for not doing things properly!
| sigwinch28 wrote:
| > and if I lost those keys I'd lose all my messages?
|
| Only if the developers of the software insist on encryption at
| rest in addition to transit encryption.
|
| For example, you don't lose the ability to open files you have
| already downloaded via HTTPS just because the client or server
| certificate later expires.
| upofadown wrote:
| The secret bits can be protected by a strong passprase (which
| you can print out) or kept in a hardware device that also does
| enough of the cryptography to allow the secrets to stay on the
| hardware device. After that the problem becomes one of backup.
| You can scatter your encrypted keys all over the internet if
| you want (or just on a reliable server somewhere). You can have
| a bunch of hardware devices stored in various safe places and
| protected in various ways.
|
| You often hear advice that comes down to keeping, say, your PGP
| keys in one secure place, but that is a terrible idea... E2EE
| encryption needs a good backup approach. If your system does
| not provide that in a easy to use way for the user then you
| have a bad system. E2EE encryption _is_ hard but it doesn 't
| help to leave all the hardness for the user.
| jcynix wrote:
| IMHO, for most people the best interface would be a physical
| key (with software keys on it). That is something like a
| smartcard or ubikey, and even a way of obtaining multiple
| copies, like with physical keys.
|
| My employer rolled out a smartcard based PKI about 15 years
| ago, where you may own more than one card (optionally in sim
| card size for usb tokens) so, for example, you can have one in
| the office and have another one in your home office. Works
| quite well (modulo mobile) for now abour 30K employees. And,
| combined with RFID on the cards, you can call elevators, open
| doors, pay for food, etc.
|
| Now if hardware manufacturers had standardized on appropriate
| hardware (chassis, keyboards, smart devices...) some 20 years
| ago, instead of trying (and failing) all kinds of software
| "solutions" ... and thinner and thinner hardware, so no such
| cards or sticks would fit anymore ...
| mr_mitm wrote:
| The difference is: If you lose your smartcard, it's pretty
| straight forward for your employer to verify your identity
| and issue a new one.
|
| How do I do that with WhatsApp? We all know how well these
| tech giants react if you got permanently locked you out of
| accounts, no matter whose fault it was.
|
| And even if I could get a new key, I would have to explain to
| all my contacts why my key changed and they would have to re-
| verify it over a secure channel. It doesn't scale.
|
| Now you might say: just keep multiple copies, but there
| _will_ be cases where people lose all copies, and depending
| on how much of our digital lives will be tied to these keys,
| we _will_ need a secure recovery plan for that.
| mr_mitm wrote:
| Remember when Zoom claimed that meetings were E2EE yet you could
| join the meeting by phone and no one batted an eye for at least
| one or two years? Noticed how no regular person cares when the
| "security code" of a chat partner changes in WhatsApp or Signal?
| Not to mention no regular person uses self-compiled apps for
| that, even if it were possible. E2EE is close to becoming a cargo
| cult, because done properly key management and identity
| verification are a usability nightmare. It will always remain a
| toy for a niche audience. Most people just don't care, and the
| non-technical people that do care are happy with a green lock
| appearing somewhere. I do hope to be proven wrong, though.
| Genghis_9000 wrote:
| matheusmoreira wrote:
| Cargo cult? No. It's not perfect but end-to-end encryption is
| demonstrably better than whatever we had before. WhatsApp
| encryption is the same as Signal's and it has already defeated
| police and courts here. Some judges got so pissed off they
| ordered WhatsApp to be blocked nation-wide because of it.
| twstdzppr wrote:
| So what you're saying is, we can just put a green lock in the
| UI and call it a day.
| rini17 wrote:
| We can put some thought into how to set up the third parties
| to be trusted with our keys. Curretly it's very haphazard.
| And it's not avoidable. Even Phil Zimmerman, inventor of PGP,
| won't accept PGP encrypted mail because he claims to have
| lost his private key.
|
| The outrage "own your private keys or bust!" is much easier
| tho.
| twstdzppr wrote:
| Green lock it is.
| sleepless wrote:
| Do you happen to know which OS Phil was using, when he lost
| his private key and what the exact circumstances were? This
| is a nice story to be told, but without details it is not
| worth that much.
| rini17 wrote:
| Why does it matter? Private keys are being compromised or
| lost under all kinds of circumstances and regardless of
| the OS.
| pabs3 wrote:
| Kind of meaningless if you can't trust the software running on
| your device though, since it could be scanning locally or
| relaying to remote services.
| einpoklum wrote:
| Exactly. Demand an open source for every encryption app - or at
| least those offered to the public en masse.
|
| It's not enough that a FOSS alternative _exists_; it needs to
| be the case that closed-source encryption is not considered as
| an actual encryption "end".
| jstanley wrote:
| This is an instance of the trope "if you can't solve
| everything, you shouldn't solve anything".
|
| It is fallacious because you'll never get there if you're not
| allowed to make incremental advances.
| fsflover wrote:
| So run free software?
| sschueller wrote:
| Like signal that still refuses to put their client on fdroid?
| btdmaster wrote:
| Like Element which does publish its client on F-Droid:
| https://f-droid.org/en/packages/im.vector.app/
|
| (Yes, F-Droid availability is a very good cutoff, I agree.)
| edf13 wrote:
| Are you confident enough to audit the free software yourself
| - or pushing the trust back to someone else?
| mindslight wrote:
| Our modern society couldn't exist without some trust, but
| there are huge differences in types of trust and the
| trustee's underlying motivations.
|
| Trusting the community to audit is like trusting the
| scientific method. Anyone can find and point out a flaw,
| which can then be verified by everyone. That's an idyllic
| description, and the process is quite imperfect, but it's
| the best we've got.
|
| Meanwhile, trusting a surveillance company to self police
| is like trusting a quack medicine healer.
| threeseed wrote:
| Every free software will have dozens or even hundreds of
| transitive dependencies.
|
| It literally isn't possible for an ordinary person to audit
| all code.
|
| At some point you have to blindly trust.
| fsflover wrote:
| There us a huge difference between you alone trusting a
| piece of software and the whole community verifying it at
| random.
| threeseed wrote:
| Which is why many of us are firmly against RCS.
|
| End to end encryption is critically important and no messaging
| standard should exist that doesn't include it.
| fgiox wrote:
| RCS does include end-to-end encryption though.
|
| https://support.google.com/messages/answer/10262381
| sneak wrote:
| No, it does not. Google's proprietary implementation
| implements Google's proprietary encryption system. RCS does
| not.
| fgiox wrote:
| While this is currently only in Google's and Samsung's
| Messages apps, it's basically just the Signal protocol
| implemented as an RCS extension. It will most likely end up
| formalised in the next standard release of RCS.
| threeseed wrote:
| > It will most likely end up formalised in the next
| standard release of RCS
|
| Based on what evidence.
|
| If it hasn't been added after all these years it doesn't
| seem likely it will be added soon.
| LinuxBender wrote:
| What I find intriguing is that E2EE was significantly more common
| long ago than it is today. Multi-protocol chat clients would
| utilize the OTR libraries _meaning 'off the record'_ and even
| auto-negotiate with folks over AIM, MSN, ICQ, IRC and others to
| assist in showing fingerprints and sharing public keys which
| could be done over the platform or out of band if one so wished.
|
| I would have expected that by today that not only would this be
| more common but that the technology would have significantly
| advanced by now. Instead the opposite appears to be true. People
| are instead using apps that pinky promise to E2EE things and
| depend on the chat service to handle the trust mechanism for them
| which in my view entirely defeats the intent and purpose of E2EE.
| I assume the motivation is the desire to capture and monetize
| every conversation, especially the private conversations and/or
| to comply with national security letters. I've heard people try
| to rationalize this with _because non technical people_ but non
| technical people were utilizing OTR just fine.
|
| What could _realistically_ be done to reverse this in a way that
| puts the control back into the individuals hands? i.e. Even if
| all the kings horses and all the kings men wanted your messages
| they could only pound sand again and again.
| georgyo wrote:
| I used OTR, but I would not say that it was _more_ common, as
| that was just a layer over a non E2EE platform like the ones
| you listed.
|
| Using OTR meant using a non-standard client, installing the
| plugins for it, and configuring it. Worse, all parties you
| wanted to chat with had to do it as well. Which means it was
| almost always a novelty feature among nerdy friends and no one
| else.
|
| Because most people did not have it installed, having it auto
| negotiate always was a non-starter. It meant people who didn't
| have it would recieve cryptic messages from you at the start of
| each session and then complain to you.
|
| So most of the time it was completely manual process of
| enabling it for _some_ chats, and this leaks a massive amount
| of data. It basically declares that for this particular chat
| sometime might have been interesting. Often the chat log would
| include a plain text message like "Let's switch to OTR".
|
| Unlike you I could not get any non-technical friends to care
| about OTR enough to install it. I have difficulty even today
| getting people interested in Signal. I still hear arguments
| like "I'm not that interesting, they can look" and "I've got
| nothing to hide"
| LinuxBender wrote:
| _Because most people did not have it installed_
|
| In my circle of friends, everyone used a non standard client
| so they could customize the skin of the client. Most of these
| people were either not technical or at least just technical
| enough to follow simple instructions. I personally knew many
| people that used the multi-protocol chat clients. Their
| primary interest was for the customization capabilities of
| the client and having all their messages in one place rather
| than encryption plugins. The vendor clients did not have a
| dark mode and were nowhere near as customizable. The E2EE
| aspect was just a _nice to have_ for some.
|
| I suspect this will never be a thing again as vendors might
| decide to ban accounts that use a client other than their own
| and people would fear losing their social circles in the
| vendor locked in services. That is, outside of IRC. One can
| still use E2EE in IRC.
| Asmod4n wrote:
| ICQ and the like threatened to close your account when you used
| OTR.
| saddlerustle wrote:
| > What I find intriguing is that E2EE was significantly more
| common long ago than it is today.
|
| This is absurd. Today a large fraction of the world's
| population is using E2EE via WhatsApp.
| LinuxBender wrote:
| _Today a large fraction of the world 's population is using
| E2EE via WhatsApp._
|
| That is good example of the problem I am describing. People
| are using E2EE created, deployed and maintained by WhatsApp
| _in_ WhatsApp. That is a problem. The E2EE in WhatsApp is not
| truly E2EE if it is maintained by the very people providing
| the service in my unwavering opinion. True E2EE is entirely
| outside of the service transport that messages are traversing
| meaning that FB could not possibly intercept the messages
| even if their livelihood depended on it. Today people have to
| just trust that FB are not targeting people with custom
| intercept code or code that otherwise preclude E2EE for
| specific messages or recipients. That is what I call a _pinky
| promise_ sometimes also referred to as a Pinky Swear [1].
|
| I follow the logic of, people will do what people can do. If
| the application can be monkeyed with, it will be. Message
| encryption must be entirely outside of the purview of the
| application. Even OTR was somewhat at risk of interception.
| That is why I would have expected that by today this would
| have been a solved problem and highly evolved.
|
| [1] - https://en.wikipedia.org/wiki/Pinky_swear
| georgyo wrote:
| There are a few reasons why I think it has to be at the app
| itself.
|
| In order to be actually secure, all conversions must be
| encrypted, without exception.
|
| OTR is one channel method of encrypting text, but it isn't
| the only method. For example using PGP over text messages
| is also a plugin for pidgin. Competing standards means your
| ven diagram of people and chat protocols now gets an entire
| new axis of encrption method.
|
| Metadata is data. Without seeing the message content, it is
| still valuable to see who is talking to who and when.
|
| There are always tradeoffs. While OTR may be more
| verifiable secure, it's difficultly hiders adoption. A
| balance has to be reached with ease of use and security. If
| it is easy to get it wrong then people will have a false
| sense of security. That is strictly worse than no actual
| security.
| LinuxBender wrote:
| _There are a few reasons why I think it has to be at the
| app itself._
|
| I agree it needs to be in an app, just not the app that
| is created by the service the person is using. Missing
| today is a universal chat app that can speak to all the
| services using standard chat protocols and standard
| authentication mechanisms. All the popular apps today
| appear to be highly proprietary and in some cases the
| vendor will even state that using an unapproved client is
| forbidden.
| jll29 wrote:
| This may be due to policy activism on the side of governments.
|
| I think asking service providers to use more E2EE is not the
| right approach. Governments' intelligence agencies do not shy
| away from infiltrating commercial vendors (from Yahoo! to the
| infamous Crypto AG - https://en.wikipedia.org/wiki/Crypto_AG).
|
| 1. People should diversify across many, in particular smaller
| platform, so that surveillance cannot be done as it becomes a
| scale & long tail problem.
|
| 2. People should use P2P E2EE protocols and GPG-encrypted
| emails. The single biggest step forward would be easier ways to
| set up public key encrypted email part of e.g. Mozilla
| Thunderbird and other clients. But in parallel an attitude
| change is needed, for the receiving end needs to know how to
| open an encrypted email.
___________________________________________________________________
(page generated 2022-10-22 23:02 UTC)