https://mjg59.dreamwidth.org/62598.html
Account name: [ ] Password [ ] [Log in]
(OpenID?) (Forgot it?) [ ] Remember Me
You're viewing [personal profile] mjg59's journal
Create a Dreamwidth Account Learn More
[ ] [Interest ] [Go]
Reload page in style: site light
Matthew Garrett
End-to-end encrypted messages need more than libsignal
End-to-end encrypted messages need more than libsignal
Dec. 8th, 2022 02:51 pm
[personal profile] mjg59
(Disclaimer: I'm not a cryptographer, and I do not claim to be an
expert in Signal. I've had this read over by a couple of people who
are so with luck there's no egregious errors, but any mistakes here
are mine)
There are indications that Twitter is working on end-to-end encrypted
DMs, likely building on work that was done back in 2018. This made
use of libsignal, the reference implementation of the protocol used
by the Signal encrypted messaging app. There seems to be a fairly
widespread perception that, since libsignal is widely deployed (it's
also the basis for WhatsApp's e2e encryption) and open source and has
been worked on by a whole bunch of cryptography experts, choosing to
use libsignal means that 90% of the work has already been done. And
in some ways this is true - the security of the protocol is probably
just fine. But there's rather more to producing a secure and usable
client than just sprinkling on some libsignal.
(Aside: To be clear, I have no reason to believe that the people who
were working on this feature in 2018 were unaware of this. This
thread kind of implies that the practical problems are why it didn't
ship at the time. Given the reduction in Twitter's engineering
headcount, and given the new leadership's espousal of political and
social perspectives that don't line up terribly well with the bulk of
the cryptography community, I have doubts that any implementation
deployed in the near future will get all of these details right)
I was musing about this last night and someone pointed out some prior
art. Bridgefy is a messaging app that uses Bluetooth as its transport
layer, allowing messaging even in the absence of data services. The
initial implementation involved a bunch of custom cryptography,
enabling a range of attacks ranging from denial of service to
extracting plaintext from encrypted messages. In response to
criticism Bridgefy replaced their custom cryptographic protocol with
libsignal, but that didn't fix everything. One issue is the potential
for MITMing - keys are shared on first communication, but the client
provided no mechanism to verify those keys, so a hostile actor could
pretend to be a user, receive messages intended for that user, and
then reencrypt them with the user's actual key. This isn't a weakness
in libsignal, in the same way that the ability to add a custom
certificate authority to a browser's trust store isn't a weakness in
TLS. In Signal the app key distribution is all handled via Signal's
servers, so if you're just using libsignal you need to implement the
equivalent yourself.
The other issue was more subtle. libsignal has no awareness at all of
the Bluetooth transport layer. Deciding where to send a message is up
to the client, and these routing messages were spoofable. Any phone
in the mesh could say "Send messages for Bob here", and other phones
would do so. This should have been a denial of service at worst,
since the messages for Bob would still be encrypted with Bob's key,
so the attacker would be able to prevent Bob from receiving the
messages but wouldn't be able to decrypt them. However, the code to
decide where to send the message and the code to decide which key to
encrypt the message with were separate, and the routing decision was
made before the encryption key decision. An attacker could send a
message saying "Route messages for Bob to me", and then another
saying "Actually lol no I'm Mallory". If a message was sent between
those two messages, the message intended for Bob would be delivered
to Mallory's phone and encrypted with Mallory's key.
Again, this isn't a libsignal issue. libsignal encrypted the message
using the key bundle it was told to encrypt it with, but the client
code gave it a key bundle corresponding to the wrong user. A race
condition in the client logic allowed messages intended for one
person to be delivered to and readable by another.
This isn't the only case where client code has used libsignal poorly.
The Bond Touch is a Bluetooth-connected bracelet that you wear.
Tapping it or drawing gestures sends a signal to your phone, which
culminates in a message being sent to someone else's phone which
sends a signal to their bracelet, which then vibrates and glows in
order to indicate a specific sentiment. The idea is that you can send
brief indications of your feelings to someone you care about by
simply tapping on your wrist, and they can know what you're thinking
without having to interrupt whatever they're doing at the time. It's
kind of sweet in a way that I'm not, but it also advertised "Private
Spaces", a supposedly secure way to send chat messages and pictures,
and that seemed more interesting. I grabbed the app and disassembled
it, and found it was using libsignal. So I bought one and played with
it, including dumping the traffic from the app. One important thing
to realise is that libsignal is just the protocol library - it
doesn't implement a server, and so you still need some way to get
information between clients. And one of the bits of information you
have to get between clients is the public key material.
Back when I played with this earlier this year, key distribution was
implemented by uploading the public key to a database. The other end
would download the public key, and everything works out fine. And
this doesn't sound like a problem, given that the entire point of a
public key is to be, well, public. Except that there was no access
control on this database, and the filenames were simply phone
numbers, so you could overwrite anyone's public key with one of your
choosing. This didn't let you cause messages intended for them to be
delivered to you, so exploiting this for anything other than a DoS
would require another vulnerability somewhere, but there are
contrived situations where this would potentially allow the privacy
expectations to be broken.
Another issue with this app was its handling of one-time prekeys.
When you send someone new a message via Signal, it's encrypted with a
key derived from not only the recipient's identity key, but also from
what's referred to as a "one-time prekey". Users generate a bunch of
keypairs and upload the public half to the server. When you want to
send a message to someone, you ask the server for one of their
one-time prekeys and use that. Decrypting this message requires using
the private half of the one-time prekey, and the recipient deletes it
afterwards. This means that an attacker who intercepts a bunch of
encrypted messages over the network and then later somehow obtains
the long-term keys still won't be able to decrypt the messages, since
they depended on keys that no longer exist. Since these one-time
prekeys are only supposed to be used once (it's in the name!) there's
a risk that they can all be consumed before they're replenished. The
spec regarding pre-keys says that servers should consider
rate-limiting this, but the protocol also supports falling back to
just not using one-time prekeys if they're exhausted (you lose the
forward secrecy benefits, but it's still end-to-end encrypted). This
implementation not only implemented no rate-limiting, making it easy
to exhaust the one-time prekeys, it then also failed to fall back to
running without them. Another easy way to force DoS.
(And, remember, a successful DoS on an encrypted communications
channel potentially results in the users falling back to an
unencrypted communications channel instead. DoS may not break the
encrypted protocol, but it may be sufficient to obtain plaintext
anyway)
And finally, there's ClearSignal. I looked at this earlier this year
- it's avoided many of these pitfalls by literally just being a
modified version of the official Signal client and using the existing
Signal servers (it's even interoperable with Actual Signal), but it's
then got a bunch of other weirdness. The Signal database (I /think/
including the keys, but I haven't completely verified that) gets
backed up to an AWS S3 bucket, identified using something derived
from a key using KERI, and I've seen no external review of that
whatsoever. So, who knows. It also has crash reporting enabled, and
it's unclear how much internal state it sends on crashes, and it's
also based on an extremely old version of Signal with the "You need
to upgrade Signal" functionality disabled.
Three clients all using libsignal in one form or another, and three
clients that do things wrong in ways that potentially have a privacy
impact. Again, none of these issues are down to issues with
libsignal, they're all in the code that surrounds it. And remember
that Twitter probably has to worry about other issues as well! If I
lose my phone I'm probably not going to worry too much about whether
the messages sent through my weird bracelet app being gone forever,
but losing all my Twitter DMs would be a significant change in
behaviour from the status quo. But that's not an easy thing to do
when you're not supposed to have access to any keys! Group chats?
That's another significant problem to deal with. And making the
messages readable through the web UI as well as on mobile means
dealing with another set of key distribution issues. Get any of this
wrong in one way and the user experience doesn't line up with
expectations, get it wrong in another way and the worst case involves
some of your users in countries with poor human rights records being
executed.
Simply building something on top of libsignal doesn't mean it's
secure. If you want meaningful functionality you need to build a lot
of infrastructure around libsignal, and doing that well involves not
just competent development and UX design, but also a strong
understanding of security and cryptography. Given Twitter's lost most
of their engineering and is led by someone who's alienated all the
cryptographers I know, I wouldn't be optimistic.
Tags:
* advogato,
* fedora
* Previous Entry
* Add Memory
* Share This Entry
* Next Entry
---------------------------------------------------------------------
* 7 comments
* Reply
---------------------------------------------------------------------
Flat | Top-Level Comments Only
no subject
Date: 2022-12-09 06:32 pm (UTC)
From: (Anonymous)
I believe that nostr, or something like it, will be the goto protocol
for secure messaging in the near future. I love Signal but tech
changes.
https://github.com/nostr-protocol/nostr
* Link
* Reply
* Thread
* Hide 3 comments
* Show 3 comments
no subject
Date: 2022-12-09 09:13 pm (UTC)
From: [personal profile] mjg59
nostr is a protocol for broadcasting (signed) plaintext messages -
Signal is a protocol for sending 1:1 encrypted messages. They're
different problems that have different solutions.
* Link
* Reply
* Thread from start
* Parent
* Thread
* Hide 2 comments
* Show 2 comments
Nostr
Date: 2022-12-09 10:07 pm (UTC)
From: (Anonymous)
Anything done with Signal (1:1, group messages, voice chat) can be
done with Nostr as well.
https://github.com/emeceve/loquaz
Proof of concept voice chat on Nostr:
https://github.com/Giszmo/Nostr-Voice-Chat/blob/master/Readme.md
This is billed as "secure clipboard" but it's encrypted bidirectional
text messaging in real time:
https://sendstr.com
* Link
* Reply
* Thread from start
* Parent
* Thread
* Hide 1 comment
* Show 1 comment
Re: Nostr
Date: 2022-12-10 09:38 am (UTC)
From: (Anonymous)
The Terminology used here is very confusing. There are also Signal
the App, Signal the transport protocol, and both are not libsignal.
libsignal implements the Signal Double Ratchet AKA Axolotl and other
cryptographic protocols used by Signal.
A quick skim of loquaz lead me to the protocol description Nip 4
Direct Messages of nostr: https://github.com/nostr-protocol/nips/blob
/743e43a8d4bf4a37022e3b6551524b12e7cc54a0/04.md . That not only
exhibits the pitfalls explained in this blog post, it also doesn't
even try to have the security goals typical of modern encrypted
messengers. E.g. Nip 4 provides no forward secrecy, nostr overall
does not yet have provisions for key rotation, nor an interaction
flow for verifying identities.
It might be a good idea for nostr to adopt libsignal or vodozemac or
something similar for encrypting messages, so that they can
concentrate on getting the "more than" parts right and innovate where
they can make a difference with their protocol.
* Link
* Reply
* Thread from start
* Parent
People
Date: 2022-12-10 08:09 am (UTC)
From: (Anonymous)
I think the most interesting part is your implied claim that only a
few people in the world can design and audit a properly secure system
of this kind. And that they can't be bought with money. Did I get you
right on this?
* Link
* Reply
* Thread
* Hide 2 comments
* Show 2 comments
Re: People
Date: 2022-12-10 08:19 am (UTC)
From: [personal profile] mjg59
I've no doubt there are people who could do this and who could be
bought, but they've mostly already been bought by other people.
* Link
* Reply
* Thread from start
* Parent
* Thread
* Hide 1 comment
* Show 1 comment
Re: People
Date: 2022-12-10 09:16 am (UTC)
From: (Anonymous)
Haha.. nice
* Link
* Reply
* Thread from start
* Parent
* Previous Entry
* Add Memory
* Share This Entry
* Next Entry
* 7 comments
* Reply
Flat | Top-Level Comments Only
Profile
Matthew Garrett
About Matthew
Power management, mobile and firmware developer on Linux. Security
developer at Aurora. Ex-biologist. [personal profile] mjg59 on
Twitter. Content here should not be interpreted as the opinion of my
employer. Also on Mastodon.
Page Summary
* (Anonymous) - (no subject)
* (Anonymous) - People
Expand Cut Tags
No cut tags
Top of page