[HN Gopher] Twilio incident: What Signal users need to know
___________________________________________________________________
Twilio incident: What Signal users need to know
Author : input_sh
Score : 382 points
Date : 2022-08-15 17:12 UTC (5 hours ago)
(HTM) web link (support.signal.org)
(TXT) w3m dump (support.signal.org)
| g_sch wrote:
| This info gives us an interesting opportunity to estimate the
| rate at which Signal is adding new users. They've been very
| tight-lipped (understandably) about their usage stats but
| anecdotally they seem to be an increasingly common presence on my
| friends' phones, even the non-techies.
|
| As far as I can tell, Signal uses Twilio only to send SMS for
| phone number verification. Verification happens when a user
| registers a new number or changes the number on their existing
| account.
|
| The rate at which Signal is adding new users could be calculated
| by:
|
| 1900 * (proportion of new registrants among SMS recipients) /
| (length of Twilio incident)
|
| You could probably make some common-sense assumptions about the
| first variable. But I can't find any publicly available info on
| when Twilio was first compromised. Their press release only
| mentions that they discovered the intrusion on August 4, which is
| presumably close to the end date of the incident. Does anyone
| know what the estimated start of the incident might be?
| gsdofthewoods wrote:
| Signal's SMS registration codes expire after a few minutes, so
| you wouldn't even need to know the duration of the incident.
| Let's be conservative and say the codes expire after 5 minutes
| (it's probably shorter), then Signal is registering 380 devices
| a minute.
| g_sch wrote:
| My reading of the post is that they determined the "1,900
| users" figure by the number of users who had requested a code
| during the duration of the Twilio incident, as the attacker
| could have accessed their SMS messages at any point during
| the compromise:
|
| > During the window when an attacker had access to Twilio's
| customer support systems it was possible for them to attempt
| to register the phone numbers they accessed to another device
| using the SMS verification code. The attacker no longer has
| this access, and the attack has been shut down by Twilio.
| sgc wrote:
| At least number of requests + number of open unverified and
| unexpired requests. So you need to guess average length of
| time to verify and the abandon rate.
|
| It also seems like there is also a buffer period when the
| numbers are registered and not yet purged from the twilio
| system:
|
| > 1) their phone numbers were potentially revealed _as
| being registered_ to a Signal account, or 2) the SMS
| verification code used to register with Signal was
| revealed.
|
| We don't know how often Signal purges that, so, although
| unlikely, it could be a day or a week or more of
| registrations.
| chitowneats wrote:
| "We conducted an investigation into the incident and determined
| the following."
|
| Guys how is the punctuation here not a colon?!
|
| This is an extremely serious issue according to my English Major
| inclination towards pedantry.
| hammyhavoc wrote:
| I'm reading yet another argument against centralization.
|
| Matrix protocol, anybody?
| [deleted]
| lutoma wrote:
| How is this an argument against Signal and/or centralization?
| Pretty much the exact same thing could have happened with
| matrix 3pid servers.
|
| I really like Matrix (and Signal), I use it and even run a
| public Matrix home server, but jesus I can't stand obnoxious
| Matrix fanboys who feel the need to shoehorn some Matrix plug
| into literally every conversation involving Signal.
| hammyhavoc wrote:
| Because all of those users are exposed by the same SaaS
| fault. Too many digital eggs in one digital basket.
|
| What is there to plug? It's FOSS. Use it, or don't, who
| fucking cares? Whether you or anybody else use it or not is
| of absolutely zero consequence to me.
| uncomputation wrote:
| Yes, Signal's phone number requirement is bad. But, given that,
| the fact that they don't store any messages on their side and
| everything is client side is still a huge benefit over a lot of
| other apps and still a huge step forward for privacy! Criticism
| is definitely important but I just wanted to put that out there
| that all things considered, Signal is still very much a good
| thing.
| A4ET8a8uTh0 wrote:
| I will admit that this requirement always confused me. What is
| there to benefit from by requiring it?
| konschubert wrote:
| Without it, you wouldn't be able to see which of your
| contacts are on signal.
|
| And then nobody would use Signal.
|
| It's very unfashionable today, but they decided to not let
| perfect be the enemy of good.
| panick21_ wrote:
| While that is nice, I see no reason to require that. Some
| people just don't care about that feature.
| dcow wrote:
| What is this future UX you're imagining? How does the
| future solve the contacts book/short identifiers problem?
| fezfight wrote:
| Just like this website. Usernames. Easy peasy.
| dcow wrote:
| And how do you claim a username?
| [deleted]
| aaaaaaaaaaab wrote:
| By knowing the password?
| panick21_ wrote:
| I'm not saying its an amazing experience or solves the
| problem systematically. Again, some people simply don't
| need these features. You can literally just take part of
| the public key and that's it. That is totally fine for
| some use-cases.
| dcow wrote:
| Then use urbit. It already exists.
| panick21_ wrote:
| Yeah but the whole point of Signal is to allow secure
| very secure communication. With little effort they could
| allow this usecase. It would address a major criticism
| and they already have the underlying infrastructure.
|
| But I guess you can just keep moving the goal post.
| mbrubeck wrote:
| It means that Signal doesn't need you to create or upload a
| list of your contacts; it uses the existing contact list from
| your phone. This also lets you use Signal to replace the
| default text messaging app on Android, automatically
| upgrading conversations to be encrypted when possible. This
| in turn means that just using Signal to communicate with
| someone becomes a normal, everyday activity, and less of a
| sign of suspicious activity (from the point of view of law
| enforcement, etc.).
| fezfight wrote:
| I think the problem is that it's a requirement, not a
| feature you can choose to use. I'd be more inclined to use
| Signal if I choose to use only a user/pass. Just need a
| block function.
| dcow wrote:
| How would other people contact you?
| getcrunk wrote:
| With your username?
| dcow wrote:
| How do you text a username?
| Volundr wrote:
| By opening signal, putting in the username and sending a
| text. Signal only uses MMS as a fallback when
| communicating with someone not on signal. When both
| parties are on signal SMS/MMS is not used. Presumably
| they are OK with not being able to communicate with
| people not on signal.
| sofixa wrote:
| Why do you need to text? Sending messages without the
| arcane UX and unreliability of old school SMS or the
| tries at improving it(RCS) is simply better.
| tjoff wrote:
| You do know that sharing your contact list is optional?
| AareyBaba wrote:
| On an iPhone, what does 'sharing your contact list' imply ?
|
| Does the app get just name and phone numbers or all the
| meta data like address and personal notes that I put into
| my contacts ? I haven't been able to figure this out - does
| anyone know what Apple's policy is on this ?
| mercutio2 wrote:
| Contact Notes, specifically, require [0] a special
| entitlement to access them, so normal chat apps should
| never have access to them on iOS.
|
| All other fields, for all contacts, are accessible once
| Contacts access is granted.
|
| [0] https://developer.apple.com/documentation/bundleresou
| rces/en...
| adrr wrote:
| Using the phone number as the identity. You need to verify
| ownership of the phone number to prove your identity. I guess
| they could use email as an alternative.
| rakoo wrote:
| It's the easiest anti-spam measure, because it makes it
| expensive enough that spammers won't have a million accounts.
| Fake account detection is effectively put on the back of
| mobile carriers
| moontear wrote:
| Less spam. A phone number is a much better deterrent against
| opening a hundred accounts than let's say an email address.
| xorcist wrote:
| Beside the technical reasons, tech companies are valued from
| their access to contact information, and Signal has had huge
| investments made. This despite the obvious fact that it is
| highly unlikely that Signal would benefit directly from that
| data in any way. But much Signal's design is taken from the
| companies that came before, which did.
| dogecoinbase wrote:
| They trade your privacy for not having to figure out some
| technical issues.
|
| Also, when they rolled out their cryptocurrency payment
| system (after keeping the server-side source code secret for
| more than a year, during which Moxie, who is a paid advisor
| to that same cryptocurrency, denied they were working on a
| payment system), they got KYC for free.
| MobiusHorizons wrote:
| I believe they have covered this question many times before,
| but I don't see an answer on signal's website. From memory,
| it had to do with not wanting to own the user's contact list.
| Using a phone number allowed them to rely on a contact list
| on the users phone, which is not tied to the signal account.
| There was more nuance than that though.
| charcircuit wrote:
| Why not just store a contact list of usernames on the phone
| though?
| dcow wrote:
| What would this list contain? You don't have a signal
| username. If you did you'd have to claim it somehow
| (degenerates to email or phone verification). It's not
| that simple.
|
| Using phone numbers allows signal to plug into the
| existing state of the world and leverage it to upgrade
| the security of messaging for everyone who uses it. The
| one compromise is that it treats phone number as a short
| identifier (importantly, not cryptographic, it uses real
| crypto for that).
|
| If you don't use phone numbers, your product would look
| more like Keybase. You have to somehow facilitate key
| exchange between people in a way that's actually usable.
| Otherwise all your security benefits go out the window
| because nobody uses your product. Signal understands this
| nuance perfectly which is why they're a successful
| product.
| k1t wrote:
| If I don't use Signal, then I'm not going to keep a list
| of my friends' Signal usernames on my phone.
|
| If I subsequently sign-up for Signal, then I have no way
| to discover which of them use Signal - short of
| contacting them via some other method and asking for
| their Signal username, if any.
|
| By making the Signal username the same as the user's
| phone number, I actually DO have a list of Signal
| 'usernames' on my phone already. As soon as I sign-up, I
| can send my list of friends' phone numbers to Signal and
| they can tell me which of those people have Signal
| accounts.
| autoexec wrote:
| > it had to do with not wanting to own the user's contact
| list. Using a phone number allowed them to rely on a
| contact list on the users phone, which is not tied to the
| signal account.
|
| That doesn't make any sense. Signal did the total opposite.
| It started keeping sensitive user data in the cloud
| including your name, your photo, your phone number, and a
| list of your contacts. It stores that data on their servers
| permanently.
| MAGZine wrote:
| The list of your contacts bit is patently false, they've
| discussed in detail about how they securely organize
| contact lists: https://signal.org/blog/contact-discovery/
|
| Signal has always kept your name/pic/etc on their servers
| I believe, because otherwise you turn signal into a P2P
| application, which it is not. It's a fully encrypted
| application that stores minimal information. It is NOT
| P2P.
|
| For example, your messages are stored on their servers
| until they're delivered.
| autoexec wrote:
| > The list of your contacts bit is patently false,
|
| You are wrong and your blog post from 2014 doesn't take
| into account their new data collection practices. See:
| https://community.signalusers.org/t/proper-secure-value-
| secu...
|
| If this is the first time you're hearing about the data
| collection, that should tell you everything you need to
| know about how trustworthy Signal is.
|
| > Signal has always kept your name/pic/etc on their
| servers I believe
|
| Wrong again I'm afraid. There really was a time when
| Signal didn't collect and store any user data on their
| servers. They've repeatedly bragged about times when
| governments have come around asking them for data and
| they were able to turn the feds away because that data
| was never collected in the first place. That changed with
| the update which added pins. Today, Signal now collects
| that very same data.
| pabl8k wrote:
| I don't think this is true, do you have a source?
|
| They store _registered users phone numbers_ and allow
| discovery by making a request with a hashed version of
| the phone numbers on your contact list. They add an extra
| layer to allow attestation of the software doing this
| using Intel 's secure enclave. They give many examples of
| responding to warrants with only whether the number has
| been registered and the timestamp of registration, which
| they explain is the only information they hold.
|
| Private Contact Discovery:
| https://signal.org/blog/private-contact-discovery/
| autoexec wrote:
| Your 2017 blog post is outdated.
|
| See:
|
| https://community.signalusers.org/t/can-signal-please-
| update...
|
| and
|
| https://community.signalusers.org/t/dont-want-pin-dont-
| want-...
|
| See here for a discussion on how Intel's 'secure' enclave
| won't save you:
| https://community.signalusers.org/t/proper-secure-value-
| secu...
| dcow wrote:
| There's a horrible conflation of concepts here. A pretty
| big one.
|
| When people talk about cloud services, they generally
| mean part of an application that runs on the cloud that
| _participates_ as a trusted actor in the application 's
| trust model.
|
| What people in the linked thread are realizing is that
| "signal has a server" and they are confused because they
| thought signal didn't have a server, or something.
|
| So, what's important about Signals servers is that,
| outside of initial key exchange which is verified by two
| parties out of band, they are not a trusted entity, ever.
| When you send a message it goes through signals servers.
| When you sync your profile picture with other devices,
| same thing. The data transits signals servers. This is
| made possible because of cryptography. By encrypting the
| data in a way that is indecipherable by 3rd parties
| (Signal's servers included) your data is isomorphic to
| random noise. So, the only thing Signal needs to do is
| route the random noise to the right place. If it doesn't
| do that, it's a denial of service and about the only
| attack you're vulnerable to if you use Signal. Otherwise,
| the receiver gets the exact random noise that you sent,
| but only they can make sense of it because of the miracle
| of cryptography.
|
| If you're really doing to throw a fit because Signal
| syncs a profile picture between your devices using the
| same level of crypto as is used for messaging then you're
| honestly crazy.
|
| No. Signal did not "not have a cloud" and now they "have
| a cloud". Not by any reasonable interpretation of the
| events.
| autoexec wrote:
| Signal has a "cloud" a server where they collect and
| store your name, your phone number, your photo, and list
| of every person you've contacted using Signal. That data
| isn't some ephemeral encrypted string that is only
| present when you "sync your profile picture" or when you
| send a message. It is collected and stored on their
| server where it will sit for at least as long as you have
| an account.
|
| The justification for it was so that you could get a new
| device and have Signal download all of your info from
| your Signal's server down to your device. The data
| collection first takes place as soon as you set a pin or
| opt out of setting one (at which point a pin is assigned
| for you automatically).
|
| The data is encrypted, but that does not make it
| impossible for signal or for 3rd parties to access it.
| see: https://community.signalusers.org/t/proper-secure-
| value-secu...
|
| If you're a whistleblower or an activist, a list of every
| person you've been contacting using Signal is a highly
| sensitive data. No matter how you want to spin it, Signal
| is hosting that highly sensitive user data on their
| servers where Signal and 3rd parties alike could possibly
| gain access to them.
| dcow wrote:
| You should assume every bit of information sent on the
| internet is archived in a massive warehouse somewhere,
| because it is.
|
| Thus, we have to trust the cryptography itself. Sending
| an encrypted message to a peer is no different from
| sending an encrypted message to yourself (other than the
| use of symmetric vs asymmetric crypto). The fact that you
| send a message to yourself which is stored persistently
| on signal's server doesn't change anything (and it's even
| opt in AFAIU). Sure, there are concerns about the
| implementation, but until someone can decrypt the blobs
| in storage (the crypto is broken) I don't see reason for
| outrage.
|
| Pretty simply, if you don't trust the crypto then you
| have a very different threat model to pretty much
| everyone else. If you don't trust crypto you can't use
| the internet because you can't use TLS. You're relegated
| to networks where you trust every single node (where you
| don't _need_ crypto) and other such stuff. Most of us
| trust the crypto because it 's really the only practical
| option. I don't see the problem.
| autoexec wrote:
| > You should assume every bit of information sent on the
| internet is archived in a massive warehouse somewhere,
| because it is.
|
| Leaving aside the whataboutism here, you shouldn't assume
| that when you're using a secure messaging app that claims
| to be designed to never collect or store user data.
| Signal makes that claim at the start of their privacy
| policy and it is a lie. It started out true, but they
| begain colleting data and they refuse to update their
| policy.
|
| > Thus, we have to trust the cryptography itself.
|
| No one is suggesting we can't trust cryptography. The
| fact is that doesn't matter how strong your algprythm is
| when you're encrypting that data with a 4 digit number.
| You can 100% "trust the cryptography" and still
| acknollege that it won't take very long for someone to
| brute-force your pin and get your data plain text.
|
| > Sending an encrypted message to a peer is no different
| from sending an encrypted message to yourself... (and
| it's even opt in AFAIU).
|
| This has nothing to do with "sending data to yourself"
| and everything to do with Singal collecting data from you
| and storing it for itself. There is a massive difference
| between encrypting something yourself and sending that
| data to yourself and someone else copying data from you,
| encryping it, and saving it for themselves.
|
| This data collection is also not opt in. At all. You can
| opt out of setting a pin, but if you do one will be
| automatically generated for you and your data still gets
| silently uploaded to Singal servers to be stored. The
| community spent months begging for Signal to add a way to
| opt out of this data collection, but they were ignored.
|
| See:
|
| https://community.signalusers.org/t/dont-want-pin-dont-
| want-...
|
| https://community.signalusers.org/t/mandatory-pin-
| without-cl...
|
| > Pretty simply, if you don't trust the crypto then you
| have a very different threat model
|
| "The crypto" isn't the problem here. The problem is
| Signal collecting sensitive user data and permanently
| storing it on their servers in a manner that could allow
| it to be accessed by third parties and then not clearly
| disclosing that to their users and refusing to update
| their privacy policy to reflect the change.
| dcow wrote:
| Signal _can 't possibly read the data_. How is that for
| itself? Only _you_ can decrypt it! Signal doesn 't _have_
| your data. They have garbage bits of effectively random
| noise.
|
| You can prove it to yourself. Go take one of Signal's
| servers and try to find someone else's data there. You
| won't.
|
| Why would Signal update their privacy policy to reflect
| the desire of misguided fear mongers? I certainly
| wouldn't do that if I were them.
| andrepd wrote:
| This is absolutely NOT true. (1) Signal doesn't store
| your contacts, and (2) Signal only stores a name and a
| profile photo _if_ you want, and _in a secure way_
| https://signal.org/blog/signal-profiles-beta/
| autoexec wrote:
| I'm sorry to be the one to tell you, but Signal 100%
| stores your contacts. They keep your name, your photo,
| your phone number, and a list of every person you've
| contacted using Signal. That data is permanently stored
| on their servers.
|
| See: https://community.signalusers.org/t/can-signal-
| please-update...
|
| > "This should be updated for the recent changes where
| contacts are uploaded to Signal's servers and stored
| permanently along with Groups V2 and other data,
| protected by a 4-digit minimum PIN and Intel SGX - there
| have been concerns 5 raised 2 in these forums,
| particularly if one of your contacts chooses a brute-
| forceable PIN which in the context of an Intel SGX
| vulnerability 1 could leak a lot of contact data if
| hacked, even if you choose a strong password."
|
| See the two links sited in that comment for more
| information on why it isn't actually stored in "secure"
| way.
| nobody9999 wrote:
| >That doesn't make any sense. Signal did the total
| opposite. It started keeping sensitive user data in the
| cloud including your name, your photo, your phone number,
| and a list of your contacts. It stores that data on their
| servers permanently.
|
| This is the first I've heard of that. And if it's true,
| it's a big problem.
|
| Is there any documentation of this behavior that you can
| direct me to?
| autoexec wrote:
| You aren't alone. There are a ton of people who have no
| idea Signal has been collecting and storing sensitive
| user data on their servers. There was a ton of discussion
| about it when the update rolled out and a lot of backlash
| from their users, which they ignored. They've since
| refused to update their privacy policy as well which I
| personally see as a canary warning users to avoid their
| service.
|
| https://community.signalusers.org/t/proper-secure-value-
| secu...
|
| https://community.signalusers.org/t/what-contact-info-
| does-t...
|
| https://community.signalusers.org/t/can-signal-please-
| update...
|
| https://community.signalusers.org/t/dont-want-pin-dont-
| want-...
|
| https://community.signalusers.org/t/sgx-cacheout-sgaxe-
| attac...
| nobody9999 wrote:
| >You aren't alone. There are a ton of people who have no
| idea Signal has been collecting and storing sensitive
| user data on their servers. There was a ton of discussion
| about it when the update rolled out and a lot of backlash
| from their users, which they ignored. They've since
| refused to update their privacy policy as well which I
| personally see as a canary warning users to avoid their
| service.
|
| Edit: This bit is apparently not the case. And more's the
| pity.
|
| ====Section affected by edit=========
|
| I can't (and wouldn't try to) speak for anyone else, but
| if you disable the PIN functionality[0], Signal doesn't
| upload the information you're talking about.
|
| ==== End section affected by edit=========
|
| Which isn't a new change (IIUC, PIN disablement was
| introduced ~2 years ago). I'd say that using the PIN
| functionality should be opt-in rather than opt-out, so in
| that respect I agree.
|
| Further, Signal should probably update their policy
| documents to reflect the current state of affairs.
|
| But I stand by my previous comment[1].
|
| [0] https://support.signal.org/hc/en-
| us/articles/360007059792#pi...
|
| [1] https://news.ycombinator.com/item?id=32474579
| autoexec wrote:
| > I can't (and wouldn't try to) speak for anyone else,
| but if you disable the PIN functionality[0], Signal
| doesn't upload the information you're talking about.
|
| This is also incorrect. If you opt out of setting a pin,
| Signal creates a pin for you and uses that to encrypt the
| data it uploads to their servers. Again, not your fault.
| Signal has gone out of their way to avoid answering
| direct questions about this in a plain way.
|
| See: https://old.reddit.com/r/signal/comments/htmzrr/psa_
| disablin...
| CodesInChaos wrote:
| Making it more expensive to create spam accounts, I'd guess.
| TheCraiggers wrote:
| Besides fighting spam accounts, finding and connecting with
| others is easier. No more needing to ask somebody what their
| account name / friend code / ICQ number / whatever UID a
| system uses to add them to your contacts. It's not a good
| user experience to type in someone's sometimes insane
| username.
|
| Meanwhile, someone in my contacts that installs Signal
| automatically sees my name pop up and can start chatting. Far
| easier, and helps drive adoption.
| cmeacham98 wrote:
| There's no need to make it a requirement for this. Matrix
| has similar functionality, but doesn't _require_ that you
| add your phone, just makes it an option.
| Macha wrote:
| Except the reality is that I'm asking for a phone number to
| add people (well on whatsapp here in europe), and most
| screen names people chose can be communicated verbally
| while phone numbers often have people resorting to handing
| the phone displaying the number around.
| dijonman2 wrote:
| It also pushes fraud detection up the pipeline to mobile
| operators.
| [deleted]
| dcow wrote:
| I wouldn't even call it bad. In may ways it's good, actually.
| It's good because it allows signal to build a product that is
| relevant and usable. Phone numbers only connect people and are
| a bridge to allow all the perfect crypto to do the legwork. The
| knee jerk "phone bad" reaction is understandable, sure. But I
| don't think it's warranted for Signal. Signal would look like
| Keybase without phone numbers. Keybase (or their key exchange
| UX, at least) is great. But it's not the same product.
| baby wrote:
| People use telegram more and more which is based on
| usernames...
| [deleted]
| cookiengineer wrote:
| > don't store any messages on their side
|
| Google Play services are still required for the official builds
| because of the (unverifiable to be really) encrypted backups.
|
| > everything is client-side
|
| Signal's FOSS fork developers would disagree. They got outright
| legal problems after they wanted to implement an open source
| alternative. Most APIs in regards to contact management are
| server-side. There's Molly as a younger fork but I'm waiting
| for Signal to write them also a cease and desist letter.
|
| Honestly this is why I think that Signal should be treated the
| same like WhatsApp. Supposedly end to end encrypted, but only
| until you suddenly have the FBI with printed out chats in front
| of your door.
|
| As long as Signal uses proprietary services and contains
| proprietary blobs in their (default aka Play store-provided)
| app we have to treat it as an unsecure messaging system.
|
| Especially given the RCEs that it had in the past, where it was
| as simple as injecting an HTML with a script tag to install
| malware on your system.
| exyi wrote:
| we still don't know of anyone with their messages printed
| out. Signal probably doesn't have all the contacts hoarded on
| their server, while WA certainly does. For me, there is a big
| difference in "might be buggy" vs "certainly is privacy
| hostile"
|
| I still prefer Matrix, but Signal is clearly the next best
| thing for chats. And it also has quite a number of non-HN
| users :)
| MayeulC wrote:
| > Google Play services are still required for the official
| builds because of the (unverifiable to be really) encrypted
| backups.
|
| ??? I use the official build with encrypted backups without
| Google services, and have been doing so for at least 3 years.
| I've been forward-carying my backup since 2016, too.
| lrvick wrote:
| I refuse to use or recommend Signal due to blatantly bad design
| choices that put people that need privacy most at risk like
| security researchers, journalists, abortion seekers, or
| dissidents.
|
| If you learn a contact phone number then you can buy their
| location history. Requiring phone numbers and requiring you
| share them with everyone you contact is brain dead.
|
| This alone is bad enough to abandon Signal but then consider
| they have centralized control of client binaries, and metadata
| protection anchored on centralized SGX they can trivially
| access. This negligent design makes them vulnerable to coercion
| or even court orders if any judge realizes they actually -can-
| decrypt messages and dump metadata.
|
| Matrix supports Signal crypto but in a federated network with
| no PII requirements like Signal. Also no lock-in or central
| control of apps.
| Thorentis wrote:
| > abortion seekers
|
| Uhhh, not sure what koolaid you've swallowed, but including
| them in that list is almost laughable.
| HideousKojima wrote:
| I wonder how the people putting "abortion seekers" on such
| lists would feel if I included "self-defense rights
| advocates" for people 3d printing guns or smuggling them in
| from abroad on similar lists.
| Volundr wrote:
| I'd wonder if it's for self defense why you didn't buy
| your firearm legally, since, you know, it's legal to do
| so. I haven't done a deep dive, but as far as I can tell
| in most cases it's legal to 3d print too, though
| admittedly that's something that there are some semi-
| serious efforts to change.
|
| In other words I'd suspect the classification of "self
| defense advocate" to be a self serving branding effort
| since there are legal ways to accomplish the same, but I
| wouldn't doubt the need of this person for a secure
| messaging platform.
| HideousKojima wrote:
| >I'd wonder if it's for self defense why you didn't buy
| your firearm legally, since, you know, it's legal to do
| so.
|
| Outside of the United States, that's usually not the
| case. Even if countries do allow private gun ownership,
| the restrictions on how to obtain them (and what they can
| legally be used for, what kinds are available, etc.) are
| exceptionally onerous.
|
| And even within the United States, there are individual
| states that have attempted to severely curtail private
| firearm ownership. Were it not for certain Supreme Court
| decisions, handgun ownership would be outright illegal in
| the District of Columbia and likely in several other
| states.
| Volundr wrote:
| > Outside of the United States, that's usually not the
| case.
|
| A good and fair point. I'd fallen into the trap of being
| too US centric on HN.
|
| > Were it not for certain Supreme Court decisions,
| handgun ownership would be outright illegal in the
| District of Columbia and likely in several other states.
|
| Sure, were it not for the Supreme Court. But as there
| remains plenty of ways to legally obtain guns in the US,
| I'm still going to doubt that you've resorted to gun
| smuggling for "self defense"
| sofixa wrote:
| > Even if countries do allow private gun ownership, the
| restrictions on how to obtain them (and what they can
| legally be used for, what kinds are available, etc.) are
| exceptionally onerous
|
| Citation needed. I, and probably the majority of the
| citizens of those countries do not consider the standard
| test/psych eval/background check/random checks in the
| future to make sure you're following the rules to be
| "exceptionally onerous". And i think most non-Americans
| would agree that adding some friction to a fringe case
| (owning a personal firearm for protection or fun is not
| something most people do, even in the US) is worth it if
| it nearly eliminates blatant misuses of firearms - either
| making suicides easier and more terminal, enabling easier
| revenge murders, or making your average school/public
| place shooting easier.
|
| What would you consider a just middle ground between
| "onerous requirements" and "everyone can buy any weapon
| without any requirements but paying for it"?
| HideousKojima wrote:
| >Citation needed. I, and probably the majority of the
| citizens of those countries do not consider the standard
| test/psych eval/background check/random checks in the
| future to make sure you're following the rules to be
| "exceptionally onerous".
|
| Just because you've accepted the boot on your neck
| doesn't make it not a boot. When (not if) a currently
| free and democratic Western nation decides to be not so
| democratic anymore (whether due to invasion,
| international pressure from economic partners like Russia
| and China, or just that the assholes in power decided to
| seize even more power) the citizens (or rather subjects)
| of those countries will have no means of fighting back.
| You can already see it with several countries' response
| to covid.
|
| >What would you consider a just middle ground between
| "onerous requirements" and "everyone can buy any weapon
| without any requirements but paying for it"?
|
| My feelings on gun control can be summed up as "I want
| mail order rocket launchers delivered to my doorstep."
| The state should fear its people, not the other way
| around, and the best way to ensure that is to give the
| people the means to put a bullet (or several) into any
| would-be tyrants.
|
| And, regardless of what "the majority of citizens" feel
| about bootlicking and trampling on their own natural
| rights, advances in home manufacturing are quickly making
| any efforts to do so a pipedream.
| lrvick wrote:
| I would have no problem seeing that included on such
| lists either. I have friends who hunt and I myself enjoy
| shooting on a range once in a while. I also know single
| parents that live alone in sketchy areas that are well
| trained and level headed enough to trust with firearms
| for home self defense.
|
| There are almost always reasonable uses of many services
| and tools we tend to have knee-jerk-ban reactions to as a
| society.
| Volundr wrote:
| How do you figure? Several states had abortion laws that
| were never repealed and others have trigger laws on the
| books that have gone into effect or will shortly, so yes
| you can be prosecuted for abortion now. Hence the need for
| privacy.
| lostcolony wrote:
| Why is it laughable?
| https://www.npr.org/2022/08/12/1117092169/nebraska-cops-
| used...
| WaitWaitWha wrote:
| I agree none of this is laughable.
|
| Irrelevant of your position, please read the entire
| article that you referenced for facts (pre-RvW overturn,
| pregnancy at 23~28 (?) weeks, took Pregnot, buried in
| back yard, Nebraska law was at that time 20 weeks).
|
| The Vice article seems to have quite a lot more facts and
| references. https://www.vice.com/en/article/n7zevd/this-
| is-the-data-face...
| HideousKojima wrote:
| So Nebraska cops went through the proper channels and
| obtained a warrant in order to investigate a murder?
| Sounds perfectly reasonable to me.
| dcow wrote:
| How does signal allow you to learn someone's phone number
| from message history? As far as I understand the only thing
| one can learn by inspecting signal's protocol is that:
|
| 1. generally, a certain phone number _uses_ signal
|
| (1) happens once, upon registration of your phone number. You
| don't see history of which phone numbers are communicating,
| do you?
|
| In other words, you don't need Signal to buy someone's
| location history. You just need their phone number and Signal
| doesn't particularly provide that to you.
|
| Per my understanding, Signal provides subpoena/nation-state
| resistant level security and _is_ in fact used by many people
| with high security needs.
| andrepd wrote:
| I agree that Signal does have several questionable design
| decisions, but that's not one of them. You can get a sim,
| register with it, and take it back out. There, no location.
| Or even better, you can simply get a voip number.
|
| Bottom-line, despite Signal's issues it is still the #1 IM
| app that I recommend to "normal people" seeking to have
| private conversations. No, it's not perfect, yes, it's a
| massive improvement over
| facebook/instagram/whatsapp/telegram/etc.
| nr2x wrote:
| You need a government issued ID to get mobile service many
| places. You can't just get a SIM in the same way you get a
| burner email address.
| lrvick wrote:
| You can not buy a sim without KYC in almost all countries.
| Also most users will not realize these consequences and
| will just assume the defaults on Signal protect them with
| their every day phone number and SIM.
|
| Also facebook/instagram/whatsapp/telegram/etc are not
| trying to advertise themselves for the high risk use cases
| Signal is actively promoted for. I obviously do not
| recommend anyone use those either, regardless.
|
| Matrix is all I suggest for most people.
| pseudo0 wrote:
| > You can not buy a sim without KYC in almost all
| countries.
|
| I'd be curious to see stats on this. At least in the US,
| it is very easy to buy a SIM and sign up for a pre-paid
| plan with zero KYC.
| lrvick wrote:
| The US is actually the only exception I am aware of world
| wide which gives us a distorted view of this problem.
| jrockway wrote:
| > still the #1 IM app that I recommend to "normal people"
|
| What app do you recommend to HN types? (I'm getting ready
| to switch messaging platforms. All my friends use iMessage
| and I'm so tired of typing on my phone at them. They can be
| lured over to something else with the promise of
| encryption.)
| lrvick wrote:
| Try Element.
|
| Effectively the same crypto as Signal but you can be
| anonymous as needed. Also decentralized with many app
| options.
| mxuribe wrote:
| And if Element is not the desired application to use
| matrix, then there are plenty of others, and available
| across many device and OS platforms:
| https://matrix.org/clients/
|
| ...Of course, Element remains the oldest and likely still
| most feature-full app.
| jrockway wrote:
| I haven't really looked into Matrix. I appreciate the
| nudge!
| nobody9999 wrote:
| >I refuse to use or recommend Signal due to blatantly bad
| design choices that put people that need privacy most at risk
| like security researchers, journalists, abortion seekers, or
| dissidents.
|
| I understand your concerns, and if I was a security
| researcher, journalist, abortion seeker or dissident, I
| wouldn't use Signal either.
|
| But, like the vast majority of us, I am not any of those
| things. As such, for _my_ (and most others) use case, Signal
| is great.
|
| For those at risk from highly motivated and/or state-level
| actors, Signal isn't nearly enough. Nor, unless you _build_
| and run your own servers and clients ( _and_ never screw up
| your OpSec), is Matrix.
|
| Signal isn't perfect. However, for _most_ people, it 's _good
| enough_.
|
| Don't make perfect the enemy of the good. Because perfect
| doesn't exist.
| lrvick wrote:
| Those of us that do not need high privacy today might need
| it tomorrow, or maybe someone we frequently communicate
| with.
|
| We also have a responsibility to favor tools and practices
| that make those that really need privacy not stand out.
|
| Element or other Matrix clients are easy to use and lack
| the serious flaws I outlined for Signal.
| nobody9999 wrote:
| I'd point out that for _most_ people (I suppose that
| could change, and I wouldn 't be upset if such changes
| resulted in better privacy), messaging is _often_ phone-
| based and includes folks who use secure methods like
| Signal and Matrix _as well as_ those who use iMessage and
| OEM SMS clients.
|
| When it comes to that sort of messaging ("I'm running a
| few minutes late and will meet you inside the
| restaurant," or similar) I don't (and won't) separate
| those out. I just use Signal for all such messages.
|
| Which makes for inconvenience when (especially iPhone
| users) install Signal _and_ still use iMessage.
|
| I'd add that if I have something to discuss that I don't
| want recorded (don't forget that it's not just _your_
| device that puts you at risk, anyone who 's _received_
| such messages do so as well), I 'll use encrypted voice
| calls (with the assumption -- valid or not -- that the
| other participant(s) aren't recording that conversation)
| with Signal or Matrix.
|
| In both my personal and professional life, I've always
| made sure to only put _in writing_ that which I wouldn 't
| care if it was shared with the world.
|
| Which is no different than it's ever been. I'm not sure
| why _anyone_ thinks this is a _new_ thing or that somehow
| "technology" obviates the need for good OpSec. It never
| did and still doesn't.
| exyi wrote:
| With Matrix you can use F-Droid build of the client. And
| you don't really need to trust the server too much, right?
| Maybe it's not enough for Snowden, but it's better.
|
| I'm not saying "don't use Signal", in fact I still
| recommend it to non technical people, since it's just much
| simpler. But pointing at the flaws is a necessary
| requirement for them to be fixed
| MayeulC wrote:
| I like Matrix, but I admit its E2 EE rooms seem to leak
| more metadata (users in th room, reactionss, _maybe_
| replies, display names, avatars) than Signal.
| nobody9999 wrote:
| >With Matrix you can use F-Droid build of the client. And
| you don't really need to trust the server too much,
| right? Maybe it's not enough for Snowden, but it's
| better.
|
| And why should I trust F-Droid's build of Matrix over
| Signal's build of Signal?
|
| Please understand, I agree with your point. Mine was
| orthogonal: If you are under threat from motivated and/or
| state-level actors, using _someone else 's servers_ (or
| clients, for that matter) is a bad idea.
|
| And that includes Matrix. I run my own Matrix server and
| the users of that server can interact (especially via
| voice/video) without any fear of being intercepted --
| even by me.
|
| What's more, _I_ can 't decrypt the conversations folks
| have in Matrix "rooms" that don't include me without a
| long process of brute-forcing.
|
| No one is coming to my house to confiscate my server.
| That scenario is much more likely with a
| public/commercial service hosted at a data center/cloud
| provider.
|
| So yes, Matrix is likely _more_ secure than Signal _if,
| and only if_ you build and install your servers and
| clients from source with a compiler /linker you built
| yourself using a _trusted_ tool chain on hardware whose
| components you 've _personally_ confirmed to be free of
| compromise[0].
|
| [0] https://users.ece.cmu.edu/~ganger/712.fall02/papers/p
| 761-tho...
| ptsneves wrote:
| Aren't the apps reproducible? Meaning, if the open source
| part does not match the binaries the it could be a canary for
| compromise.
|
| Mind you, the last time I looked there are not alternate
| implementations of the signal protocol and even the usage of
| libsignald was frustrating due to continuous backwards
| compatibility breakage. I would love for a proper libpurple
| implementation.
| awinter-py wrote:
| the recommended fix here is to add a PIN + enable registration
| lock
|
| IIRC signal PIN was very controversial back in the day because
| they were 1) forcing users to do it and 2) forcing them to opt in
| to some data collection as part of creating a PIN. Signal backed
| down on requiring a PIN, but now it's unclear from their settings
| page whether setting a PIN will share data as well. The marketing
| copy on my droid device says:
|
| > PINs keep information stored with signal encrypted so only you
| can access it. your profile, settings and contacts will restore
| when you reinstall
|
| For a company with relatively good cred, this is a shady way of
| saying 'we will upload your contacts and encrypt it with a short
| string'
| maerF0x0 wrote:
| What I want to know is what is ultimate aim that comes from the
| overlapping targets of Twilio, 1900 (and 3 specific) Signal
| users, and CloudFlare? What product on cloudflare and their 1900
| (and 3) users were being targeted?
| atemerev wrote:
| I absolutely do not understand why I have to link my very
| sensitive Signal account to a very insecure and hard to change
| ID: my phone number (which can be traced to my identity in too
| many ways).
|
| Why Signal does not allow fully anonymous IDs (like Threema does)
| is a mystery to me.
|
| Signal is fine for most users, but it is inherently _unsafe_ for
| high-value sensitive communications where participants can expect
| targeted phishing attacks.
| brewdad wrote:
| Anonymity isn't part of Signal's risk model. If you need to
| stay anonymous, then there are more suitable options.
| baby wrote:
| It's not about that, it's pretty much the same as using a
| dynamic IP to authenticate you
| atemerev wrote:
| It is not about being anonymous (though this also could be
| nice in some situations), it is about identity theft and
| credentials theft. There are numerous ways to steal my phone
| number and then impersonate me on Signal. For me, it is not a
| big deal (though a dedicated hater can probably ruin my life
| with that). For many people in sensitive positions, this is
| literally a matter of life and death.
| tapoxi wrote:
| On average, stealing a phone number is much more difficult
| than stealing someone's password, because of the frequency
| of password reuse and data breaches.
|
| If someone were to do that, it would be blocked by
| registration lock (which it prompts you to do). If they
| were to guess that, all your contacts would be notified
| that your identity has changed.
| soziawa wrote:
| So that means for the duration of the attack active contacts and
| groups were exposed?
| [deleted]
| cowsup wrote:
| No. When you sign up to Signal, they send you a text message
| verification code. This is done via a service called "Twilio."
|
| The attackers were able to view outgoing Twilio messages, so
| they could enter your number on the registration screen, read
| the code that Twilio sent to you, then use that cod to complete
| the sign up process.
|
| Attackers were not able to view information about your current
| Signal account (if present) through this SMS service.
| LtWorf wrote:
| But attackers are able to impersonate you to your contacts
| no?
| tialaramex wrote:
| _If_ an attacker successfully registered your phone number,
| and _if_ your contact either never checked the Safety
| Number for your conversation+ or they ignore the fact that
| Signal says the number has changed, then yes, the attacker
| would be able to impersonate you to that contact.
|
| Twilio says 1900 Signal users are potentially affected
| (attackers saw the confirmation code or could have seen the
| confirmation code) so Signal disabled those accounts
| pending re-registration.
|
| + For any particular conversation pair, Signal has a large
| unique number it calls a Safety Number calculated from the
| long term cryptographic identities, if you got a new phone
| (or if I'm _pretending_ to be you on a new phone), messages
| you send will have a different number because the new phone
| won 't know the old phone's cryptographic keys. The phone
| app can display the number (so you can compare them) or
| scan a QR code from another phone to check they match
| without the boring work of comparing numbers.
| sethjr5rtfgh wrote:
| Varqu wrote:
| Is it not possible to use an Authenticator app for Signal, given
| their privacy setup?
| LtWorf wrote:
| They want to know the phone numbers. Despite everyone telling
| moxie that it's not secure from the day signal was started.
| pr0zac wrote:
| Signal uses phone numbers as (the only) unique identifier in
| their system currently so SMS (or phone call) is necessary to
| verify the device owns the number.
|
| They've been talking about moving away from phone numbers as
| identifiers for a while and have implemented features like
| account pins that head in that direction but it hasn't happened
| yet.
| alldayeveryday wrote:
| This incident points to something much more severe. What role was
| this employee(s) whose credentials were compromised? How did
| these credentials allow even an employee to get plain text auth
| codes being sent out to end users? Such a permission should be
| extremely limited in who it is granted to.
| jefftk wrote:
| I suspect many Twilio support reps need access to outgoing SMS,
| because manually looking over those will be an important
| component of handling a "someone is using your service for
| spamming" complaint.
| alldayeveryday wrote:
| I disagree. They would not need to access the full contents
| of outgoing SMS to perform this duty. For example they could
| see the auth codes masked.
| jefftk wrote:
| How would Twilio know what portion of the outgoing SMS was
| auth codes?
|
| Are you proposing they add an API where senders can
| annotate part of their message as private? (Not a bad
| idea...)
| dethi wrote:
| Totally agree. The message content should be private and not
| accessible by employees. Kind of scary when you think that so
| many 2FA codes are sent via Twilio.
| mikece wrote:
| "...Twilio, the company that provides Signal with phone number
| verification services..."
|
| Perhaps this is why Twilio (and Twilio-issued) VoIP numbers work
| so well for Signal when I don't want to use the number issued by
| my cellular carrier? Kinda hard to SIM-swap me if you don't know
| my real phone number.
| giancarlostoro wrote:
| Do they offer numbers you can use that way or do you just use
| their APIs with a minimal app?
| mikece wrote:
| I spin up a number and verify it can receive SMS (which it
| forwards to me via email). Since I need receive-only, this is
| fine. No need to futz with APIs or apps.
| bubblethink wrote:
| A lot of banks refuse to send SMSes to voip numbers. Google
| voice runs into this, and presumably twilio too.
| estiven2006 wrote:
| COGlory wrote:
| Hey look, the centralized nature of Signal has come back to bite
| it. Who could ever have predicted?
|
| Edit to try and make this less snarky and more productive: In n
| number of threads regarding Matrix, and the Matrix vs Signal
| controversy[0][1], the point is that Signal took a single, simple
| approach and tried to proclaim it was necessary because it allows
| them to be agile and responsive and integrate new features or
| security standards and not be weighted down. However, they jumped
| in and married themselves to a centralized service to do that,
| and have just essentially ignored that the centralized service
| may have faults. Whereas Matrix, while slower to evolve, is setup
| to be far more resilient to a single point of failure.
|
| [0] https://signal.org/blog/the-ecosystem-is-moving/
|
| [1] https://matrix.org/blog/2020/01/02/on-privacy-versus-freedom
| [deleted]
| berry_sortoro wrote:
| I was always opposed to this STUPID requirement that you NEED to
| tie Signal to your phone number. I never used it and AFAIK even
| for the Desktop client you need to have a phone number. Wire and
| other messengers that use the same or similar protocols do not
| have this issue.
|
| I am not the only one hating this, from the very start this was a
| huge critique on Signal by many people, but they never changed. I
| did not even know they used a 3rd party service to "verify" phone
| numbers. This is a HUGE issue. People who truly want to stay
| PRIVATE, politically hunted people who are in life and death
| situations should never ever use Signal.
| [deleted]
| beefee wrote:
| Please, stop using phone numbers. There is no reliable way to
| hold a phone number. The messaging protocols are insecure. If
| your service uses phone numbers or SMS, that means it's not
| secure or reliable.
| registeredcorn wrote:
| Not only that, I don't want any service that I use tied to a
| phone number. Partially for the reasons you listed, but also
| because there are better alternatives; email, authenticator
| apps, physical keys, cards, etc.
|
| I hate looking at my phone. I hate using my phone. I don't want
| to have even more reasons to keep my phone charged and in my
| hand. Phones suck.
| spurgu wrote:
| The Signal desktop app doesn't require your phone to be
| turned on (once it's been "paired") by the way, as opposed to
| for example Whatsapp.
| LtWorf wrote:
| By desktop app you mean a website cosplaying as a desktop
| app.
| spurgu wrote:
| I didn't know Signal had a web app.
|
| Yes, I'm aware the desktop app is made with Electron. So
| what? I keep it running almost all the time and I've
| never had any performance issues with it.
| LtWorf wrote:
| Well, it's slow, it has awful accessibility, you can't
| create accounts from a computer, you will have
| performance issues with it if you need the resources for
| something else.
|
| Also it won't work on linux phones.
| brewdad wrote:
| Whatsapp has finally gotten away from requiring your phone
| to be on. It works the same way as the Signal app now.
| spurgu wrote:
| Ah, didn't know that, but you're right (just tried it)!
| That's cool. I ran into this issue around a year ago when
| my phone broke.
| marshray wrote:
| What identity token would you prefer?
|
| Would it be bound to the mobile device in any way?
|
| Would it require that a canonical list of registered identities
| be stored server-side?
|
| How would you impose a cost on spam accounts without burdening
| users?
|
| Just a few considerations.
| thankful69 wrote:
| If they (Signal) care about privacy, they need to drop the need
| for phone numbers to use their service, there are many ways of
| dealing with spam (rate limiting, captchas, ...), a true
| private/secure messenger app should not require any user
| identifiable info. And the argument of "Signal was the first e2ee
| messenger app to go mainstream, so they can keep ignoring user's
| privacy, .... yada yada..." is naive at best; they should lead by
| example, right now there are many solutions way more private
| (Briar, SimpleX, Session, Wickr, ....). I user Signal, and I like
| it, is just a shame they soft-refuse("We are working on it...")
| to remove phone numbers from the equation.
| [deleted]
| cookiengineer wrote:
| I second recommending Briar as a messenger. Codebase is well
| maintained. Specifications and documents are audited, as well
| as the official clients.
| timbit42 wrote:
| Briar is nice but I wish it worked through Tor like Tox does.
| emmelaich wrote:
| The phishing against Twilio looks very much like the attempt on
| Cloudflare.
|
| I wonder how many other companies have been successfully phished
| that we don't know about.
| dont__panic wrote:
| It certainly makes me wonder: it Twilio _more_ competent
| because they noticed a phishing success? I assume there are
| plenty of people at every company who could fall for social
| engineering scams, so I think it 's safe to say most companies
| aren't totally safe -- they just haven't noticed a major
| breach.
| tialaramex wrote:
| Yes, Cloudflare pointed out the similarity and they suppose
| it's a single attacker with multiple targets.
|
| If you have access (I don't in my current role and I don't care
| enough to spend money to do this on my own account) you can ask
| a passive DNS system about names like cloudflare-okta.com that
| were used in the Cloudflare attack, identify patterns (same
| registrar, same hosting, that sort of thing) and also the IP
| addresses Cloudflare listed.
|
| You should probably assume that anywhere which doesn't actually
| have FIDO or similar and was actively targeted is screwed,
| because it only takes one lapse to let the bad guys in.
| throwawaycuriou wrote:
| Unrelated to this current focus on the weakness of using a phone
| number as a contact reference, I have experienced a different
| problem. If a user has ever used Signal previously and for
| whatever reason revert back to SMS, if another user sends them a
| message via Signal it is lost with no indication. This is a
| growing problem as people try out Signal and when they get a new
| phone just forget or don't care to install it again.
| [deleted]
| scottydelta wrote:
| Cloudflare was attacked with the same attack but they were able
| to prevent any harm since they use hardware keys for 2-step
| instead of OTP.
|
| This is a very simple phishing attack and I am surprised it has
| proven to be effective.
| jtdev wrote:
| jamespwilliams wrote:
| The attack Twilio suffered is almost identical to the recent
| attack against Cloudflare:
| https://blog.cloudflare.com/2022-07-sms-phishing-attacks/ (even
| down the wording of the text messages, which are nearly
| identical). Cloudflare's use of security keys prevented the
| attackers getting access to any accounts in that case.
|
| These attacks are sophisticated and are capable of bypassing TOTP
| or mobile-app-based MFA. If this is widespread, I'd be surprised
| if we didn't see a massive influx of breaches soon. The vast
| majority of companies are not well defended against this.
| maerF0x0 wrote:
| to be clear they are not able to "bypass" TOTP or mobile-app-
| based MFA in the way security folks think of that term. They
| were able to bypass humans[1], which are often the weakest link
| in security related matters.
|
| [1]: "Twilio became aware of unauthorized access to information
| related to a limited number of Twilio customer accounts through
| a sophisticated social engineering attack designed to steal
| employee credentials. This broad based attack against our
| employee base succeeded in fooling some employees into
| providing their credentials. The attackers then used the stolen
| credentials to gain access to some of our internal systems,
| where they were able to access certain customer data. "
| https://www.twilio.com/blog/august-2022-social-engineering-a...
| tialaramex wrote:
| I would consider myself "security folks" and while maybe I
| wouldn't choose the word "bypass" the effect is that TOTP is
| basically useless against phishing and always was, and I
| don't object to that word from lay people.
|
| At Cloudflare, or Google, or several other places that took
| this seriously, "fooling some employees into providing their
| credentials" doesn't get you anywhere. With WebAuthn your
| employees don't _have_ a way to give bad guys credentials the
| bad guys can use - no matter how badly they were fooled.
|
| TOTP is effective against credential stuffing, but it does
| nothing for phishing.
| saurik wrote:
| The way that Cloudflare attack was working out sounds similar
| in nature to the way MailChimp was attacked a few months ago:
|
| https://www.bleepingcomputer.com/news/security/hackers-breac...
| hn_throwaway_99 wrote:
| > These attacks are sophisticated and are capable of bypassing
| TOTP or mobile-app-based MFA.
|
| To be honest, I wish people would stop parroting that these
| attacks were "sophisticated". In my opinion, I'd call something
| like Pegasus spyware "sophisticated". I don't think these
| attacks were that sophisticated at all - they were just
| standard issue, MITM attacks using targeted text messages - and
| they just took advantage of what is always the weakest link in
| security: people. I think of myself as a general middle-of-the-
| road software developer but I think I could have easily
| replicated this attack myself.
| somehnguy wrote:
| I work for a very large retailer in the US and these examples
| look very similar to one's sent out by our security team
| recently. Must be making the rounds :/
| AdmiralAsshat wrote:
| Maybe this will make Signal re-think their hard requirement of a
| phone number to register for Signal.
|
| ...eh, who am I kidding?
| dylan604 wrote:
| which is a curious thing to me as the phone number i created a
| Signal account with is no longer my phone number. what happens
| if the person currently assigned that number tries to join
| Signal and what happens to me if they do?
| jraph wrote:
| It's unsafe. They could lock you out from your Signal account
| and impersonate you. Someone who does not know you changed
| your number, forgot about it or does not think about this
| could then send a message to the person who has your old
| phone number thinking it's you at the other end. Most people
| don't bother with the warning about the security number
| having changed. I also personally assume that the phone
| number of a contact in Signal is theirs and that I can send
| an SMS or call them with this phone number. This not being
| true is at best confusing even if you are not locked out.
|
| It'd also be nice to let the current user of your old number
| to be able to join Signal without issues.
|
| I'd say don't play with fire and migrate as soon as possible
| before having issues.
|
| You probably could follow https://support.signal.org/hc/en-
| us/articles/360007062012-Ch...
|
| Then the people with who you discussed on Signal will be
| notified of your number change. If you can't follow this,
| better decide to handle it the best way you can instead of
| having to deal with this when you have lost access.
| dylan604 wrote:
| Thanks for the link. I have been way too lazy after having
| the new number for nearly a year. I just couldn't not do it
| after someone on the internet did the heavy lifting for me.
| jraph wrote:
| Happy to know I had a (hopefully) positive impact :-)
| dylan604 wrote:
| Yes, you get your Good Deed For the Day merit badge!
| zatertip wrote:
| They'll create a new key and your contacts will be notified
| that your key has changed.
| input_sh wrote:
| They mention it, use registration lock:
| https://support.signal.org/hc/en-
| us/articles/360007059792-Si...
|
| Basically if someone tries to register to Signal with your
| phone number they'll need to enter that PIN Signal
| consistently reminds you of.
| CogitoCogito wrote:
| So if you have a phone number that someone else used to
| create an account, you can't use Signal?
| registeredcorn wrote:
| Close, but not quite.
|
| You see, if the other person didn't use registration
| lock, now you've got access to complete strangers
| account.
|
| Problem solved!
| Spivak wrote:
| Which is less scary than it sounds because a signal
| "account" is a phone number. Oh no your privacy!
|
| If you view Signal as "a service that allows you to send
| E2E messages to phone numbers" then this is fine. Your
| friends will even get a message that says the chat has
| been rekeyed once the new person sets up Signal.
|
| And if you're worried about government's compelling your
| cell carrier to turn over your phone number then rest
| assured that usernames wouldn't help you since they could
| just compel Signal to turn over your username. So much
| safer.
|
| As long as the source of identity is something other than
| a private key that is owned and controlled by the user
| and devices must have their keys signed by that key to be
| considered valid it will be the same issue.
| 0x457 wrote:
| Not exactly. Registering again will make an entirely new
| identity with different key pair. The new phone holder
| won't get access to your contacts or your message
| history. I believe your contacts will also know about
| signature change as well.
| [deleted]
| godelski wrote:
| I think they are. I just also think the problem is a lot harder
| than people give it credit for. If they just go with a standard
| username (as in some form of a database lookup) then I'll be
| upset. But I'll be upset because this effectively doesn't solve
| any issue, and introduces others that have big privacy impacts
| and requires Signal to be a trusted source (which is
| antithetical to Signal's proposed mission).
|
| I do wish Signal would be more transparent though. Given that
| this is such a difficult problem and they've been struggling
| with it for years, it reasons that it is time to seek help.
| This is like when a student just studies for hours and hours on
| end, spinning their wheels. They aren't learning anything. At
| some point you need a tutor, help from a friend, or asking the
| professor for help. (Or analogy in work if that's better)
|
| Signal, it is time to be transparent.
| 3np wrote:
| Here's an idea that's been floated many times in the past:
| Add e-mail as an alternative alongside phone numbers. Support
| in contacts in both Android and iOS. The only real difference
| these days is that one requires government ID (by law in an
| increasing number of countries) and one does not.
|
| I fail to see any fundamental difficulty here.
| autoexec wrote:
| > I do wish Signal would be more transparent though.
|
| You mean like updating their privacy policy to explain that
| they are keeping sensitive user data in the cloud? They
| refuse. There are people in this very discussion who are (or
| were at least) unaware that Signal is collecting and
| permanently storing user data on their servers. Signal's
| communication on what they're collecting and how has been a
| total joke. I cannot consider them trustworthy and at this
| point I suspect that refusing to update their privacy policy
| is a giant dead canary intended to warn users away from their
| product.
| sneak wrote:
| Please stop spreading objectively inaccurate FUD in the
| thread.
| autoexec wrote:
| What part of what I said was inaccurate?
| godelski wrote:
| What user data is being stored in the cloud? Can they
| decrypt it?
| notpushkin wrote:
| Signal is a trusted source already - you trust them telling
| you which number is which user.
| godelski wrote:
| Sorry, let me clarify. We should trust Signal as little as
| possible. That's how the design should work. Zero trust is
| very hard to create but let's minimize it.
|
| Opening up usernames (in the conventional sense) you will
| end up needing to verify and act as an arbiter. This is due
| to the fact that certain usernames have real world meaning
| behind them and you don't want to create an ecosystem where
| it is really easy to honeypot whistleblowing platforms (how
| do you ensure that CNN gets {CNN, CNN-News, CNNNews,
| etc}?). They've suggested that this might be the case given
| that the "Notes to self" and "Signal" users are verified
| with a blue checkmark. The issue is that verifying users
| not only makes Signal a more trusted platform but the act
| of verification requires obtaining more information about
| the user. It also creates special users. All things I'm
| very against. I'd rather hand out my phone number than have
| Signal collecting this type of information. So yeah, it
| isn't completely trustless, but I certainly don't want to
| move in the direction of requiring more trust.
| throwaway290 wrote:
| Signal does not tell me which number is which user. I know
| which number is who myself. The most Signal does is
| presumably warn me when the key associated with the number
| changed (eg. new phone).
|
| And that's where I have to trust Signal, but as a protocol
| not a "trusted source" of information.
| greyface- wrote:
| You aren't supposed to trust Signal on that; you are
| supposed to verify it out-of-band using Safety Numbers.
| the_other wrote:
| dvzk wrote:
| It was because of over-represented complaints about phone
| number requirements that Signal implemented the mistake that is
| SGX and server-side contact lists. Now the social graph of
| millions of Signal users is instead centrally protected by
| Intel's attestation obfuscation and a weak 4-digit PIN. All to
| eventually support usernames, which normies won't use.
| ajmurmann wrote:
| Why are server-side contact lists needed to support
| identities not linked to phone numbers?
| dvzk wrote:
| Basically there are many options, none of them perfect:
|
| 1. Support phone number contact discovery, with persistence
| provided by the contacts provider. This is seamless and
| causes the least amount of complaints.
|
| 2. Support username discovery, with persistence provided by
| passphrase-encrypted online storage. This is painful and
| risks backlash from people losing access to data. Also, now
| the threat model must account for or ignore weak derived
| keys (which is probably most of them).
|
| - 2a. Enforce strong passphrase requirements. Many users
| will abandon the product.
|
| - 2b. Sync usernames between linked devices (using a
| generated key). Requires multiple devices, risks people
| losing data, more complaints.
|
| - 2c. Sync usernames using custom contacts provider fields
| (e.g. email). Nobody is accustomed to doing this, but it
| might work. Automatic discovery rates would be low.
| Possibly requires an odd workflow for people adding Signal
| contacts by their email/username.
| lxgr wrote:
| Signal (or, more accurately, one of its predecessors) used
| to use client-side private set intersection for contact
| discovery, but this scales poorly [1].
|
| Now they use a solution based on Intel SGX and server-side
| trusted computing [2].
|
| [1] https://signal.org/blog/contact-discovery/
|
| [2] https://signal.org/blog/private-contact-discovery/
| 0x457 wrote:
| Because right now, Signal can use your contact list on the
| device to get your signal contacts. If you replace a phone
| number with a username, there will be no way to match
| signal's user to your contact without:
|
| - Having a server to hold your contacts
|
| - Or having signal app to maintain contact list and sync it
| across devices
| colordrops wrote:
| Do "normies" care about high quality encryption? And did
| signal ever turn on username support?
| dvzk wrote:
| I think developers have a moral responsibility to make
| their products as secure as possible, within reason and
| while still being usable. It doesn't matter if the users
| care about the benefits. To your second question: no, not
| yet.
| colordrops wrote:
| I fully agree with you, and my first question was asked
| somewhat sarcastically. For the second, my implication is
| that developers also have a moral responsibility to make
| their products as private as possible, and SMS
| verification aint it.
| johnchristopher wrote:
| > All to eventually support usernames, which normies won't
| use.
|
| But brogrammers will. /s
| tialaramex wrote:
| > weak 4-digit PIN
|
| Are you sure that it makes sense to _require_ people to pick
| something longer and non-numeric for a PIN?
|
| Or is your claim (wrongly) that people don't have longer non-
| numeric PINs (lots of us do) ?
| dvzk wrote:
| Most people will choose weak passwords given the option.
| And so I think it's the responsibility of the developer to
| enforce strong requirements ( _edit: when dealing with data
| encryption susceptible to brute-force attacks_ ): entropy
| estimations, 128(+)-bit static keys, etc. If any user has
| chosen a weak passphrase, and still believes it to be
| secure, the developer has likely failed.
| tialaramex wrote:
| > when dealing with data encryption susceptible to brute-
| force attacks
|
| The "brute-force attacks" imagined here are a bad guy
| somehow controls Signal's systems, or else the US
| government seizes them and then decides to try to brute
| force them, right ?
|
| But these are attacks where for various rival systems it
| was already game over. So your assumption is that
| Signal's casual users, people who maybe were also
| considering Whatsapp or iMessage or something, should be
| _required_ to have a cryptographically strong passphrase
| so as to defeat this unlikely circumstance, as a minimum?
|
| Moxie's whole deal is that this stuff only works when
| it's for everybody. If there are a five people in your
| country who use Signal, guess what, the Secret Police can
| round them up as suspected terrorists and execute them.
| Were they planning to bomb the President For Life? Or
| just organising a pizza party? Don't care, it's just good
| policy. But if there are five _million_ people who use
| Signal that 's a different matter.
|
| Even if all five million _are_ terrorists, that 's
| numbers where you're going to have to tear up your "no
| negotiating with terrorists" policy, 'cos there are just
| too many of them.
| daneel_w wrote:
| I've been complaining about the glaring privacy/integrity
| problem in their SMS-based account verification scheme for
| years. I don't think any snafu can make them reconsider. It
| would forfeit the valuable social network mapping they've
| already poured millions of dollars into through sending
| verification SMSes.
| tptacek wrote:
| It's not so much "valuable social network mapping" as it is
| "the only social network available to Signal", by design.
| Without phone numbers, they can't use clientside contact
| lists (they can build their own, of course, but if it's
| strictly clientside it won't sync, and so it won't work for
| most of their users). The alternative design, which HN would
| _wildly_ prefer, admits to usernames or email address
| accounts, but requires Signal to keep a database of contacts
| serverside. That 's untenable for Signal's threat model.
| daneel_w wrote:
| The sane alternative is that it could keep a client-side
| contact book that users would be responsible for managing
| entirely on their own, including when setting the app up on
| a new phone.
|
| Addendum: also, there is nothing preventing this type of
| contact book data from being backed-up/synced to a new
| phone, like any other data and settings of any other app.
| iOS has this feature since like 7 years now. Android, too,
| I'm sure.
| tptacek wrote:
| That's a good way to build a secure messaging app nobody
| ever uses.
| daneel_w wrote:
| It may very well be the case for the smartphone-flipping
| demographic that prefer WhatsApp and TikTok, but I think
| it's a misunderstanding/misrepresentation of the crowd
| that go for e.g. Signal and Telegram.
| detaro wrote:
| Signals very explicit goal is making something that's
| usable for everybody. If they'd wanted to make a nerd-
| messenger they'd make a different product.
|
| (I still consider it a major downside that the phone
| number is the only lookup key)
| daneel_w wrote:
| Sounds pretty drastic to me to draw the line between
| "everybody" and "nerds" at the point of the contact book
| being available or not.
| detaro wrote:
| Feel free to insert your word choice of "smartphone-
| flipping demographic" instead. The point is that if you
| argue based on "the crowd that goes for signal", that
| crowd being everyone is the clear aim of Signal, and thus
| they design around that goal.
| joshuamorton wrote:
| The majority of my signal contacts aren't particularly
| tech-literate. The crowd that go for signal and the crowd
| that go for telegram are _different_ crowds, in a large
| part because signal designed itself to be accessible to
| nonexperts.
| daneel_w wrote:
| So they are tech-literate enough to use a smartphone, and
| apps for it, and they are tech-literate enough to type in
| their Signal password reminder in a hidden text field
| (and probably also passwords on dozens of web pages
| because password/keychain apps are "hard") but typing in
| e.g. an anonymized "user token" to add a buddy would be
| too "tech" for them? I refuse to believe a word of what
| you're saying.
| joshuamorton wrote:
| > anonymized "user token" to add a buddy would be too
| "tech" for them? I refuse to believe a word of what
| you're saying.
|
| How do you transmit said anonymous user token securely?
| Using the secure messaging app you're already using?
| Meeting up in real life? Posting it on keybase? Each of
| these has downsides that are all solved by a phone
| number.
| daneel_w wrote:
| I don't see why the user token ("account name") has to be
| secret in every conceivable way. It just needs to be
| anonymous. What's wrong with meeting in real life, or
| exchanging account names in whatever way you initially
| exchanged phone numbers? You don't seem concerned over
| the security problem of _account activation codes being
| sent over SMS_ , so I don't see why you should be
| concerned over exchanging anonymous account names in the
| same or more secure ways.
| roughly wrote:
| Large swaths of my non-tech social group are on Signal
| because it offers a large amount of practical security
| and privacy without the kinds of sacrifices people seem
| to assume signal users want to take. Signal has
| successfully and dramatically increased the number of
| people enjoying privacy in their communications by not
| making that assumption.
| atemerev wrote:
| That anonymous IDs can be _optional_, in addition to
| phone numbers. Again, like Threema is already doing.
| 8ytecoder wrote:
| Signal still assumes you manage your contacts no? It
| happens via iCloud or a google account today.
| potatototoo99 wrote:
| Why would contacts need to be saved though? If I change my
| phone, I expect my contacts will be copied in some manner,
| and they'll be available on the new phone. Why would a
| messaging app need a persistent social network of contacts?
| 3np wrote:
| I feel we've had this conversation before. IIRC this is
| where we left off:
|
| E-mail as a complement should work fine and supported in
| all contact lists. It wouldn't change a thing wrt what
| you're describing.
| teraflop wrote:
| After countless discussions of Signal on HN, I have yet to
| see an explanation for why Signal can use phone numbers
| from a client-side contact list, but not email addresses
| from a client-side contact list. Surely, in either case the
| identifier can be treated as an opaque string, right?
|
| Or in other words: suppose the definition of "phone number"
| was expanded to include alphanumeric characters and @. What
| aspect of Signal's current design would break? By saying
| "without phone numbers, they can't use clientside contact
| lists", you seem to be suggesting that _something_ would
| break, but I can 't imagine how, unless it's as trivial as
| a database constraint that says "this field must contain
| only digits".
| spullara wrote:
| People don't have email addresses in their contact lists
| because all their email contacts are stored on Google
| servers.
| xorcist wrote:
| Having the ability to identify by other strings than
| "phone number" wouldn't take away any functionality, just
| add it. It would be possible to communicate with devices
| that have email but not phone numbers (children without
| SIM cards, for example).
|
| But this is all moot because phone numbers aren't just
| opaque binary strings. They are more useful than other
| forms of identification.
| spurgu wrote:
| > I have yet to see an explanation for why Signal can use
| phone numbers from a client-side contact list, but not
| email addresses from a client-side contact list
|
| OS functionality? There no "Grant access to Gmail
| contacts" (= email addresses) on Android/iOS, so that
| client-side list would have to be manually maintained,
| while (practically) everyone already has a contact list
| containing phone numbers of their friends.
|
| That said I don't see why a user would _have to_ have a
| stored social network at all, why can 't it simply be
| opt-out?
| teraflop wrote:
| At least on Android, any application that I give
| permission to access my contact list can see email
| addresses for the client-side contacts that are synced
| with my Gmail account. Maybe iOS behaves differently?
| tablespoon wrote:
| > Or in other words: suppose the definition of "phone
| number" was expanded to include alphanumeric characters
| and @. What aspect of Signal's current design would
| break?
|
| I feel like you're mis-analyzing a social problem or some
| other design goal as a low-level technical problem.
|
| I don't know their real reason, but I can say that my
| email contact list is _waaay_ messier and less curated
| than my phone contact list. It would probably be annoying
| is I 'd get a "So-and-so joined Signal!" notification for
| a bunch of randos I've emailed once and had their email
| auto-added to my address book.
| veeti wrote:
| Of course there is no real reason Signal should be
| spamming anybody with these crap "notifications" in the
| first place. Who wants to wake up at 1 AM to know some
| dude they texted years ago is now on Signal?
| teraflop wrote:
| That seems like a problem that could easily be solved by
| sending fewer notifications. Do I really need to know if
| somebody has joined Signal until I actually want to talk
| to them? Isn't it _better_ to have a larger pool of
| people with whom I can communicate securely using Signal?
|
| I'm mostly just confused because this is being
| _presented_ as a technical limitation: using email
| addresses would supposedly "require Signal to keep a
| database of contacts serverside". I don't understand how
| or why that's true.
| tablespoon wrote:
| > That seems like a problem that could easily be solved
| by sending fewer notifications. Do I really need to know
| if somebody has joined Signal until I actually want to
| talk to them?
|
| I don't know what the real reason is, what I said was
| just something that popped into my head. Another comment
| mentioned spam-prevention as a reason (by making it
| infeasibly expensive), and that actually makes more
| sense. Honestly, there probably isn't just one reason,
| but a cluster of tradeoffs.
|
| > Isn't it better to have a larger pool of people with
| whom I can communicate securely using Signal?
|
| IMHO, the number people who care deeply enough about the
| phone number thing to boycott Signal is vanishingly
| small; not even a rounding error. Sure they're loud on HN
| or maybe even Twitter, but giving tiny but loud
| minorities whatever they demand is bad policy.
| teraflop wrote:
| Sorry, just to clarify: I didn't mean "it's better if
| people who don't want to give out their phone number can
| use Signal" (although I happen to think that's also
| true).
|
| What I meant was: "it's better if I can use Signal to
| communicate with people even if all I know is their email
| address".
| verall wrote:
| Noone wants those messages
| stormbrew wrote:
| > I'd get a "So-and-so joined Signal!" notification for a
| bunch of randos
|
| One of the reasons I wish I could use something other
| than my phone number and access to my contact list to
| work with signal is these notifications creep me the fuck
| out and I would rather never get them, or have anyone get
| them about me.
|
| I'm fine with it being a feature for people who want it,
| but I don't. I want to make my own damn choices about who
| I talk to through it.
| daneel_w wrote:
| _> "It's not so much "valuable social network mapping" as
| it is "the only social network available to Signal", by
| design."_
|
| I don't understand why you state this, when you obvioulsy
| know that data is connectable and joinable across discrete
| sources. Being "the only social network available to
| service X" is the inherent case for every single online
| service on the entire planet when viewed as an isolated
| entity. But this isn't a case of anonymized UUIDs. It's a
| case of personal phone numbers.
| [deleted]
| xorcist wrote:
| That doesn't follow. A client side contact list does not
| need to consist of phone numbers.
|
| Apart from using the device contact list (which contains
| email addresses as well as phone numbers) the client can
| also keep a private contact list.
| [deleted]
| colinmhayes wrote:
| How do you deal with spam without requiring a phone number to
| register?
| ssizn wrote:
| rt4mn wrote:
| Give people the option to pay. I would gladly pay $100 one
| time fee if it meant I could avoid having a phone number
| associated. https://jmp.chat is a great work around but I
| would rather just have an email address or ideally _nothing_
| but a receipt directly associated with my signal account.
| croes wrote:
| Isn't email even worse for security?
| eitland wrote:
| You can trivially create as many emails as you want,
| anonymously and for free.
|
| In many countries, registering phone numbers anonymously
| is illegal and/or impossible.
| thaumasiotes wrote:
| > You can trivially create as many emails as you want,
| anonymously and for free.
|
| Where? Gmail and hotmail both don't allow this.
| carlhjerpe wrote:
| Depends, if you're able to poison the DNS of the
| mailprovider / hack the recipient mailserver or do a
| phising attack.
|
| I just want to be able to communicate without sharing my
| phone number (since my phone number is bound to Swedish
| "Swish") meaning someone can get my ID from my phone
| number here.
|
| This is why drug dealers use Wickr, Threema and others,
| because they don't expose identity, not because they're
| "safer".
|
| I have a contact on Threema who I've met many times, but
| I have no idea how to contact him outside of Threema,
| because I don't know his identity and we'd both like to
| keep it that way.
| rakoo wrote:
| "security" is vast and means different things to
| different people.
|
| Email can absolutely be used for with e2e encryption
| keeping the content of exchanges from external eyes.
|
| Email can absolutely not be used for hiding metadata of
| who talks to who.
| jussion_zoonist wrote:
| capableweb wrote:
| One option could be to not be able to send unsolicited
| messages in the first place. Make it required for everyone to
| "accept interaction" before messages can actually be sent
| between two parties. Add in rate limiting so you can only
| have N open "invitations" and spamming should be very
| limited.
| Aachen wrote:
| I don't see the difference between some Isabelle showing up
| as someone to accept or deny, or some Isabelle with a "I'm
| a hot single in your area, click this link" message so I'm
| sure it's spam.
|
| On Telegram this is rampant, on average probably one person
| per day. It shifted from e-gold scams to sex since a few
| months, but both are still present. People that aren't in
| big groups (where the spammers scrape user IDs) have zero
| problems, so the trick is revealing your random identifier
| only to those you want to contact you. Phone number
| identifiers are the antithesis to spam protection: we keep
| our ranges just full enough that we can't shorten it by a
| digit, but empty enough that we have small growth
| possibilities. You're very likely to hit a subscriber, by
| design, by trying random numbers.
| freedomben wrote:
| > _Add in rate limiting so you can only have N open
| "invitations" and spamming should be very limited._
|
| That also sounds like a good way to limit adoption as well,
| at least for anyone with more the N contacts, particularly
| >= 2N as that means likely a minimum waiting period before
| you can transfer over "more" contacts since some people
| will never accept/reject the invite because they don't use
| the app much.
|
| If it were me and I had to wait on others to accept or
| reject my invite before I can continue transferring
| contacts, I'm gonna move on.
| soziawa wrote:
| Threema seems to manage just well. I guess payment is the
| natural limiter for spam there.
| uoaei wrote:
| And accessibility.
| giancarlostoro wrote:
| Who is dealing with the spam SMS messages I get all year
| round? Phone numbers do not stop spam, they are used to
| distribute it. I know I received no spam when I had Google
| Talk... It is a solvable problem that I guess benefits nobody
| in power to solve.
| colinmhayes wrote:
| Spam is one of the major reasons people don't use sms. "The
| product Were replacing sucks so our product can suck too"
| Aachen wrote:
| I don't use SMS because I can't download an open source
| client to use on desktop, send pictures or other files,
| edit messages, encrypt conversations, share a live
| location, hold a poll, have group chats with some
| semblance of scale, it just doesn't work for more than
| receiving an occasional message as last resort.
|
| Spam via sms doesn't seem to really exist here, maybe two
| per year now, up from zero until three years ago.
| tptacek wrote:
| Signal doesn't ask for phone numbers simply to combat spam;
| the phone number isn't an elaborate captcha. Rather, as this
| article repeatedly points out, Signal doesn't keep your
| contact lists and other data available to its servers. It
| uses phone numbers because phones already have contact lists,
| stored clientside, keyed by those numbers.
|
| To replace the numbers with usernames, Signal users would
| have to either give up contact lists altogether (at which
| point nobody would use the service), or allow Signal to keep
| a serverside database of contacts ready at all times for
| users who log in. This is what other messaging services do,
| and the result is that the servers have a plaintext log of
| who talks to who on their service, which is the most valuable
| information a secure messaging service can make available to
| a state-level adversary.
| mercutio2 wrote:
| Are you basing your argument on assumptions that:
| A) Most contacts have phone numbers B) Most contacts
| don't have email addresses?
|
| I think you're assuming this (and as it happens, I agree,
| although the number of email-only contacts is still nonzero
| for a lot of people).
|
| Are you also assuming that it just adds a lot of complexity
| to be willing to search by _both_ phone numbers _and_
| emails?
|
| That's the part of your argument I'm not grasping. Signal
| has to be willing to intersect known-account-identifiers
| with this-device's-local-contact-handles, what's the
| problem with preferring phone numbers but allowing emails?
| Aachen wrote:
| > or allow Signal to keep a serverside database of contacts
| ready at all times for users who log in. This is what other
| messaging services do, and the result is that the servers
| have a plaintext log of who talks to who on their service,
| which is the most valuable information a secure messaging
| service can make available
|
| Why couldn't they just keep using the current system,
| alongside usernames, for phone number contact discovery? Of
| course for those who opt into it; imo that's what it should
| be.
| daneel_w wrote:
| Users having to add their contacts each time they set
| Signal up on a new phone, should the app keep its own
| client-side contact book, doesn't sound like hassle.
|
| Could you please explain how Signal _does not_ have a
| social network map, when 1) user accounts are equal to
| mobile phone numbers, and 2) Signal servers route messages
| between user accounts.
| niel wrote:
| The Signal protocol has had "sealed sender" since 2018 -
| Signal server does not know who sent a message, because
| the sender's identity is E2E encrypted along with the
| message.
|
| Even if Signal's server saves a message (they claim not
| to, once downloaded), Signal's server by design has no
| way of knowing who sent the message.
| daneel_w wrote:
| Every inbound message is authenticated, and credentials
| are stored _somewhere_. Correct me if I am wrong, but I
| 'm betting that it's with the same credential/channel as
| for logging-in a user (aka "sender").
|
| Also, wasn't "sealed sender" broken (again) earlier this
| year by a group of researchers?
| niel wrote:
| No, sealed sender messages are not authenticated. The
| sender's client uploads two things: 1) an encrypted
| message (with sender id encrypted), and 2) a zero-
| knowledge proof that the sender's client knows the
| recipient's delivery token.
|
| There is no authentication by the sender, and the sender
| does not upload any credentials.
| daneel_w wrote:
| I guess I have to rephrase myself: the _API calls_ are
| authenticated, because the API endpoints will not consume
| anonymous requests. I 'd be glad if you could point me to
| documentation proving that the messaging API uses
| completely different credentials than those for user
| login, and that the two are also disassociated.
| niel wrote:
| This doesn't _prove_ anything, but:
|
| > Without authenticating, hand the encrypted envelope to
| the service along with the recipient's delivery token.
|
| Source: https://signal.org/blog/sealed-
| sender/#:~:text=Without%20aut...
|
| The sender's client sends a certificate derived from the
| _recipient 's_ profile key.
|
| This certificate is sent to the server as the header
| "Unidentified-Access-Key" - you can see how this header
| is derived from the Signal clients' source.
|
| So yes, these API calls are authenticated, but not using
| the sender's credentials in any way.
| melgafin wrote:
| Good luck finding documentation about the protocols and
| APIs used by signal. While every random cryptocurrency
| has a cryptography whitepaper, it seems that Signal does
| not.
| niel wrote:
| Signal published detailed specifications of the protocol
| with reference implementations since at least Feb 2017
| (group messaging protocol was added later on):
| https://signal.org/docs/
|
| The server and clients are open source:
| https://github.com/signalapp
| tjoff wrote:
| You forgot the part where joining signal "conveniently"
| discloses that to everyone - with no way to opt out(!).
|
| Also, everyone not sharing their contacts with the signal
| app already have that UX. Minus the privacy benefits of
| course.
| bitexploder wrote:
| Signal has always prioritized message security and
| integrity over anonymity. If you want anonymity, Signal
| is not, has not, and probably never will be the tool for
| you.
| tjoff wrote:
| Huge difference between having a low profile and actively
| advertising out new registrations. Does not sit well with
| any conceivable notion of privacy. Which supposedly is
| one of their strongpoints.
| parineum wrote:
| > ...and the result is that the servers have a plaintext
| log of who talks to who on their service
|
| Is there a reason why a user's address book can't be
| encrypted as well?
| rt4mn wrote:
| > It uses phone numbers because phones already have contact
| lists, stored clientside, keyed by those numbers.
|
| This makes sense for contact discovery, which is important
| for normal people who just want a chat app that works.
|
| But there are a very important segment of signal users,
| people with an elevated threat model, who I would be
| willing to bet a good portion of them would gladly
| sacrifice contact lists if it meant not having to share
| their phone number.
| nisegami wrote:
| >Among the 1,900 phone numbers, the attacker explicitly searched
| for three numbers, and we've received a report from one of those
| three users that their account was re-registered.
|
| I wonder if this was a curious attacker trying to see what they
| could do with their access, or a targeted attack.
| AtNightWeCode wrote:
| I think someone might know that certain numbers belong to
| certain users and that they want to prove it. Happened a lot
| with Disqus accounts.
|
| You can btw use the password reset function on many sites to
| correlate it with notifications. Easy at public events.
| joosters wrote:
| The page is also quite vague about how the attacker got _these_
| 1900 phone numbers. It seems to imply that they were just the
| ones around when the attacker got access. But it doesn't
| actually state that clearly. Were they 1900 random numbers or
| were they chosen somehow? The latter is of course far worse.
|
| They also apparently have logs of the attacker searching out
| three specific accounts within these 1900. That seems odd.
| What's the chance that, out of all signal accounts, the three
| they are curious about just happen to be among the 1900 they
| got access to? (Perhaps signal/trillio don't have logs from
| failed searches? That would be pretty poor logging though)
| pr0zac wrote:
| Those 1900 phone numbers would be all the accounts that
| started the registration/re-registration process with Signal
| during the time the unauthorized access was available. That
| process is started on Signal's side and Twilio is only used
| at the midway point to send a device verification SMS.
|
| Any Signal accounts that did not start that process during
| that time would not be able to be intercepted or accessed
| since Twilio has no means to begin it. The three specific
| accounts mentioned would be the cases found that the
| verification message was accessed through Twilio to register
| the account on the attacker's device.
|
| So yes, in effect the 1900 were only the ones around when the
| attacker got access. Whether the specific three were targeted
| attacks or random messing around isn't clear though.
| Confiks wrote:
| The attacker sought out 3 specific numbers. The 1900 number
| is the amount of registrations that occurred during the time
| the attacker had access to re-register their Signal account -
| but likely mostly didn't.
| [deleted]
| eadmund wrote:
| It sure _feels_ like it was targeted. Is trying to re-register
| a Signal account the sort of thing an attacker is likely to do
| at random?
| JustSomeNobody wrote:
| > Is trying to re-register a Signal account the sort of thing
| an attacker is likely to do at random?
|
| Yes. I mean why not, you've got the number(s).
| nisegami wrote:
| Similar to the other reply, I imagine it would be along the
| lines of:
|
| "Holy shit are those Signal 2FA codes? That's wild"
|
| In my head, this is something a more teenager (e.g. Lapsus)
| might think?
| sbussard wrote:
| Nice try, FBI
| baby wrote:
| assigning accounts to numbers is the dumbest thing. I remember
| when I got a new phone number a few years ago I managed to login
| into someone else's venmo account. Numbers are like dynamic IPs,
| why would anyone use this to authenticate you is beyond me.
| hammyhavoc wrote:
| Yes, it's a horribly dated idea, as-is receiving any kind of
| 2FA code via SMS.
| dublinben wrote:
| >Numbers are like dynamic IPs
|
| Maybe for you. For other people who have had the same phone
| number for years or decades, they're the one of the most
| persistent forms of communication or identification available.
| ycombinator_acc wrote:
| >it was possible for them to attempt to register the phone
| numbers they accessed to another device using the SMS
| verification code
|
| That's a thing? If my number expires and gets reassigned to
| someone else, and they register for Signal, I'll get locked out
| of my account just like that? And they'll start getting all the
| messages that were addressed to me?
| ryukafalz wrote:
| Your account is tied to your phone number so pretty sure that's
| the case, yep!
| ycombinator_acc wrote:
| That sounds horrible. Would I be SoL even if I had ticked
| "Registration Lock" prior to that?
| lxgr wrote:
| Apparently as long as you use Signal at least once every 7
| days from a linked device, you should be good:
| https://support.signal.org/hc/en-
| us/articles/360007059792-Si...
|
| Still, given that your number is used as a primary
| identifier, I'd avoid using it in that way for an extended
| amount of time. Among other things, I'm not sure if it's
| possible to re-register using just your phone number and
| PIN (but not access to SMS-OTPs on the associated phone
| number) in case you lose your own device, for example.
| 8organicbits wrote:
| It's not your account any more. The new owner gets "your" SMS
| and phone calls too. The identity is backed by the ownership of
| the number, not your person.
|
| Importantly the safety number will change since it's a new
| device. If you care about stuff like this, verify the new
| device out of band and distrust any unexpected changes. Most
| people don't care and they still see a huge improvement over
| plain SMS.
| dcow wrote:
| This is a weird thread. There's a product that does secure
| messaging with usernames and only requires user/pass. It's called
| Keybase. If this is the product you want, then go use it. I don't
| understand why everyone wants Signal to be something it's not. I
| quite like Signal as they are and this "incident" demonstrates
| exactly what happens if a carrier gets compromised: nothing.
| Nothing happens. Signal decides not to trust any phone
| verifications from the period of compromise and requires affected
| numbers to reregister. All the important crypto has nothing to do
| with phone numbers in Signal's domain. And this is exactly why I
| use Signal. It lets me send secure messages to people using a
| tried and true UX: text messaging, but with its own secure
| application layer. It's really difficult to build a useable
| security product, and Signal has done it successfully.
| stormbrew wrote:
| > If this is the product you want, then go use it.
|
| This is great advice if your goal is to send messages to
| yourself. In the real world, though, a messaging app that
| you're the only one using is about as useful as a bag of ice in
| a snowstorm. People don't need "like signal but with
| usernames," they need "signal with usernames (or email
| addresses or...)" so they can communicate with people who use
| signal.
| dcow wrote:
| This doesn't make any sense. My assertion is that Signal
| would not _be_ Signal if it has usernames. The subtext that I
| did not state specifically is exactly the question of why
| more people don 't use Keybase regularly. Maybe it's not the
| winning UX?
|
| You don't get to look over at Signal and say "wow what a
| great user base I _need_ to be a part of that " and then draw
| the conclusion that "Signal _needs_ to support my idealogical
| aversion to using a phone number ". You're missing the
| possibility that Signing is the way it is _because_ it
| requires users to verify their phone number.
|
| If you can't use a phone number but need to talk to people
| who do, securely, then you _need_ to convince them to use a
| product that accommodates your niche. Why can 't you use PGP
| and email, or Keybase, or <insert one of the 10s of other
| products that let you send encrypted messages>?
|
| Sure, signal could add support for usernames. But how do you
| know there'd be anyone left after they did for you to talk
| to? Maybe it's not what Signal's users _need_.
|
| Anyway, if Signal found a way to support usernames that
| didn't compromise on all the reasons I use signal and also
| didn't open the network up for tons of spam and low quality
| content, I don't think I'd complain. But that's a big IF.
| stormbrew wrote:
| fwiw I am a user of signal and I am expressing my need.
| Allowing it access to my contact list and my phone number
| is a privilege I extend nearly uniquely to it among similar
| apps and I want that gone. Because I can't just "not use
| signal," because signal is where the people I need to talk
| to are. Users are a key feature of any social product, you
| can't just "all else equal" them away.
|
| It's not really my problem if it's hard. That's for them to
| figure out. Until they do I will continue to be an unhappy
| user of their product, and no amount of people on the
| internet willing to defend their choices as if they were
| their own is going to change that.
| tptacek wrote:
| Allowing the Signal client to access your contact list is
| literally the premise of Signal; it's the core security
| UX trade it makes: no durable logs of who's talking to
| who on the servers, and contact lists stored exclusively
| on the client.
| zajio1am wrote:
| These two things are not related in any way. You could
| clearly have a communicator that stores its contact lists
| exclusively on the client, but does not abuse identifiers
| and contact lists of different applications (PSTN calling
| software).
| dcow wrote:
| Let's concede that using other applications' identifiers
| is strictly bad. Probably everyone agrees. Now, how do I
| message you on this pristine application?
|
| Using phone numbers is a _compromise_ taken in order to
| enable a UX that actually wins users. Have we forgotten
| what that word means?
| stormbrew wrote:
| You've said something like this many many times and I
| just don't see the logic of the question. You're talking
| about a feature that you admit is a privacy compromise
| and then comparing it to an absolutely maximalist
| alternative, or a world where people only connect in
| literally one way (through their phone contact lists). Is
| it really so hard to imagine that other compromises may
| be possible, or even coexist?
|
| The answer is I give them my email or username. They give
| me theirs. We connect.
|
| Using phone contact lists shortcuts this process, but the
| exchange still had to happen at some point. Is it really
| so hard to believe some users might choose to do it
| again? Or, god forbid, with someone they'd rather not
| give a phone number to?
| stormbrew wrote:
| That may be their product management premise, but it's
| not why I use it. I use it because people I need to talk
| to are there and it has proper e2e messaging. I'm not
| beholden to their expectations of why I want to use their
| product.
|
| Also I'm not advocating for anything to be kept server
| side, nor do I see any reason why other identifiers
| couldn't be kept client side. An address book is just a
| list of identifiers, it's not magical just because it's
| phone numbers and already on my phone.
|
| We've had this conversation before though. I remain
| unconvinced.
| tptacek wrote:
| Signal replaces messaging services that were all keyed by
| phone number. Use something else. I don't think anybody
| can do better than explaining why Signal works this way,
| and what the benefits are, vs. the (amply articulated)
| liabilities.
|
| This is one of the most boring repeated conversations
| that occurs on HN. It's incessant. Avoiding these
| incessant superficial conversations is, in fact, part of
| the premise of HN.
| stormbrew wrote:
| I agree, it's an exhausting repeated conversation. It's
| almost as if there's a frustrating unmet need with signal
| as it stands for a lot of people that isn't actually
| placated by the repetition of an argument about how they
| grow as a ~~business~~ (sorry, as a non-profit).
|
| And again, signal is the only thing that can talk to
| people on signal so "use something else" is not helpful.
| tptacek wrote:
| They're not a business.
| dcow wrote:
| It is certainly fair to be frustrated. Respectfully, I'd
| challenge anyone who thinks they can build a successful
| secure messaging platform that concocts the perfect UX
| while being absolutely privacy preserving to do so. I'd
| give it a spin.
| stormbrew wrote:
| Except that like I said, users are a feature here. The
| perfect thing may exist but it doesn't matter if no one's
| using it. I don't know about you but I lost belief in the
| idea of a perfectly meritocratic world of social products
| a long time ago.
| password4321 wrote:
| I don't know whether or not it has always been the case,
| but Signal works fine without "access to my contact
| list". The Android app does seem pretty persistent in
| asking for it, though!
| petestream wrote:
| I've learned the hard, or at least slow, way that this
| discussion is mostly futile. All I can say is that a large
| part of the world doesn't use phone numbers like that
| anymore. One of the major benefits of messaging services is
| that they aren't tied to a country, carrier, area, address,
| personal identity or even your phone. It doesn't end up in
| random databases of shopping websites or advertising
| networks. You can share it with someone you briefly met,
| someone unknown or even have someone else share it for you.
|
| I've found, and I think more than me have, that the overlap
| between having an immediate need for security and wanting
| to share you phone number is surprisingly small. And even
| just a subset of those people are on Signal.
|
| It's just never been very useful for me when other services
| are.
| [deleted]
| [deleted]
| baby wrote:
| The reality is that there's room for a new messaging app that
| uses username, has good UI, and is secure. Perhaps wire is that
| app?
| robmusial wrote:
| It's interesting to me that you used Keybase as the example. My
| brain doing its guessing ahead thing assumed you were going to
| say Matrix. I've seen several popular instances of it, and run
| in to people actively using it at least monthly where I haven't
| seen anyone use Keybase in years (since the Zoom acquisition).
| Do you see a lot of people _actively_ using Keybase still?
| pzduniak wrote:
| I personally use it for LOTS of stuff, both personal and
| commercial (as a Slack replacement). Other than a couple bugs
| (pinch to zoom on Android, media playback), it's fine - I
| don't feel like I need any more features, though I'd love it
| to be a bit snappier. KBFS has been excellent for stuff like
| secrets in CI pipelines.
|
| Disclaimer: I'm one of the ex-Keybase, now Zoom people. I'm
| definitely in a bubble. The non-Keybase people I talk with
| are my consultancy's employees + a couple clients.
|
| Keybase's security model is excellent in protecting you from
| attacks like the one described in the OP. If you can't sign
| your device with another one, you can only recover a username
| if:
|
| - it's not in [lockdown
| mode](https://book.keybase.io/docs/lockdown)
|
| - it has a verified email / phone number
|
| - you either click a reset link in the email / SMS _or_ know
| the password
|
| - _and_ the user fails to cancel the reset over many days of
| warnings.
|
| And if you manage to go through all that trouble, all your
| contacts will get blasted with warnings about your identity.
| Fun!
| dcow wrote:
| I don't use Matrix a bunch so it might just be that I'm not
| as familiar and out of the loop. To me Keybase (despite all
| the drama) seems like the most isolated/pure example of a
| product that took the approach of username/password style
| accounts and applied it to application layer crypto to
| achieve secure messaging. Keybase later added all the
| network-y chat type features that make me think more of a
| product like Matrix. But if Matrix is good for 1:1 "chat up
| my contacts and groups thereof", then great. Matrix always
| seemed more like federated Discord or "crypto" IRC to me with
| the whole needing to join channels thing.
| MikeKusold wrote:
| I love Keybase, but I would never recommend it today.
|
| Zoom acqui-hired the team in 2020: https://blog.zoom.us/zoom-
| acquires-keybase-and-announces-goa...
| ianopolous wrote:
| If you're looking for a Keybase replacement, check out
| Peergos (https://peergos.org). Peergos is a P2P E2EE global
| filesystem and application protocol that's:
|
| * fully open source (including the server) and self hostable
|
| * has a business model of charging for a hosted version
|
| * designed so that you don't need to trust your server
|
| * audited by Cure53
|
| * fine-grained access control
|
| * identity proofs with controllable visibility
|
| * encrypted applications like calendar, chat, social media,
| text editor, video streamer, PDF viewer, kanban
|
| * custom apps - you can write your own apps for it (HTML5),
| which run in a sandbox which you can grant various
| permissions
|
| * designed with quantum resistance in mind
|
| You can read more in our tech book (https://book.peergos.org)
| or source (https://github.com/peergos/peergos)
|
| Disclaimer: co-founder here
| kitkat_new wrote:
| can communication happen across servers/vendors like in
| Matrix?
| ianopolous wrote:
| Yes, it's P2P. Anyone on any server can share and
| communicate with anyone on any other server.
|
| You can also migrate server unilaterally and keep your
| social graph without needing to tell everyone, all links
| to your stuff continue to work afterwards.
| schlauerfox wrote:
| It's kept updated, we use it to interact with the Chia
| Blockchain team heavily and you just can't substitute for its
| identity feature to know who you're talking to.
| Y-bar wrote:
| Barely so. looks more like bare minimum life support. The
| slump in code contributions can speak for themselves after
| the acquisition.
|
| https://github.com/keybase/client/graphs/contributors
| dcow wrote:
| I am aware. For one it still works just as well is it ever
| has, the Zoom acquisition didn't change anything there. So if
| you care about features, there shouldn't be any problem. For
| sure it seems to be in maintenance mode, but nothing they
| were doing of late with Lumens was that exciting anyway
| (trying to become a crypto wallet like everyone and their
| mothers).
|
| I would pay $/mo for a Keybase reboot with the goal of
| building a sustainable business like Signal did instead of
| taking VC money for a shot at the moon. Until someone does
| that, Keybase continues to work as a messaging app with
| usernames instead of phone numbers.
| ianopolous wrote:
| I've replied in a sibling comment about Peergos which is
| trying to do just that.
| dcow wrote:
| Definitely checking it out, thanks!
| rvz wrote:
| Yeah I'd rather use Keybase which has username / password
| than the disaster that Signal is right now. Especially when
| you have both Twitter and Twilio breaches, SS7 attacks, SIM
| swapping attacks, etc.
|
| Keybase still works and for a simple messaging app does the
| job better than Signal or any other messaging app that
| requires a phone number. This is a total disaster.
|
| > but nothing they were doing of late with Lumens was that
| exciting anyway (trying to become a crypto wallet like
| everyone and their mothers).
|
| Just like Signal did, with their own private crypto wallet
| and cryptocurrency that they have been working suspiciously
| in the background for a year after being questioned.
| Jenk wrote:
| where does Telegram fit in your opinion?
|
| genuine question from someone oblivious to messaging advances
| in the last decade.
| smegsicle wrote:
| telegram "supports" e2e encryption, but it is frustrating to
| use and is not enabled by default
| tptacek wrote:
| Last time I checked, it also doesn't work for group chats.
| Has that changed?
| hiimkeks wrote:
| The e2e encryption protocol is the definition of "let someone
| who has just learned about Diffie-Hellman roll their one
| crypto". It's called MTProto, and version 2 mostly updates
| padding and uses SHA256 instead of SHA1. Yes, SHA1 was
| deprecated before Telegram even existed. No, version 2 is not
| better.
|
| Cryptographers praise Signal because the protocol makes sense
| and because it's not run by someone as data-hungry as Meta or
| Alphabet (though I think it's hosted on AWS).
|
| Threema is a good alternative if you want username/password,
| but has less users (probably since it's a paid app) and less
| neat security properties (not even forward secrecy).
|
| I agree Signal is not perfect and has never played the Open
| Source game very well (even under Moxie reports from the
| community were largely ignored) and the MobileCoin move is
| weird. I also have not followed the direction the project has
| taken since Moxie left. However, the _entire_ code is open
| source (which iirc is not the case with Telegram) and the
| protocol makes sense (and has been extensively studied), and
| there is a lot of eyes on the development. I remember code
| changes that suggested a pivot to not using phone numbers as
| identifiers (i.e. maybe requiring them for registration but
| not showing it to everyone you talk to).
|
| I wonder whether MLS will go anywhere and actual projects
| will adopt it. Last time I checked it did require consensus
| on message ordering, which seems to make it less well-suited
| for non-centralized protocols like Matrix, but we'll see.
| pr0zac wrote:
| Telegram only provides e2e encryption for one-to-one
| conversations and only if you specifically create a "secret
| chat" largely because of usability and discoverability
| reasons with regard to their major point of focus. Its
| probably better discussed in comparison to other services
| like IRC, Matrix, Discord, or Slack that concentrate on
| feature rich group chat implementation with easy
| discoverability, organization, and mobility for which
| encryption either does not exist or is an opt-in or bolt-on
| feature.
|
| Services like Signal, Whatsapp, Keybase, or iMessage that
| provide e2e encryption for all chats, group or otherwise,
| (albeit with differing levels of implementation security)
| have chosen to do so at the expense of things like mobility
| of chat history across devices and the ability to easily
| discover and join new group chats and instead focus on a less
| organized, more ad-hoc form of messaging that's a rather
| different use case than Telegram's.
| theK wrote:
| Doesn't matrix also do all that with the added benefit of
| federated Id and pulling Chat history from an e2e group
| chat?
| laurex wrote:
| This article is worth a read on that front:
| https://www.wired.com/story/how-telegram-became-anti-
| faceboo...
| skybrian wrote:
| Isn't Keybase semi-abandoned? There hasn't been a blog post
| since 2020 when they were acquired by Zoom.
| ls15 wrote:
| Looks mostly abandoned to me:
|
| https://github.com/keybase/client/graphs/code-frequency
|
| https://github.com/keybase/client/graphs/contributors
| dcow wrote:
| I got a 6.0.1 update for Keybase like yesterday. I agree with
| the sentiment, though, feels like it's in maintenance mode.
| But its core value prop and feature has never stopped
| working. Point was that it's there and it works and if it's
| the UX model you prefer then by all means, use it at least
| until someone comes and reboots the concept.
| throwaway0x7E6 wrote:
| > I quite like Signal as they are and this "incident"
| demonstrates exactly what happens if a carrier gets
| compromised: nothing. Nothing happens. Signal decides not to
| trust any phone verifications from the period of compromise and
| requires affected numbers to reregister.
|
| cool, but entire carriers being compromised has never been a
| concern. it's state agencies forcing carriers to compromise
| individuals.
|
| >I don't understand why everyone wants Signal to be something
| it's not
|
| we don't. we just warn people against using it. it's not a
| privacy tool, it's a larp toy like a commercial VPN.
| [deleted]
| dcow wrote:
| Doesn't everyone get notified when your verification status
| changes? Don't you need to rescan people's security numbers
| or whatever they call it? If this is truly a gripe you have
| couldn't signal also add some sort of delay to the re-
| verification process so that device resets take weeks to be
| trusted and with lots of warning and opportunities for both
| parties to disengage before any hostile actor takes over?
|
| As far as I'm away, Signal is used by plenty of people who
| may be targeted by state agencies. Has there been even one
| "High value target apprehended because Signal" headline?
| hayst4ck wrote:
| I don't think it's stated enough just how easy signal is as a
| drop in replacement for WhatsApp, the main communication method
| for a significant portion of the world. The ability to install
| a new app, use your phones contact database, and be able to use
| the app nearly exactly the same way you used WhatsApp is an
| incredible feature. With almost zero effort you can
| significantly reduce (capitalist or nationstate) surveillance
| against you. It's not perfect but it's a lot of value for
| little effort.
|
| All of these feature requests require less knowledgeable users
| to do new things or weigh alternative options which involves
| time spent developing onboarding. Having "one way," an
| opinionated way, to do a particular type of thing is a very
| useful engineering value especially if you have limited
| engineering resources. Simplicity is an extremely underrated
| feature.
|
| Being 80% perfect for 20% of the work is laudable.
|
| What's even better about Signal is that Facebook's competitive
| data is the list of people you know. Facebook wins every time a
| person adds a friend without adding their contact info to their
| phone. That means Facebook is the source of truth for who you
| know and Facebook is the intermediary for communicating with
| someone else. That's why, in retrospect, whatsapp was an
| obvious competitor worth spending a lot of money acquiring.
| WhatsApp drove people to use their phones contact list as the
| source of truth for you who communicate with, not Facebook's
| friend list.
| hammyhavoc wrote:
| A drop-in replacement would mean that you can still
| communicate with people on WhatsApp. Matrix protocol allows
| you to bridge WhatsApp and many other SaaS comms platforms to
| a single client, truly making is a drop-in replacement for
| WhatsApp.
| Iv wrote:
| Matrix is the protocol I think one should go to if Signal's
| requirement of phone numbers is a turn down.
| tptacek wrote:
| Matrix is great. Just remember that the Matrix threat model
| isn't the Signal threat model: you're usually telling a
| Matrix instance --- or, really, anyone who can compromise or
| suborn that instance --- a lot more about your communication
| patterns than you are with Signal.
|
| Matrix, right now, is a lot more amenable to the kinds of
| messaging that people on HN tend to want to do than Signal
| is. The problematic thing is that HN people tend to believe
| that their workflows are (1) the most important and (2) the
| ones with the most sophisticated threat models. Neither are
| true; (2) is _very_ un-true.
|
| For, like, talking to team members about a shared dev
| project, I'd always use Matrix in preference to Signal --- of
| course, for that kind of work, what I'd really do is just use
| Slack or Discord. Which gets at something about what HN wants
| from Signal.
___________________________________________________________________
(page generated 2022-08-15 23:00 UTC)