[HN Gopher] Signal Server code on GitHub is up to date again
       ___________________________________________________________________
        
       Signal Server code on GitHub is up to date again
        
       Author : domano
       Score  : 247 points
       Date   : 2021-04-07 15:00 UTC (8 hours ago)
        
 (HTM) web link (github.com)
 (TXT) w3m dump (github.com)
        
       | pmlnr wrote:
       | "Signal Server code on GitHub is up to date again - now with a
       | freshly added shitcoin!"
        
         | mtmail wrote:
         | The addition of micropayments to Signal is discussed separately
         | at https://news.ycombinator.com/item?id=26724237
        
           | [deleted]
        
           | hu3 wrote:
           | It's obviously related. The implication is that they pushed
           | code to Github just to gain public trust that can be
           | leveraged to market their cryptocurrency.
        
             | mnd999 wrote:
             | I think you're part correct. I suspect they didn't want to
             | go public with the shitcoin until it was done.
        
               | hvis wrote:
               | That makes a lot of sense, though.
               | 
               | It was probably apparent to them that adding the new
               | crypto payments feature would create at least some kind
               | of community pushback.
               | 
               | Waiting until the feature is reasonably complete and can
               | be judged on its merits is good from a business
               | perspective.
        
               | Slartie wrote:
               | It's also good from a very personal business perspective
               | to have the crucial inside information of an obscure coin
               | named "MobileCoin" (sitting on CoinMarketCap Rank #2378)
               | about to be pumped HARD once the news about inclusion
               | into a popular messaging app hits the airwaves.
               | 
               | It's especially good to have this information while said
               | coin is already publicly traded on some (albeit obscure)
               | exchanges. Because that allows you, as an insider, to
               | slowly accumulate a nicely sized position in that coin
               | while it's still cheap. "MobileCoin" wasn't publicly
               | traded until December of last year, there wasn't even a
               | live blockchain until that month, so if it's true that
               | the first commits in the Signal server repo hinting at
               | the crypto payment plans were created shortly after the
               | repo went silent in April 2020, it is obvious that
               | keeping these commits secret until at least December 2020
               | was crucial to successfully realize these good personal
               | business perspectives - a.k.a. insider trading, but
               | nobody cares about it if it's crypto.
        
               | jeltz wrote:
               | Obviously. Otherwise Signal employees would not be able
               | to do shitcoin insider trading.
        
       | schoolornot wrote:
       | Nice, if only they could be so kind after all this time to
       | provide instructions on how to run it. I don't get why they have
       | been dancing their reputation around Hacker News this way by not
       | releasing sources until there were a bunch of front page posts
       | about it.
        
         | fartcannon wrote:
         | If it looks like a duck, and quacks like a duck, it's probably
         | a duck.
        
       | a3w wrote:
       | Thank Fefe for that: https://blog.fefe.de/?ts=9e9221ad (second
       | update in posting, completely in german) Coincidence or bad press
       | covfefe? ;)
        
       | newscracker wrote:
       | So it just took close to a year to dump thousands of private
       | commits into the public repo! Is there an official response as to
       | why they stopped sharing the code for so long and more
       | importantly, why they started sharing it publicly again? Who
       | gains what with the publication now? And seriously, why is it
       | even relevant anymore?
        
         | amyjess wrote:
         | My guess is that the whole public repo thing was just one
         | employee's idea, nobody else at the company cared, and that
         | employee either forgot about it or was too busy to do it for
         | months.
        
         | Sander_Marechal wrote:
         | > Is there an official response as to why they stopped sharing
         | the code for so long
         | 
         | Not oficially, but see
         | https://news.ycombinator.com/item?id=26725117. They stopped
         | publishing code when they started on the cryptocurrency
         | integration.
        
         | est31 wrote:
         | The first commit that they omitted in April 2020 is related to
         | the payment feature they just announced. So the two events
         | coinciding (server code being published and payment feature
         | being announced) might not have been a coincidence. They
         | apparently didn't want to bother creating a private test server
         | running a private fork of the server code and just pushed their
         | experiments to production, just not releasing the source code
         | to prevent people from seeing the feature before an official
         | announcement. They neccessarily built test client apps because
         | I couldn't find any old commit mentioning payments in the
         | client app git log.
         | 
         | https://news.ycombinator.com/item?id=26718134
        
           | theptip wrote:
           | This leaves a very bad taste in my mouth. Unclear how much
           | practical damage this caused (how many security analysts are
           | using the Signal server source to look for vulns?) but this
           | is damaging to the project's claims of transparency and
           | trustworthiness.
           | 
           | It's quite clear that this crypto integration provides a
           | perverse incentive for the project that points in the
           | opposite direction of security.
        
             | _dibly wrote:
             | Forgive me if this is a stupid question, but how exactly is
             | that the case?
             | 
             | It's been damaging to their claims of transparency for
             | almost a year now, if anything this should be the first
             | step in repairing that slight. How is dumping a year's
             | worth of private work into your public repo somehow doing
             | damage to their trustworthiness?
        
               | stjohnswarts wrote:
               | For one security through obscurity is a thing. Depending
               | on it as your primary "security measure" is stupid on all
               | levels but being part of your security is not a bad
               | thing. Before all someone could get would be your chat
               | history. Other than police, jilted lovers, and state
               | actors no one else gives a crap about that most likely
               | unless you are targeted as an individual. Now by adding
               | access to money that might be accessible via Signal adds
               | more incentive for hackers to not try to hack something
               | else and now make a beeline for Signal. Also it dilutes
               | the efforts of the Signal developers efforts to make a
               | better messaging app. Also crypto in and of itself is
               | questionable, but one that is 85% by one entity waiting
               | to liquidate has been questioned by many organizations as
               | well. The people who own that will expect fair value for
               | it and in essence become billionaires several times over
               | if this really comes to fruition.
        
               | theptip wrote:
               | You're right that the damage to trustworthiness was
               | always there. (I.e. they did the damage when they stopped
               | publishing their source code, and they compounded that
               | damage the longer they declined to publish their code).
               | My point was more that the damage now seems to be
               | directly attributable to the new payments integration.
               | 
               | Prior to seeing this post, I was already concerned that
               | adding a crypto/payments integration would damage the
               | Signal project, and this appears to be an immediate
               | example of the kind of harms/perverse incentives I was
               | concerned about.
               | 
               | (A counterargument to my theory here would perhaps be
               | "Signal was always doing stuff like declining to publish
               | their server code even prior to the payments
               | integration", I'm not familiar enough with the history of
               | the project to know the details there.)
        
               | _dibly wrote:
               | Reading the other article on HN definitely helped me
               | understand more. I think really it comes down to me not
               | understanding why they had so much trustworthiness to
               | begin with.
               | 
               | They've been obscuring their code for about a year and
               | even then, it's not like Signal has always come out and
               | said "we love the passion our fellow developers have for
               | our commitment to privacy and security". They just let
               | people sell their relatives on that promise and waited
               | until they had a massive userbase to start monetizing
               | their platform.
               | 
               | Thanks for your reply, I just wonder where all this
               | trustworthiness has been coming from for the last 12
               | months while they've been quietly working on the platform
               | without publishing any changes. It feels like a beta
               | tester for a game being mad that there were lootboxes in
               | the full release of the game when they weren't in the
               | beta. Even if you didn't know they were coming, you had
               | to assume something like it was inevitable given enough
               | traction.
        
               | mindslight wrote:
               | Signal's choices never really felt right, as their
               | justifications tended towards authoritarian paternalism -
               | eg willfull reliance on Play services, keeping it out of
               | F-Droid (which while flawed as Signal pointed out, seems
               | to be the best we currently have), bottleneck centralized
               | servers, and phone numbers as primary identifiers (?!).
               | 
               | But the standard Free Software development/distribution
               | model does lack in some areas. And so Signal got a bunch
               | of community leeway for going against the grain, in the
               | hopes that a fresh approach would somehow bear fruit.
               | 
               | We're now apparently seeing some of the fruit from that
               | approach.
        
               | iudqnolq wrote:
               | I don't agree with adding cryptocurrencies, but I was
               | very sympathetic to the play services argument. Android
               | is very difficult to program for, and it's even more
               | difficult without play services.
               | 
               | For notifications the alternatives are noticably worse
               | (higher battery usage because you can't coordinate
               | request timings with other apps, an annoying permanent
               | notification), and the leakage is minimal. If you protect
               | your encrypted packets from Google the NSA will see them
               | anyway.
               | 
               | Your custom implementation will be quite complicated, and
               | if you only enable it for a small subset of your users
               | it'll be a pain to debug.
        
               | mindslight wrote:
               | I said willfully for a reason, as opposed to just
               | reluctantly.
               | 
               | I agree about the sorry state of non-Google notifications
               | on Android. I wish someone would make a common
               | notification framework for the Free world that would be
               | installed alongside system-level F-Droid. Although
               | F-Droid Conversations and Element notifications do work
               | fine for me, regardless of purportedly less battery life,
               | I can understand not everyone wants to make the same
               | choice.
               | 
               | However, I'm referencing more than the notifications
               | issue. I recall an early thread from Signal where they
               | touted the benefits of fully opting into the Google
               | ecosystem - the gist was that Google has expended all of
               | this effort on security and they wanted to take advantage
               | of it to bring security to the masses. And that simply
               | doesn't line up with my own threat model, in which Google
               | is one of the most significant attackers.
        
             | kelnos wrote:
             | The server being or not being secure is only important to
             | the people who operate it. You can examine the client code
             | and see that your messages are encrypted end to end.
             | Signal's entire security model revolves around the idea
             | that you don't need to trust the server.
        
               | theptip wrote:
               | There's no concern about metadata leakage?
        
               | outime wrote:
               | Even if you have access to an up-to-date source code it
               | doesn't guarantee at all they'd be running a completely
               | different version if so they wish. I mean this have just
               | happened yet this question kind of implies you'd still
               | trust such entity to run the server from the source code
               | you have access to. I hope this collective illusion dies
               | already.
        
             | ignoramous wrote:
             | It was called out as recently as 4 weeks ago [0] and was
             | voted to the front-page but then weighted-out possibly
             | incorrectly by mods (may be because the top comment is
             | dismissive of concerns raised [1]?) before a discussion
             | could flourish.
             | 
             | cc: @dang
             | 
             | [0] https://news.ycombinator.com/item?id=26345937
             | 
             | [1] _The title is the only thing worth reading in this pile
             | of speculation and hand waving._
        
         | jiveturkey wrote:
         | I think it's proof that security (and privacy) doesn't matter.
         | So it is very relevant. (As if telegram as competitor isn't
         | enough proof.)
         | 
         | The entirety of the signal "stack" depends on the SGX enclave.
         | The fact that no one, in all time, has bothered to notice that
         | the running code is different than the published code, is
         | telling.
         | 
         | There's actually a newer SGX exploit, and related mitigation,
         | that came to light at about the same time when they released
         | their discovery protocol. Those mitigations were never
         | backported to the base signal functionality. That no one
         | audited and complained about this says quite a lot.
         | 
         | I've not looked at this code dump but perhaps the newer fixes
         | finally made their way in. Or have been there all along.
        
           | ajconway wrote:
           | > The fact that no one, in all time, has bothered to notice
           | that the running code is different than the published code
           | 
           | It's client apps who verify (via attestation) that the code
           | inside an SGX enclave is what they expect it to be, and
           | clients are open source.
           | 
           | > The entirety of the signal "stack" depends on the SGX
           | enclave
           | 
           | Only private contact discovery depends on trusting SGX.
        
             | jiveturkey wrote:
             | > It's client apps who verify (via attestation) that the
             | code inside an SGX enclave is what they expect it to be,
             | and clients are open source.
             | 
             | If the attestation signature matches the published enclave
             | code, then we can know if there's a match. So either
             | there's a missing mitigation, which no one ever has
             | complained about, or the running enclave code doesn't match
             | the source, which also no one ever has complained about.
             | Without independent audit, there is no verification and we
             | have established that independent parties do not care.
             | 
             | > Only private contact discovery depends on trusting SGX.
             | 
             | uh, no. this is demonstrably and obviously wrong.
        
               | feanaro wrote:
               | > uh, no. this is demonstrably and obviously wrong.
               | 
               | Yes? How?
        
               | tylersmith wrote:
               | Then please demonstrate.
        
         | nileshtrivedi wrote:
         | Here's a response by MobileCoin folks:
         | 
         | > Signal had to verify that MobileCoin worked before exposing
         | their users to the technology. That process took a long time
         | because MobileCoin has lots of complicated moving parts.
         | 
         | > With respect to price, no one truly understands the market.
         | It's impossible to predict future price.
         | 
         | - https://twitter.com/mobilecoin/status/1379830618876338179
         | 
         | Reeks of utter BS. As the reply on this tweet says, features
         | can be developed while being kept switched off with a flag.
        
           | dewey wrote:
           | > features can be developed while being kept switched off
           | with a flag
           | 
           | But maybe you don't want everyone to know about all the
           | features / announcements months in advance?
        
             | vinay427 wrote:
             | They already did this development privately. I don't think
             | anyone has a problem with building out a new feature before
             | it's announced. The problem people have, IMO
             | understandably, is that they pushed this code to production
             | servers instead of testing it privately.
        
         | nimbius wrote:
         | better question yet: Did we ever get a full post-mortem of the
         | six day outage the service had? other than hand waving
         | statements about user subscriptions? what fixes were made or
         | lessons learned?
        
           | ddevault wrote:
           | The Signal outage was SIX DAYS?
        
             | saurik wrote:
             | (All the news I'm finding is that it was just one day.)
        
               | uoaei wrote:
               | As I recall it was 1-2 days, yeah.
        
             | stjohnswarts wrote:
             | no it wasn't
        
         | mrtweetyhack wrote:
         | remove plain text passwords and China's message redirects
        
       | NotSammyHagar wrote:
       | I am disappointed they are supporting mobilecoin yet it's not
       | available for use in the US. I understand it has important
       | privacy and security features, but there seems little point to it
       | if it can't be used in most of the world (the us isn't that big,
       | but it's important financially). There were some confusing
       | comments that it could be supported one day, easily. I guessed
       | that it wasn't supported because they somehow want to avoid us
       | financial regulation. So what is the reason for no us capable
       | currencies here?
        
       | BTCOG wrote:
       | You can't even use MobileCoin in the US (Not that I'm a Signal
       | user) but are they either planning on focusing on foreign
       | markets, or planning on getting audited heavily by all the US
       | financial agencies?
        
         | Vinnl wrote:
         | I think they're anticipating support for other coins, which
         | might work in the US, judging by their blog post [0]:
         | 
         | > The _first_ payments protocol we've added support for is a
         | privacy focused payments network called MobileCoin, which has
         | its own currency, MOB.
         | 
         | (Emphasis mine.)
         | 
         | [0] https://signal.org/blog/help-us-test-payments-in-signal/
        
       | varispeed wrote:
       | How long until it gets closed down because you know dealers and
       | stuff? On the other hand is it possible to compile the whole
       | thing and run Signal on premises end to end?
        
       | [deleted]
        
       | woah wrote:
       | A lot of these comments are just manifestations of the kneejerk
       | HN "crypto bad" reflex. Here's the deal:
       | 
       | - Whether or not Signal's server is open source has nothing to do
       | with security. Signal's security rests on the user's knowledge
       | that the open source client is encrypting messages end to end.
       | With that knowledge, the server code could be anything, and
       | Signal inc. would still not be able to read your messages. In
       | fact, having the server code open source adds absolutely nothing
       | to this security model, because no matter how open source and
       | secure the server code might be, Signal inc. could still be
       | logging messages upstream of it. The security rests only upon the
       | open source client code. The server is completely orthogonal to
       | security.
       | 
       | - Signal's decision to keep early development of the MobileCoin
       | feature set private was valid. Signal is not your weekend node.js
       | module with two stars on Github. When changes get made to the
       | repo, they will be noticed. This might mess up their marketing
       | plan, especially if they weren't even sure whether they were
       | going to end up going live with the feature. Signal is playing in
       | the big leagues, competing with messengers which have billions of
       | dollars in marketing budget, will never ever be even the smallest
       | amount open source, and are selling all your messages to the
       | highest bidder. They can't afford to handicap themselves just to
       | keep some guys on Hacker News happy.
       | 
       | - Signal's decision to keep development to the (private) master
       | branch, instead of splitting the MobileCoin integration into a
       | long-running feature branch is a valid choice. It's a lot of work
       | to keep a feature branch up to date over years, and to split
       | every feature up into the public and non-public components which
       | then get committed to separate branches. This would greatly
       | affect their architecture and slow down shipping for no benefit,
       | given that the open sourceness of the server is orthogonal to
       | security.
        
         | gojomo wrote:
         | Signal Foundation has legitimate _self-serving_ strategic
         | reasons to prefer such secrecy, sure.
         | 
         | But users also have legitimate reasons to want more
         | transparency into both source-code & strategy.
         | 
         | Whether such secrecy best serves the _users_ & the cause of
         | private messaging is an open question.
        
         | emptysongglass wrote:
         | You're apologizing for a project that has repeatedly damaged
         | user trust with excuses.
         | 
         | These are "valid" reasons for keeping the source code private
         | for a year? By whose book? Yours? Certainly not by mine. I
         | wouldn't let any other business abscond from its promise to
         | keep open source open source in spirit and practice, why would
         | I let Signal?
         | 
         | This is some underhanded, _sneaky_ maneuvering I 'm more used
         | to seeing from the Amazons and the Facebooks of the world.
         | These are not the actions of an ethically Good organization.
         | And as has already been demonstrated by Moxie in his lust to
         | power, he's more than capable of deviance. On Wire vs Signal:
         | "He claimed that we had copied his work and demanded that we
         | either recreate it without looking at his code, or take a
         | license from him and add his copyright header to our code. We
         | explained that we have not copied his work. His behavior was
         | concerning and went beyond a reasonable business exchange -- he
         | claimed to have recorded a phone call with me without my
         | knowledge or consent, and he threatened to go public with
         | information about alleged vulnerabilities in Wire's
         | implementation that he refused to identify." [1]
         | 
         | These are not the machinations of the crypto-idealist, scrappy
         | underdog for justice we are painted by such publications as the
         | New Yorker. This is some straight up cartoon villain twirling
         | their moustache plotting.
         | 
         | So now I'm being sold on a business vision that was just so hot
         | the public's eyes couldn't bear it? We're talking about a _pre-
         | mined_ cryptocurrency that its inventors are laughing
         | themselves to the bank with.
         | 
         | At least Pavel Durov of Telegram is honest with his users. At
         | least we have Element doing their work in the open for all to
         | see with the Matrix protocol. There are better, more ethical,
         | less shady organizations out there who we can and ought to be
         | putting our trust in, not this freakshow of a morally-
         | compromised shamble.
         | 
         | [1] https://medium.com/@wireapp/axolotl-and-
         | proteus-788519b186a7
        
           | selykg wrote:
           | Repeatedly? This is the first I'm aware of, what are the
           | others?
        
         | codethief wrote:
         | > Whether or not Signal's server is open source has nothing to
         | do with security
         | 
         | This true only when you are exclusively concerned about your
         | messages' content but not about the metadata. As we all know,
         | though, the metadata is the valuable stuff.
         | 
         | There is a second reason it is wrong, though: These days, lots
         | of actual user data (i.e. != metadata) gets uploaded to the
         | Signal servers[0] and encrypted with the user's Signal PIN
         | (modulo some key derivation function). Unfortunately, many
         | users choose an insecure PIN, not a passphrase with lots of
         | entropy, so the derived encryption key isn't particularly
         | strong. (IMO it doesn't help that it's called a PIN. They
         | should rather call it "ultra-secure master passphrase".) This
         | is where a technology called Intel SGX comes into play: It
         | provides remote attestation that the code running on the
         | servers is the real deal, i.e. the trusted and verified code,
         | and not the code with the added backdoor. So yes, the server
         | code _does_ need to be published and verified.
         | 
         | Finally, let's not forget the fact that SGX doesn't seem
         | particularly secure, either[1], so it's even more important
         | that the Signal developers be open about the server code.
         | 
         | [0]: https://signal.org/blog/secure-value-recovery/
         | 
         | [1]: https://blog.cryptographyengineering.com/2020/07/10/a-few-
         | th...
        
           | codethief wrote:
           | Addendum: Out of pure interest I just went into a deep dive
           | into the Signal-Android repository and tried to figure out
           | where exactly the SGX remote attestation happens. I figured
           | that somewhere in the app there should be hash or something
           | of the code running on the servers.
           | 
           | Unfortunately, `rg -i SGX` only yielded the following two
           | pieces of code:
           | 
           | https://github.com/signalapp/Signal-
           | Android/blob/master/libs...
           | 
           | https://github.com/signalapp/Signal-
           | Android/blob/master/libs...
           | 
           | No immediate sign of a fixed hash. Instead, it looks like the
           | code only verifies the certificate chain of some signature?
           | How does this help if we want to verify the server is running
           | a specific version of the code and we cannot trust the
           | certificate issuer (whether it's Intel or Signal)?
           | 
           | I'm probably (hopefully) wrong here, so maybe someone else
           | who's more familiar with the code could chime in here and
           | explain this to me? :)
        
           | wolverine876 wrote:
           | > These days, lots of actual user data (i.e. != metadata)
           | gets uploaded to the Signal servers[0] and encrypted with the
           | user's Signal PIN (modulo some key derivation function).
           | Unfortunately, many users choose an insecure PIN, not a
           | passphrase with lots of entropy, so the derived encryption
           | key isn't particularly strong.
           | 
           | If I understand what you are saying and what Signal says,
           | Signal anticipates this problem and provides a solution that
           | is arguably optimal:
           | 
           | https://signal.org/blog/secure-value-recovery/
           | 
           | My (limited) understanding is that the master key consists of
           | the user PIN plus c2, a 256 bit code generated by a secure
           | RNG, and that the Signal client uses a key derivation
           | function to maximize the master key's entropy. c2 is stored
           | in SGX on Signal's servers. If the user PIN is sufficiently
           | secure, c2's security won't matter - an attacker with c2
           | still can't bypass the PIN. If the PIN is not sufficiently
           | secure, as often happens, c2 stored in SGX might be the most
           | secure way to augment it while still making the the data
           | recoverable.
           | 
           | I'd love to hear from a security specialist regarding this
           | scheme. I'm not one and I had only limited time to study the
           | link above.
        
             | codethief wrote:
             | > If I understand what you are saying and what Signal says,
             | Signal anticipates this problem and provides a solution
             | that is arguably optimal
             | 
             | Yep, this is what I meant when I said "This is where a
             | technology called Intel SGX comes into play". :)
             | 
             | And you're right, SGX is better than nothing if you accept
             | that people use insecure PINs. My argument mainly was that
             | 
             | - the UI is designed in the worst possible way and actually
             | encourages people to choose a short insecure PIN instead of
             | recommending a longer one. This means that security
             | guarantees suddenly rest entirely on SGX.
             | 
             | - SGX requires the server code to be verified and published
             | (which it wasn't until yesterday). Without verification,
             | it's all pointless.
             | 
             | > uses a key derivation function to maximize the master
             | key's entropy
             | 
             | Nitpick: Technically, the KDF is deterministic, so it
             | cannot change the entropy and - as the article says - you
             | could still brute-force short PINs (if it weren't for SGX).
             | 
             | > I'd love to hear from a security specialist regarding
             | this scheme. I'm not one and I had only limited time to
             | study the link above.
             | 
             | Have a look at link [1] in my previous comment. :)
        
           | im3w1l wrote:
           | SGX is just the processor pinky swearing (signed with Intel
           | keys) that everything is totally legit. Nation State
           | Adversaries can and will take Intel's keys and lie.
        
             | codethief wrote:
             | SGX is also supposed to protect against Signal as a
             | potential adversary, though, as well as against hackers. Or
             | at least that's how I understood the blog article.
        
         | iudqnolq wrote:
         | Focussing on whether the changes directly make things insecure
         | is missing the point. Fundamentally this sort of security is
         | about trust.
         | 
         | While it's nice to try to have Signal be resilient to attacks
         | by the core team, there just aren't enough community-minded
         | independent volunteer code reviewers to reliably catch them up.
         | I doubt the signal foundation gets any significant volunteer
         | efforts, even by programmers who aren't security experts.
         | 
         | That means I need to decide if I trust the Signal Foundation.
         | Shilling sketchy cryptocurrencies is indicative of loose
         | morals, which makes me think I was wrong to trust them in the
         | past.
        
           | cptskippy wrote:
           | > Shilling sketchy cryptocurrencies is indicative of loose
           | morals, which makes me think I was wrong to trust them in the
           | past.
           | 
           | Who decided it was sketchy?
           | 
           | The "I don't like change so I'm going to piss all over you"
           | attitude is what sinks a lot good things.
           | 
           | How does Signal benefit from being a shill for this coin? Are
           | they being paid by MOB or do they get a % of the cut?
           | 
           | So far all I've read are people screaming their heads off
           | that MOB eats babies and how dare Signal stoop so low as to
           | even fart in their general direction, but I have yet to see
           | anyone explain why MOB is bad or how Signal is bad for giving
           | MOB a platform.
        
             | bostonsre wrote:
             | Yea, I'm bearish on cryptocurrencies, but I think moxie and
             | his team have built up an incredible amount of goodwill in
             | my book. Enough for me to hear out their solution before
             | making a decision. I'm assuming they didn't write dogecoin2
             | or even a bitcoin clone. It will be interesting to learn
             | about it.
        
             | iudqnolq wrote:
             | > How does Signal benefit from being a shill for this coin?
             | Are they being paid by MOB or do they get a % of the cut?
             | 
             | The CEO of signal messenger LLC was/is the CTO of MOB.
             | 
             | See https://www.reddit.com/r/signal/comments/mm6nad/bought_
             | mobil... and https://www.wired.com/story/signal-mobilecoin-
             | payments-messa...
        
         | CynicusRex wrote:
         | >kneejerk HN "crypto bad" reflex
         | 
         | You make it sound as if that's a bad thing. Reflexes are
         | beneficial when the treats are real. And the crypto"currency"
         | multi-level marketing pyramid schemes are very real. They do
         | nothing but induce greed, gambling, spam, and all-around toxic
         | behavior. It's a digital cancer that needs to end.
        
           | z3ncyberpunk wrote:
           | So you want us to stick with the usual paper money, im sorry
           | -- fiat currency, pyramid scheme? no thanks
        
         | pmlnr wrote:
         | > A lot of these comments are just manifestations of the
         | kneejerk HN "crypto bad" reflex.
         | 
         | Nope. It's a reaction to "who the f* asked for this in a
         | messaging app?!".
        
           | neolog wrote:
           | Text isn't the only thing I want to be able to send to
           | people. I wish there were a universal "send thing" api that
           | could be implemented for text, images, money, whatever.
        
           | cptskippy wrote:
           | Unless you have your head thoroughly buried in the sand,
           | you'd understand that all the major players allow people to
           | send money AND people are using those platforms to send
           | money.
           | 
           | When people evaluate a new messaging client, the minimum
           | feature set required to be considered viable now includes
           | sending money for a lot of the population.
           | 
           | * removed insult
        
         | unhammer wrote:
         | > The security rests only upon the open source client code. The
         | server is completely orthogonal to security.
         | 
         | For Android at least, builds are reproducible
         | https://signal.org/blog/reproducible-android/ (would be neat if
         | there was one or more third party CI's that also checked that
         | the CI-built app reproduces the one on Google Play Store - or
         | maybe there already are?)
        
           | cptskippy wrote:
           | That's pretty neat, I wasn't aware that was possible.
        
         | lucideer wrote:
         | > _- Whether or not Signal 's server is open source has nothing
         | to do with security. [...] having the server code open source
         | adds absolutely nothing to this security model, [...] The
         | security rests only upon the open source client code. The
         | server is completely orthogonal to security._
         | 
         | The issue a lot of people have with Signal is that your
         | definition here of where security comes from is an extremely
         | narrow & technical one, and many would rather look at security
         | in a more holistic manner.
         | 
         | The problem with messaging security is that there's two ends,
         | and individually we only control one of them. Ok,
         | screenshotting & leaking your messages will always be a concern
         | no matter what technology we develop, but the other challenge
         | is just getting the other end to use Signal in the first place
         | and that's governed by the network effect of competitors.
         | 
         | Open Source is essential for security because one of the most
         | fundamental security features we can possibly hope to gain is
         | platform mobility. Signal doesn't offer any. If Signal gains
         | mass adoption and the server changes, we're right back to our
         | current security challenge: getting your contacts onto the new
         | secure thing.
        
           | kreetx wrote:
           | But now the server code is there, so we now have this
           | mobility, no?
        
             | lucideer wrote:
             | Yes and no.
             | 
             | Signal is not actually designed with mobility in mind (in
             | fact I would argue, based on Moxie's 36C3 talks, it was
             | designed to be and continues to be persistently kept anti-
             | mobility). That fact is independent of it being open- or
             | closed-source.
             | 
             | However, if the server is open-source, it opens the door
             | for future mobility in the event of org change. If it's
             | closed-source, you get what's currently happening with
             | WhatsApp.
             | 
             | In actuality, if we had something federated, with mobility
             | pre-baked in, having a closed-source server would be less
             | of a security-risk (the gp's comments on only needing to
             | trust the client would apply more strongly since mobility
             | removes the power to change from server maintainers)
             | 
             | Basically:
             | 
             | - with multi-server clients (e.g. Matrix/OMEMO), you have
             | no dependency on any orgs' server, so their being open-
             | source is less relevant (provided the protocol remains open
             | --this can still go wrong, e.g. with GChat/FBMessenger's
             | use of XMPP).
             | 
             | - with single-server clients (Telegram/WhatsApp/Signal),
             | you are dependent on a single server, so that server being
             | open-source is important to ensure the community can make
             | changes in the event of org change.
        
               | kreetx wrote:
               | So in principle we do have this mobility because you can
               | run your own servers. Perhaps it is not all that unlikely
               | that they will do a bridge to matrix.
        
             | acrispino wrote:
             | Until they decide to go silent for another 11 months
        
               | kreetx wrote:
               | Most of the popular chat-app space is not open source.
               | What is it with Signal that people feel entitled to
               | condemn it for not having the latest commits on github?
        
               | neolog wrote:
               | What is it with chat apps that people don't condemn them
               | for being closed source? Imagine if GCC hid their changes
               | for a year.
        
       | red_trumpet wrote:
       | Is there any mechanism to validate that the code running on
       | Signal's servers is the same as on Github?
        
         | gorkish wrote:
         | I am curious how this could even possibly be done.
         | 
         | As far as my understanding goes, it's hardly possible to even
         | verify that a compiled binary represents a faithfully executed
         | representation of the source instructions, let alone that it
         | will execute that way when run through a modern OS and CPU
         | pipeline.
         | 
         | I would think the objective here is more about releasing server
         | code that can be run independently in a way that 1) doesn't
         | involve signal's infrastructure and 2) allows the client/server
         | interactions to be audited in a way that trust of the server
         | side is unnecessary, regardless of what code it may or may not
         | be running.
        
         | jiveturkey wrote:
         | Yes, via SGX remote attestation.
         | 
         | https://sgx101.gitbook.io/sgx101/sgx-bootstrap/attestation
        
         | [deleted]
        
         | knocte wrote:
         | No.
        
         | Someone1234 wrote:
         | How would that work? You'd be layering trust on trust, wherein
         | if they're willing to lie about one thing they're willing to
         | lie about confirmation of that same thing (or not).
         | 
         | Unless you're going to hire some independent auditor (that you
         | still have to trust) it seems logically problematic.
        
           | madars wrote:
           | SGX enclaves can attest to the code they are running, so you
           | don't exactly need to take Signal's word on faith.
        
             | eptcyka wrote:
             | Except SGX enclaves are horribly broken.
        
               | monocasa wrote:
               | Like, does an SGX enclave attest that meltdown is patched
               | in microcode? That's one way to pull the keys out.
               | 
               | The recentish work to get read write access to some Intel
               | CPU's microcode can probably break SGX too. I wouldn't be
               | surprised if the ME code execution flaws could be used
               | that way too.
        
             | Someone1234 wrote:
             | That isn't a solution to the problem being discussed (a
             | provider's server code being verifiable by end users). I'm
             | quite confused by the suggestion that it could be/is.
        
         | codethief wrote:
         | As others have already mentioned there is Intel SGX and the
         | Signal developers indeed say they use it, see
         | 
         | https://news.ycombinator.com/item?id=26729786
        
         | Foxboron wrote:
         | There isn't, but people are working on getting us there. The
         | first project that comes to mind is "System Transparency".
         | 
         | https://system-transparency.org/
        
         | mikece wrote:
         | Seems there should be an API endpoint, similar to a health
         | check endpoint, that allows one to validate that the code on
         | the server matches what's in GitHub. How exactly that would
         | work is beyond me since I'm not a cryptographer but seems like
         | an easy way to let developers/auditors/the curious check to see
         | that the code on the server and GitHub match.
        
           | monocasa wrote:
           | validate_endpoint() {          return
           | hash_against_other_file_not_exe();        }
        
           | beaconstudios wrote:
           | if you assume that the server can lie to you, then it's
           | physically impossible. Any query could be answered by
           | interrogating a copy of the github version of the server and
           | returning the answer.
        
           | jhugo wrote:
           | How could that possibly work? The API endpoint of a malicious
           | modified server could just return whatever the API endpoint
           | of the non-malicious non-modified server returns.
        
         | monocasa wrote:
         | That's basically the same problem as DRM, so no, you can't
         | verify that someone is running only code you want them to run
         | against data you gave them, on hardware they own.
        
           | lxgr wrote:
           | Yet DRM does exist. (Yes, these schemes usually end up
           | getting broken at some point, but so does other software.)
           | 
           | The problem is more generally called trusted computing, with
           | Intel SGX being an implementation (albeit one with a pretty
           | bad track record).
        
         | danpalmer wrote:
         | I think the argument would be that with end-to-end encryption
         | this is unnecessary, which is good because it's impossible.
         | 
         | There's a counter-argument that there is still useful metadata
         | a server can glean from its users, but it's certainly minimised
         | with a good protocol... like the Signal protocol.
        
           | cortesoft wrote:
           | Wait, how would end-to-end encryption help this problem at
           | all? I agree that it is impossible (currently), but not sure
           | how E2E helps anything?
           | 
           | E2E encryption only helps you verify WHO you are connecting
           | to, not what they are doing with your connection once it is
           | established.
        
             | iudqnolq wrote:
             | Because the other end in E2E is your friend's phone, not a
             | server. We call end-to-server encryption "in-flight"
             | encryption.
        
               | cortesoft wrote:
               | Ah, ok I misunderstood
        
             | gsich wrote:
             | Because it doesn't matter what the server does in terms of
             | message content.
        
         | edejong wrote:
         | Yes.
         | 
         | Auditors
         | 
         | Trusted Enclaves (but then you trust Intel)
         | 
         | Signed chain of I/O with full semantics specified (blockchain
         | style).
        
         | hoophoop wrote:
         | No.
         | 
         | If Signal /was/ federated it would be a strong hint that the
         | server code stays the same.
         | 
         | And even if it's not the same, people would be able to run
         | their own trusted servers.
        
           | ViViDboarder wrote:
           | Federation pretty much guarantees the opposite. There would
           | likely be many servers running many different versions
           | whereby you'd have no way of knowing which are trusted or
           | not. It, by design, distributes trust. This means there are
           | more parties to trust.
           | 
           | Anyway, Signal is designed to handle all the private bits at
           | the client side with e2ee so you have to put as little trust
           | in the server as possible.
        
       | tlarkworthy wrote:
       | If you have a PhD you might be able to verify from the client-
       | side it does not matter. If you are into blockchain there might
       | be another (but very expensive) way to show a system can be
       | trusted.
       | 
       | For normal development, I am advocating an always auditable
       | runtime that runs only public source code by design:-
       | https://observablehq.com/@endpointservices/serverless-cells
       | 
       | Before sending data to a URL, you can look up the source code
       | first, as the URL encodes the source location.
       | 
       | There is always the risk I decided to embed a trojan in the
       | runtime (despite it being open source). However, if I am a
       | service provider for 100k customers built upon the idea of a
       | transparent cloud, then compromising the trust of one customer
       | would cause loss of business across all customers. Thus, from a
       | game-theoretic perspective, our incentives should align.
       | 
       | I think running public source code, which does not preclude
       | injecting secrets and keeping data private, is something that
       | _normal_ development teams can do. No PhDs necessary, just normal
       | development.
       | 
       | Follow me on https://twitter.com/tomlarkworthy if you want to see
       | this different way of approaching privacy: always auditable
       | source available server-side implementations. You can trust
       | services implemented this way are safe, because you can always
       | see how they process data. Even if you cannot be bothered to
       | audit their source, the sheer fact that someone can, inoculates
       | you against bad faith implementations.
       | 
       | I am building a transparent cloud. Everything is encoded in
       | public notebooks and runs open-source
       | https://observablehq.com/collection/@endpointservices/servic...
       | There are other benefits, like being able to fork my
       | implementations and customize, but primarily I am doing this for
       | trust through transparency reasons.
        
         | mulmen wrote:
         | How do you prove the endpoint is running the code to which it
         | links?
        
           | tlarkworthy wrote:
           | Simple but not 100% foolproof, you can mutate your source
           | code and verify the changes propagate.
           | 
           | Note the endpoint does a DYNAMIC lookup of source code. So
           | you can kinda reassure yourself the endpoint is executing
           | dynamic code just by providing your own source code.
           | 
           | It might be more obvious the runtime does nothing much if you
           | see the runtime
           | https://github.com/endpointservices/serverlesscells
           | 
           | The clever bits that actually implement services are all in
           | the notebooks.
        
             | mulmen wrote:
             | That doesn't seem to provide any meaningful indication the
             | endpoint runs the code it claims. Can't I just create an
             | evil endpoint that links to legit code?
        
               | tlarkworthy wrote:
               | No the endpoint is shared across all customers, the
               | service providers do not self host, generally. The end
               | point is the infra provider. Later I might try to code
               | sign that and open up the cloud console for visibility,
               | but not short term
        
             | yjftsjthsd-h wrote:
             | > Simple but not 100% foolproof, you can mutate your source
             | code and verify the changes propagate.
             | 
             | If I was evil, I wouldn't have a totally separate source
             | tree and binary that I shipped; I'd have my CI process
             | inject a patch file. As a result, everything would work as
             | expected - including getting any changes from the public
             | source code - but the created binaries would be backdoored.
        
               | tlarkworthy wrote:
               | Yeah I can fix this with work but just getting some users
               | would be helpful first
        
               | pluies wrote:
               | https://www.cs.cmu.edu/~rdriley/487/papers/Thompson_1984_
               | Ref... :)
        
       | [deleted]
        
       | random5634 wrote:
       | The crapcoin thing is REDICULOUS!
       | 
       | THIS is what we setup nonprofits to support - some private entity
       | and their crapcoin?
       | 
       | How is this legal?
       | 
       | How is this not a conflict of interest?
       | 
       | How is this not private inurment?
       | 
       | The crapcoin is totally opaque to. With premined coins.
        
         | jeltz wrote:
         | It probably isn't legal but people use non-profits all the time
         | for personal gain and get away with it almost always.
        
           | random5634 wrote:
           | Exploiting nonprofit status to generate a personal benefit is
           | not legal in the US. So if the board members, officers or
           | whomever have some personal benefit to this crapcoin taking
           | off, it's not legal for them to use nonprofit resources and
           | tax breaks to push for that.
        
       | jeffo_rulez wrote:
       | kinda crazy that the signal team doesn't GPG sign their commits.
        
       | MAGZine wrote:
       | Is it though? We're already 6 days without a commit. Who knows
       | that the history isn't frozen again until the next major release?
        
       | Klwohu wrote:
       | Do you still need to give them your phone number though?
        
       | louwrentius wrote:
       | OK, good!
       | 
       | Now please remove that cryptocoin stuff from the app. Don't
       | create another avenue for money laundering, tax evasion and drugs
       | sales (if not worse).
        
       | lvs wrote:
       | I read some speculation that the delay was to keep this
       | objectionable crypto payment development under wraps until they
       | were ready to launch.
        
         | prophesi wrote:
         | Yep. I posted this on a different Signal HN submission, but the
         | very next commit on April 22nd, 2020 was when they first began
         | working on the integration.
         | 
         | https://github.com/signalapp/Signal-Server/commit/95f0ce1816...
        
           | yunohn wrote:
           | Oh wow. That's incredibly suspicious...
        
             | yjftsjthsd-h wrote:
             | It _could_ just be an arguably-legitimate desire to keep
             | the hot new feature secret until the big announcement; this
             | particular bit is... sub-optimal... but it doesn 't seem
             | like it needs to be nefarious.
        
               | yunohn wrote:
               | Open source trust is a big selling point. Arguably more
               | so than a new payment feature. Hard to understand the
               | desire to hide it all...
        
               | [deleted]
        
               | coolspot wrote:
               | Next time they will freeze code to keep NSA integration
               | secret.
        
         | olah_1 wrote:
         | The devious aspect to this is that nobody knew the development
         | was happening so we couldn't invest even if we wanted to.
         | 
         | It was kept under wraps for a grade A pump.
        
           | hvis wrote:
           | OTOH, announcing this development semi-privately on GitHub
           | but not to the public at large (including the current
           | MobileCoin owners) could be considered as "insider trading",
           | and it's a criminal offense in US.
        
             | olah_1 wrote:
             | Which is probably why they're avoiding the SEC at all
             | costs.
        
         | dbrgn wrote:
         | This might be a legitimate reason to keep the source code non-
         | public temporarily. However, the communication strategy by
         | Signal about this was horrible (or rather non-existent).
         | 
         | People in the user forum
         | (https://community.signalusers.org/t/where-is-new-signal-
         | serv...) and in other places on the internet were upset for
         | months, because the server wasn't being updated anymore. At the
         | same time, Signal regularly tweetet that "all they do is 100%
         | open source", even at a point in time where no source code was
         | released for almost a year.
         | 
         | Just 2 days ago this was getting picked up by some larger tech
         | news platforms:
         | 
         | https://www.golem.de/news/crypto-messenger-signal-server-nic...
         | 
         | https://www.androidpolice.com/2021/04/06/it-looks-like-signa...
         | 
         | It's normal that Signal ignores its users, but apparently they
         | didn't even reply to press inquiries about the source code. All
         | it would have taken is a clear statement like "we're working on
         | a cool new feature and will release the sources once that's
         | ready, please bear with us". Instead, they left people
         | speculating for months.
         | 
         | This communication strategy, combined with the cryptocurrency
         | announcement, may cause serious harm to Signal's reputation.
        
       | dtx1 wrote:
       | After people started to realize that WhatsApp, owned by Facebook,
       | started changing their privacy settings from terrible to slightly
       | differently terrible, people flocked to and were recommended
       | Signal by so called experts. Yet no one at that time bothered to
       | point out that signal has been opaque as fuck about just about
       | anything they do. On the other hand a free, self-hostable, highly
       | transparent, highly secure alternative exists in the form of
       | matrix.
       | 
       | But of course tech media didn't recommend it because it requires
       | a modicum of thought and technical understanding. To me this
       | drives home one key issue: the bulk of users is either too stupid
       | or unwilling to invest even the tiniest amount of effort into
       | their privacy. If you don't want to be saved then fine, give up a
       | detailed profile of your life, interests and opinions into the
       | hands of large megacorps that sell you as their product to
       | advertisers, governments and whoever has the cash. But don't come
       | crying to the experts when your private information has been
       | leaked for the thousands time and don't ever expect a fair
       | election again when political parties can microtarget you into
       | oblivion.
       | 
       | It's not a matter of not knowing anymore but willful ignorance.
       | Democracy and Freedom won't die with thunderous applause but the
       | silent callousness of an unthinking majority.
        
         | driverdan wrote:
         | > the bulk of users is either too stupid or unwilling to invest
         | even the tiniest amount of effort into their privacy
         | 
         | This is an awful attitude. Not everyone understands computers
         | at a technical level. Not everyone has the knowledge to install
         | and troubleshoot Matrix.
         | 
         | Most users are non-technical, full stop. You can't expect them
         | to use something that requires technical knowledge.
        
           | dtx1 wrote:
           | Then the reality is that those users cannot expect privacy
           | anymore
        
             | jedberg wrote:
             | This part I disagree with. They can (and should) lobby for
             | laws that protect their privacy. Users don't want to give
             | up convenience, but if the majority are using the services
             | of BigCorp, and we push to force BigCorp to provide privacy
             | through legislation, then society still wins.
        
               | dtx1 wrote:
               | They should but rarely do and even if a tiny victory like
               | GDPR is won, it's immediately dismantled by big corps.
               | Facebook hasn't been fined any significant amount yet it
               | violates GDPR every day. As it stands right now, the
               | average user has no privacy.
               | 
               | Just try this: Use a default browser without
               | adblocking/tracking protection etc. for a week and look
               | how the ads are targeting you more and more. That's the
               | internet for the average user. We just don't see it as
               | much because we default to protect ourselves.
        
               | mindslight wrote:
               | Lobbying for privacy laws requires legal knowledge to
               | know which proposals can actually be effective, otherwise
               | one is merely lending their uninformed support to
               | whatever the media tells them is a good idea today.
               | 
               | It all comes down to self actualization, no matter how
               | you slice it. Installing Element instead of Signal (even
               | with its @weird%identity.syntax) is quite technically
               | easy, especially compared to politics! The issue is
               | almost entirely social, in that it requires pushing back
               | on friends who want you to install the proprietary stuff
               | to communicate, and pushing them to the Free solutions.
        
         | grumpyautist wrote:
         | Matrix is still broken though. I find XMPP to be sufficient at
         | least until matrix fixes their group encryption
        
           | dtx1 wrote:
           | Got a source for matrix group encryption issues?
           | 
           | EDIT:
           | 
           | From their [FAQ](https://matrix.org/faq/)
           | 
           | > End-to-End Encryption is fully supported in Matrix. New
           | rooms have encryption enabled by default, and all existing
           | rooms can optionally have End-to-End Encryption turned on.
           | 
           | What exactly do you think is broken here?
        
             | grumpyautist wrote:
             | In group chat constantly there are members who you cannot
             | decrypt.
        
             | omnimus wrote:
             | ux is seriously broken. i run my own instance and even i
             | cant verify my encrypted sessions. Like i already have some
             | three security codes that i have no idea when to use and
             | verification of my own devices often randomly fails.
             | 
             | its just mess and i think it works properly maybe on main
             | instance.
             | 
             | its nowhere near encryption on signal
        
         | CivBase wrote:
         | I'm disappointed WhatsApp became the defacto solution for
         | secure communication. I am also disappointed Signal became the
         | defacto alternative. I also think Matrix is a better solution
         | and I'm rooting for it. But I think your response is
         | unreasonably pessimistic and, frankly, arrogant.
         | 
         | For all its problems, WhatsApp was an improvement to the status
         | quo. User messages went from being fully public to only being
         | accessible by Facebook. Signal was also an improvement. User
         | messages were finally encrypted from end-to-end, even though
         | Signal retained control of the infrastructure and kept some of
         | it opaque. Things are improving and there is no reason to
         | believe Signal will have any more staying power than WhatsApp
         | or any other platform.
         | 
         | As for _why_ Signal beat out Matrix, I think the technical
         | hurdles are a relatively minor factor. After all, signing up
         | with Element doesn 't require hardly any technical
         | understanding. I think Signal was just a better known brand
         | with a larger established userbase, streamlined on-boarding
         | process, and ubiquitous feature support across many platforms.
         | 
         | I don't think it's appropriate to call Signal's users or the
         | techies who recommended it "stupid". I think most of them just
         | realized a communication platform is only valuable if the
         | people you communicate with _actually use it_ and that Signal
         | would be an easier short-term sell. I think the chaps at
         | Element realize that and are trying to position themselves as
         | the next logical alternative once Signal eventually has its own
         | mass-exodus.
         | 
         | There is still hope for Matrix. Just keep championing it and
         | show some patience to your fellow human.
        
           | dtx1 wrote:
           | That's such a nice response that i just want to thank you for
           | it. Thank you, it really made my day! I'm just depressed from
           | being stuck inside due to covid for such a long time, maybe
           | that made me lose hope there..
        
             | CivBase wrote:
             | No problem. With all the bad news out there, it's easy to
             | get hung up on minor setbacks and loose track of how things
             | are genuinely getting better overall.
        
         | ForHackernews wrote:
         | This is an unhelpful attitude, IMHO.
         | 
         | Your best chance, as a privacy-conscious, tech-savvy individual
         | is to push for mass-market adoption of strong encryption and
         | good government privacy regulations that will help everyone.
         | 
         | Lacking those, you will stand out like a sore thumb as one of a
         | tiny number of "weirdos" using Matrix, or Brave, or Tor or
         | GrapheneOS or whatever other hardcore self-hosted, federated
         | niche tools you favour. Any interested spooks can focus their
         | considerable resources on the user base for these tools. Merely
         | having these things installed becomes suspicious in a way that
         | WhatsApp or Signal is not.
        
           | Vinnl wrote:
           | Especially if your interest is in not necessarily your
           | personal privacy, but that of lawyers, journalists,
           | whistleblowers, etc.
        
             | dtx1 wrote:
             | I wish i could show you what lawyers in germany did with
             | their lawyer-specfic mail service. I don't have any non-
             | german sources but they are a security nightmare and e2e
             | encryption was deemed "unnecessary". Even vulnerable groups
             | don't care anymore.
        
           | dtx1 wrote:
           | > This is an unhelpful attitude, IMHO.
           | 
           | I agree. I gave up fighting for privacy for other people
           | because they don't care. I'm now fighting for my privacy
           | because it's a war that the normies don't care to fight. They
           | never did. Give them a shiny app and they'll sign away their
           | firstborn.
           | 
           | > Your best chance, as a privacy-conscious, tech-savvy
           | individual is to push for mass-market adoption of strong
           | encryption and good government privacy regulations that will
           | help everyone.
           | 
           | Governments, at least the one here in Germany, are already
           | pushing for mandatory backdoors in all messengers and so far
           | they have been hugely successful in pushing for all anti-
           | privacy anti freedom laws they want with the only hindrance
           | being our highest court. Just this week our NSA equivalent,
           | the BND was granted sweeping new rights to hack messenger
           | services around the world, to install rootkits and trojans on
           | people's devices and listen into large parts of the internet.
           | 
           | WE. FUCKING. LOST.
           | 
           | This was all done in the open, the mass media didn't object,
           | normies didn't object, no one cared. Heck my parent's are
           | still using whatsapp despite my best efforts and my father is
           | an IT Professional himself.
           | 
           | > any interested spooks can focus their considerable
           | resources on the user base for these tools. Merely having
           | these things installed becomes suspicious in a way that
           | WhatsApp or Signal is not.
           | 
           | bullshit, it's massively easier to force
           | signal/whatsapp/whatever to hand over user data then to hack
           | my Graphene OS phone (yes, really it's my main phone),
           | backdoor into my qubes OS notebook (yes really, i'm writing
           | this comment using it) or attack my raspberry pi hosted
           | matrix instance. Sure it's possible. If they want to get me
           | they don't have to do much, just threaten me with a
           | rubberhose. But they can't fuck me over by legally attacking
           | large cloud infrastructure that hosts everyone.
           | 
           | i can't save those that don't care but i can build a fucking
           | cyberbunker and host my own stuff in the vain hopes that it
           | protects me from the worst, and it certainly protects me from
           | facebook, google and co. Sure the NSA can hack me if they
           | want to. We lost that fight before we knew we had to fight
           | it. But at least the cost of doing it is slightly higher then
           | using the government mandated backdoors in whatsapp, android
           | and co.
        
             | exolymph wrote:
             | Cheers for fighting the good fight. Quixotic idealists
             | almost never get anything done... and yet they're still the
             | only ones who do manage it ;P
             | 
             | With high-variance results, but heck, that's nature.
        
           | JasonFruit wrote:
           | But if you are promoting something that is opaque and in some
           | situations worse than an already-existing alternative, is it
           | really the right path to recommend the problematic tool
           | because it is more popular?
        
         | jedberg wrote:
         | > the bulk of users is either too stupid or unwilling to invest
         | even the tiniest amount of effort into their privacy.
         | 
         | Those of us who have worked in security have known this for
         | years. There are countless examples of massive security gains
         | through minor inconvenience, and users rejecting them. Two
         | factor auth is a big one. Yeah, it's a slight hassle. It gives
         | major security improvements. Even security people don't like to
         | use it. Even people who have been scammed multiple times, and
         | told if they just turned on 2FA it help, would rather deal with
         | being scammed again than use 2FA (I saw this at eBay and PayPal
         | a few times, where the user rejected a free security token
         | despite having been scammed multiple times).
         | 
         | Users hate friction. If it's not as easy as "download app, put
         | in contacts" they aren't interested.
        
           | Vinnl wrote:
           | And of course, that's exactly why people were recommending
           | Signal. Of all the practically frictionless apps it provided
           | best encryption.
        
             | dtx1 wrote:
             | privacy isn't just encryption. Forcing users to link their
             | phone number as an identifier like signal does is just one
             | of the myriad of questionable choices signal has made. And
             | with their coin based payment they just painted a large
             | bullseye on them.
        
           | dtx1 wrote:
           | > I saw this at eBay and PayPal a few times, where the user
           | rejected a free security token despite having been scammed
           | multiple times.
           | 
           | People here downvote me for this but this is the issue. Those
           | people cannot be helped and they don't want help. They chose
           | this and have to live with the consequences. There's no right
           | to easy computing.
        
             | jedberg wrote:
             | I think you got downvotes for the way you presented your
             | argument, not the content...
        
               | dtx1 wrote:
               | Yeah, i'm tired and hangry. you're right. I'm just happy
               | SOMEONE understood my argument and saw the same issues i
               | did
        
         | hiq wrote:
         | > highly secure alternative exists in the form of matrix
         | 
         | At least for metadata, as of now, Signal seems to provide
         | better guarantees than Matrix.
         | 
         | I can imagine Matrix competing with Discord and Slack, but I
         | don't think they'll ever be able to compete with WhatsApp and
         | Signal. You can blame "stupid" users and the media all you
         | want, that won't change the path of least resistance. I really
         | like Matrix as an IRC replacement though.
        
           | Ploskin wrote:
           | > At least for metadata, as of now, Signal seems to provide
           | better guarantees than Matrix.
           | 
           | Agree here. Matrix servers log _everything_ by default. If
           | somebody cares about protecting metadata, I don 't know why
           | they'd choose Matrix over Signal.
        
         | lxgr wrote:
         | > To me this drives home one key issue: the bulk of users is
         | either too stupid or unwilling to invest even the tiniest
         | amount of effort into their privacy.
         | 
         | Something tells me you've never given tech support to non-
         | technical family members. (And that's being generous - because
         | outright considering everyone non-technical "stupid" would be a
         | pretty sad worldview.)
        
           | dtx1 wrote:
           | > or unwilling to invest even the tiniest amount of effort
           | into their privacy.
        
         | subsubsub wrote:
         | Given that you have said...
         | 
         | "the bulk of users is either too stupid or unwilling to invest
         | even the tiniest amount of effort into their privacy."
         | 
         | I don't feel the need to pull my punches.
         | 
         | This is the most deluded, idiotic response I've seen on hacker
         | news in a long time.
         | 
         | It seems unlikely that the average person (or even a non-techie
         | person of above average intelligence - e.g. a doctor) will be
         | able to set up matrix in a way that is more secure than just
         | installing signal. Your security relies not just on you but on
         | the weakest node in your network. Getting good security might
         | require trade-offs. Your all or nothing mindset will not
         | achieve it. The saying "Perfect is the enemy of done" comes to
         | mind. Perfect security (or what you purpose) is not one of the
         | options in a secure system that has to exist in the real world.
         | 
         | Please remove your head from it's dark cavernous home.
         | 
         | Love, Me
        
           | subsubsub wrote:
           | P.S. Goodbye HN, you don't matter any more.
        
           | exolymph wrote:
           | On the one hand, you're absolutely right about usability. On
           | the other hand...
           | 
           | > "the bulk of users is either too stupid or unwilling to
           | invest even the tiniest amount of effort into their privacy."
           | 
           | Based on my interactions with users writ large, this
           | assessment is on the money. Normies do not care and will
           | never care.
        
           | dtx1 wrote:
           | the average person doesn't use signal they use phones,
           | whatsapp and facebook. You can use the public matrix node and
           | i can use my own node and we can talk. No setup needed. If
           | they care they can buy a hosted matrix service package like
           | this https://element.io/pricing with their own instance run
           | by people who probably know what they are doing.
           | 
           | In before "but it's not free as in no cost". That's why big
           | corps will always fuck over the normies. As it stands, one
           | cannot use the internet without either giving away their
           | privacy or learning a lot about computers and how they work
           | and how to use them.
           | 
           | The majority chose the "i don't care, give me shiny app"
           | route. and they fucked us all over by doing so. There's no
           | right to easy privacy friendly computing. There's only the
           | harsh reality that behind friendly blue and rainbow colored
           | companies sit people that will sell a digital recreation of
           | yourself to anyone who cares to pay and give you a few gigs
           | of free e-mail space and a shiny app for it.
        
             | dtx1 wrote:
             | to prove a point here, use a browser without adblocking and
             | other extensions like chrome for a week and look at how
             | they are targeting you. That's what they do to anyone
             | without the willingness to fight it with technical
             | knowledge. And that sadly is the majority.
        
       ___________________________________________________________________
       (page generated 2021-04-07 23:01 UTC)