[HN Gopher] F-Droid Fake Signer PoC
       ___________________________________________________________________
        
       F-Droid Fake Signer PoC
        
       Author : pabs3
       Score  : 217 points
       Date   : 2025-01-03 22:47 UTC (1 days ago)
        
 (HTM) web link (github.com)
 (TXT) w3m dump (github.com)
        
       | CountHackulus wrote:
       | Guess I'm immediately uninstalling F-Droid. That chain of events
       | looks really poor for them.
        
         | Larrikin wrote:
         | And using what instead?
        
           | donio wrote:
           | Not the parent and I will continue to use F-Droid but
           | Obtanium is a popular alternative. It allows you to install
           | apks directly from various supported sources (forges, F-Droid
           | repos etc) so you typically use the apk that the app
           | maintainer has produced in their CI pipeline rather than
           | F-Droid's reproducible builds.
        
             | notpushkin wrote:
             | F-Droid would likely get APKs from the same place (if
             | reproducible builds are on for the app in question). If
             | this attack is implemented successfully, then that place
             | was compromised as well, and Obtainium can't do much here
             | to detect that I'm afraid.
             | 
             | Edit: on second thought, they could pin certificate hashes
             | like F-Droid does on the build server, but verify them
             | client-side instead. If implemented correctly this could
             | indeed work. However, I think F-Droid with reproducible
             | builds is still a safer bet, as attacker would have to get
             | write access to source repo as well _and_ hide their
             | malicious code so that F-Droid can build and verify it.
        
           | CountHackulus wrote:
           | Nothing. I'll sideload what I need to. I didn't find it that
           | useful.
        
             | yjftsjthsd-h wrote:
             | Okay, but sideloading is worse? AFAICT the problem we're
             | discussing was in F-Droid doing extra verification
             | (somewhat incorrectly, apparently) of an APK before handing
             | it to Android to install. Regardless of F-Droid, Android
             | will check signatures on updates against the installed
             | version. So your response to F-Droid imperfectly checking
             | signatures as an extra verification on first install... is
             | to skip that entirely and do zero verification on first
             | install? That's strictly worse for your security.
        
             | wobfan wrote:
             | Sideloading sounds like a massively worse option than using
             | F-Droid even with this flaw. Humans are way more likely in
             | making mistakes, and you lose a lot of safeguards in
             | between you and the APK when you sideload. Also, you don't
             | get updates as fast, which is a whole problem in itself.
             | 
             | So, IMO we should not fall into that trap of immediately
             | removing apps that had a security flaw and falling back to
             | a way worse alternative (which sideloading is) instead.
        
           | udev4096 wrote:
           | Closest you'll get is Aurora Store if you don't want to give
           | in to play store
        
       | Retr0id wrote:
       | What's the practical impact?
        
         | udev4096 wrote:
         | A malicious app can bypass the certificate pinning feature to
         | install the app. More info:
         | https://gitlab.com/fdroid/fdroidserver/-/issues/1128
        
       | kuschku wrote:
       | While none of that applies to F-Droids primary use case (the
       | primary F-Droid repo builds all apps from source itself), it
       | nonetheless looks like they failed to correctly handle the issue.
       | 
       | The only reason this didn't turn into a disaster was pure luck.
        
         | gruez wrote:
         | >The only reason this didn't turn into a disaster was pure
         | luck.
         | 
         | Is it? Or is it a case of "It rather involved being on the
         | other side of this airtight hatchway"[1]? The apk signature
         | done by fdroidserver seems totally superfluous. Android is
         | already going to verify the certificate if you try to update an
         | app, and presumably whatever upload mechanism is already
         | authenticated some other way (eg. api token or
         | username/password), so it's unclear what the signature
         | validation adds, aside from maybe preventing installation
         | failures.
         | 
         | [1]
         | https://devblogs.microsoft.com/oldnewthing/20060508-22/?p=31...
        
           | ncr100 wrote:
           | > Android is already going to verify the certificate..
           | 
           | Will it if it's a non Google distro of Android?
        
             | mid-kid wrote:
             | yeah, you can't update an app with a different signature
             | than the one you've currently installed, that's a feature
             | of the OS.
        
             | gruez wrote:
             | The behavior is in AOSP, so it should be in "non Google
             | distro of Android" as well, unless the manufacturer decided
             | to specifically remove this feature.
        
           | Nullabillity wrote:
           | > The apk signature done by fdroidserver seems totally
           | superfluous. Android is already going to verify the
           | certificate if you try to update an app, and presumably
           | whatever upload mechanism is already authenticated some other
           | way (eg. api token or username/password), so it's unclear
           | what the signature validation adds, aside from maybe
           | preventing installation failures.
           | 
           | If you try to _update_ the app. Anyone installing the app
           | from scratch will still be vulnerable. Effectively, both
           | cases are Trust On First Use, but AllowedAPKSigningKeys moves
           | the First Use boundary from  "the first time _you_ install
           | the app " to "the first time F-Droid saw the app". Izzy wrote
           | a blog post about it a while ago.[0]
           | 
           | > and presumably whatever upload mechanism is already
           | authenticated some other way (eg. api token or
           | username/password)
           | 
           | IzzyOnDroid (and, I believe, F-Droid) don't have their own
           | upload UI or authentication, they poll the upstream repo
           | periodically.
           | 
           | [0]: https://f-droid.org/2023/09/03/reproducible-builds-
           | signing-k...
        
             | gruez wrote:
             | >Effectively, both cases are Trust On First Use, but
             | AllowedAPKSigningKeys moves the First Use boundary from
             | "the first time you install the app" to "the first time
             | F-Droid saw the app".
             | 
             | 1. What you're describing would have to happen on the
             | f-droid app, but the vulnerability seems to be on
             | fdroidserver?
             | 
             | 2. Even if this actually affected the f-droid app, what you
             | described seems like a very modest increase in security.
             | The attack this prevents (ie. a compromised server serving
             | a backdoored apk with a different signature) would also
             | raise all kinds of alarms from people who already have the
             | app installed, so practically such an attack would be
             | discovered relatively quickly.
             | 
             | >IzzyOnDroid (and, I believe, F-Droid) don't have their own
             | upload UI or authentication, they poll the upstream repo
             | periodically.
             | 
             | Doesn't f-droid perform the build themselves and sign the
             | apk using their own keys? They might be pulling from the
             | upstream repo, but that's in source form, and before apks
             | are signed, so it's irrelevant.
        
         | wkat4242 wrote:
         | Yeah that's the big benefit of F-Droid, reproducible builds. It
         | builds directly from github. I like that aspect of it a lot, it
         | adds a lot of security that other app stores don't have.
         | 
         | But yeah other repos don't :(
        
         | FuturisticGoo wrote:
         | The primary F-droid repo also hosts the app developer builds in
         | case of reproducible builds, where F-droid will first build
         | from source and then compare it with the dev's build. If its
         | identical, it uses the dev build in the repo and if its not,
         | the build fails.
         | 
         | The use of AllowedAPKSigningKeys afaik is to compare that key
         | with the key used for signing the dev build. If its not the
         | same, the dev build is rejected.
         | 
         | From what I've understood from this POC, its possible to bypass
         | this signature check. The only exploit I can think of with this
         | bypass is that someone who gets access to the developer's
         | release channel can host their own signed apk, which will
         | either get rejected by Android in case of update (signature
         | mismatch) or gets installed in case of first install. But in
         | either case, its still the same reproducible build, only the
         | signature is different.
        
           | rollcat wrote:
           | > The only exploit I can think of with this bypass is that
           | someone who gets access to the developer's release channel
           | can host their own signed apk, which [...] gets installed in
           | case of first install.
           | 
           | That still enables a supply chain attack, which should not be
           | dismissed - virtually all modern targeted attacks involve
           | some complex chain of exploits; a sufficiently motivated
           | attacker _will_ use this.
        
             | gruez wrote:
             | You missed the later part of the quote:
             | 
             | >But in either case, its still the same reproducible build,
             | only the signature is different.
             | 
             | That means the attacker still has to compromise the source
             | repo. If they don't and try to upload a backdoored apk,
             | that would cause a mismatch with the reproducible build and
             | be rejected. If you can compromise the source repo, you're
             | already screwed regardless. Apk signature checks can't
             | protect you against that.
        
               | hnaccount_rng wrote:
               | In a previous post you said that - in case of matching
               | builds - the dev's version is used. Why is the "dev's"
               | version relevant? And assuming I'm correct that it isn't.
               | What is the added benefit vs. just building from source
               | (from a known good state, e.g. by a blessed git hash)?
        
               | gruez wrote:
               | >In a previous post you said that - in case of matching
               | builds - the dev's version is used
               | 
               | Which post are you talking about?
               | https://news.ycombinator.com/item?id=42592150 was made by
               | FuturisticGoo, not me.
               | 
               | Also, the wording on f-droid suggests the version that
               | f-droid hosts is built by them, rather than a version
               | that's uploaded by the dev. If you go on any app and
               | check the download section, it says
               | 
               | > It is built by F-Droid and guaranteed to correspond to
               | this source tarball.
        
               | NotPractical wrote:
               | Android will block any update to an existing app that
               | wasn't signed with the same signature. The benefit of
               | using the developer's signature (even if the app is built
               | by F-Droid) is that the F-Droid release of the app is not
               | treated as a "different app" by the Android OS, and thus
               | it can be updated by other app stores or through direct
               | APK releases from the developer. If the user chooses to
               | stop using F-Droid in the future, they can still receive
               | updates through other means without uninstalling and
               | reinstalling the app.
               | 
               | It also allows the user to place a little less trust on
               | F-Droid because the developer, as well as F-Droid, must
               | confirm any release before it can be distributed. (Now
               | that I think of it, that probably creates an issue where
               | if malware somehow slips in, F-Droid has no power to
               | remove it via an automatic update. Perhaps they should
               | have a malware response or notification system?)
               | 
               | More: https://f-droid.org/2023/09/03/reproducible-builds-
               | signing-k...
        
         | NotPractical wrote:
         | From what I can understand the attack scenario is as follows:
         | 
         | 1. User downloads an app from F-Droid that supports
         | reproducible builds.
         | 
         | 2. The developer's account is compromised and submits an app
         | with a different-than-expected signing key.
         | 
         | 3. A new user installs the app (existing users aren't affected
         | due to Android's enforcement of using the same signing key for
         | updates).
         | 
         | 4. This user is (external to the app) contacted by the attacker
         | and directed to install an update to the app from them. The
         | update contains malicious code.
         | 
         | F-Droid's response is concerning but this attack scenario seems
         | pretty unlikely to work in practice.
        
       | mrayycombi wrote:
       | Great work.
        
       | mappu wrote:
       | Is it as bad as they're making it out to be? The fdroidserver
       | get_first_signer_certificate can give a different result to
       | apksigner, but then fdroidserver calls apksigner anyway for
       | verification, and F-Droid mitigates the issue in various other
       | ways.
       | 
       | I think F-Droid were acting in the right up to that point; and
       | then the latest update (regex newlines) is 0day? Has there been a
       | response from F-Droid about the updates?
        
         | KennyBlanken wrote:
         | Well, this is pretty concerning all on its own:
         | 
         | > Instead of adopting the fixes we proposed, F-Droid wrote and
         | merged their own patch [10], ignoring repeated warnings it had
         | significant flaws (including an incorrect implementation of v1
         | signature verification and making it impossible to have APKs
         | with rotated keys in a repository).
         | 
         | This concerns me more than the vulnerabilities themselves. It's
         | a pretty serious failure in leadership and shows that F-Droid
         | is still driven by egos, not sound software engineering
         | practices and a genuine interest in doing right for the
         | community.
         | 
         | F-Droid has numerous issues:
         | 
         | * glacially slow to release updates even when security patches
         | are released
         | 
         | * not enforcing 2FA for developer accounts
         | 
         | * no automatic vulnerability or malware scanning
         | 
         | ...and more problems:
         | https://privsec.dev/posts/android/f-droid-security-issues/
        
       | bsimpson wrote:
       | Tangential, but:
       | 
       | I often wonder how secure these open source projects actually
       | are. I'm curious about using Waydroid in SteamOS, but it looks
       | like it only runs LineageOS (apparently a derivative of
       | CyanogenMod).
       | 
       | I know that people claim that open source is more secure because
       | anyone can audit it, but I wonder how closely its security
       | actually interrogated. Seems like it could be a massive instance
       | of the bystander effect.
       | 
       | All of it gives me a bias towards using official sources from
       | companies like Apple and Google, who presumably hire the talent
       | and institute the processes to do things right. And in any case,
       | having years/decades of popularity is its own form of security.
       | You know anyone who cares has already taken shots at Android and
       | iOS, and they're still standing.
        
         | mid-kid wrote:
         | Google isn't gonna build a ROM for waydroid so someone's going
         | to have to make a build of Android, whom you'll have to trust.
         | Google doesn't build ROMs for anything but their own phones.
         | 
         | LineageOS is popular in this field because in essence it's a
         | derivative of AOSP (the Android project as shipped by Google)
         | with modest modifications to support a crapload of devices,
         | instead of the handful that AOSP supports. This makes it easier
         | to build and easier to support new platforms.
         | 
         | The bulk of the security in AOSP (and thus, LineageOS) comes
         | from all the mitigations that are already built into the system
         | by Google, and the bulk of the core system that goes
         | unmodified. The biggest issue is usually the kernel, which may
         | go unpatched when the manufacturer abandons it (just like the
         | rest of the manufacturer's ROM), and porting all the kernel
         | modifications to newer versions is often incredibly tricky.
        
           | tredre3 wrote:
           | > Google doesn't build ROMs for anything but their own
           | phones.
           | 
           | Are you suggesting that ROMs provided through Android
           | Studio's emulator are somehow not built by Google?
        
         | Dalewyn wrote:
         | >I know that people claim that open source is more secure
         | because anyone can audit it, but I wonder how closely its
         | security actually interrogated.
         | 
         | The answer is that, no, nobody _akshuarry_ audits anything.
         | This has been proven time and time again, especially in the
         | last few years.
         | 
         | >All of it gives me a bias towards using official sources from
         | companies like Apple and Google, who presumably hire the talent
         | and institute the processes to do things right.
         | 
         | What you get from commercial vendors is _liability_ , you get
         | to demand they take responsibility because you paid them cold
         | hard cash. Free products have no such guarantees, you are your
         | own liability.
        
           | mid-kid wrote:
           | And we've seen time and time again how that liability "harms"
           | them when they whoopsie daisy leak a bunch of data they
           | shouldn't have gathered in the first place...
        
           | graemep wrote:
           | What liability? How do they take responsibility if there is a
           | security flaw?
        
             | fl0id wrote:
             | Especially as many licenses have liability disclaimers.
             | Sure some enterprise stuff etc will have stronger
             | guarantees etc but not by default probably
        
           | yjftsjthsd-h wrote:
           | > The answer is that, no, nobody akshuarry audits anything.
           | This has been proven time and time again, especially in the
           | last few years.
           | 
           | Sooo how about the audits linked in
           | https://news.ycombinator.com/item?id=42592444 ?
        
         | okanat wrote:
         | I think most of the Open Source projects are inadequate from
         | security PoV but they are not at a place that can do harm.
         | 
         | Android is extremely complex so I think many of the custom ROMs
         | possibly have some security rookie mistakes and quite a bit
         | security bugs due to mishmash of drivers. Android is still
         | better than most of the Linux distros due to its architecture
         | though. The default setup of many distros doesn't have much
         | isolation if at all.
        
           | yjftsjthsd-h wrote:
           | > so I think many of the custom ROMs possibly have some
           | security rookie mistakes and quite a bit security bugs due to
           | mishmash of drivers
           | 
           | I would easily believe that many Android systems have
           | vulnerabilities owing to the horrific mess that is their
           | kernel situation. That said, I personally doubt that
           | aftermarket ROMs are worse than stock, as official ROMs are
           | also running hacked up kernels.
        
             | ignoramous wrote:
             | > _...owing to the horrific mess that is their kernel
             | situation._
             | 
             | Do you mean OEM drivers or the Android Kernel,
             | specifically?
             | 
             | Google invests quite a bit on hardening the (Android
             | Commons) Kernel including compile-time/link-time & runtime
             | mitigations (both in hardware & software).
             | 
             | Ex: https://android-
             | developers.googleblog.com/2018/10/control-fl...
        
               | yjftsjthsd-h wrote:
               | The drivers; last I heard, literally every Android device
               | on the market was using a forked kernel in order to
               | support its hardware. And Google keeps trying things to
               | improve that situation, but...
               | https://lwn.net/Articles/680109/ was ~9 years ago and
               | since then not even Google themselves have managed to
               | ship a device running a mainline kernel. Supposedly it
               | should get better with their latest attempt to just put
               | drivers and user space, but 1. I haven't heard of any
               | devices actually shipping with an unmodified kernel,
               | probably because 2. AIUI that doesn't cover all drivers
               | anyways.
        
         | pserwylo wrote:
         | While this is true of many projects, F-Droid has a track record
         | of sourcing funding for security audits. To date there have
         | been at least three audits, in 2015, 2018, and 2022.
         | 
         | https://www.opentech.fund/security-safety-audits/f-droid/
         | 
         | https://f-droid.org/2018/09/04/second-security-audit-results...
         | 
         | https://f-droid.org/2022/12/22/third-audit-results.html
         | 
         | I was involved in addressing in issues identified in the first
         | one in 2015. It was a great experience, much more thorough than
         | the usual "numerous static analysers and a 100 page PDF full of
         | false positives that you often receive.
        
           | udev4096 wrote:
           | I'm surprised that several audits didn't uncover this signing
           | issue. GrapheneOS devs do not recommend f-droid. Instead,
           | Play Store is the safest option for now, after Aurora Store
        
             | cenamus wrote:
             | But their goals are also kinda opposed, software security
             | with not much concerns paid to freedom.
        
               | udev4096 wrote:
               | What? That's so not true. They give heavy preference to
               | security because without it, your freedom and privacy has
               | no value
        
               | fl0id wrote:
               | Well yeah so Theo goals are opposed. F-droid is foss
               | first and probably say proprietary illusion of security
               | has no value ;)
        
               | t0bia_s wrote:
               | How can you trust proprietary software, when you cannot
               | inspect code? It's just a blind trust.
        
               | gruez wrote:
               | You don't have to. On grapheneos google play service
               | isn't given special privileges and is sandboxed like any
               | other normal app.
        
             | t0bia_s wrote:
             | Aurora Store downloads apk files directly from gplay
             | servers, why it should be less safe than Play Store?
        
         | Idesmi wrote:
         | > CyanogenMod
         | 
         | Has been dead for 8+ years. LineageOS is its own thing by now.
         | 
         | > anyone who cares has already taken shots at Android and iOS
         | 
         | LineageOS is based on AOSP, plus some modifications that do not
         | affect security negatively.
        
         | LtWorf wrote:
         | They have a much better track record of apple, microsoft,
         | google and so on...
        
         | graemep wrote:
         | > I know that people claim that open source is more secure
         | because anyone can audit it, but I wonder how closely its
         | security actually interrogated. Seems like it could be a
         | massive instance of the bystander effect.
         | 
         | It depends on the software. Something widely used and critical
         | to people who are willing to put resources in is a lot more
         | likely to be audited. Something that can be audited has got to
         | be better than something that cannot be.
         | 
         | > All of it gives me a bias towards using official sources from
         | companies like Apple and Google, who presumably hire the talent
         | and institute the processes to do things right.
         | 
         | I am not entirely convinced about that, given the number of
         | instances we have of well funded companies not doing it right.
         | 
         | > You know anyone who cares has already taken shots at Android
         | and iOS, and they're still standing.
         | 
         | There has been quite a lot of mobile malware and security
         | issues, and malicious apps in app stores. Being more locked
         | down eliminates some things (e.g. phishing to install malware)
         | but they are far from perfect.
        
       | panny wrote:
       | I don't like people like this. They do the work of finding a bug,
       | but rather than try to fix it, they grandstand and shout about
       | how bad the thing they obviously enjoy is no good at all. If I
       | find a vulnerability in code I enjoy, I _work to fix it_ and then
       | only after my ironclad fix is applied, do I mention that it
       | existed and that I fixed it so it can never be exploited again.
       | 
       | "Security researchers" IMO are the most cringe and worst examples
       | of community members possible. They do not care about making
       | things better, they only care about their own brand. Selling
       | themselves, and climbing the ladder of embarrassed hard working
       | people who do things for the love of doing.
        
         | Idesmi wrote:
         | Hey. As someone who cries tears of joy when I see the software
         | I support succeed, I share the sentiment.
         | 
         | (I am just trying to push the visibility of your comment ;) )
        
         | evujumenuk wrote:
         | Exactly. Ideally, we'd all follow the Benzite approach, which
         | is to withhold any and all information from one's peers until a
         | complete analysis has finished, and the best possible remedy to
         | the problem has already been applied. Because how can a
         | miscreant use a vulnerability if it hasn't even been published
         | yet?
         | 
         | As contributors, we enjoy a lot of trust, as we should. That's
         | why it's not a problem if we make seemingly random changes that
         | don't necessarily make a lot of sense, but seem relevant to
         | security, when they actually fix an issue in the code. After
         | all, it's necessary to prevent bad guys from gaining sensitive
         | information, and to keep your colleagues from being unduly
         | bothered with challenges they could possibly help with.
        
         | int_19h wrote:
         | Per the write-up, they only went public with details of this
         | exploit after F-Droid merged the "fix" that didn't actually fix
         | the problem despite having been warned that it will not, and
         | despite being told what they actually need to do to fix it
         | properly.
        
       | mschwaig wrote:
       | I really wish we would take defining what it means for an
       | artifact to be signed more seriously.
       | 
       | Which key(s) is it signed with? What is the hash of the
       | corresponding unsigned artifact?
       | 
       | Signature verification tools should have some option which prints
       | these things in a machine-readable format.
       | 
       | I did some work on reproducibility of Android apps and system
       | images with Nix, and while defining a build step which can
       | automatically establish these relationships sounds a bit goofy,
       | it can make the issues with underspecified edge cases visible by
       | defining verification more strictly. I did not do this to look
       | for those edge cases though.
       | 
       | I am still working on that type of stuff now, but on more
       | fundamental issues of trust we could start addressing with
       | systems like Nix.
        
         | 1oooqooq wrote:
         | blame browsers and the url padlock "cuz users are dumb"
         | attitude.
         | 
         | i still believe "pgp is too complex" was the most successful
         | cia counter action after they lost the crypto wars to the
         | people.
         | 
         | solving via nix only works within the flawed assumptions that
         | end users either fully trust google or fdroid and are incapable
         | of anything else.
        
           | 486sx33 wrote:
           | +1
        
           | rollcat wrote:
           | > "pgp is too complex"
           | 
           | PGP _is_ too complex. I 've known my way around the command
           | line before I learned how to hand-write, and I have to look
           | up the commands to fetch the keys and/or verify the blob
           | _every single time_. Keyservers regularly fail to respond.
           | There 's no desktop integration to speak of. The entire UX
           | stinks of XKCD 196.
           | 
           | Don't blame CIA for obvious deficiencies in usability.
        
             | Y_Y wrote:
             | I was with you right up until the end. I think the only
             | thing that would stop me from sabotaging a small project
             | like PGP (was in the early days) is moral aversion. FOSS
             | and academic circles where these things originate is
             | generally friendly and open, and there is plenty of money
             | and length of rubber hose for anyone who doesn't welcome
             | the mole into their project.
             | 
             | I'm not saying I have evidence that this happened to PGP
             | specifically, just that it doesn't seem at all implausible.
             | If the CIA told me my code was never to get too easy to
             | use, but otherwise I could live a long and happy life and
             | maybe a couple of government contracts it would be hard to
             | argue.
             | 
             | Why a mass-market interface never took off (GPG and other
             | descendants notwithstanding) may indicate that the whole
             | cryptographic idea is inherently not amenable to user-
             | friendliness, but I don't find that hypothesis as
             | compelling.
             | 
             | (It could also be an unlikely coincidence that there's a
             | good solution not found for lack of looking, but that's
             | even less plausible to me.)
        
               | rollcat wrote:
               | Then why no such efforts are being pursued for PGP(GPG)
               | nowadays?
               | 
               | signify[1] is approachable _at least_ for the power users
               | - I could print out that man page on a T-shirt. HTTPS is
               | ubiquitous _and_ easy, thanks to ACME  & Let's Encrypt.
               | E2EE with optional identity verification is offered in
               | mainstream chat apps.
               | 
               | And of course there _are_ usability improvements to GPG,
               | being made by third parties: Debian introduced package
               | verification a couple decades ago, Github does commit
               | verification, etc. What 's to stop e.g. Nautilus or
               | Dolphin from introducing similar features?
               | 
               | [1]: https://man.openbsd.org/signify
        
               | Y_Y wrote:
               | > Then why no such efforts are being pursued for PGP(GPG)
               | nowadays?
               | 
               | I wonder why there aren't more, but there are some, for
               | example Proton's efforts towards encrypted email.
               | 
               | https://proton.me/support/how-to-use-pgp
               | 
               | (I won't mention the relative shortcomings of HTTPS and
               | E2E chat apps here.)
        
               | exe34 wrote:
               | you'd think if the cia don't want it to happen, then
               | somebody somewhere else would make it though. it's not
               | like the CIA and fsb would collude - they serve different
               | oligarchs.
        
             | bscphil wrote:
             | > I have to look up the commands to fetch the keys and/or
             | verify the blob every single time.
             | 
             | I have no doubt that this is true, but I very much question
             | whether any alternate UX would solve this problem for you,
             | because the arguments for these two tasks are given very
             | obvious names: `gpg --receive-keys <keyIDs>` and `gpg
             | --verify <sigfile>`. There's no real way to make it easier
             | than that, you just have to use it more.
             | 
             | The tool also accepts abbreviations of commands to make
             | things easier, i.e. you could also just blindly type `gpg
             | --receive <keyID>` and it would just work.
        
             | graemep wrote:
             | For what purpose? Setting up PGP signing and encryption for
             | emails in Thunderbird is dead simple. if only I knew anyone
             | else willing to use it!
             | 
             | I think you are right that UI sucks in many cases, but I
             | think its not intrinsic to PGP - its fixable.
        
               | arccy wrote:
               | if only everyone used my preferred set of tools
               | (thunderbird + pgp)...
        
               | exe34 wrote:
               | so it's their fault that every other tool maker refuses
               | to provide the facilities at the same level of
               | simplicity? they gave an example to show it was possible,
               | it doesn't mean that their example was the only way -
               | other developers decided that the public was too dumb to
               | use those kinds of tools.
        
               | jeroenhd wrote:
               | I know more people who use terminal user interfaces for
               | email than I know people who use Thunderbird, and I say
               | that as a techie.
               | 
               | The UI still sucks, though, because people ask me what
               | the .ASC attachments sent with all of my emails are and
               | if I've been hacked. When I explain that's for
               | encryption, they may ask how to set that up on their
               | phones if they care, but most of them just look at me
               | funny.
               | 
               | I do use email encryption at my job, through S/MIME, and
               | that works fine. Encryption doesn't need terrible UI, but
               | PGP needs support from major apps (including webmail) for
               | it to gain any traction beyond reporting bug bounties.
        
           | bolognafairy wrote:
           | "Users are dumb" is not and was never the attitude. On
           | average, people are average. You've just got completely
           | unrealistic expectations of people. You're asking for the
           | world to be built around your wants, needs, preferences, and
           | areas of expertise. Something this complex in the hands of
           | 99.99% of the population would be entirely useless.
        
             | alex7734 wrote:
             | A few years ago everyone that had ever used a computer knew
             | what a file and a folder was and could move a document to
             | an USB drive.
             | 
             | Thanks to the efforts of Google to "simplify" smartphones
             | the average young person now couldn't find and double-click
             | a downloaded file if their life depended on it.
             | 
             | In the US, a manual car is considered an anti-theft device.
             | In Europe, basically everyone that isn't obscenely rich has
             | driven a manual car at some point.
             | 
             | People learn what they're expected to learn.
        
               | ANewFormation wrote:
               | Another example would be ctrl+alt+del, ctrl+c, ctrl+v,
               | etc, etc.
               | 
               | Like you said people learn what they're expected to
               | learn.
        
               | johannes1234321 wrote:
               | Back then user base of computers was a lot smaller.
               | 
               | However Whatsapp/signal show how e2e can be done in a
               | user-compatible way. By default it simply exchanges keys
               | and shows a warning when key is changed and those who
               | need/want can verify identity.
               | 
               | Missing there of course openness.
        
               | upofadown wrote:
               | > ... those who need/want can verify identity.
               | 
               | So the rest are actually OK with Whatsapp/Signal having
               | the opportunity to see their messages? I would submit
               | that most are not even aware of the issue...
               | 
               | The identity thing is basically _the_ usability issue for
               | E2EE messaging. If you don 't solve that then you have
               | not actually increased usability in a meaningful way. The
               | PGP community understood this and did things like
               | organize key signing parties. When is the last time
               | anyone did anything like that for any popular E2EE
               | capable instant messenger?
        
               | arccy wrote:
               | if anything it's Apple / iOS that dumbs down users,
               | Google / Android provide a perfectly fine file picker /
               | file management app.
        
       ___________________________________________________________________
       (page generated 2025-01-04 23:01 UTC)