[HN Gopher] Trusted third parties are security holes (2005)
       ___________________________________________________________________
        
       Trusted third parties are security holes (2005)
        
       Author : greyface-
       Score  : 59 points
       Date   : 2023-11-03 06:51 UTC (2 days ago)
        
 (HTM) web link (web.archive.org)
 (TXT) w3m dump (web.archive.org)
        
       | pixl97 wrote:
       | Stepping outside the main thrust of the article...
       | 
       | > The functionality of personal property has not under normal
       | conditions ever depended on trusted third parties.
       | 
       | If so, it's the sign you have a low trust society in a more
       | general sense. In a high trust society it's easy to say "This is
       | my personal property, no one is going to mess with it". Things
       | like your car being out front the next day, or your goats not
       | getting stolen, and any number of social expectations are quietly
       | met every day. You don't behave like the police of your property
       | all night and are able to get a good nights sleep. This allows
       | you to get a job as a specialist and make a good income, along
       | with all your neighbors which brings more prosperity.
       | 
       | Trusting vs not trusting come with different costs that must be
       | calculated.
        
         | Crespyl wrote:
         | >If so, it's the sign you have a low trust society in a more
         | general sense.
         | 
         | Isn't that kind of orthogonal to the question? I can envision
         | both high or low trust societies that could have problems with
         | required 3rd parties attached to personal property.
         | 
         | Even the most well intentioned and highly trusted 3rd party is
         | still susceptible to malicious attacks, or just incompetence or
         | plain old accidents.
        
           | bluGill wrote:
           | Scale is the question. Some societies you can get by with
           | trusting others, once in a while you lose, but the gain
           | overall from being able to trust your stuff will still be
           | there is much more.
        
         | worksonmine wrote:
         | People trusted their shiny Teslas to be fine out front with the
         | key protected at home. Then thieves figured out that the keys
         | were most likely just behind the front door and used signal
         | boosters to unlock and start the car.
         | 
         | A paranoid owner putting the key in a small Faraday caged box
         | would not have that problem. With all the smart things we use
         | today the attack surface is not insignificant and thinking
         | about it is just common sense.
        
         | vajrabum wrote:
         | Well the context of the article is at the time recent, buggy
         | implmentations of ssl and other security protocols which were
         | at the time unproven security protocols, and government
         | entities providing black-box crypto with either explicit or
         | occult back doors. If by high-trust you mean I'm going to trust
         | a government entity's protocol or implmentation of a security
         | protocol which doesn't provide decent forward secrecy or non-
         | repudiation, well then, even in a high-trust environment that
         | sounds like a bad idea.
        
         | nyolfen wrote:
         | trusting is indeed probably always more efficient than not-
         | trusting -- however, the crux of the matter is what your
         | options are when trusting is not possible. you can always
         | choose to trust in the presence of a trustless mechanism, but
         | it does not work the other way around.
        
       | staplers wrote:
       | Prescient article. No surprise it was written by Nick Szabo
       | (ideated Bitgold a year before Bitcoin white paper was released).
        
         | nonrandomstring wrote:
         | Prescient still, in Europe because of the EU's proposal to
         | stuff browsers with untrustworthy certificates, maybe?
         | 
         | Which raises the question, what does "trust" even mean if you
         | are _forced_ to trust?
         | 
         | If, as the NSA like to say "trust is ability to do harm", then
         | the EU commission are proposing to force potential harm on the
         | entire continent.
        
           | JanisErdmanis wrote:
           | Trust is a remedy for not being able to detect misbehaviour
           | and prevent that from happening. That's my raw take on it.
        
           | JohnFen wrote:
           | > what does "trust" even mean if you are forced to trust?
           | 
           | You can't be forced to trust anything. You can be forced to
           | use things you don't trust, though.
        
       | denton-scratch wrote:
       | Could have been written yesterday; it seems pretty fresh to me.
       | 
       | I regret TFA's counfounding of the role of CAs and the role of
       | DNS. The bug with CAs is that the role exists at all; the bug
       | with DNS is down to implementation.
       | 
       | CAs are a solution to the problem of how to securely introduce
       | two parties that don't know one another. That's a fundamental
       | problem, and CAs aren't the only solution, but there aren't many.
       | IMO, CAs are a really awful solution. As TFA says, they are
       | unreliable and expensive to operate. WoT is a better solution,
       | but expensive to set up, which is perhaps why it's unsuccessful.
       | 
       | [Edit] WoT perhaps isn't a better solution; you still have to be
       | able to find the trusted certificate, and ultimately that means
       | you still need a trusted introducer. That means a TTP; or you
       | meet in person, which is in conflict with the demand that this is
       | someone you haven't met. But so what if you've met them? It comes
       | down to bank statements and driving licences, which amounts to
       | official government-issued ID. But suppose what I want to do or
       | say is unrelated to my official ID, and I'd prefer to certify
       | idependent of government and officialdom? Suppose I'm stateless,
       | and have no government? I'd like self-signed certificates to be
       | treated by browsers as first-class citizens.
       | 
       | [Edit 2] Given certificates done right, everything else falls
       | away; you can do what DNS does using your certificate. You can
       | sign a DNS zone, or sign any other statement you want, and the
       | certificate itself can direct people to those signed statements.
       | 
       | DNS solves another problem: how to map names to [data], where
       | [data] is usually an address of some kind. It assumes a
       | particular naming system, and that system of names isn't
       | fundamental. It's a layer on top of lower-level addressing
       | systems, and could be replaced without affecting lower layers.
       | It's dirt-cheap to operate; what's expensive and buggy is the
       | market for names.
        
         | JohnFen wrote:
         | > IMO, CAs are a really awful solution.
         | 
         | I agree. But the only solutions that can work are ones everyone
         | agrees to use and, for right now anyway, everyone's settled on
         | that model. There isn't much choice.
        
           | rainsford wrote:
           | I disagree the problem is a lack of choice, because what
           | realistic alternative choices actually exist? Web of trust
           | a-la PGP key signing parties is a ridiculous solution for
           | people who aren't super nerds (and even for super nerds it's
           | pretty bad). Trust on first use works OK-ish in SSH, but it's
           | hard to see that as a realistic solution for trust as a whole
           | if you try to scale it up to every user needing to access
           | arbitrary servers.
           | 
           | Maybe CAs are a bad solution, but if so it's in the sense of
           | the quote about democracy being the worst form of government
           | except for all the other ones. At the end of the day, the CA
           | system has managed to deliver working cryptographic trust at
           | a scale that galatically surpasses anything else.
        
             | JohnFen wrote:
             | > what realistic alternative choices actually exist?
             | 
             | Yeah, that's a problem. I certainly can't think of one. But
             | even if there were, it would still be an enormous struggle
             | just to shift to using it at this point. That struggle also
             | means that there isn't a huge amount of effort being put
             | into finding a better way.
             | 
             | The CA system certainly has great benefits in terms of
             | delivering reasonable security in a relatively convenient
             | way, but that doesn't take away from the fact that it still
             | sucks in ton of different ways that are inherent to the
             | idea. It also can't be used (at least not without great
             | struggle) in some situations.
        
             | denton-scratch wrote:
             | Yeah, I didn't mention TOFU because it's a non-starter if
             | you're trying to get a secure introduction to someone you
             | don't know.
             | 
             | The fundamental problem is that "identity" is problematic.
             | How do you know someone is who they say they are? Perhaps
             | you don't really know who you are. That's why I want a
             | space for a certified identity that isn't tied to a
             | government ID. Like, an instance of some software might
             | claim an ID. Arguably, a domain is an ID. I don't see
             | what's wrong with something or somone having multiple IDs
             | that aren't linkable.
        
       | gopher_space wrote:
       | This is a useful point of view I'll be adding to the toolbox.
       | Looking at testing and type as TTP entities is already fun. This
       | line:
       | 
       | > Making personal property functionality dependent on trusted
       | third parties (i.e. trusted rather than forced by the protocol to
       | keep to the agreement governing the security protocol and
       | property) is in most cases quite unacceptable.
       | 
       | sums up a line of thinking I'd like to explore a bit.
       | 
       | > These institutions have a particular way of doing business that
       | is highly evolved and specialized. They usually cannot "hill
       | climb" to a substantially different way of doing business.
       | Substantial innovations in new areas, e.g. e-commerce and digital
       | security, must come from elsewhere.
       | 
       | Work I've done in finance-related shops fits neatly into the TTP
       | model, and that model explains most of what I used to see as tech
       | debt. As a TTP being able to guarantee a result was a core part
       | of our business, so processes would be set in stone once people
       | used them to do serious work. If the initial design was written
       | and implemented by people with a grasp on each domain it touched,
       | you'd need to have that all under your belt before thinking about
       | real changes.
       | 
       | The immutability + cost factor makes existing work look
       | foundational and innovations in processing metabolite are easy to
       | spec out and calculate return value on.
        
       | JohnFen wrote:
       | Yep. There's no such thing as a trustworthy third party.
        
         | Arainach wrote:
         | And yet the world is far too large for even a significant
         | fraction of things to be first party, and transactions require
         | trust.
        
           | JohnFen wrote:
           | Yes, we have to rely on third parties for a lot of things.
           | That doesn't mean they're trustworthy, though.
        
       ___________________________________________________________________
       (page generated 2023-11-05 23:01 UTC)