[HN Gopher] The sins of the 90s: Questioning a puzzling claim ab...
___________________________________________________________________
The sins of the 90s: Questioning a puzzling claim about mass
surveillance
Author : ibotty
Score : 169 points
Date : 2024-10-28 15:34 UTC (7 hours ago)
(HTM) web link (blog.cr.yp.to)
(TXT) w3m dump (blog.cr.yp.to)
| hobs wrote:
| In a nutshell I dont think we would have seen much change -
| corporations only engage in security insofar as much as they are
| required to - we've seen that even in this "metastatic SSL
| enabled growth" we've basically sold out security to the lowest
| common denominator, and core actors in the industry just use
| these security features as a fig leaf to pretend they give a
| single crap.
|
| Now, would CERTAIN industries exist without strong cryptography?
| Maybe not, but commerce doesn't really care about privacy in most
| cases, it cares about money changing hands.
| InDubioProRubio wrote:
| I dont know, they sure make sure the paper-trail is shredded
| and shedded with the Azure Document Abo 365. When it comes to
| security from liability everything is top notch.
| Terr_ wrote:
| Right: So what we need to do is make organizations _liable_
| for mishandling data.
|
| Imagine if you could sue a company for disclosing your unique
| email address to spammers and scammers. (They claim it's the
| fault of their unscrupulous business partner? Then they can
| sue for damages in turn, not my problem.)
|
| There are some practical issues to overcome with that
| vision... but I find it rather cathartic.
| red_admiral wrote:
| Cryptocurrency, if you accept it and its ecosystem as an
| industry, would certainly not exist. And as for privacy, a
| fairy dies every time some someone praises bitcoin for being
| anonymous.
| try_the_bass wrote:
| > And as for privacy, a fairy dies every time some someone
| praises bitcoin for being anonymous.
|
| And yet countless thefts have happened and had the proceeds
| exfiltrated via Bitcoin, and the culprits never caught.
|
| If that's not effective/practical anonymity, I don't know
| what is?
| wmf wrote:
| A decent number have been caught and plenty of others are
| known but can't be held responsible because they're state
| supported.
| WaitWaitWha wrote:
| One key part is that the crypto wars were around _export_ , lest
| we forget "PGP Source Code and Internals".
|
| If there was no international business, _any-strength_ crypto
| would have been and could have been used.
| convolvatron wrote:
| there was a huge chilling effect on both product and protocol
| design. In the 90s I had to fill out a form and submit it to
| RSA in order to get a copy of their library. Which I eventually
| got after waiting 6 months, but I had to agree not to
| redistribute it in any way.
|
| Efforts to design foundational cryptographic protocols were
| completely hamstrung by the spectre of ITAR and the real
| possibility that designs would have to US only. Right around
| the time that the US gave up, the commercial community was
| taking off and they weren't at all interested in further
| standardization except was creating moats for their business -
| which is why we're still stuck in the 90s as far at the network
| layer goes.
| rangestransform wrote:
| AFAIK the Zimmerman case was quietly dropped instead of ruled
|
| Tinfoil hat: it was dropped to prevent exporting code being 1A
| protected as case law
| RamAMM wrote:
| The missed opportunity was to provide privacy protection before
| everyone stepped into the spotlight. The limitations on RSA key
| sizes etc (symmetric key lengths, 3DES limits) did not materially
| affect the outcomes as we can see today. What did happen is that
| regulation was passed to allow 13 year olds to participate online
| much to the detriment of our society. What did happen was that
| business including credit agencies leaked ludicrous amounts of
| PII with no real harm to the bottom lines of these entities. The
| GOP themselves leaked the name, SSN, sex, and religion of over a
| hundred million US voters again with no harm to the leaking
| entity.
|
| We didn't go wrong in limiting export encryption strength to the
| evil 7, and we didn't go wrong in loosening encryption export
| restrictions. We entirely missed the boat on what matters by
| failing to define and protect the privacy rights of individuals
| until nearly all that mattered was publicly available to bad
| actors through negligence. This is part of the human propensity
| to prioritize today over tomorrow.
| elric wrote:
| > What did happen is that regulation was passed to allow 13
| year olds to participate online much to the detriment of our
| society.
|
| That's a very hot take. Citation needed.
|
| I remember when the US forced COP(P?)A into being. I helped run
| a site aimed at kids back in those days. Suddenly we had to
| tell half of those kids to fuck off because of a weird and
| arbitrary age limit. Those kids were part of a great community,
| had a sense of belonging which they often didn't have in their
| meatspace lives, they had a safe space to explore ideas and
| engage with people from all over the world.
|
| But I'm sure that was all to the detriment of our society
| :eyeroll:.
|
| Ad peddling, stealing and selling personal information, _that_
| has been detrimental. Having kids engage with other kids on the
| interwebs? I doubt it.
| ryandrake wrote:
| Kids are not stupid, though. They know about the arbitrary
| age limit, and they know that if they are under that limit,
| their service is nerfed and/or not allowed. So, the end
| effect of COPPA is that everyone under 13 simply knows to use
| a fake birthdate online that shows them to be over the limit.
| elric wrote:
| Sure, it's one of the many rules that's bent and broken on
| a daily basis. Doesn't make it any less stupid. And it
| falls on the community owner to enforce, which is doubly
| stupid, as the only way to prove age is to provide ID,
| which requires a lot of administration, and that data then
| becomes a liability.
| dfxm12 wrote:
| _COP(P?)A_
|
| COPA [0] is a different law which never took effect. COPPA
| [1] is what you're referring to.
|
| _Ad peddling, stealing and selling personal information,
| that has been detrimental._
|
| I agree and what's good for the gander is good for the goose.
| Why did we only recognize the need for privacy for people
| under an arbitrary age? We all deserve it!
|
| 0 - https://en.wikipedia.org/wiki/Child_Online_Protection_Act
|
| 1 - https://en.wikipedia.org/wiki/Children%27s_Online_Privacy
| _Pr...
| bippihippi1 wrote:
| the issue with online kids isn't just the availability of the
| internet to kids but the availability of the kids to the
| internet
| burningChrome wrote:
| >> Having kids engage with other kids on the interwebs? I
| doubt it.
|
| Unless those kids aren't interacting with kids at all, but
| instead pedo's masquerading as kids for nefarious reasons.
| Which yes, has been VERY detrimental to our society.
| elric wrote:
| Nah. I'm not buying it. What's the rate of kids interacting
| with pedos instead of other kids?
|
| Knee-jerk responses like yours, and "what about the
| children"-isms in general are likely more detrimental than
| actual online child abuse. Something about babies and
| bathwater.
| Lammy wrote:
| Downside of trading privacy for security: _anything_ that makes a
| network connection creates metadata about you, and the metadata
| is the real danger for analyzing your social connections:
| https://kieranhealy.org/blog/archives/2013/06/09/using-metad...
|
| The problem isn't about the big corporations themselves but about
| the fact that _the network itself_ is always listening and the
| systems the big corporations build tend to incentivize making as
| many metadata-leaking connections as possible, either in the name
| of advertising to you or in the name of Keeping You Safe(tm):
| https://en.wikipedia.org/wiki/Five_Eyes
|
| Transparent WWW caching is one example of a pro-privacy setup
| that used to be possible and is no longer feasible due to
| pervasive TLS. I used to have this kind of setup in the late
| 2000s when I had a restrictive Comcast data cap. I had a FreeBSD
| gateway machine and had PF tied in to Squid so every HTTP request
| got cached on my edge and didn't hit the WAN at all if I reloaded
| the page or sent the link to a roommate. It's still technically
| possible if one can trust their own CA on every machine on their
| network, but in the age of unlimited data who would bother?
|
| Other example: the Mac I'm typing this on phones home every app I
| open in the name of """protecting""" me from malware. Everyone
| found this out the hard way in November 2020 and the only result
| was to encrypt the OCSP check in later versions. Later versions
| also exempt Apple-signed binaries from filters like Little Snitch
| so it's now even harder to block. Sending those requests at all
| effectively gives interested parties the ability to run a "Hey
| Siri, make a list of every American who has used Tor Browser"
| type of analysis if they wanted to:
| https://lapcatsoftware.com/articles/ocsp-privacy.html
| kmeisthax wrote:
| Transparent HTTP caching as a way to avoid leaking metadata is
| not pro-privacy. It only works _because_ the network is always
| listening, to both metadata _and_ message content. The reason
| why people worry about metadata is because it 's a way to
| circumvent encryption (and the law). Metadata is holographic[0]
| to message content, so you need to protect it with the same
| zeal content is protected.
|
| But letting everyone have the message content so that metadata
| doesn't leak isn't helpful. Maybe in the context it was
| deployed, where pervasive deep packet inspection was only
| something China wasted their CPU cycles on, your proxy made
| sense. But it doesn't make sense today.
|
| [0] X is holographic to Y when the contents of X can be used to
| completely reconstruct Y.
| SketchySeaBeast wrote:
| How it metadata holographic? Sure, you can know when I
| communicated to a particular individual, and even the format
| and size of the message, but it doesn't include the exact
| message, right?
| some_furry wrote:
| "We kill people based on metadata."
|
| https://www.justsecurity.org/10318/video-clip-director-
| nsa-c...
| SketchySeaBeast wrote:
| It's certainly powerful, but that wasn't the claim I'm
| asking about.
| wood_spirit wrote:
| Gordon Welchman first productionized "traffic analysis" in
| WW2 at Bletchley Park.
|
| When in his retirement he tried to write about it, it was
| his work on traffic analysis more than his disclosing that
| the allies had cracked enigma that most worried the NSA who
| tried to stop him publishing.
|
| Traffic analysis is in many ways more valuable than the
| contents of the messages.
|
| https://en.m.wikipedia.org/wiki/Gordon_Welchman
| SketchySeaBeast wrote:
| I won't say that metadata isn't valuable, but I still
| don't think it's holographic. You can tell I WhatsApp my
| friend every day around noon, so we're probably talking
| about lunch, but you don't know that today I had a tuna
| sandwich.
| 2OEH8eoCRo0 wrote:
| I don't see metadata as a danger, I think it's a great
| compromise between police work and privacy.
|
| Some of thi requirements I see here seem crazy. I want carte
| blanche access to the global network of other peoples computers
| and I want perfect privacy and I want perfect encryption...
|
| Yeah, no
| agiacalone wrote:
| Maybe you don't, but for some people, it's lethal.
|
| https://www.justsecurity.org/10318/video-clip-director-
| nsa-c...
| 2OEH8eoCRo0 wrote:
| Good. Im glad the NSA is doing it's job. I don't want
| terrorists to feel safe while using our systems.
| ziddoap wrote:
| > _"We kill people based on metadata"_
|
| > _"metadata absolutely tells you everything about
| somebody's life. If you have enough metadata, you don't
| really need content."_
|
| Your response to the above quotes is so short-sighted
| that I don't even know where to begin.
|
| As long as it's the people you don't like dying, I guess
| it's cool.
|
| Good thing the NSA is the only group in the world that
| has access to metadata at scale.
| throwaway19972 wrote:
| If only the NSA or the people designating who terrorists
| are vs who our allies are had such pure, pro-human
| intentions.
| 2OEH8eoCRo0 wrote:
| A hypothetical problem that we can tackle when (or if)
| it's _actually_ a problem. Thanks for your metadata,
| regardless.
| A4ET8a8uTh0 wrote:
| Is it really a hypothetical at this point? I was under
| the impression that relevant cases have already been
| explored ( to the extent that one can given the nature of
| IC ). In cases like these, the moment it is actually a
| problem, it is likely already too late to make sensible
| adjustments.
| codethief wrote:
| Keep in mind that you don't decide who's a terrorist and
| who isn't. You might be "glad" about the NSA doing their
| job as long as your definition of terrorism aligns with
| the government's but what if that ceases to be the case?
| thadt wrote:
| One man's meta is another mans data. The classification of
| 'data' and 'metadata' into discrete bins makes it sound like
| metadata is somehow not also just 'data'.
|
| If every morning I got in my car and left for work and my
| neighbor followed me, writing down every place I went, what
| time I got there, how long I stayed, and the name of everyone I
| called, it would be incredibly intrusive surveillance data, and
| I'd probably be somewhat freaked out.
|
| If that neighbor were my cell phone provider, it would be
| Monday.
|
| What we allow companies and governments to do (and not do) with
| this data isn't something we can solve in the technical realm.
| We have to decide how we want our data handled, and then make
| laws respecting that.
| stouset wrote:
| I am struggling to comprehend how allowing everyone between you
| and the services you use to view not only the metadata but the
| _content_ as well could possibly be considered privacy-
| preserving.
| natch wrote:
| It's kind of an unorthodox take, but I'm guessing the idea is
| that if corporations perceived that they didn't have secure
| ways to protect stuff, they would refrain from gathering as
| much stuff, because they would be afraid of the liability.
| And btw the perception / reality distinction is important
| here in supporting this theory.
| janalsncm wrote:
| I disagree. What makes corporations afraid of liability are
| _laws enforcing liability_. We never got those, and I don't
| see why weaker encryption would've created them. We could,
| for example, have meaningful penalties when a company leaks
| passwords in plain text.
| ForHackernews wrote:
| I haven't seen the talk, but it sounds plausible to me: Technical
| people got strong crypto so they didn't worry about legislating
| for privacy.
|
| We still have this blind spot today: Google and Apple talk about
| security and privacy, but what they mean by those terms is making
| it so only they get your data.
| MattJ100 wrote:
| > Technical people got strong crypto so they didn't worry about
| legislating for privacy.
|
| The article debunks this, demonstrating that privacy was a
| primary concern (e.g. Cypherpunk's Manifesto) decades ago. Also
| that mass surveillance was already happening even further back.
|
| I think it's fair to say that security has made significantly
| more progress over the decades than privacy has, but I don't
| think there is evidence of a causal link. Rather, privacy
| rights are held back because of other separate factors.
| ForHackernews wrote:
| > demonstrating that privacy was a primary concern (e.g.
| Cypherpunk's Manifesto) decades ago. Also that mass
| surveillance was already happening even further back.
|
| How does that debunk it? If they were so concerned, why
| didn't they do anything about it?
|
| One plausible answer: they were mollified by cryptography.
| Remember when it was revealed that the NSA was sniffing
| cleartext traffic between Google data centers[0]? In
| response, rather than campaigning for changes to legislation
| (requiring warrants for data collection, etc.), the big tech
| firms just started encrypting their internal traffic. If
| you're Google and your adversaries are nation state actors
| and other giant tech firms, that makes a lot of sense.
|
| But as far as user privacy goes, it's pointless: Google is
| the adversary.
|
| [0] https://theweek.com/articles/457590/why-google-isnt-
| happy-ab...
| thecrash wrote:
| As you point out, decades ago privacy was a widespread social
| value among everyone who used the internet. Security through
| cryptography was also a widespread technical value among
| everyone (well at least some people) who designed software
| for the internet.
|
| Over time, because security and cryptography were beneficial
| to business and government, cryptography got steadily
| increasing technical investment and attention.
|
| On the other hand, since privacy as a social value does not
| serve business or government needs, it has been steadily de-
| emphasized and undermined.
|
| Technical people have coped with the progressive erosion of
| privacy by pointing to cryptography as a way for individuals
| to uphold their privacy even in the absence of state-
| protected rights or a civil society which cares. This is the
| tradeoff being described.
| anovikov wrote:
| How could that be relevant for more than a few more years? The
| world does not end with the US. Regardless of the ban, strong
| crypto would have been developed elsewhere, as open source, and
| proliferated to the point of making continuation of the ban
| impossible: by ~2005 or earlier, it will be either US closing off
| from global Internet becoming a digital North Korea of a sort, or
| allowing strong crypto.
| wmf wrote:
| Popular OSes and browsers have almost entirely come from the
| US. If people had a choice between IE with weak crypto or Opera
| with strong crypto they absolutely would have chosen IE.
| ikmckenz wrote:
| This is a good article, and throughly debunks the proposed
| tradeoff between fighting corporate vs government surveillance.
| It seems to me that the people who concentrate primarily on
| corporate surveillance primarily want government solutions
| (privacy regulations, for example), and eventually get it in
| their heads that the NSA are their friends.
| g-b-r wrote:
| Aside from everything else, I don't understand what Whittaker's
| point was; she seemed to ultimately be advocating for something,
| but I can't understand what, exactly.
|
| It's probably in the talk's last sentences:
|
| > We want not only the right to deploy e2ee and privacy-
| preserving tech, but the power to make determinations about how,
| and for whom, our computational infrastructures work. This is the
| path to privacy, and to actual tech accountability. And we should
| accept nothing less.
|
| But who are "we" and "whom", and what "computational
| infrastructure" is she referring to?
| salawat wrote:
| I can fill that in for you I think. The "We" and "Whom" are
| you, me, the arbitrary host/admin/user.
|
| If you look at the regulatory trends developing around tech at
| the moment there are a lot of pushes to slap obligations on the
| host essentially toe the societal line of their geopolity. You
| will spy on your users. You will report this and that. You will
| not allow this group or that group.
|
| This tightening acts in part to encourage centralization, which
| is regulable by the state, and discourage decentralization,
| which is at best, notionally doable.
|
| The power of technologically facilitated networking has, prior
| to the Internet, been in large part a luxury of the State or
| Entity granted legitimacy by the State. With everyone having
| the potential to take their networks dark enough where the
| State level actors legitimately revert to having to physically
| compromise the infrastructure instead of being able to just
| snoop the line, it's a threat to the edifice of power currently
| extant to under a bottom up inversion.
|
| No longer would the big boys in the current ivory tower be able
| to sit on high and know that there may be threats purely by
| sitting on SIGINT and data processing and storage alone. The
| primitive of true communications and signalling sovereignty
| would be in the hands of every individual. Which the
| establishment would like to cordially remind you includes those
| dirty terrorists, pedophiles, communists, <group you are
| mandated to treat as an outgroup>. So therefore, everyone must
| give up this power and conduct affairs is a monitorable way to
| make those other people stand out. Because you're all "good"
| people. And "good" people have nothing to fear.
|
| You can't deplatform persona non grata from infra they've
| already largely built for themselves, which is a terrifying
| prospect to the current power structure.
|
| It's all about control.
| convivialdingo wrote:
| I used to work with the guy who was named by DJB in the crypto
| export case which removed the restrictions. IIRC, the NSA guy
| used to be his student!
| singron wrote:
| I think this article isn't considering wifi. Most early sites
| were pressured into using SSL because you could steal someone's
| session cookie on public wifi.
|
| Without cryptography, all wifi is public, and in dense areas, you
| would be able to steal so many cookies without having to actually
| get suspiciously close to anything.
|
| I'm guessing without crypto, we would only access financial
| systems using hard lines, and wifi wouldn't be nearly as popular.
| Mobile data probably wouldn't have taken off since it wouldn't
| have been useful for commerce.
| quietbritishjim wrote:
| I thought WiFi was somewhat secure from other clients, even if
| your connection is unsecured at the TCP layer, so long as
| they're not impersonating the hotspot. You're certainly not
| secure from the hotspot itself, of course.
| Amezarak wrote:
| Back in the day, we had a Firefox extension for stealing
| other people's cookies over WiFi.
|
| https://github.com/codebutler/firesheep
| https://en.wikipedia.org/wiki/Firesheep
| Rygian wrote:
| Unencrypted WiFi would be equal to broadcasting your
| unsecured TCP traffic for everyone to eavesdrop in.
|
| Early domestic WiFi would use WEP ("Wired-Equivalent
| Privacy") which was very vulnerable:
| https://library.fiveable.me/key-terms/network-security-
| and-f...
| eurleif wrote:
| Only if the WiFi network is password-protected, which causes
| connections to be encrypted. Pretty much all WiFi is
| password-protected nowadays -- if a cafe wants to enable
| public access to their WiFi, they'll write the password on
| the wall -- but that only became the case after Firesheep and
| other sniffing tools drew attention to this issue around
| 2010. In the old days, there were plenty of networks with no
| password (and hence, no encryption) at all.
|
| The GP specified "without cryptography", in reference to a
| counterfactual world where we weren't allowed to encrypt
| things.
| quietbritishjim wrote:
| Ah ok. I thought they were referring back to SSL in their
| first paragraph. Interesting, I had forgotten that WiFi
| networks once didn't all have passwords.
| m463 wrote:
| I think the social element is one of the roots of the problem.
|
| Basically, people don't understand privacy, and don't see what is
| going on, so they don't care about it. Additionally, most privacy
| intrusions are carefully combined with some reward or
| convenience, and that becomes the status quo.
|
| This leads to the people who stand up to this being ridiculed as
| tinfoil hat types, or ignored as nonconformist.
|
| everything after that is just a matter of time.
| A4ET8a8uTh0 wrote:
| Yes. The weirdest example of this ( and most personally
| applicable ) is the DNA data shared with 23andme and the like.
| I did not subscribe to this. Neither did my kids ( or kids of
| the individual who did subscribe I might add ), but due to some
| shared geography and blood ties, whether I want to or not, I am
| now identifiable in that database.
|
| To your point, there is something in us that does not consider
| what information could do.
| 14 wrote:
| If you have nothing to hide what are you worried about? Or if
| you are not planning to be a criminal what are you worried
| about?
|
| I am 100% not serious and do not believe either statement
| above. I sadly am in the same boat as you and had a
| blacksheep of a brother who did some sort of crime and as a
| condition had his DNA taken so I by default am in the system
| as well.
|
| I never could understand why people would willingly offer
| their DNA to companies that even if they are not selling that
| data sooner or later could have that data leak and the
| consequences could mean being able to afford life and medical
| insurance or not.
___________________________________________________________________
(page generated 2024-10-28 23:00 UTC)