[HN Gopher] Preparedness Paradox
___________________________________________________________________
Preparedness Paradox
Author : thunderbong
Score : 97 points
Date : 2022-08-15 06:02 UTC (16 hours ago)
(HTM) web link (en.wikipedia.org)
(TXT) w3m dump (en.wikipedia.org)
| arcticbull wrote:
| This also manifests in for instance how we treat testing in
| software engineering. Folks don't get as much credit for writing
| tests because it's impossible to count the set of SEVs that
| didn't happen. On the other hand, you get outsized credit for the
| heroics of fixing them.
| eastbound wrote:
| Testing is a loss of time. It absorbs about 50% the workforce,
| and projects that don't have it don't necessarily suffer.
|
| Also, ask an engineer whether tests are complete, and he'll
| always tell you that we haven't tested anything yet. You need a
| cutoff at one point.
| arcticbull wrote:
| I think you're demonstrating exactly the fallacy that I
| identified.
|
| I know personally I've caught massive issues in my own unit
| testing of my own code - so I know for a fact it's not a dead
| loss of time. I'm also not sure why you think it takes 50% of
| the workforce - that's never been my experience.
|
| The trick is knowing what to test, how much to test it and
| how long to spend.
| motohagiography wrote:
| I generally don't prepare for anything because the opportunity
| cost of being prepared for something is almost necessarily
| against responding to the events I am not prepared for - and
| those are the ones you really have to worry about.
|
| The secret I find is to always be ready for the things you aren't
| prepared for.
| Trasmatta wrote:
| I'm having this struggle with projects at work. I've had to push
| back on all sorts of requests from our project manager and
| designer so that our team has the bandwidth to focus on some very
| important stability and security concerns for our next release.
| They want all sorts of additional fancy bells and whistles (that
| don't add much user value or functionality) that we just can't
| focus on right now, because it would come at the expense of
| making sure the feature is actually stable and secure.
|
| I'm almost positive there will be some amount of blowback when
| the thing releases and there are no stability or security
| problems...which was only because I made sure we spent the
| necessary time on them.
| MattPalmer1086 wrote:
| Yep, the curse of doing things right. You can't prove it was
| needed.
|
| You need to demonstrate that you are fixing genuine problems,
| or you will eventually be replaced by someone who delivers
| faster, even if there are subsequent bugs.
|
| One way to do this is to negotiate with the business in what
| needs doing, using risk. If you think there is a risk of a
| security or stability issue then you should be able to assess
| that risk. The business can then choose to accept the risk and
| add some features, or fix the risk. It is essential that the
| owner of the system officially accepts the risks presented. You
| cannot own the risks.
|
| This lets the business prioritise the work according to its
| risk appetite. And if the shit hits the fan, you are not only
| covered but your reputation will increase.
| Test0129 wrote:
| While this works with rational actors the experience I have
| had in the industry is often the opposite. In fact, the
| company I work for now is probably the only company I've
| worked for in the last decade that actually correctly
| evaluates risk. The average corporate drone overseeing the
| engineer org is very typically the least rational actor in
| the entire org.
|
| Given the opportunity most start-up and mid-tier business
| will prioritize speed over safety. Despite my many attempts
| to explain this trade off using various methods such as
| engineer-speak, business-speak, or some combination of the
| two the need for money and the need to constantly impress
| investors trumps all. I have quite literally told people the
| total cost of a half-fix will be more than double the cost in
| engineering hours to implement a correct fix and by-and-large
| the half-fix will be chosen because it "gets the feature out
| to users quicker". It's the most asinine thing I've heard and
| I fully understand the need to deliver on time and on budget.
|
| In the end your ass is never covered. It will be your fault
| whether you suggested to do it and they said no, or they said
| yes. Your team will end up working the long hours to
| implement the obvious security and safety changes. The math
| for the other side is simple, if the cost to take on the risk
| is less than the cost to implement the fix, it will never get
| done. Companies use pager duty for free labor for a reason.
| It's the industry's most effective permitter of poor
| practices.
|
| Sure, something as simple as "we should really hash our
| passwords" might be so glaringly obvious even the most dense
| business person would understand. But when you wander into
| the land of ambiguity is when you really get burned. When the
| company is spending $XX,XXX/mo. on cloud storage because the
| ticket specifically said to not worry about lifecycle it's
| going to be you in the office explaining why this wasn't
| fixed. Rarely will any business person take "its your fault"
| as the answer. They'll happily assign you as many 60 hour
| weeks as you need to fix the problem and in a large enough
| corporate-tier screw up you may be the sacrificial lamb for
| the investors to feel like "the problem was solved".
|
| Call me cynical but this is an unwinnable battle.
| Unfortunately, until software bugs start literally killing
| people, the desire to actually allow engineers to do their
| job will be low.
| aaron695 wrote:
| Incoherent.
|
| How can there be a levee paradox.
|
| You can see the water it holds back.
|
| More people build because there are less floods.
|
| No idea what they are talking about with Fukushima
|
| The Millennial Bug is a good prospect. But that's a debate in
| itself.
| jsight wrote:
| Good point about the levee issue. Apparently there's a little
| wikiwar going on with that one already.
|
| I couldn't follow their logic with Fukushima either. The
| wording was a little strange.
|
| The Year 2000 scenario and covid scenarios are great examples
| IMO. The problem is that any great example is intrinsically
| going to be controversial, and that seems to be the paradox
| itself.
| enragedcacti wrote:
| For a real world example, you can look to the hole in the
| ozone layer. This conservative commentator and roughly 42k
| twitter users agree that we "suddenly just stopped talking
| about it", when in reality governments implemented bans on
| CFCs that mostly solved the problem.
|
| https://twitter.com/mattwalshblog/status/1549713211188027394
| [deleted]
| _int3_ wrote:
| This examples can be made up arbitrarily. For any situation.
| You can always say that if it weren't for x, y would be even
| worse.
|
| "If people didn't carry guns, there would be more violence."
|
| "If we didn't start climate talks , climate change would be
| even worse."
|
| etc...
| huetius wrote:
| Not doubting this, but it seems to cut both ways. I can just as
| easily justify an overreaction by claiming to have averted some
| worse outcome. It seems to be a general problem of
| counterfactuals.
| anonporridge wrote:
| This is why it's often good resource management to wait until
| something breaks before committing resources to fix it.
| Especially true in software systems.
|
| One might think that constant firefighting is a waste of
| resources, and we'd be better off solving problems before they
| happen. That's true if and only if you know for sure that the
| problem and eventual breakage is really going to happen AND
| that it's worth fixing. At least in my experience, it's more
| often true that people overestimate the risk of calamity and
| waste resources fixing things that aren't actually going to
| break catastrophically. Or fix things that we don't actually
| need, but only figure out that we don't need them when they
| finally break and we realize that the cost of fixing or
| replacing it outweighs whatever value it was providing.
|
| The engineer in me hates saying this, but sometimes things
| don't have to be beautifully designed and perfectly built to
| handle the worst. Duct tape and superglue often really is good
| enough.
|
| Of course, this doesn't apply to problems that are truly
| existential risks. If the potential systemic breakage is so bad
| that it irreparably collapses the system, then active
| preparedness can certainly be justified.
| thisisauserid wrote:
| You can't A/B test a lot of things without a time machine so you
| need to be good at assessing risks and tradeoffs.
| schoen wrote:
| A few months ago, I wrote a review of the book _A Libertarian
| Walks Into a Bear_. The book describes the Free Town Project, a
| kind of offshoot of the Free State Project in New Hampshire, in
| which people moved to a particular town in order to try to reduce
| the role of local government in their lives. The book notes that
| the town then had significant difficulty coordinating on wildlife
| control issues, as there were lots of bears in the nearby woods
| and the residents had trouble agreeing on what to do to keep them
| away from people.
|
| While the issues were somewhat complex and not solely the result
| of the Free Town Project, it seemed clear that the lack of
| governmental coordination and some residents' bear-attracting
| behaviors made the bears' presence a bigger problem than it had
| been before.
|
| One thing I thought several times while reading the book was that
| the preparedness paradox was a big part of the challenge
| (although I didn't remember that it was called that!).
| Specifically, it seemed like quite a few of the people involved
| sincerely thought that wildlife management or wildlife control
| wasn't "a thing" because they had only ever lived in places where
| it was already being handled well. So they didn't perceive any
| need to continue actively addressing it in their new environment,
| because it seemed like such a hypothetical or fanciful risk.
|
| Since then, I've thought that the question of understanding or
| evaluating what is a real risk that one needs to make a real
| effort to deal with gets _extremely_ clouded by all of the things
| that people and institutions are already doing in the name of
| risk mitigation. We 've seen this most dramatically with measles
| vaccines (where people felt like measles was an incredibly remote
| risk, because they had never seen it occur at all in their
| environments, because other people had successfully mitigated it
| by vaccination and hygiene programs in earlier generations!). But
| I imagine that this comes up over and over in modern life: how do
| people get a clear sense of what is dangerous (and how dangerous
| it is) when they already live in settings where whatever degree
| of danger exists is already being dealt with well, so most people
| rarely or never witness its consequences?
| tunesmith wrote:
| I don't understand why it's called a paradox. It's just people
| having trouble understanding counterfactuals. Getting better at
| systems thinking is a great way to get better at avoiding this.
| At work I've learned to point out "we wouldn't need to spend time
| on this if we invested the time to implement X", so the product
| folks are more aware of the counterfactuals when it comes time to
| justify the investment.
| pdonis wrote:
| _> It 's just people having trouble understanding
| counterfactuals._
|
| That's one issue, but another issue is how accurately we can
| estimate the counterfactual outcomes. In the case you
| described, where some up-front investment can reduce costs
| later on, the accuracy of the estimate of the counterfactual is
| usually fairly good. But when we talk about society-wide or
| planet-wide outcomes, our accuracy is much worse. Even in many
| cases where it seems fairly obvious that an up front
| intervention mitigated significant harm, we really don't know
| that with a very high level of confidence. There are just too
| many uncontrolled and unmeasured variables.
| pbreit wrote:
| The reverse may also be true: that the "preparedness" truly was
| unnecessary. No one will ever know.
| tunesmith wrote:
| I guess that gets me closer to understanding it, thanks. If
| we consider an example where the potential outcome is truly
| unknowable. If we don't prepare, it might happen; if we do,
| it might not have ever happened. So in that sense, the Y2K
| bug isn't a good example, but perhaps preparing for
| catastrophic low-probability events like "AI paper-clip doom"
| is.
| coldtea wrote:
| > _I don 't understand why it's called a paradox. It's just
| people having trouble understanding counterfactuals._
|
| So? Most paradoxes can be described as "people having trouble
| understanding X".
|
| The Liar's paradox is "people having trouble understanding
| meta-statements" (at least according to Russel's theory).
|
| Zeno's Ahilles paradox is people not understanding convergent
| infinite series's.
|
| The Potato paradox is people not understanding algebra.
|
| The Friendship paradox is people not understanding statistics.
|
| And so on...
| Jtsummers wrote:
| > It's just people having trouble understanding
| counterfactuals.
|
| You've just described most paradoxes. From the definition of
| "paradox":
|
| > a seemingly absurd or self-contradictory statement or
| proposition that when investigated or explained may prove to be
| well founded or true.
| tunesmith wrote:
| How odd, I've never come across that definition of paradox.
| I've always understood it to be purely self-contradictory,
| like: This sentence is false. If I take it to be false, it's
| true; if I take it to be true, it's false. The proper
| understanding is that it actually has no semantic meaning,
| but it certainly doesn't prove to be well-founded or true.
|
| Using "paradox" for something like this concept though is
| along the lines of also using it for the phenomenon of people
| appearing to vote against their self-interest. They keep
| doing it, we don't understand why - it might be that they're
| stupid, it might be that we don't understand enough of their
| perspective, but it just doesn't strike me as a paradox. Not
| unless every phenomenon we don't understand is also a
| paradox. Are software bugs paradoxes?
| Kranar wrote:
| Your notion of paradox is more precisely known as an
| antinomy:
|
| https://en.wikipedia.org/wiki/Antinomy
|
| Yes, all antinomies are paradoxes but not all paradoxes are
| antinomies.
| omnicognate wrote:
| An antinomy that isn't a paradox would be paradoxical
| indeed.
| bee_rider wrote:
| Yeah, this is something that has always bugged be a tiny
| bit. I was more familiar with the idea of a paradox as
| something like your definition -- containing an actual
| contradiction. But it seems to be used instead to describe
| any initially counterintuitive situation.
|
| It is tempting to attribute this to a technical/non-
| technical difference (similar to fallacy, which in non-
| technical discussion has been expanded to basically include
| almost any bad argument). But somehow the Birthday
| "Paradox" has managed to stick in probability.
| omnicognate wrote:
| Paradox isn't synonymous with contradiction. Some
| paradoxes are, or contain, logical contradictions (i.e.
| they effectively say both X and not X are true) but the
| term is much broader.
|
| Some of the earliest paradoxes are Zeno's, and they were
| referred to by that term at the time. For example the
| paradox that an object that moves towards a point must
| first cover half the distance, and then half the
| remaining distance, then half of the remainder, etc.
| Since this is an infinite number of steps, Zeno playfully
| argued that motion is impossible. There's no logical
| contradiction there, just a way of pointing out something
| counterintuitive about reality and maths.
| coldtea wrote:
| > _How odd, I 've never come across that definition of
| paradox. I've always understood it to be purely self-
| contradictory, like: This sentence is false._
|
| That's just one kind of paradox in one domain (say, logic).
| There are well known named paradoxes of several different
| types, belonging to several different domains...
| mrtesthah wrote:
| A core logical fallacy made by anti-vaxxers.
| [deleted]
| [deleted]
| paulpauper wrote:
| A problem is the media hyping things too much. If not for media
| hype, maybe this paradox would not be such a problem or
| prevalent. People's expectations are in part formed formed by the
| media.
| perrygeo wrote:
| I see this all the time in software development. No media hype
| involved.
|
| I worked with a senior engineer who had a brilliant knack for
| finding design flaws in review (usually security or performance
| issues) and would put in heroic efforts to fix them before they
| went to production. Someone privately called him out as an
| obstructionist - "He's constantly worried about BadThing
| happening, but it never does! He's just wasting time.". I
| politely corrected them - "Did you ever consider that BadThing
| never happens BECAUSE he's constantly worried about it?"
| Jtsummers wrote:
| Related: https://web.mit.edu/nelsonr/www/Repenning=Sterman_CM
| R_su01_.... "Nobody Ever Gets Credit for Fixing Problems that
| Never Happened" by Repenning and Sterman.
| BitwiseFool wrote:
| I'm getting a 404, is there an archive link elsewhere?
|
| Edit: It's been fixed.
| Jtsummers wrote:
| Fixed the link.
| Swenrekcah wrote:
| Nobody got credit for the site being up so the sysadmin
| quit
| jewayne wrote:
| No doubt this existed before the media. Think of when you were
| a kid, and your parents were always making you pick up your
| things, saying people would trip on them. But you knew how dumb
| they were, because nobody ever actually tripped on your
| things...what you didn't realize is that was largely because
| your parents made you pick them up.
| amin wrote:
| This makes me wonder; how different would the COVID pandemic
| death toll have been if governments didn't change anything? No
| travel bans, no lockdowns, etc.
|
| I suspect many people would still voluntarily use masks, self-
| isolate, protect their eldery and take other precautions.
| retrac wrote:
| Both Japan and Sweden were very hesitant to impose legally-
| compelled rules compared to most other developed countries.
| People behaved as you described. Though one can debate whether
| they would have behaved even more so with an order compelling.
|
| In hindsight, I suspect the biggest factor was not whether it
| was compelled, but whether people could _afford it_. (Plenty of
| payments to stay home or keep workers home were still made in
| Japan and Sweden.) If your rent depends on providing black
| market haircuts, you 'll still perform them despite the ban.
| And if you're allowed to do haircuts, but the government will
| instead pay you to stay home to avoid the epidemic disease
| going around, maybe you'll just stay home.
| SapporoChris wrote:
| If you could get accurate data on the different strategies that
| countries used and their results you could extrapolate with
| huge error margins but it would give you a general idea.
| However, many countries did not accurately report: testing
| numbers, results, outcomes, well practically everything!
| tuatoru wrote:
| Very different.
|
| It would have spread very rapidly, overwhelming health systems
| utterly. Do you remember the mask shortage early on in the
| pandemic? Do you remember the oxygen shortage recently? Have
| you heard the news about nurses and doctors quitting because of
| burnout? Imagine all of those dialed up to eleven, all at the
| same time. Along with shortages of cleaners, orderlies, and
| basic hospital supplies.
|
| Nearly all the people whose lives have been saved by treatment
| in intensive care units would be dead, and many more besides:
| accident victims, cancer patients, etc., etc.
|
| The sickness could have spread rapidly enough that essential
| services were entirely out of action for long periods of time .
| No water. No power. No air traffic control. No road repairs. No
| trains. No food transport. All of these at the same time, for
| weeks.
| clarge1120 wrote:
| "Y2K was a hoax", is an example of this bias.
| someweirdperson wrote:
| We'll be able to retry the same scenario but without
| preparation in 16 years.
| zoover2020 wrote:
| Why without preparation? We all know the epoch integer will
| overflow on 32-bit systems in 20388
| someweirdperson wrote:
| Because as it seems Y2K was a hoax (see ggp), so why should
| we prepare next time?
|
| Plus, noone in charge to decide will understand the
| significance of such a weird date.
| charlieyu1 wrote:
| The systems that would have suffered the most, are the
| old Cobol systems that used to run the world. They were
| mostly fixed in Y2K.
|
| Taiwan and Japan have their own version of Y2K problem in
| 2011 and 2025 respectively due to era names. Nothing big
| happened for Taiwan, and I can't see big problems coming
| up in 2025 or 2038.
| _int3_ wrote:
| Then why South Korea and Italy didn't suffer Y2K problems. And
| invested little to nothing in Y2K remediation.
| cyberge99 wrote:
| Little reliance on Y2K impacted platforms?
| _int3_ wrote:
| Like what platforms?
| MattPalmer1086 wrote:
| Y2k wasn't a hoax but it was exploited and hyped.
|
| There was definitely the possibility of really bad impacts on
| critical infrastructure. If we had all behaved like Italy
| then I think it could have been quite bad.
|
| The majority of y2k work I saw on the ground was companies
| using it as an excuse to upgrade all their kit. I did some
| assessments and was asked more than once to emphasise the
| risk a bit more.
| _int3_ wrote:
| And also Russia , country with nuclear arsenal, did nothing
| for Y2K
| googlryas wrote:
| I invested little to nothing in tiger remediation, and lo and
| behold, no tigers!
| _int3_ wrote:
| Imagine if you did invest. You would be called sucker.
| googlryas wrote:
| That really depends if you live in Iowa or Sumatra.
| anonporridge wrote:
| I'd love to see a source for that.
|
| This article from 1999 I found suggests that South Korea was
| worried about North Korea's preparedness for Y2K,
| https://www.deseret.com/1999/12/17/19480898/s-korea-
| worried-.... That seems to suggest that South Korea itself
| would have been making sure its own systems were secure.
|
| Is it simply possible that most of their systems were newer
| than those in other countries, so updates weren't necessary?
| b1n wrote:
| The best strategy for countering the preparedness paradox is to
| prepare, while simultaneously telling everyone else not to
| prepare/over-react. Then you get the benefit of preparedness AND
| the proof that it was required. Win-Win(-lose)!
___________________________________________________________________
(page generated 2022-08-15 23:00 UTC)