[HN Gopher] Wide-ranging SolarWinds probe sparks fear in Corpora...
___________________________________________________________________
Wide-ranging SolarWinds probe sparks fear in Corporate America
Author : kordlessagain
Score : 252 points
Date : 2021-09-10 14:20 UTC (8 hours ago)
(HTM) web link (www.reuters.com)
(TXT) w3m dump (www.reuters.com)
| datameta wrote:
| Zero accountability, complete lack of responsibility, and a total
| absence of sufficient security in far too many instances.
| Something like this has been long long overdue.
| SV_BubbleTime wrote:
| Yes. But -
|
| I'm skeptical that a probe will have any more teeth than the
| censorship testimony fist shaking we've seen at Zuckerberg and
| Dorsey.
|
| Even if there is, the solution needs to be punitive. That if
| you ship shitty software and didn't follow good practices that
| you'll be investigated and fined. New frameworks for what
| constitutes software negligence.
|
| The last thing I would want to see is software regulation,
| oversight of development, Government access. For about a dozen
| reasons each.
| datameta wrote:
| The implementation of anything in the last paragraph would be
| a shit snowball of the finest degree.
| ClumsyPilot wrote:
| Yeah, god forbid if the public has access to source code of
| critical systems that the country relies upon to run
| critical infrastructure like oil pipelines.
|
| We might loose any respect for people in charge whatsoever.
| SV_BubbleTime wrote:
| Just like all the transparency from all the other gov
| agencies right?
|
| And I think the discussion is about private businesses
| not some forced open source ideal you seem to have
| conjured up.
| ClumsyPilot wrote:
| It is your private business freemarket ideal when half
| the country comes to a standstill because the oil
| pipeline runs on windows XP or some shit?
| SV_BubbleTime wrote:
| Are you implying that Windows XP needs security reviews
| now? Or that regulated energy markets make Windows XP?
| You lost me with any sort of relevancy there. As I
| understand it, the topic is about software mfgs. Did you
| just want to throw some freemarket attack (while not
| using a freemarket example)?
| AniseAbyss wrote:
| Interesting I've always wondered if we could hold software
| developers to the same standards as we do medical
| practitioners.
|
| But of course with the extreme shortages companies will
| basically hire anyone fresh from college and the level of
| responsibility from people in the industry is low.
| SV_BubbleTime wrote:
| I like the idea. But isn't it a common complaint that at a
| company there are very few people there that really have a
| wholistic view? That it would be as hard to bring a
| regulator in to inspect as it would be a new hire?
|
| Compared to engineering where you often see the same things
| from job to job.
| fragmede wrote:
| While PCI is its own bag of worms, part of the
| certification process is to describe the architecture to
| an outside auditor. It's annoying and companies can (and
| will) complain all they want, but without meeting that
| requirement, the company can't say they're PCI
| compliance. Which they want to be. So they meet that
| requirement.
| amanaplanacanal wrote:
| This is why real engineers complain about developers
| calling themselves engineers.
| dukeofdoom wrote:
| Corporate America should pay more attention. Growing revolt among
| the proletariat over lockdown and arbitrary technocracy rules.
| Vaccine mandates will push this even more. Its crazy to think the
| White House got stormed less than a year ago. Who knows how many
| dirty secrets The SolarWinds hack has soaked up, and what will be
| revealed down the line.
| lovich wrote:
| As a point of pedantry, the White House did not get stormed,
| the Capitol building did.
| dukeofdoom wrote:
| You're right. My mistake. However it was a little a year ago
| (June) that BLM protestors breached the fence surrounding the
| White House, and Trump went into a bunker. If the media
| reports were correct.
| kordlessagain wrote:
| It's always possible to mention things that have little to
| do with the actual conversation and yet still have an
| amount of semantic context with things that are
| statistically insignificant.
| tisthetruth wrote:
| Some might find this interesting:
|
| https://vengrams.blogspot.com/2021/09/security-is-layered-pr...
| TechBro8615 wrote:
| Why the SEC? I would expect this to be more the domain of FBI or
| NSA or any of the more "cyber"-related three letter
| manifestations of the executive branch.
|
| Either way, I'm glad government agents are going door to door to
| check on privately owned servers. Maybe they should check
| everyone's vaccination status while they're at it.
| IncRnd wrote:
| "The U.S. Securities and Exchange Commission is a large
| independent agency of the United States federal government,
| created in the aftermath of the Wall Street Crash of 1929. The
| primary purpose of the SEC is to enforce the law against market
| manipulation."
|
| "In addition to the Securities Exchange Act of 1934, which
| created it, the SEC enforces the Securities Act of 1933, the
| Trust Indenture Act of 1939, the Investment Company Act of
| 1940, the Investment Advisers Act of 1940, the Sarbanes-Oxley
| Act of 2002, and other statutes. The SEC was created by Section
| 4 of the Securities Exchange Act of 1934 (now codified as 15
| U.S.C. SS 78d and commonly referred to as the Exchange Act or
| the 1934 Act)." [1]
|
| [1]
| https://en.wikipedia.org/wiki/U.S._Securities_and_Exchange_C...
| csours wrote:
| "Everything is Security Fraud" - Matt Levine (Money Stuff guy)
| dennisnedry wrote:
| Because the SEC regulates publicly traded companies.
| PeterisP wrote:
| Because apparently SEC had years ago issued an order that
| breaches must be disclosed if they may have a material impact
| on shareholders (e.g. a potential large lawsuit from customers
| some years afterwards when the full extent becomes known), and
| it's the job of SEC to ensure that company executives don't
| hide company problems from the shareholders.
|
| In essence, if you want to keep your dirty laundry private,
| then you're not allowed to take money from the public stock
| market, as investors (i.e. everyone if you want to be publicly
| traded) deserve to know about any major issues with your
| private servers. SEC doesn't care about how poor your security
| is as long as the company is open about it, but it absolutely
| cares if company lies about their (lack of) exposure to its
| owners.
| a-dub wrote:
| if a public company fails at infosec, and financials or other
| material nonpublic information is stolen and used to trade, then
| yea, it's securities fraud.
| _wldu wrote:
| Good, some of these companies are run by socially connected
| technical morons who hire a bunch of their college buddies as
| 'leaders'.
|
| These people need to be exposed.
|
| Years ago, a guy I know was asked by management to spec out an
| email system that had no limits on the size of file attachments.
| He asked why and was told that 'leadership will have no limits on
| their authority... none whatsoever'.
|
| When he produced the quote, leadership was in shock. The price
| was enormous. They told him they could not afford to spend that
| much money on a mail system, and he said, "Well, I guess there
| will have to be limits then."
| [deleted]
| unemphysbro wrote:
| ah, I didn't realize email attachment size was holding me back.
| throwdecro wrote:
| > 'leadership will have no limits on their authority... none
| whatsoever'
|
| For some reason I envy whoever got to hear that sentence in
| real life. It makes it perfectly clear that you're dealing with
| assclowns.
| Ms-J wrote:
| I have worked for a boss like that but it really wasn't bad.
| He knew what he wanted, and left us alone to do our jobs. No
| micromanagement from middle managers, was very nice.
| ronsor wrote:
| > leadership will have no limits on their authority... none
| whatsoever
|
| It will be unfortunate for them to hear that disk space places
| limits on their "authority."
| AlbertCory wrote:
| I've read all the comments, and as usual, no one's asked "what do
| _other_ industries do? "
|
| Money-handling, for example (banks, payment systems). If ever
| there was a Fraud Magnet, that's it. I've heard PayPal described
| as "a giant fraud-detection system, wrapped around a tiny money-
| transferring system."
|
| And yet, they don't seem to be in the news all the time like
| "data theft" stories are. Could it be that the legal and
| regulatory and insurance systems have made it a manageable
| problem? Someone steals your credit card, your losses are capped.
| Someone steals your Personally Identifying Information, sorry,
| pal; change your passwords.
|
| So maybe treating PII as the same thing, in every way, as money
| is the answer.
| johnchristopher wrote:
| But GDPR and cookie banners forced me to stop selling my
| startup services in Europe :(. /s
| alexott wrote:
| I doubt about PayPal's anti-fraud capabilities. They allowed to
| open another account with the same name and address as mine,
| but with different phone and email, without any confirmation.
| And that person bought something, and after it wasn't paid,
| gave my information to collectors...
| 1vuio0pswjnm7 wrote:
| What would be the cap for losses from losing PII.
|
| Credit cards generally have one use: payments. Usage is not
| difficult to quantify. The card is generally worth the same to
| whomever is in possession of it.
|
| PII has a multitude of uses. The prices offered on the black
| market for PII do not reflect its value to those that it
| identifies or those from whom it was stolen.
| acdha wrote:
| One useful move would be changing laws around identity theft so
| companies are liable for any costs incurred from their failure
| to verify identity, or for reporting credit issues from
| unvalidated activity. Americans worry about things like SSNs
| getting breached because they don't want to get someone else's
| bill -- if companies were required to check photo ID against a
| real person (not an uploaded photo) that'd be a much harder
| crime to make financially viable.
| [deleted]
| PeterisP wrote:
| Indeed, it is ridiculous that "identity theft" places a
| burden on the person whose identity was used - if someone
| opens an account in my name and the only "evidence" is having
| provided something that other people (e.g. my mother or
| spouse) can know, then in any dispute it should be illegal
| for that fraud/debt to appear on my credit report.
|
| That's the way how most of the world has mostly solved
| identity theft, however, it's not that easy to implement in
| USA because there's no system of universal secure IDs in USA
| (by design) - there's a multitude of ID forms, some of them
| are not really secure (easy to forge, no verification if it
| was really issued by the institution who did so, no easy
| process to quickly verify online if the provided credential
| has been lost/stolen/revoked, etc), and there's a
| sufficiently large minority of potential customers who don't
| have a valid ID.
|
| It would be helpful to have laws that clearly assign the
| credit fraud risk fully onto the defrauded companies instead
| of the people whose identities were used, as experience shows
| that this would rapidly result in improvements to fraud
| elimination (there's all kinds of measures that simply are
| not taken since they add friction), however, a proper
| solution does require a decent state-run identity system as
| the foundation of trust, and USA has made a political
| decision to not have one.
| ezoe wrote:
| >appear on my credit report
|
| The root of the problem is sharing the private information.
| Why your credit reports are shared among completely
| different entities? Nobody want to gives them a consent to
| share your private information.
|
| > system of universal secure IDs
|
| Actually, it's the opposite. US has universal ID(not secure
| though). That's the problem. If there exist one idiot who
| doesn't verify your identity, everything fails in chain
| reaction, because everybody else believe the idiot.
| gknoy wrote:
| > US has universal ID(not secure though)
|
| Do we? Our SSN is not a unique number, and not just
| because the keyspace is too small for our population.
| (It's worse: some of the prefixes are geographically
| related.)
| acdha wrote:
| Oh, trust me, I know that this is a self-inflicted problem
| -- we have too many people who subscribe to conspiracy
| theories about things like the "mark of the beast". It's
| just somewhat impressive to see how effectively companies
| created a new category of crime to direct attention away
| from their negligence.
| californical wrote:
| See I do understand the distrust of the state with the
| ability to cut people off from society, by revoking an id
| for example. Especially if there are laws around the ID
| checks being mandatory (which I am generally against).
|
| But I think this is mitigated as long as it's optional
| for a company. The company is held liable for any fraud
| that they allow. The company has the _option_ to use the
| government ID to prevent fraud, but they can also assume
| more risk and take on a customer without the "official"
| gov ID, if they want to.
|
| I can see this resulting in something like creditors
| saying: "either you can use a govID to sign up for this
| credit card, like normal. OR you can send us a $10k
| deposit and forego the govID entirely, if you like."
|
| This solution makes it so that companies are held more
| responsible, but decreases the risk of having more
| government power by making it a decision for the
| company's "risk management team" to decide.
| acdha wrote:
| > See I do understand the distrust of the state with the
| ability to cut people off from society, by revoking an id
| for example. Especially if there are laws around the ID
| checks being mandatory (which I am generally against).
|
| How does that not already happen, just inefficiently?
| It's hard to function in the U.S. if you don't have a
| Social Security Number -- that's why people bother using
| someone else's -- and we already have a de facto ID
| system for most people but it's a patchwork at the state
| level which was somewhat federalized with RealID.
|
| It's hard to imagine an environment where people would
| unjustly be "cut off" where the state level system would
| prevent abuse which would otherwise happen -- it's not
| like, for example, California stopped politically-
| motivated DHS activity during the Trump era.
| ethbr0 wrote:
| > _theories about things like the "mark of the beast"_
|
| My hope is their anti-vax research eventually leads them
| to learning about DNA.
| TeMPOraL wrote:
| The problem isn't with a person having a UUID of some
| sort (of which their genome is one). The issue is that
| the Book of Revelations talks about a Mark people will
| need to have stamped on their arm and/or forehead in
| order to be able to conduct business. I.e. it's a problem
| of allegiance, not authentication.
|
| So, in practice, anything that pattern-matches to "people
| will need to carry some sort of token given by a big
| organization (private or public) to pay or be paid for
| goods and services" will be viewed by some as the Mark,
| or a slippery slope towards the Mark.
| ethbr0 wrote:
| Last I checked, the anonymous web was effectively dead.
| Or do these people not use the internet either?
| hutzlibu wrote:
| If it would be true research, sure. But it is likely
| looking for anything that looks like supporting of the
| theory and ignoring everything else.
| gedy wrote:
| To be fair, all the anti vaccination people I know are
| solidly liberal, non-religious types who believe in
| "natural" medicine, etc.
| gnufx wrote:
| I don't remember where, but Ross Anderson said something
| like "It's not 'identity theft', it's personation.".
| whyrelevant wrote:
| Identity theft or bank robbery?
| https://youtu.be/CS9ptA3Ya9E
| gnufx wrote:
| Yes, a fine explanation from Mitchell and Webb, worth
| keeping in mind.
| nitrogen wrote:
| _if companies were required to check photo ID against a real
| person_
|
| It'll be really hard to convince people to give up the
| convenience and higher returns of online-only banks.
|
| A better option would be using cryptographic digital
| signatures by an HSM (smart card) to verify ID for financial
| services.
| alexott wrote:
| Video ID verification works quite well in many cases.
| motohagiography wrote:
| Can't help but ask, but as a security pro, what would the
| consequences be if we just let it burn?
| nimbius wrote:
| as of April 2021 Solarwinds still shows up in Gartner reports
| read by managers and 'thought leaders.' until they start losing
| prestige in the trade rags you can expect them to endure as a
| corporate standard, best practice, industry standard, and
| "enterprise grade" solution regardless of what common sense and
| competent system administrators at your company say.
| h2odragon wrote:
| At some point, "corporate america" decided that willful ignorance
| was better than making an effort, possibly failing, and possibly
| being held liable for that failure.
|
| Its annoying that there's law for _people_ , then there's laws
| that apply to _some_ corporations, but not always and not all of
| them.
|
| "Maintaining an attractive nuisance" is what they tell people
| with unfenced junkyards, right? Why couldn't that apply to some
| of these folks aggregating data about _our kinks_ "unwittingly"
| displaying the results to the world.
| [deleted]
| ryanmarsh wrote:
| That's how most companies do most everything. If they get big
| then they've figured out a systematic way to win at one or more
| games in business. Everything else is just enough of a shit
| show to get by.
|
| As with all aspects of modern business operations "how to do it
| right" has been crowed about for decades by experts who care.
| It's just that nothing matters until it matters, such as waste
| disposal, workers rights, product safety, etc...
|
| If you show me the incentives I'll show you the behavior. The
| only way we will ever get data security to matter more than
| theater and "check the box" is for the obvious to happen (bad
| consequences).
|
| We don't have a Ralph Nader.
|
| This is why I'm against responsible disclosure, accepting below
| market payouts on bug bounties, and generally treating
| companies with any modicum of trust. Until it hurts so bad that
| people are on the steps of the capitol building beying for the
| blood of CIOs will we see meaningful change.
| Cd00d wrote:
| oh, wow. I _already_ thought the job of CIO was
| overwhelmingly stressful! I think the job description is: try
| to create some guardrails but worry constantly about events
| way outside your control ruining everything.
| ryanmarsh wrote:
| No it's primarily vendor management (according to the CIOs
| I've interviewed).
|
| When you have a network security department unable to
| articulate its policies, which relies on vendors for
| _everything_ including expertise, you damn well should
| worry.
| ddoolin wrote:
| There's really just laws for some people and not always and not
| all of them too.
| SavantIdiot wrote:
| More like they wanted to avoid mob panic. If the corporation
| was hacked and kept it on the DL, but boosted security posture
| in response, is that a bad thing? If they were hacked and did
| nothing, well, screw them. Perhaps the SEC should couch the
| expectations with a bit of reassurance.
| IncRnd wrote:
| > If the corporation was hacked and kept it on the DL, but
| boosted security posture in response, is that a bad thing?
|
| A public corporation has a legal and fiduciary duty to its
| owners other than hiding what happened.
| toomuchtodo wrote:
| > If the corporation was hacked and kept it on the DL, but
| boosted security posture in response, is that a bad thing?
|
| If it's a public company, it's securities fraud. IMHO,
| securities law is the most effective tool at the moment in
| encouraging improved security engineering, best practices,
| and posture.
|
| https://www.sec.gov/news/press-release/2021-154
|
| ""As the order finds, Pearson opted not to disclose this
| breach to investors until it was contacted by the media, and
| even then Pearson understated the nature and scope of the
| incident, and overstated the company's data protections,"
| said Kristina Littman, Chief of the SEC Enforcement
| Division's Cyber Unit. "As public companies face the growing
| threat of cyber intrusions, they must provide accurate
| information to investors about material cyber incidents."
|
| The SEC's order found that Pearson violated Sections 17(a)(2)
| and 17(a)(3) of the Securities Act of 1933 and Section 13(a)
| of the Exchange Act of 1934 and Rules 12b-20, 13a-15(a), and
| 13a-16 thereunder. Without admitting or denying the SEC's
| findings, Pearson agreed to cease and desist from committing
| violations of these provisions and to pay a $1 million civil
| penalty."
| bink wrote:
| If what's being disclosed is of the nature of the Pearson
| hack (theft of student records) then great. But there are
| probably thousands of hacks that don't result in the
| disclosure of PII or other confidential information.
|
| I can understand companies being worried that a compromise
| of a test system with no access to sensitive data -- which
| they normally wouldn't be required to disclose -- could
| make them look bad. But at the same time they're all being
| required to disclose this info so at least there's safety
| in numbers.
| toomuchtodo wrote:
| I agree that a breach of a test system with no access to
| sensitive information or digital property (information,
| source code, binaries, etc) and no ability to pivot from
| said test system to other systems should not require
| reporting. To pick an example, that's not what's
| happening with S3 buckets and Mongo instances (where vast
| amounts of personal and or sensitive information is being
| leaked). That's not what happened with Equifax, T-Mobile,
| Solarwinds, Colonial Pipeline, Pearson, CNA Insurance,
| etc. You have to hold the feet of these businesses to the
| fire, and if they don't perform, dissolve them after
| repeated regulatory failures (just as Arthur Andersen had
| happen after Enron's failure, or FDIC would part out a
| bank after insolvency).
|
| https://www.reuters.com/technology/hackers-
| demand-70-million... (July 2021: Up to 1,500 businesses
| affected by ransomware attack, U.S. firm's CEO says)
|
| (disclosure: infosec practitioner)
| elliekelly wrote:
| I think _how_ and _why_ the breach occurred matters more
| than _what_ information was accessed. In asset
| management, for example, when you're dealing with an
| error you don't just look at the dollar amount. Maybe the
| error only cost a couple thousand dollars today (or maybe
| it even _made_ money!) but the exact same error on
| another trading day could just have easily been ten, or a
| hundred, or even a thousand times more costly. That the
| error happened at all is the material event. And that's
| why there's no such thing as a de minimus trading error.
| Sometimes you just get lucky in the magnitude of the
| impact. Even if it didn't cost you anything you still
| need to address the weak point that allowed the error to
| happen in the first place.
|
| So even if a system with absolutely no information was
| breached if your other system(s) use(s) the same or
| similar security then it doesn't really matter that
| nothing was taken. The breach could still material (and
| require disclosure) because it's exposed a material
| security vulnerability.
| toomuchtodo wrote:
| Lots of nuance that can't fit into a single thread.
| ClumsyPilot wrote:
| "If it's a public company, it's securities fraud. IMHO,
| securities law is the most effective tool at the moment in
| encouraging improved security engineering, best practices,
| and posture."
|
| That's just a fucking sad state of affairs. Apparently they
| owe nothing to their customers.
| markus_zhang wrote:
| The Law embodies the will of the ruling class. This is what we
| were taught back in school.
| lifeisstillgood wrote:
| - Offer amnesty / limited liability / zero liability for losses
| following breaches
|
| - require full disclosure to national data registrars following
| breaches from now on.
|
| - Make source of income as big a deal as KYC
|
| - make KYC a "walk in the branch with some photo id". How many of
| us really need to borrow thousands without going into a store or
| bank.
| Ms-J wrote:
| I'm very confused about your last bullet point. I don't use
| physical banks for my finances and do all my finances online
| (like many people). I regularly move thousands of Euros and
| don't see why I would need a physical bank. That would severely
| impact a lot of people.
| rfd4sgmk8u wrote:
| People are worried about the wrong stuff. SolarWinds was bad but
| it was likely intel operation. They wanted access to networks for
| intelligence purposes. They jacked it so they could access assets
| behind corp firewalls. Spys will always try to spy.
|
| IMHO the Kaseya hack was far worse, maybe worse than WannaCry but
| with better outcomes. This was a criminal operation, provided by
| criminal software suppliers that really was only resolved when
| the keys were leaked on a forum.
|
| The rumor is that local intelligence forced the disclosure of the
| keys (eg: guns to heads), because this is pretty much the destroy
| the world scenario that is unstoppable. It is easy easy for
| attackers to cause billions of dollars of damage in a day.
|
| Its not getting better. It can't. Our systems are designed for
| large scope of trust with massive surface areas. Security is a
| game where the defenders cannot mess up once. Its hopelessly
| asymmetric and can never be better.
| ep103 wrote:
| we can start by repealing laws that give corporate entities
| immunity when data is leaked. Make them liable for lawsuits
| with set minimum damage amounts for exposed data, and one would
| be able to watch the money flow into better tech security on a
| society-wide scale.
| AmericanChopper wrote:
| What laws do you imagine grant corporate entities immunity?
| The companies are victims of the crime in this case, along
| with their customers. There is no special law that grants
| corporations criminal immunity from falling victim to a
| crime, because that's not illegal.
|
| If you look at how this sort of thing is regulated, there's
| two general approaches. The first is creating a category of
| data that requires special protections, and defining a
| standard for protecting it. Either through legislation (like
| HIPAA), or self-regulation (like PCI). The other is to
| specify a requirement to protect all PII, but not define any
| specific standard for protecting it, only prescribing
| penalties for failing to do so (when seems to be what the EUs
| regulatory approach is).
|
| Both of these approaches are problematic.
|
| Is it self-evident that any breached data was not
| sufficiently protected? I don't think any experienced
| professional would agree. It is impossible to build a system
| that is completely protected from being potentially
| compromised, and it's possible for a largely unprotected
| system to last its entire lifespan without being compromised.
| So the simple fact that a system has been compromised doesn't
| necessarily reveal any information about how adequately
| protected it was.
|
| On the other hand, is there a single security standard that's
| widely regarded as being good? I don't think there is. The
| ones that are generally regarded as the best I would
| personally consider to be not bad, but not great. One size
| fits all solutions tend to find a lot of not fit for purpose
| use cases as well.
|
| It's also not apparent to me at all that spending more money
| on security achieves better security outcomes. I've worked in
| numerous large enterprises that spend enormous sums of money
| on security budgets, and manage to achieve very little with
| it. So I don't think you're going to get much consensus on
| that being a suitable metric for how adequate a company's
| security systems are either.
|
| You could easily devise a system that punishes companies for
| falling victims to these attacks. But that's the only outcome
| it's going to achieve. A punishment for being the victim of a
| crime.
| ClumsyPilot wrote:
| We can also add a prohibition on three-letter agencies
| installing purposeful backdoors which are later exploited by
| criminals. Maybe it's time they actually were helping regular
| citizen protect themselves and their privacy, instead of
| playing chicken with their counterparts abroad.
| willcipriano wrote:
| Simpler and more effective solution: Do as JFK suggested
| and "splinter the CIA into a thousand pieces and scatter it
| into the winds".
| acdha wrote:
| I support that but ... how often has that happened? That
| Juniper incident didn't seem to be widespread and it
| certainly doesn't appear that a notable percentage of
| breaches are due that kind of thing.
| fragmede wrote:
| Except we don't know most of the hacks going on, so we
| definitely don't know _how_ they happened. Eg we 'll
| never know how many hacks were due to Debian's SSH fiasco
| but I bet you it's far from zero.
| acdha wrote:
| We don't know everything but think about how many we do
| get details about showing nothing of the sort. It seems
| conspiratorial to assume that this happens often but is
| always hushed up.
| sherr wrote:
| Thanks for the Kaseya reminder - it had vanished from my
| memory. For a period, these attacks seemed to be coming thick
| and fast. According to wikipedia [1] :
|
| 9 July 2021 - phone call between Joe Biden and Vladimir Putin.
| ... Biden later added that the United States would take the
| group's servers down if Putin did not
|
| 13 July 2021 - REvil websites and other infrastructure vanished
| from the internet
|
| 23 July, Kaseya announced it had received a universal decryptor
| tool
|
| I'd love to read the real story behind that. Perhaps "guns to
| heads" did happen.
|
| [1] https://en.wikipedia.org/wiki/Kaseya_VSA_ransomware_attack
| kordlessagain wrote:
| This is what happens when you delete Hoover Beaver.
| csbartus wrote:
| It's time to fix software security. And it's gonna be hard.
|
| First, there is no unbreakable software. Second, software is
| written by average people vs above-the-average people who are
| hacking it. Mission impossible.
| only_as_i_fall wrote:
| Is there evidence that the average hacker is smarter than the
| average developer? I would expect the opposite to be true
| because legitimate work seems more profitable/stable, but also
| I'd imagine the difference is t that high either way
| IncRnd wrote:
| The parent never used the word "smarter". By definition, the
| average developer is developing applications, but the non-
| average developer is doing something else, possibly hacking.
| Hacking is not the average activity (the way that word is
| used today).
|
| With regards to skill sets, I have repeatedly found that
| people who engage in hacking range from skill sets of
| "knowing how to use a hacking kit" to "uber developer with
| security knowledge". There is a wide range of skills and
| knowledge.
|
| However, it is practically an entry requirement for someone
| in the security space to view software differently than most
| programmers. That is defined as non-average.
| fragmede wrote:
| They didn't say smarter but they did say above-the-average
| which implies better (as opposed to worse), rather than it
| being a different skill set. That is to say, I know
| _exceptional_ "hackers" who can't code their way out of a
| pair bag, or build any sort of GUI. Similarly, I know some
| really good programmers who don't intimately understand how
| computers work a tenth as well as hackers do. There are
| genuinely smart people in both camps, but they're different
| skill sets.
| csbartus wrote:
| I remember after finishing our CS studies we were taken by
| the Army to take a day long test. We were warned better fail
| the test unless we are willing to be enrolled. However this
| might be an isolated case.
|
| In turn, I guess a security professional is more scarce than
| an average developer. The question is if all security
| professionals are hired to strenghten systems, or some of
| them to break it.
| fragmede wrote:
| Yes, large engagements frequently include a "red team"
| who's job it is to try and break into the system.
| csours wrote:
| I used to think this way, it can be really dangerous to
| assume level of intelligence from background information.
|
| More to the point, hackers can be very motivated to break
| things in a way that the average developer is not motivated
| to secure them.
| [deleted]
| dennisnedry wrote:
| Of course not, this is just the parent poster's opinion. The
| truth of the matter is that there exceptional individuals who
| decide to get into software development and software
| security. The problem with software is that often companies
| don't invest into securing their software, and that has to be
| a priority. Perhaps having the SEC force fines for not
| securing mission critical software is the first step?
| datameta wrote:
| I think the incentives are lopsided. The developer does not
| personally bear the blow of their company's data breach
| (unless they're dedicated cybersec personnel) whereas the
| hacker reaps all the reward of getting access.
| datameta wrote:
| Maybe the cost equation becomes more evident to companies:
|
| dedicated above-average internal* cybersec staff < (SEC fines +
| outcry when breach goes public)
|
| * external seems like a different can of worms. perhaps someone
| in cybersec can refute/expand
| gitanovic wrote:
| Sorry, this is not true.
|
| The real issue is that software has many bugs as the sum of all
| contributes to it, and all it takes is finding one.
|
| What I mean is that it takes just one sloppy developer to
| introduce a bug, and that's all you need.
|
| Making unbreakable software is a much harder task than breaking
| it.
|
| It's not about who's smarter, it's about what's easier.
| csbartus wrote:
| I still believe it's about who does what. Code written by an
| average developer is breakable by a better skilled developer.
| Vice versa is not true.
| aledalgrande wrote:
| More than avg vs above avg I would say it's building a cards
| castle vs making a cards castle fall. The latter is way easier.
| datameta wrote:
| And as a card castle toppler, you only have to find the most
| unstable one in a group. Perhaps to many companies it seems
| like a revenue sink to implement proper security. It Probably
| Won't Happen To Us(tm) and so forth. Okay. Maybe it seems
| like a more concrete return on spend if a company were to
| frame the goal as trying to be at least as fit as the average
| of the herd.
| adrianmonk wrote:
| There are really two problems that could go under the name of
| "fixing software security":
|
| (1) How do you improve the state of the art, so that, if a
| company is serious about security, they can succeed?
|
| (2) How do you fix the way companies are run so that they
| actually even try to take security seriously?
|
| Both are big contributors to the overall problem.
|
| I do think there is room for improvement in #1, so it's
| something we should be looking at. But we could get a lot of
| mileage out of #2 even if there were no way to move the needle
| on #1.
| mikewarot wrote:
| The age old advice is "don't talk to the police"... I imagine
| that goes wayyyyy more for talking to the SEC. Of course they're
| fearful.
| ndespres wrote:
| I have noticed "don't talk to the police" being repeated often
| around here lately, with links to the youtube video. While it
| is probably good advice for your general day-to-day encounters
| with police, I don't think it is great advice for a executives
| of the corporation dealing with the SEC.
| maeln wrote:
| I think the whole sentence should be "don't talk to the
| police without a lawyer / let the lawyer speak for you". And
| in the case of an executives talking to the SEC you should
| absolutely have a lawyer, or multiple, with you.
| Ms-J wrote:
| Actually, not talking to or interacting with the police
| sounds like the most rational advice for American's right
| now.
| jdavis703 wrote:
| I was audited by the "tax police" at the IRS. We had a
| constructive conversation, I fixed the problem, paid some
| more taxes and was done. I don't think this advice applies to
| all government investigators.
| luckylion wrote:
| You can always be lucky. I was audited a few years ago,
| they didn't find any issues. So, just for fun, they added a
| special audit 3 months later. Needless to say they didn't
| find anything that time either.
|
| Don't talk to the police or the IRS. They are never aligned
| with your interests, and whether you get a reasonable
| person or someone who just loves to ruin your day is
| random.
| only_as_i_fall wrote:
| I've never been audited, but isn't it more a case of "you
| must talk to the IRS or they'll simply collect what they
| think you owe and leave you no recourse"?
| jdavis703 wrote:
| Let me precise, I was audited because I forgot a 1099
| form. It was a stupid mistake on my side. If you're being
| audited just so they can root around and find a problems,
| then yeah be careful.
| toss1 wrote:
| Yes, multiple people that I've known have been audited by
| the IRS. All were small business owners who did their own
| taxes (vs having them done & submitted by a CPA). At the
| end of all the audits, the IRS ended up writing a check to
| the businesses. (If you can find more overlooked deductions
| the exceed the overlooked taxable items, they must make the
| adjustments in your favor.)
|
| One good friend got audited five years in a row; maybe the
| local bureau chief was just _sure_ he was up to something.
| The last time they were writing a check to him, it was
| going to be for less than $2, and the agent asked if they
| really wanted it -- "Of course I damn well want you to
| write that check!".
|
| I've had a career mostly in small businesses, and always
| had a CPA do it with never an audit. I strongly suspect
| that it not only likely gets me more proper deductions that
| I'd miss, but also gets a lot of points in avoiding an
| audit, since the CPA is also putting their license on the
| line by signing it. I'd recommend the practice, just find a
| good one who charges flat rate (they do exist, just takes
| some looking).
| Accujack wrote:
| You'd be surprised. At my employer, I'm told before every
| meeting with internal and external auditors to not offer
| unrelated information to them.
|
| Basically, give them only what they ask for and exactly what
| they ask for.
|
| If no one sees something, it doesn't exist, right?
| jszymborski wrote:
| I'm wondering if anyone here on HN actually knows if any of this
| has teeth and if corpos are legitimately worried or if it's more
| of a feint. I'm generally skeptical of SEC enforcement.
| jeffwask wrote:
| Good. Security will only be a priority when it's more expensive
| than profit.
| ChrisLomont wrote:
| And business cannot operate at a loss, so increased expenses
| will be passed on to customers. Yay.... right?
|
| If we can make security lapse expenses higher and higher we can
| all pay more and more until all products are completely secure
| but no products remain....
| ClumsyPilot wrote:
| Yeah, who needs those aircraft safety regulation, where is my
| Boeing Max-Max with 50% chance of taking a swim mid-flight?
| ChrisLomont wrote:
| The vast majority of software security issues don't kill
| people. Trying to price them higher than current levels
| will add cost to goods, no?
| balabaster wrote:
| Or when security actually becomes profitable in itself.
| cronix wrote:
| Security is profitable. Very profitable. It's likely one of
| the reasons a lot of companies avoid it....it's very
| expensive and most don't see it as adding to the bottom line
| because it's largely invisible, but taking from it, until
| something major happens.
| A4ET8a8uTh0 wrote:
| I will only add:
|
| About god damn fucking time.
| [deleted]
| whitepaint wrote:
| That's why dapps on Ethereum and the likes are and will be way
| better than any alternative.
| 1970-01-01 wrote:
| That's part of the definition of good security engineering.
| Protect stuff up to its value, and never spend more money than
| what is needed to rebuild it from scratch.
| joe_the_user wrote:
| _Protect stuff up to its value, and never spend more money
| than what is needed to rebuild it from scratch._
|
| Oh, but the problem appears when you'll holding other
| people's information. " _Your SSN ain 't worth much to me,
| sorry, keeping that pipeline open only matter X much to our
| bottom line,_" etc. .
|
| "Good Security
| AlbertCory wrote:
| Thanks, yes. Like I said in my other comment: if you keep
| other people's _money_ there are laws and rules that apply
| to you. You may not be negligent with it. The phrase
| "fiduciary duty" comes to mind.
|
| Yet somehow, keeping their PII imposes almost no
| obligations on you at all.
| cortesoft wrote:
| Capping spending on security to the cost of rebuilding from
| scratch implies that total loss is the worse thing that can
| happen from a security breach. That isn't true. A security
| breach could be more costly than a total loss.
| wutbrodo wrote:
| I think that for-profit organizations actually mesh quite
| perfectly with the "security economics" perspective. Ie, they
| care about security to the extent that they see it affecting
| their own utility function. In ideal circumstances, negative
| externalities like the impact on the breached users flow back
| into the company's incentives via bad PR. The problem is that
| there's a shortcut: it's inherently easy to hide security
| breaches, given that the security domain already involves a
| baseline level of opacity (as opposed to, say, product or
| pricing decisions). As it often is, the approach here should
| be to reconnect this feedback loop, by regulating and
| vigorously enforcing penalties for failing to disclose
| breaches. Suddenly, the "value" of security from the
| perspective of the organization drops precipitously. To make
| matters worse, hiding security breaches causes collateral
| damage by making mitigation by its victims harder (if no one
| tells me my SSN was leaked, I won't (eg) freeze my credit
| report).
|
| The answer, as it often is, is for regulatory pressure and
| robust enforcement to connect the externality's consequences
| back to the agent. The easiest step is by requiring
| disclosure of breaches. As such, the news in this article
| seems like it should be unequivocally celebrated.
| daveslash wrote:
| Yep. This. Couldn't agree more. I went to a BSides talk years
| ago titled _" Does DoD Level Security Apply to the Real
| World?"_ ~ In summary, Yes.
|
| The premise of the talk, as I understood it, was that too
| many small operations or "mom and pop" shops think that they
| do not need "Department of Defense" level security, because
| they're a small general store, not Fort Knox. That's a
| misconception. "DoD Level Security" doesn't mean that you
| protect your place like the NOC list in Mission Impossible;
| it means that you are proactive in thinking about your thread
| model and assessing the value of your assets. If, after
| proactively _thinking it through_ , you're _still_
| comfortable with just a cheap pad lock and no alarm system,
| then you 've applied "DoD Level Security" (or something like
| it).
| mupuff1234 wrote:
| The problem with the statement is that value function might
| be quite different for the company vs the impacted user.
| leptoniscool wrote:
| For sensitive software like these, we should make them open-
| source so that more eyeballs are looking at it..
| tisthetruth wrote:
| Does this classify as something one can submit a tip about under
| the SEC.gov's Whistlerblower program?
___________________________________________________________________
(page generated 2021-09-10 23:00 UTC)