[HN Gopher] It's not just CrowdStrike - the cyber sector is vuln...
       ___________________________________________________________________
        
       It's not just CrowdStrike - the cyber sector is vulnerable
        
       Author : jmsflknr
       Score  : 95 points
       Date   : 2024-07-19 16:02 UTC (7 hours ago)
        
 (HTM) web link (www.ft.com)
 (TXT) w3m dump (www.ft.com)
        
       | udev4096 wrote:
       | https://archive.ph/sKcp3
        
       | cs702 wrote:
       | For most corporations, security and robustness are -- and for a
       | long time have been -- an afterthought.
       | 
       | Making systems _hard to hack_ and _robust to rare events_ :
       | 
       | * is really hard,
       | 
       | * costs a lot of money, and
       | 
       | * reduces earnings in the short term.
       | 
       | Faced with these inconvenient facts, many executives who want to
       | see stock prices go up prioritize... other things.
        
         | hypeatei wrote:
         | To them, they _are_ thinking about it though. They installed
         | this ultra secure thing called CrowdStrike that checked a
         | regulatory box for cybersecurity
        
           | cs702 wrote:
           | CrowdStrike is a textbook example of a single point of
           | failure:
           | 
           | https://en.wikipedia.org/wiki/Single_point_of_failure
        
         | prasadjoglekar wrote:
         | To be fair, most corporations signed up for Crowdstrike as a
         | way to address some issues. I'm sure it wasn't cheap and CS was
         | probably better at security than an IT admin at a 50 person
         | shop.
        
           | cs702 wrote:
           | Yeah, it wasn't cheap.
           | 
           | It's still not enough.
        
           | Damogran6 wrote:
           | Much like the RSA attack, now we get to see how Crowdstrike
           | handles damage control.
        
           | croes wrote:
           | But what's worse, hundreds of maybe insecure companies or
           | creating a big single point of failure?
        
             | fire_lake wrote:
             | Globally or locally?
             | 
             | To each individual company, it's better to have the big
             | single point of failure. That's the problem.
        
         | lotsofpulp wrote:
         | I feel like I would be doing everything in my power to de-
         | Windows my operation.
        
       | monero-xmr wrote:
       | The real financial problem is that cybersecurity is mostly box
       | checking. It's an industry that is open to commoditization, as
       | startups in lower-cost global regions manage to check the box as
       | well as the next-most-expensive region, and cost conscious
       | companies keep migrating. But the power of the box checking is
       | strong.
       | 
       | I do not invest in cybersecurity companies, it is very risky IMO
        
         | mrkramer wrote:
         | The problem with cybersecurity is that there are hundreds of
         | attack vectors; you can get pwned by supply chain attack or by
         | some random zero-day exploit or by an insider....It is
         | literally impossible to 100% prevent breaching of your computer
         | network.
        
           | lucianbr wrote:
           | It _is_ impossible to write bug-free /exploit-free code.
           | 
           | But some companies are using this as an excuse to not care
           | about the chances of an exploit at all, and just write code
           | in a cheaper way.
           | 
           | We need a middle ground, where there is at least a reasonable
           | effort towards security.
        
             | StillBored wrote:
             | "It is impossible to write bug-free/exploit-free code."
             | 
             | Right, and this should be the single deciding factor for
             | most system programming and core infrastructure
             | development. One doesn't throw away 20+-year-old battle-
             | tested code simply because it's grown ugly bug fixes for
             | edge conditions no one wants to worry about. The idea that
             | it's possible to throw away, say 30-year-old font rendering
             | code and replace it without revisiting a lot of the
             | problems along the way is peak hubris.
             | 
             | And the same goes for choosing and building internal IT
             | systems, KISS should rule those choices because each layer
             | adds additional code, additional updating, etc. Monolithic
             | general-purpose software is not only a waste of resources
             | (having software that 9/10th is just taking up
             | disk/memory/cache space because only 10% of its features
             | are used), but it's a maintenance and security nightmare.
             | 
             | This is the problem with much of the open-source world,
             | too. Having 20 different Linux filesystem drivers or
             | whatever is just adding code that will contain bugs,
             | exploits, and a monthly kernel update containing 80 KLOC of
             | changes is just asking for problems. Faster processes,
             | updates, and development velocity in projects that were
             | "solved" decades ago are just a playground for bad actors.
             | 
             | So, to go back to Andrew Tanenbaum and many others, no one
             | in their right mind should be writing or using OSs and
             | software that aren't built from first principles with
             | clearly defined modularity and security boundaries. A disk
             | driver update should be 100% separate and compatible with
             | not just the latest OS kernel but ones from 10+ years ago.
             | A database update shouldn't require the latest version of
             | python "just because".
             | 
             | Most software is garbage quality written by a bunch of
             | people who are all convinced they are better than their
             | peers. And yet another code review, or CI loop, isn't going
             | to solve this, although it might stop a maintainer from
             | throwing poorly tested code over the fence instead of
             | subjecting it to the same levels of scrutiny they give 3rd
             | party contributors.
        
           | Ekaros wrote:
           | Also very often software quality is absolute trash... With so
           | many issues developers spend no time on thinking about most
           | basic things... Like applying access control on
           | reading/editing data or what field should a request update
           | and what not...
           | 
           | And these parts are the simple ones. Not even talking about
           | operating systems, networking and so on... If even easy stuff
           | is wrong, what hope is there for complex...
        
       | mrkramer wrote:
       | Don't do automatic updates....roll updates manually. That would
       | be a nice thing for the beginning.
        
         | ygjb wrote:
         | It's easy to make comments like this against automatic updates,
         | but then you get popped because something that would have been
         | automatically updated misses a patch because it was too
         | critical to risk automatic updates.
         | 
         | In practice, failing closed (or crashing) is probably fine for
         | _most_ businesses, and lower cost than a breach, but the
         | correct solution is automated testing across a broad spectrum
         | of devices, staged and rolling updates to prevent entire fleets
         | going down at once, and ensuring that there is an effective,
         | tested rollback mechanism.
         | 
         | But that shit's expensive, so _shrug_ : /
        
           | dataflow wrote:
           | > but then you get popped because something that would have
           | been automatically updated misses a patch because it was too
           | critical to risk automatic updates.
           | 
           | This kind of logic only works if you ignore any kind of
           | possible nuances in the problem and just insist on throwing
           | the baby out with the bathwater. Just because someone let you
           | do automatic updates (or let's be real, you probably didn't
           | give them much of an option) that doesn't mean you should use
           | it for everything.
           | 
           | Automatic update of data (like virus definitions) !=
           | automatic update of code (like kernel driver)
           | 
           | And really, the only time you could justify doing automatic
           | updates on other people's machines is when have reason to
           | believe the risk of waiting for the user to get around to it
           | is larger than the damage you might do in the process...
           | which doesn't seem to have been the case here.
        
             | lucianbr wrote:
             | From what I read they automatically updated data. But the
             | pre-existing code had a bug, which crashed on reading the
             | updated data.
             | 
             | Even if this is not what happened, it is possible, and
             | shows the data/code update separation does not prevent
             | problems.
        
               | dataflow wrote:
               | > shows the data/code update separation does not prevent
               | problems.
               | 
               | Sure they do? This is like saying seatbelts don't prevent
               | injuries because people still die even while wearing
               | them.
               | 
               | I never said that one weird trick would solve every
               | problem, or even this particular one for that matter.
               | What I was saying was that if you look for ways to add
               | nuance... you can find better solutions than if you throw
               | the baby out with the bathwater. I just gave two examples
               | of how you could do that in this problem space. That
               | doesn't mean those are the only two things you can do, or
               | that either would've single handedly solved this problem.
               | 
               | The problem in your scenario is that kernel mode behavior
               | is being auto updated globally (via data or code is
               | irrelevant), and that should require a damn high bar. You
               | don't do it just because you can. There's got to be a
               | lower bar for user mode updates than kernel, etc.
        
             | ygjb wrote:
             | > This kind of logic only works if you ignore any kind of
             | possible nuances in the problem and just insist on throwing
             | the baby out with the bathwater. Just because someone let
             | you do automatic updates (or let's be real, you probably
             | didn't give them much or an option) that doesn't mean you
             | should do use for everything.
             | 
             | Oh, I agree - automatic updates are nuanced in many cases.
             | Generally speaking, automatic updates are a good thing, but
             | they offer trade-offs; the main trade-off is rapidly
             | receiving security updates, at the risk of encountering new
             | features, which can include new bugs. This is kind of a big
             | reason why folks who buy systems should be requiring that
             | updates offer a distinction between Security/Long Term
             | Support, and Feature updates. It allows the person who buys
             | the product to make an effective decision about the level
             | of risk they want to assume from those updates.
             | 
             | > Automatic update of data (like virus definitions) !=
             | automatic update of code (like kernel driver)
             | 
             | Yep, absolutely, except for the case where the virus
             | definitions (or security checks) are written in a language
             | that gets interpreted in a kernel driver, presumably in
             | languages that don't necessarily have memory safety
             | guarantees. It really depends on how the security
             | technology implements it's checks, and the facilities that
             | the operating system provides for instrumentation and
             | monitoring.
        
         | winternett wrote:
         | Testing in software dev is taken completely for granted by many
         | companies on mission-critical updates... Back in the day, our
         | deployments would get tested on configs new & old and with
         | several different variables, we always made sure deployments
         | went smoothly. Now it seems as if most of these companies hire
         | junior devs and skip testing to cut cost and then just put the
         | blame of failure all on them. Burnout levels are high in these
         | settings.
         | 
         | This whole incident would have not happened if just a basic
         | deployment test was conducted. It's so widespread, it would
         | have been impossible to miss detecting the issue.
        
         | rs999gti wrote:
         | Can auto updates be turned off on the crowdstrike falcon
         | client?
        
         | Johnny555 wrote:
         | In any sizable organization, you can't get around automatic
         | updates.
         | 
         | But updates should be rolled out slowly and you need enough
         | telemetry to detect problems as it's rolled out. Reboots,
         | crashes, cpu/memory use, end user reports, etc should all be
         | used to detect issues and pause the rollout.
        
         | HdS84 wrote:
         | Then you end up like some of our customers with log4j. We are
         | consultants and notice that a cave for log4j comes out. We
         | inform our customers that we have detected an issue under
         | active exploit, and we performed an update to non vulnerable
         | versions and want to deploy. Customer waffles for days and gets
         | exploited before he decides to upgrade. Threats are often only
         | minutes away. We are currently away to slow and manual updates
         | are slowing you down even more.
        
       | lukev wrote:
       | It's rapidly getting to the point where the cure is worse than
       | the disease, when it comes to this kind of product.
        
         | croes wrote:
         | It isn't a cure in the first place it just fights the symptoms.
        
       | Damogran6 wrote:
       | It's almost as if we're seeing the downsides to our cloud based
       | decisions. Uncontrolled costs, lack of visibility, placing
       | control of critical processes in the hands of other groups...that
       | also have control of critical processes globally.
       | 
       | Am I bitter at losing the business decisions that push ease of
       | management by sending control to service providers? Not really.
       | It's been dozens of times, and I lose every time.
       | 
       | I can raise the concerns to make sure the decisions are educated
       | ones, and then let the decisions be made.
        
         | altdataseller wrote:
         | If you managed your own servers, you would still need some sort
         | of endpoint security solution too tho...
        
           | Damogran6 wrote:
           | Not saying it outages wouldn't happen, saying it might not
           | happen on a _global_ scale....and it's not the only downside.
           | There are pros and cons to each solution.
        
       | Buttons840 wrote:
       | Cyber Security is a matter of national security, but currently we
       | sacrifice our national security for the convenience of companies.
       | 
       | The disconnect is that companies are both (1) the only entity in
       | control of their system and how it is tested and (2) not liable
       | if a security breach does happen.
       | 
       | I believe we need to enable red teams (security researchers) to
       | test the security of any system, with or without permission, so
       | long as they report responsibly and avoid obviously destructive
       | behavior such as sustained DDoS attacks.
       | 
       | A branch of the government, possibly of the military (the Space
       | Force?) could constantly be trying to hack the most important
       | systems in our nation (individuals and private companies too).
       | The bad guys are doing this anyway, but hopefully the good guys
       | could find the security holes first and report them responsibly.
       | 
       | Again, currently this doesn't happen because it would be
       | embarrassing and inconvenient for powerful companies. We threaten
       | researchers who do nothing more than press F12 (view HTML source)
       | with jail time and then have our best surprised Pikachu faces
       | ready for when half the nations data is stolen every week or
       | major systems go down. Actually, we don't make faces at all, half
       | the nation's data is stolen every week--no, actually we don't
       | even take notice, we just accept it as the way things have to be.
       | Because, after all, we can't expect companies to be liable, but
       | we can trust companies to have exclusive control over the testing
       | of their security. How convenient for them.
        
         | frenchy wrote:
         | Isn't this what the NSA is for? Also, I think we have plenty of
         | reason to believe they regularly try to penetrate powerful
         | companies, they just don't necessarily tell us when they do.
        
           | Buttons840 wrote:
           | I've never heard anything about the NSA telling a company
           | they have a security vulnerability. Have you?
        
             | orr94 wrote:
             | Not the NSA, but I know of at least one time the FBI did:
             | https://arstechnica.com/security/2024/01/chinese-malware-
             | rem...
        
             | rho138 wrote:
             | https://www.cbsnews.com/news/nsa-microsoft-
             | vulnerabilities-m...
        
               | bb88 wrote:
               | That was probably because the NSA and other critical
               | government agencies use Microsoft Exchange and it was a
               | bug found in the wild.
               | 
               | But if it wasn't a bug found in the wild, can you imagine
               | the fights between the NSA red and blue teams on whether
               | to alert Microsoft about it?
        
           | zavec wrote:
           | They absolutely have bugs up their sleeve, but if they tell
           | the companies to allow them to fix them then they can't use
           | the bugs for spying (or at least, not as effectively)
        
         | silverquiet wrote:
         | CISA offers services to public and private providers of
         | infrastructure deemed critical that include pen testing, but
         | they don't have the resources to offer it to all who want it.
        
       | freitzkriesler2 wrote:
       | Cyber was a 90s buzz word that died out and became vogue when
       | cyber security became cool. I cringe every time I hear it drop.
        
         | jabroni_salad wrote:
         | government and military loves that word and will probably never
         | let it go.
        
           | whycome wrote:
           | Air Force. Space Force. ...Cyber Force? Inevitable.
        
             | jabroni_salad wrote:
             | We do actually have US CYBER COMMAND but it isn't a branch
             | of the military, it's just a unit inside the DoD.
        
           | freitzkriesler2 wrote:
           | I roasted some Booz Allen booth people when they asked me
           | about cyber with a," the 90s called and wants its buzz word
           | back."
           | 
           | The look they gave was priceless.
        
         | bluedino wrote:
         | I still think of how 'cyber' was used in AOL Chatrooms back in
         | the 90's...
        
           | whycome wrote:
           | I forgot about that until this mention. Definitely not
           | relegated to AOL chats.
        
         | jijijijij wrote:
         | "Cyber sector" is reeeeally pushing it. Full body cringe.
        
       | PreInternet01 wrote:
       | The "cyber sector" is... awful? Nah... irresponsible? Nah...
       | immature? Yeah, probably!
       | 
       | Right now, pretty much everyone is looking to outsource their
       | "security" to a single vendor, disregarding the fact that
       | security is not a product, but a process.
       | 
       | That... won't change! And incumbents will get less-awful about
       | their impact on "protected" systems.
       | 
       | And yet, there's an opportunity here! Do you _truly_ understand
       | Windows? And whatever happens on that platform? And how to
       | monitor that activity for adverse actions? Without taking down
       | your customers on a regular /observable basis?
       | 
       | Step right up! There are a _lot_ of incumbents facing imminent
       | replacement...
        
         | silverquiet wrote:
         | There can't be a person alive who "truly understands" Windows;
         | though made by humans (allegedly), any modern OS is going to be
         | beyond the understanding of any individual. This is the
         | fundamental problem of managing modern systems.
        
           | hypeatei wrote:
           | Yes, we're in a complexity crisis.
        
           | nick__m wrote:
           | I can name two: Raymond Chen and Mark Russinovich ! I don't
           | know if Mark is still up-to-date into the latest Windows
           | internals now that he is the CTO of Azure but Mr Chen sure
           | is.
        
             | silverquiet wrote:
             | I was thinking Linus for Linux, but in spite of how
             | talented these people are, it's still hard to imagine them
             | having a detailed grasp of the entire codebase at this
             | point.
        
         | bb88 wrote:
         | > immature? Yeah, probably!
         | 
         | I find it funny that adult security researchers still get away
         | with identifying themselves with hacking monikers in public as
         | if they were teenagers probing the local telco back in the
         | 1980's.
        
       | hcfman wrote:
       | Probably get a lot more of this when the full force of the cyber
       | resilience act kicks in.
        
       | ramesh31 wrote:
       | I remember being proud of the fact that I had an intimate
       | knowledge and understanding of every single process running on my
       | dev machine. Things felt sane. I could fully comprehend what was
       | happening on my system at all times. Then the button pusher
       | configurator class got called a new name, "DevOps" and started
       | pushing all this crap on us. I'm ready to just start doing work
       | on a private machine at this point.
        
       | blibble wrote:
       | they're correct, all the others are similarly shit
       | 
       | sentinelone, tanium, guardicore, defender endpoint, delina
       | 
       | all running as root (or worse), sucking up absurd amounts of
       | resources, often more than the software running on the machine
       | (but advertised as "LOW IMPACT")
       | 
       | they also cause reliable software to break due to bugs in e.g.
       | their EBPF
       | 
       | also often serialises all network and disk on the machine through
       | to one single thread (so much for multi-queue NVMe/NICs)
       | 
       | the risk and compliance attitude that results in this corporate
       | mandated malware being required needs to go
       | 
       | this software creates more risk than it prevents
        
         | altdataseller wrote:
         | So whats the alternative? Have no endpoint protection? Have
         | nothing in place to warn you when malware ends up in your
         | system?
         | 
         | (Just playing devils advocate. I hate Crowdstrike as much as
         | anyone here :)
        
           | iwwr wrote:
           | Or maybe switch to an operating system that isn't a security
           | dumpster fire?
        
             | altdataseller wrote:
             | And what if this bug happened to affect Linux somehow too?
             | What then?
        
             | esafak wrote:
             | How do you objectively assess an operating system's
             | security? I wanted to convince friends that Windows is
             | insecure but I couldn't find unassailable evidence. Got
             | some? There are confounding variables like the age of the
             | operating system and size of the userbase (distorting the
             | event volume), its attractiveness to attackers, and the
             | tendency of organizations of different levels of technical
             | ability to prefer different operating systems...
        
               | freedomben wrote:
               | I'm a pretty die hard linux guy, and I think Windows is a
               | bloated nightmare, but it's _not_ insecure IMHO (unless
               | you consider  "privacy" to be security, but most people
               | do not (even though I think they should)). There was a
               | time when that wasn't as true, though. If Windows were
               | rewritten from scratch today, I'm certain there would be
               | some different architectural/design decisions made, but
               | that's true for pretty much every piece of software ever
               | written.
        
               | Veserv wrote:
               | Here is the official Windows security certification page
               | [1]. They certify against this standard [2]. The maximum
               | security they certify is provided is:
               | 
               | Page 53: "The evaluator will conduct penetration testing,
               | based on the identified potential vulnerabilities, to
               | determine that the OS is resistant to attacks performed
               | by an attacker possessing Basic attack potential."
               | 
               | That is the lowest level of security certification
               | outlined in the standard. The elementary school diploma
               | of security.
               | 
               | To see what that means, here is a sample of the
               | certification report [3].
               | 
               | Page 14: "The evaluator has performed a search of public
               | sources to discover known vulnerabilities of the TOE.
               | 
               | Using the obtained results, the evaluator has performed a
               | sampling approach to verify if exists applicable public
               | exploits for any of the identified public vulnerabilities
               | and verify whether the security updates published by the
               | vendor are effective. The evaluator has ensured that for
               | all the public vulnerabilities identified in
               | vulnerability assessment report belonging to the period
               | from June 8, 2021 to July 12, 2022, the vendor has
               | published the corresponding update fixing the
               | vulnerabilities."
               | 
               | The "hardcore" certification process they subject
               | themselves to is effectively doing a Google search for:
               | "Windows vulnerabilities" and checking all the public
               | ones have fixes. That is all the security they promise
               | you in their headline, mandatory security certification
               | that is the only general security certification listed
               | and advertised on their official security page.
               | 
               | When a company puts their elementary school diploma on
               | their resume for "highest education received", you should
               | listen.
               | 
               | That is not to say any of the names in general purpose
               | operating systems such as MacOS, Linux, Android, etc. are
               | meaningfully better. They are all inadequate for the task
               | of protecting against moderately skilled commercially
               | minded attackers. None of them have been able to achieve
               | levels of certification that provide confidence against
               | such attackers.
               | 
               | This is actually a good sign, because those systems are
               | objectively and experimentally incapable of reaching that
               | standard of security. That they have been unable to force
               | a false-positive certification that incorrectly states
               | they have reached that standard demonstrates the
               | certification at least has a low false-positive rate.
               | 
               | All of the standard stuff is inadequate in much the same
               | way that all known materials are inadequate for making a
               | space elevator. None of it works, so if you do want to
               | use it, you must assume they are deficient and work
               | around it. That or you could use the actual high quality
               | stuff.
               | 
               | [1] https://learn.microsoft.com/en-
               | us/windows/security/security-...
               | 
               | [2] https://www.commoncriteriaportal.org/files/ppfiles/PP
               | _OS_V4....
               | 
               | [3] https://download.microsoft.com/download/6/9/1/69101f3
               | 5-1373-...
        
             | buran77 wrote:
             | Unreasonably idealistic solutions are some of the worst
             | kind of solutions because they make you feel like you have
             | the answer but the benefits never materialize. The moment
             | you pick any other OS to be the "80% of the world" one,
             | reality will quickly deflate any sense of superiority.
             | 
             | And whether you can see it or not, they're all still some
             | form of dumpster fire, be it security, usability, price.
        
             | gruez wrote:
             | What makes you think windows is "a security dumpster fire"?
             | The fact that most infections are on windows machine
             | doesn't really count because most machines are also windows
             | machines.
        
           | pantalaimon wrote:
           | Does it actually work?
        
             | freedomben wrote:
             | Yes it works very well for the intended purpose (which
             | isn't actually security). The intended purpose is CYA. As
             | head of security, if you install CrowdStrike or some other
             | vendor, then a compromise becomes that vendor's problem,
             | not yours.
        
               | jordanb wrote:
               | When has Crowdstrike taken responsibility for a hack?
               | 
               | I think it's more like, security is heavily check mark
               | based. Crowdstrike and friends have managed to get
               | "endpoint security"[1] added as a "standard security best
               | practice" which every CSO knows they must follow or get
               | labeled incompetent. Therefore "endpoint security" must
               | be installed everywhere with no real proof that it makes
               | things more secure, an arguable case that it makes things
               | less secure, and an undeniable case that it makes things
               | less reliable.
               | 
               | [1] I also never understood how "endpoints" somehow are
               | defined as "any computer connected to any network." I
               | tried to fight security against installing this crap on
               | our database servers with the argument that they are not
               | endpoints. Did not work.
        
           | oldpersonintx wrote:
           | low permission systems
           | 
           | allow nothing and then gradually allow some activities that
           | are deemed safe
           | 
           | do not allow software to be installed from arbitrary
           | locations
           | 
           | app sandboxing and third-party vendors cannot break their
           | sandbox
           | 
           | basically, iOS, Android, ChromeOS
           | 
           | 50% of the people impacted today probably only need a browser
        
           | pkphilip wrote:
           | One option may be to use locked read-only systems. Many of
           | these computers at airports etc do not need a writeable local
           | filesystem.
        
         | nimbius wrote:
         | yes but, did it help us meet the compliance targets for this
         | year?
         | 
         | keep'er running...
        
         | nightshift1 wrote:
         | > also often serialises all network and disk on the machine
         | through to one single thread
         | 
         | Do you have more info about this ? I am very interested. Is it
         | impacting SAN fc storage ?
        
       | UweSchmidt wrote:
       | So how do you actually cybersecure a company in a compliant and
       | practical way?
        
         | rawgabbit wrote:
         | Today, we comply by ticking all the boxes in a checklist; it
         | takes care of the most obvious hacks. Is it good enough? Today,
         | we got our answer.
         | 
         | Practically speaking, that is all the end-user can do with
         | Windows machines. My point is Windows is fundamentally
         | unsecure. It is a dike with thousands of holes some of which
         | are not even visible to Microsoft themselves. The reason of
         | that is security has been after-thought. It is band
         | aids/plasters put on top of other plasters.
        
           | UweSchmidt wrote:
           | Because security experts seem to have this slight dismissive
           | attitude about companies' and individuals' attempts to do
           | security, while not usually having answers or providing
           | secure systems.
        
             | bostik wrote:
             | Sadly most so-called security experts are not hands-on
             | professionals but hands-off "cybersecurity persons". They
             | do not do any real work themselves, they only generate
             | useless busywork for others.
             | 
             | There _are_ people in that category who are not hands-on
             | themselves but still have sufficiently deep understanding
             | of technical details. But as one might guess, they are
             | about as common as four-leaf clovers.
        
         | jabroni_salad wrote:
         | I firmly believe that most routine security issues are really
         | just operations issues and vulns are just bugs and security
         | largely doesnt need to be its own category at all.
         | 
         | I know everybody hates the C-word but if I look at 27001
         | requirements or the CIS benchmarks, there is nothing in there
         | that I do not want for myself. If you can keep a list of the
         | products and services you are running, have actually put the
         | time into implementing it correctly, and have an ongoing
         | maintenance plan then you are probably in the top 1% of
         | networks.
        
       | ehPReth wrote:
       | I've been trialling application allowlisting, but wow is it ever
       | frustrating. So much stuff isn't signed, and when it is the
       | accompanying DLLs aren't. or the signature is invalid. or some of
       | Windows' own executables/dlls aren't signed (why?? you make
       | applocker??) or the installer is, but none of the actual
       | resultant end files
       | 
       | Is it just me?
        
       | bluedino wrote:
       | Security: You must install Microsoft Defender on all Linux VM's
       | 
       | Devs: Ugh...why?
       | 
       | Security: For safety!
       | 
       | Devs: Fine, we won't argue. Deploy it if you may.
       | 
       | A few moments later...
       | 
       | Devs: All of our VM's are slow as crap! Defender is using 100% of
       | the CPU!
       | 
       | Security: Add another core to your VM's. _ticket closed_
       | 
       | Management: Why are our developers up 30% on their cloud spend!?
        
         | blibble wrote:
         | and all microsoft have to do to increase cloud revenue is make
         | defender chew up 5% more CPU every few months
         | 
         | directly incentivised to make shit software
        
           | whycome wrote:
           | Isn't this kinda the hardware model for Apple devices too? Eg
           | batterygate
        
             | Hardwired8976 wrote:
             | Batterygate was just making sure the phone doesn't shut
             | down suddenly as the battery deteriotes and is less
             | capable.
        
               | mrguyorama wrote:
               | People say this like it lets Apple off the hook. Let me
               | explain why it doesn't.
               | 
               | Apple had full control over the whole phone's software
               | stack, in a very good way, meaning they built a good
               | mobile OS that had good systems for power management and
               | an app lifecycle that could actually kill apps at will to
               | maintain efficiency, without disrupting the user.
               | 
               | With this, they decided to ship smaller batteries so they
               | could make slimmer phones.
               | 
               | Except, they used garbage batteries. They were so small
               | (1600mAh on the iPhone 6) that normal wear and tear of a
               | few years degraded them to the point that the battery
               | chemistry could not keep up with normal processor
               | frequency and power ramping.
               | 
               | Apple started getting a lot of complaints because people
               | were understandably upset that their 2-3 year old phone
               | couldn't run for more than an hour off the charger. Apple
               | didn't like increasing support load, even though they
               | weren't covering anyone's battery replacement. Instead of
               | putting out a press release that they had shipped sub-
               | standard batteries in their phones, and offering free
               | battery replacements with a new battery that wouldn't
               | have the same problem in another 2-3 years, they included
               | code in the new version of iOS to SIGNIFICANTLY slow down
               | your 3 year old or less phone.
               | 
               | Apple made a product that deteriorated way too quickly,
               | and then tried to hide it. That's batterygate. If LG sold
               | a fridge that would die after five years because of
               | compressor fatigue and then silently updated their
               | fridges to not operate colder than 45 degrees F to extend
               | the life of the compressor, I would hope you would be
               | pissed at that, right?
               | 
               | A reminder that the iPhone 6 was also "Bendgate", which
               | internal apple memos showed they knew was a serious
               | problem before they sold it, and then claimed two years
               | after release they only had 9 complaints of phone bending
               | and that it wouldn't bend in normal use.
        
               | surgical_fire wrote:
               | > If LG sold a fridge that would die after five years
               | because of compressor fatigue and then silently updated
               | their fridges to not operate colder than 45 degrees F to
               | extend the life of the compressor, I woul
               | 
               | Apple sycophants are willing to put up with any bullshit
               | from Apple. It is very tiresome to argue against blind
               | faith.
               | 
               | Any other company would face incredible scrutiny if that
               | happened. Imagine if MS did that to their surface
               | devices. And this level of scrutiny from consumers is
               | healthy.
        
         | thdxr wrote:
         | there was a production incident at a customer that was this
         | exact scenario
        
         | jordanb wrote:
         | We had cylance take out all of our kubernetes clusters a few
         | years ago.
         | 
         | The whole cybersecurity concept of installing third party
         | mystery meat in the kernel controllable over the internet by a
         | different company seems contrary both to good security
         | practices and software quality assurance, immutable production
         | architecture and repeatable builds.
        
           | pennomi wrote:
           | Third party mystery meat is mostly intended for scapegoating
           | if a problem does occur.
        
         | pixl97 wrote:
         | I work supporting software that processes millions of small
         | files a day, with a lot of these scripting languages. The speed
         | difference in total iops where AV is installed vs not installed
         | is huge. 30-50% loss is no joke.
        
         | bb88 wrote:
         | That's a management problem. IT security didn't communicate
         | that to the finance folks. Microsoft didn't communicate that to
         | IT security. And if they're on Azure, it's more money for
         | Microsoft.
        
       | guru4consulting wrote:
       | For a business that relies on SaaS applications over cloud and
       | uses dumb machines (windows, iPad, whatever) as client terminals,
       | can someone please explain what are the actual threat factors
       | that these EDR tools like Crowdstrike Falcon address? And if SaaS
       | applications can restrict access, detect anomalies with user
       | behavior, have MFA for auth, etc.. will that mitigate these
       | risks? I guess common issues like key loggers, malwares, virus
       | attacks have much simpler solutions than a complex EDR which
       | seems to need root access!! Someone, please educate.
        
       | mikewarot wrote:
       | We build our "cyber fortress" out of the Turing Complete analog
       | of Crates of C4... and wonder why things go wrong all the time.
       | 
       | As I say every time this happens (and it will keep happening for
       | the next decade or so)... Ambient Authority systems can't be
       | secured, we need to switch to Operating Systems designed around
       | Capability Based Security.
       | 
       | We need at least 2 of them, from competing projects.
        
       | 1vuio0pswjnm7 wrote:
       | Works where archive.ph is blocked:
       | 
       | https://webcache.googleusercontent.com/search?q=cache:https:...
        
       ___________________________________________________________________
       (page generated 2024-07-19 23:15 UTC)