[HN Gopher] EU Cyber Resilience Act: What does it mean for open ...
       ___________________________________________________________________
        
       EU Cyber Resilience Act: What does it mean for open source?
        
       Author : ahubert
       Score  : 102 points
       Date   : 2023-12-30 20:23 UTC (2 hours ago)
        
 (HTM) web link (berthub.eu)
 (TXT) w3m dump (berthub.eu)
        
       | raverbashing wrote:
       | Very good explanation and very encouraging
       | 
       | > The Debian statement appears to be based on an earlier version
       | of the CRA.
       | 
       | > It for example says "Knowing whether software is commercial or
       | not isn't feasible, neither in Debian nor in most free software
       | projects". Under the CRA there is no need to figure that out for
       | Debian.
       | 
       | > "Having to get legal advice before giving a gift to society
       | will discourage many developers" - the final version of the CRA
       | is clear that if you are giving a gift, the CRA does not apply to
       | you anyhow. There is now a very clear statement on that (see
       | above).
        
       | transpute wrote:
       | It appears that targeted exceptions have been added for specific
       | situations lobbied by current FOSS and commercial stakeholders.
       | Hopefully there will be an ongoing process to address the need
       | for new exclusions, as the vast scope of the CRA becomes clear to
       | societies eaten by software.
       | 
       | New OSS governance and runtime binary attestation (aka DRM)
       | layers are being defined by the CRA, e.g. only specific attested
       | binaries from open-source trees that follow specific development
       | practices would be allowed to run in critical systems:
       | Open-source software stewards shall put in place and document in
       | a verifiable manner a cybersecurity policy to foster the
       | development of a secure product with digital elements as well as
       | an effective handling of vulnerabilities by the developers of
       | that product.            ... Open-source software stewards shall
       | cooperate with the market surveillance authorities, at their
       | request, with a view to mitigating the cybersecurity risks posed
       | by a product with digital elements qualifying as free and open-
       | source software.            ... security attestation programmes
       | should be conceived in such a way that ... third-parties, such as
       | manufacturers that integrate such products into their own
       | products, users, or European and national public administrations
       | [can initiate or finance an attestation].
       | 
       | Legal liability and certification for commercial sale of binaries
       | built from FOSS software will alter business models and
       | incentives for FOSS development.
       | 
       | Related:
       | 
       | Dec 2023, _" What comes after open source? Bruce Perens is
       | working on it"_ (174 comments),
       | https://news.ycombinator.com/item?id=38783500
        
         | EMIRELADERO wrote:
         | > New OSS governance and runtime binary attestation (aka DRM)
         | layers are being defined by the CRA, e.g. only specific
         | attested binaries from open-source trees that follow specific
         | development practices would be allowed to run in critical
         | systems
         | 
         | That doesn't seem like what the CRA stipulates. I think it's
         | more about manual attestation in its most traditional meaning,
         | i.e, an organization _attests_ that X software is secure.
        
           | transpute wrote:
           | _> That doesn 't seem like what the CRA stipulates. I think
           | it's more about manual attestation in its most traditional
           | meaning, i.e, an organization attests that X software is
           | secure._
           | 
           | CRA can require EU-wide recall of "products with digital
           | elements" which are found to be non-compliant by national
           | market surveillance. While we may analogize this requirement
           | to the recall of slow-moving physical products with rare
           | market withdrawal, software developers and attackers iterate
           | more quickly.
           | 
           | Centralized software distribution like mobile app stores
           | would have the ability to implement a kill switch (recall) on
           | non-compliant products. Products which depend on centralized
           | cloud services could have binaries verified before they are
           | allowed to connect to an API. This would give regulators the
           | tools to rapidly implement software "recalls".
           | (58) ... significant cybersecurity risk or pose a risk to the
           | health or safety of persons ... market surveillance
           | authorities should take measures to require the economic
           | operator to ensure that the product no longer presents that
           | risk, to recall it or to withdraw it ...            (60) ...
           | market surveillance authorities should be able to carry out
           | joint activities with other authorities, with a view to
           | verifying compliance and identifying cybersecurity risks of
           | products with digital elements.             (61) Simultaneous
           | coordinated control actions ('sweeps') are specific
           | enforcement actions by market surveillance authorities that
           | can further enhance product security.
        
             | EMIRELADERO wrote:
             | So what would you propose for recalling physical products
             | that have insecure software that can cause physical
             | trouble? What framework would have sufficed?
        
               | transpute wrote:
               | Kill switches based on attested binary identity exist and
               | can be deployed at scale. So they can and likely will be
               | used to comply with regulatory decisions. What remains to
               | be seen is how those regulatory decisions will be made
               | for complex software supply chains.
               | 
               | In part, open-source software arose in response to opaque
               | software.
               | 
               | Can opaque regulation equally govern open and opaque
               | software?
               | 
               | Should open software have open (i.e. continuously
               | evolving in public, not point-in-time negotiated)
               | regulation that can keep up with open development and
               | security research? Much will depend on the operational
               | practices and transparency of national institutions
               | tasked to implement EU CRA.
        
       | jahav wrote:
       | https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CONS...
       | 
       | Important bits (10c and around):
       | 
       | * Libraries/non-end products are fine, unless monetized.
       | 
       | * Employee contributions seem to be fine.
       | 
       | * Foundations seem to be fine.
       | 
       | * Non-core developers are fine
       | 
       | Seems like significantly better version.
        
         | smallnix wrote:
         | What about non-monetized open source end products?
        
           | PaulDavisThe1st wrote:
           | I know it's not cool on HN to say "did you even read TFA?"
           | but seriously, read TFA!
        
       | jbk wrote:
       | The new version of the CRA is quite an improvement, and most of
       | discussions around the open source communities were about older
       | versions that were quite concerning. There were a lot of scary
       | discussions on the foundations mailing list and on various board
       | of open source non-profit.
       | 
       | This article is a good step to explain what has changed.
       | 
       |  _(I was quite concerned as President of VideoLAN and involved in
       | VLC and FFmpeg, since both projects would have been threatened by
       | previous drafts)_
        
         | ahubert wrote:
         | So you might be able to turn this into your advantage. The
         | zillion people embedding your great work will be on the hook if
         | it turns out they haven't performed sufficient due diligence on
         | ffmpeg. And who knows, you might get them to sponsor you to get
         | security audits or documentation done. Could be their "ticket
         | out of jail" one day!
        
       | greatgib wrote:
       | This regulation is so shitty. I'm quite sure that it is supported
       | by big actors in the end, because the end goal is to ensure to
       | have a regulatory barrier that will avoid small actors to be able
       | to strive in the software field.
       | 
       | Also, to avoid "dangerous" not yet professional amateurs having a
       | chance against big editors.
        
         | EtienneK wrote:
         | This was the first question on my mind as well. How will this
         | affect the one-man webshop owner or software developer? Seems
         | only big established firms will be able to conform to this?
        
           | EMIRELADERO wrote:
           | This question was asked a lot when GDPR came around, and it's
           | essentially an implication that the regulator will act in bad
           | faith.
           | 
           | Courts and regulators, particularily European ones,
           | understand when there's a "will" to follow the law. It's one
           | of the differences between "rules-based" and "principles-
           | based" regulations.
           | 
           | https://news.ycombinator.com/item?id=17100541
        
             | EtienneK wrote:
             | I don't understand? So you should only in principle audit
             | your Wordpress blog?
        
               | EMIRELADERO wrote:
               | Read the comment I linked. It's about the regulation
               | being enforced with its principles in mind, not
               | robotically through its strict interpretation.
        
               | troupo wrote:
               | Questions:
               | 
               | - If you run a commercial kitchen on your own (or, let's
               | say, with a staff of 2-3 people), can you ignore the food
               | safety regulations? The fire regulations?
               | 
               | - If you run a one-man plumbing company, can you ignore
               | safety regulations? Water regulations? Sewage
               | regulations?
               | 
               | etc.
               | 
               | Why is it than when it comes to "commercial software" it
               | is inevitably "oh my god these laws are so hard, why
               | should I as one-man company be forced to comply with
               | them". Because that is literally your job.
        
               | Vespasian wrote:
               | It means you will get treated differently whether you
               | operate a one man show or a global conglomerate.
               | 
               | Follow best practices and demonstrate that you care goes
               | a long way (that has been demonstrated time and time
               | again in courts throughout the union).
               | 
               | Also it differentiates between what kind of product you
               | are building (see the annexes).
               | 
               | Most of the requirements (look them up) are best software
               | dev practices unless you are in one of the specific
               | "critical" categories of products.
               | 
               | Then, to be honest I don't really care that you are a one
               | person (commercial) shop when my car gets steered off the
               | road because of a preventable security hole.
        
         | Lariscus wrote:
         | Unless you sell critical products as described in Annex III[1]
         | the requirements to fulfill CRA are quite harmless. It's mostly
         | stuff you should be doing anyway like a risk assessment and
         | documentation. An additionally requirement is to provide a
         | conformity assessment, which you can do yourself for non
         | critical software, and you must report vulnerabilities within
         | 24 hours.
         | 
         | Not too bad really.
         | 
         | [1] https://eur-
         | lex.europa.eu/resource.html?uri=cellar:864f472b-...
        
         | kossTKR wrote:
         | Yeah this could be terrible unless there's very specific
         | exceptions for sub 1mil revenue players.
         | 
         | Lots of big players are already providing shit software to
         | millions of customers especially through government contracts
         | because they've hired armies of legal and sales teams,
         | squashing the little guy in the process.
         | 
         | If just providing some small web service built on top of open
         | source now requires hiring a huge legal team, well goodbye to
         | any entrepreneurship.
         | 
         | I know this because i've seen big players win contracts over
         | actually talented people 9 of 10 times because they can play
         | this regulation game, and i've seen small companies burn 100s
         | of thousands in consultancy fees over GDPR that made zero
         | difference for their Wordpress setup that a talented coder
         | could have used 10 hours to fix.
         | 
         | That said the intentions are good, but for some reason EU
         | thinks small players should have the same extreme measures as
         | Facebook, Google, ie the actual reasons this regulation was
         | made in the first place. Bizarre.
        
       | amadeuspagel wrote:
       | > The state of computing security is dire, and governments around
       | the world have rightly decided things can't go on like this.
       | 
       | What is this? Software is more secure then ever.
        
         | bshipp wrote:
         | Some people obviously prefer the old days when security
         | problems were hidden within proprietary code so the only people
         | who knew about them were the ones who found the exploit.
        
           | dekken_ wrote:
           | Like that won't still be possible?
        
       | nickpp wrote:
       | I wonder if there was ever any instance when regulating something
       | has brought in more of that thing. Anybody has an example?
        
         | recursive wrote:
         | Driving cars is regulated now. We have more than ever.
        
         | pjmlp wrote:
         | Street markets also known as bazaars.
         | 
         | Restaurants, food trucks, consumer electronics, medical
         | devices, clothing, products chain delivery,...
        
           | nickpp wrote:
           | > bazaars
           | 
           | Here in Eastern Europe we are having fewer and fewer of those
           | during the last 30 years. My favorite cheese maker closed her
           | small shop and started selling direct from home since local
           | authorities started demanding test and workshop inspections
           | (bribes really). She's planning to switch to selling the milk
           | directly to one of those big name supermarket diary
           | processors soon. Less money but fewer headaches.
        
         | Msurrow wrote:
         | Financial markets/stock markets are pretty regulated. From what
         | I hear there is quite a bit of stocks being traded. Trading
         | even seems to increase over the years.
        
           | nickpp wrote:
           | > Financial markets
           | 
           | There is only one of those in my whole country. Not much
           | competition there if I'd ever want to do an IPO.
        
       | gavinhoward wrote:
       | Boy, I hope the new version is better.
       | 
       | If we don't want poor regulation, we had better regulate
       | ourselves first.
       | 
       | Bonus: regulating ourselves might fund Open Source. [1]
       | 
       | [1]: https://gavinhoward.com/2023/11/how-to-fund-foss-save-it-
       | fro...
        
       | sylware wrote:
       | This is Big Tech only: only them will have the amount of
       | resources to fit the requirement of such act.
       | 
       | The only way for small actors is to move to... super small and
       | simple tech... and they better be sure small tech<->big tech
       | interop is hardcore regulated too or they will be zapped.
       | 
       | Yep, forget about those grotesquely and absurdely massive and
       | complex web engines...
       | 
       | And now I am thinking about the hardware... they better come
       | extra clean.
        
       | jokethrowaway wrote:
       | Good that the backtracked on a lot of the CRAp (which would have
       | meant the end of OSS in Europe, talk about destroying the world
       | with the wrong swift movement of a pen!) BUT I'm still angry:
       | 
       | 1. This adds barriers to sell OSS software, which helps solidify
       | existing markets and prevents new competitors from stepping up
       | 
       | 2. This won't change anything except forcing projects to waste
       | money in legal BS, when the responsibility should be uniquely on
       | the commercial entities USING and providing a service (and
       | therefore making money) with the OSS software
       | 
       | 3. This is only the first step, I'm sure they'll keep adding
       | rules
       | 
       | 4. I'm thinking they may have been heavy handed in the first
       | draft just so that people would think at the end "oh, phew! the
       | regulators didn't kill ALL OSS software in Europe, great!"
       | without thinking why do we need this regulation or how it
       | improves ANYTHING
       | 
       | Will it actually improve security? I don't think so.
       | 
       | If someone is paying for commercial support they likely already
       | have security updates and, once vulnerabilities are known by the
       | maintainers, the news spread.
       | 
       | The security problem with OSS is not that things are not
       | communicated promptly, but that it's hard to make money with OSS
       | so there is no staff working on security.
       | 
       | This would have not saved us from eg. OpenSSL vulnerabilities and
       | it will be even harder to $NextOSSOrg to start charging for their
       | product and improve their security.
        
         | EMIRELADERO wrote:
         | > This adds barriers to sell OSS software, which helps solidify
         | existing markets and prevents new competitors from stepping up
         | 
         | All commercial software is included, I don't see how
         | (commercial) OSS is somehow special. Did you read the article?
        
         | wolvesechoes wrote:
         | "which helps solidify existing markets and prevents new
         | competitors from stepping up"
         | 
         | So exactly like any other regulation.
        
         | troupo wrote:
         | > it's hard to make money with OSS so there is no staff working
         | on security.
         | 
         | Sooo.... Because of that you should be exempt even though
         | you're expecting to sell that software?
         | 
         | How does this make sense?
        
         | mqus wrote:
         | > 2. This won't change anything except forcing projects to
         | waste money in legal BS, when the responsibility should be
         | uniquely on the commercial entities USING and providing a
         | service (and therefore making money) with the OSS software
         | 
         | First of all, most of the software companies do SaaS, meaning
         | they also provide the service. And then, even if they don't,
         | the users will just hand down the paperwork to the companies
         | developing the software. Because those know what was put in,
         | security and components, and want to have this in legal
         | writing.
         | 
         | Secondly, imagine your average IoT seller. They should not be
         | liable for their bad product because they don't run it
         | themselves? "The user" is liable? In most cases the "user"
         | can't even do anything about their insecure device.
         | 
         | I think developers are rightly responsible here. It's pretty
         | comparable to other industries where the products have to be
         | safe when getting sold, think pharma, food, toys, cars, etcpp.
         | 
         | > Will it actually improve security? I don't think so.
         | 
         | Think B2C. It will improve things there, and massively so.
         | Software in B2B was already somewhat regulated via audits and
         | certifications.
        
       | donkeyd wrote:
       | I was unaware of this act before reading this, but I kinda like
       | it. My current employer wants to do the absolute minimum in
       | securing the software they develop. However, it's used at in
       | organizations working on national energy and communications
       | infrastructure, so it's somewhat important for it to be secure.
       | 
       | Meanwhile, we're way behind on updating much of our
       | infrastructure and hardly ever check whether any of the open
       | source libraries we use are up-to-date, nor whether they're
       | reliable. I really hope this legislation pushes companies like
       | mine to improve their software development practices, because I'm
       | scared of the future.
        
       | martinald wrote:
       | Am I right in thinking that if you were a small indie OSS
       | developer that offers commercial support or similar "services",
       | all these regulations will now apply to you?
       | 
       | While I get they new draft has changed it so if you are non
       | profit or accepting donations it doesn't apply (I think?) The
       | biggest problem is that isn't a great model for OSS anyway.
       | 
       | A much better model imo is charging for a "pro" version with
       | support included and maybe some extra features.
       | 
       | This regulation is likely to totally kill the viability of that
       | model if you need to do expensive security audits.
        
         | oneplane wrote:
         | It applies to you in the sense that your services are covered
         | by the CRA. Your projects themselves probably don't unless you
         | have an open-core model where you have a commercialised
         | 'supported' version, in which case you're not responsible for
         | all users, but you are on the hook for the commercial users.
         | 
         | In a way, I don't think it's that much at odds, if someone
         | comes up with a great open source project but not to 'give
         | away' as a present or in a classic FOSS style, but instead as
         | some sort of funnel to get paying customers (which includes
         | pure support), you're already doing it commercial, and even
         | without the CRA you'd probably be on the hook for doing it
         | right anyway.
        
       | chacham15 wrote:
       | There is a lot of talk about who this regulation is supposed to
       | cover, but not a lot about what it actually requires if it covers
       | you. The best I could find after a couple quick searches was that
       | you're supposed to provide information about the security
       | mechanisms used and regular security updates over the lifetime of
       | the product. Is there anything else? This doesnt sound terribly
       | hard to comply with at first glance.
        
         | transpute wrote:
         | One example from the BSA (Business Software Alliance) statement
         | on an earlier draft of CRA, https://www.bsa.org/files/policy-
         | filings/11012022eucra.pdf                  The CRA requires
         | manufacturers to ensure vulnerabilities are handled effectively
         | for the expected product lifetime or 5 years, whichever is
         | shorter.
        
       | dang wrote:
       | Related. I thought there were others, can anyone find them?
       | 
       |  _Open source liability is coming_ -
       | https://news.ycombinator.com/item?id=38808163 - Dec 2023 (218
       | comments)
       | 
       |  _Debian Statement on the Cyber Resilience Act_ -
       | https://news.ycombinator.com/item?id=38787005 - Dec 2023 (144
       | comments)
       | 
       |  _Can open source be saved from the EU 's Cyber Resilience Act?_
       | - https://news.ycombinator.com/item?id=37880476 - Oct 2023 (12
       | comments)
       | 
       |  _European Cyber Resilience Act [Discussion]_ -
       | https://news.ycombinator.com/item?id=37580247 - Sept 2023 (4
       | comments)
        
       | troupo wrote:
       | I've got to say, I like how people have started paying attention
       | to these laws, actually reading them, and them write measured
       | takes based in reality, and not in the hallucinations by the
       | industry that is usually very much oppposed to any kind of
       | regulation.
       | 
       | Other good takes in recent regulations:
       | 
       | - Unraveling the EU Digital Markets Act
       | https://ia.net/topics/unraveling-the-digital-markets-act
       | 
       | - The truth about the EU AI Act and foundation models, or why you
       | should not rely on ChatGPT summaries for important texts
       | https://softwarecrisis.dev/letters/the-truth-about-the-eu-ac...
        
       | Vespasian wrote:
       | I'm glad that the concerns of the open source community were
       | clearly heard and incorporated into the CRA. Experts were
       | listened to and being involved in this regulation helped making
       | it better.
       | 
       | As the author states "regulations are never fun" bjt this is as
       | good as it gets.
       | 
       | I'm an optimist and hope that this will somewhat dampen the
       | voices on the internet and (unfortunately) on HN that claim the
       | EU is only filled with near evil idiots acting to destroy
       | European industry.
       | 
       | I guess we will see how it goes next time (admittedly my hope is
       | small).
        
       | Aachen wrote:
       | > under article 10(4a), integrators are obliged to share any
       | vulnerabilities they have found in a component with the (open
       | source) manufacturer, including any patches they might have
       | developed
       | 
       | That's good to know about as a security consultancy.
       | 
       | Whenever we found an issue in software made by a third-party
       | vendor, we already recommend reporting it and offer to do it for
       | them (unpaid time on our part, but it gets both the finder and
       | our company publicity, and when leaving it up to the customer
       | then it might not happen which is also bad for everyone else),
       | but now we can say it's required and not just a recommendation.
       | And if there is patching on the customer's part, we get to check
       | the fix if they give it to us for reporting, which in turn makes
       | them more secure.
       | 
       | For us, the situation doesn't really change, but for the tech
       | industry as a whole I see only upsides (at least of this part) :)
        
       ___________________________________________________________________
       (page generated 2023-12-30 23:00 UTC)