[HN Gopher] AI Code Generation and Cybersecurity
___________________________________________________________________
AI Code Generation and Cybersecurity
Author : tptacek
Score : 42 points
Date : 2021-11-10 17:42 UTC (1 days ago)
(HTM) web link (www.cfr.org)
(TXT) w3m dump (www.cfr.org)
| tptacek wrote:
| Chris Rohlf is one of the smartest software security people I
| know, so while this wasn't on my radar 2 hours ago, I guess it is
| now.
| ethbr0 wrote:
| The writing seems to target higher in the management chain, but
| the points seem reasonable.
|
| > _First AI-assisted software development could ensure that
| code is more robust and better tested. [...] AI systems can be
| utilized to produce these kinds of automated testing
| integrations for every function they generate at little to no
| effort. [...] presents an opportunity to significantly scale up
| testing efforts earlier in the development cycle._
|
| > _[Second,] these systems may be able to reduce the attack
| surface through code deduplication. By recognizing when a well
| tested library or framework can be substituted for human
| generated code, and produce the same effect, the system can
| reduce the overall complexity of the final product._
|
| > _[However, automated code generation] may result in difficult
| challenges to tackle related to Software Bill of Materials
| (SBOM) standards being set forth today. This issue occurs
| because by design the AI models obscure the provenance of the
| code it has generated, and code provenance is a key property of
| SBOM._
| eganist wrote:
| Yeah, this is definitely batting higher than your average
| engineer(ing manager); cfr.org is the initial giveaway.
|
| Let's see if I can replay what I read concisely: AI code
| generation is here and has longer term potential to improve
| software security by automating integration of security tests and
| by reducing attack surface through deduplicating code, but in its
| current state, its security success hinges largely on the models
| used to train it (copilot being the main example). Future
| developments/target state will probably include generation of
| code based on abstract requirements and app architecture/design,
| but we're a ways off.
|
| ---
|
| ...yeah, that seems on point. Though I'd point out that copilot
| is far, far too nascent and risky to use in any even moderately
| risk averse environment (due entirely to licensing). Even the
| metaphorical smell of copilot, let alone something more brazen
| such as thankfully non-existent "customer success stories," might
| get an enterprising and litigious contributor to go crawling
| along a firm's published code and attack surface looking for
| indications their code was reused.
|
| On the one hand, I'm glad copilot is a Microsoft thing; tons of
| dollars will probably be dumped into it by the truckload to
| advance it, which is probably necessary because on the other
| hand, I can't think of very many companies for which copilot
| wouldn't be radioactive for the licensing reason noted above.
| mistrial9 wrote:
| coding really well, pays no money by itself; taking other
| people's really useful code, making a sausage out of it,
| selling it to 'secure' contracts through 'secure' banks to
| 'secure' executives who dont code - PROFIT
___________________________________________________________________
(page generated 2021-11-11 23:02 UTC)