[HN Gopher] Ask HN: How much of OpenAI code is written by AI?
       ___________________________________________________________________
        
       Ask HN: How much of OpenAI code is written by AI?
        
       Amidst the nascent concerns of AI replacing software engineers, it
       seems a proxy for that might be the amount of code written at
       OpenAI by the various models they have.  If AI is a threat to
       software engineering, I wouldn't expect many software engineers to
       actively accelerate that trend. I personally don't view it as a
       threat, but some people (non engineers?) obviously do.  I'd be
       curious if any OpenAI engineers can share a rough estimate of their
       day to day composition of human generated code vs AI generated.
        
       Author : growbell_social
       Score  : 15 points
       Date   : 2025-07-13 20:22 UTC (2 hours ago)
        
       | crop_rotation wrote:
       | > If AI is a threat to software engineering, I wouldn't expect
       | many software engineers to actively accelerate that trend.
       | 
       | This is a naive take. Throughout history things have been
       | automated with the help of professions who were being automated
       | away.
        
         | growbell_social wrote:
         | I don't disagree it's a naive take, and I would love to read
         | about some examples where this happened.
         | 
         | I haven't seen too many industries be automated away first
         | hand, and I'm sure there are historic examples. I wouldn't
         | expect the lamp lighters to have been championing the rise of
         | electric lamps. Maybe they did though because it meant they
         | could work less hours.
        
         | iknowSFR wrote:
         | Yeah, this needs to be considered down to the individual level.
         | If your employer not only incentives you to go against your
         | best interests but also threatens the stability of your role,
         | then your choices are either do the job or accept that you
         | might not be reliably employed. This is the culmination of
         | decades of moving power from the employee to the employer.
        
       | notfried wrote:
       | Not OpenAI, but Anthropic CPO Mike Krieger said in response to a
       | question of how much of Claude Code is written by Claude Code:
       | "At this point, I would be shocked if it wasn't 95% plus. I'd
       | have to ask Boris and the other tech leads on there."
       | 
       | [0] https://www.lennysnewsletter.com/p/anthropics-cpo-heres-
       | what...
        
         | PostOnce wrote:
         | TFA says "How Anthropic uses AI to write 90-95% of code for
         | some products and the surprising new bottlenecks this creates".
         | 
         | for _some_ products.
         | 
         | If it were 95% of anything useful, Anthropic would not still
         | have >1000 employees, and the rest of the economy would be
         | collapsing, and governments would be taking some kind of
         | action.
         | 
         | Yet none of that appears to be happening. Why?
        
           | ebiester wrote:
           | I don't doubt it, especially when you have an organization
           | that is focused on building the most effective tooling
           | possible. I'd imagine that they use AI even when it isn't the
           | most optimal, because they are trying to build experiences
           | that will allow everyone else to do the same.
           | 
           | So let's take it on face value and say 95% is written by AI.
           | When you free one bottleneck you expose the next. You still
           | need developers to review it to make sure it's doing the
           | right thing. You still need developers to be able to
           | translate the business context into instructions that make
           | the right product. You have to engage with the product. You
           | need to architect the system - the context windows mean that
           | the tasks can't just be handed off to AI.
           | 
           | So, The role of the programmer changes - you still need
           | technical competence, but to serve the judgement calls of
           | "what is right for the product?" Perhaps there's a world
           | where developers and product management merges, but I think
           | we will still need the people.
        
         | dude250711 wrote:
         | They are likely lying:
         | 
         | https://www.anthropic.com/candidate-ai-guidance
         | 
         |  _> During take-home assessments Complete these without Claude
         | unless we indicate otherwise. We'd like to assess your unique
         | skills and strengths. We 'll be clear when AI is allowed
         | (example: "You may use Claude for this coding challenge").
         | 
         | > During live interviews This is all you-no AI assistance
         | unless we indicate otherwise. We're curious to see how you
         | think through problems in real time. If you require any
         | accommodations for your interviews, please let your recruiter
         | know early in the process._
         | 
         | He'd have to ask yet did not ask? A CPO of an AI company?
        
         | another_twist wrote:
         | Sure but what did the CTO say ? Also was he shocked ? There's
         | no definitive answer, this is an evasive one.
        
       | ivraatiems wrote:
       | I absolutely believe that a large proportion of new code written
       | is at least in-part AI generated, but that doesn't mean a large
       | proportion of new code is 100% soup-to-nuts/pull-request-to-merge
       | the result of decisions made by an agent and not a human. I doubt
       | that _very_ much.
       | 
       | I think the difference between situations where AI-driven
       | development works and doesn't is going to be largely down to the
       | quality of the engineers who are supervising and prompting to
       | generate that code, and the degree to which they manually
       | evaluate it before moving it forward. I think you'll find that
       | good engineers who understand what they're telling an agent to do
       | are still extremely valuable, and are unlikely to go anywhere in
       | the short to mid term. AI tools are not yet at the point where
       | they are reliable on their own, even for systems they helped
       | build, and it's unclear whether they will be any time soon purely
       | through model scaling (though it's possible).
       | 
       | I think you can see the realities of AI tooling in the fact that
       | the major AI companies are hiring lots and lots of engineers, not
       | just for AI-related positions, but for all sorts of general
       | engineering positions. For example, here's a post for a backend
       | engineer at OpenAI: https://openai.com/careers/backend-software-
       | engineer-leverag... - and one from Anthropic: https://job-
       | boards.greenhouse.io/anthropic/jobs/4561280008.
       | 
       | Note that neither of these require direct experience with using
       | AI coding agents, just an interest in the topic! Contrast that
       | with many companies who now demand engineers explain how they are
       | using AI-driven workflows. When they are being serious about
       | getting people to do the work that will make them money, rather
       | than engaging in marketing hype, AI companies are honest: AI
       | agents are tools, just like IDEs, version control systems, etc.
       | It's up to the wise engineer to use them in a valuable way.
       | 
       | Is it possible they're just hiring these folks to try and make
       | their models better to later replace those people? It's possible.
       | But I'm not sure when in time, if ever, they'll reach the point
       | where that was viable.
        
         | osigurdson wrote:
         | >> new code is 100% soup-to-nuts/pull-request-to-merge the
         | result of decisions made by an agent
         | 
         | I am beginning to have more success with this in simpler parts
         | of the code. Particularly if you already have a good example of
         | how to do something and you need something very similar. I
         | usually have to do a few tweaks but generally quite useful.
        
       | appreciatorBus wrote:
       | > If AI is a threat to software engineering, I wouldn't expect
       | many software engineers to actively accelerate that trend. I
       | personally don't view it as a threat, but some people (non
       | engineers?) obviously do.
       | 
       | Software engineers have been automating away jobs for _other_
       | people for nearly a century. It would be quite rich if the
       | profession suddenly felt qualms about the process! (TBC I think
       | automation is great and should always be pursued. Ofc there are
       | real human concerns when change happens quickly but I am
       | skeptical that smashing the looms is the best response)
        
         | another_twist wrote:
         | Software engineering has also been automating its own jobs for
         | ages. The first thing we engineers do when asked to do a
         | repetitive thing is find ways to automate it. I think the
         | industry had qualms about losing their jobs. But honestly what
         | are the examples of people losing their jobs to software ?
         | Everybody says that this has happened many times yet examples
         | are hard to come by.
        
       ___________________________________________________________________
       (page generated 2025-07-13 23:01 UTC)