[HN Gopher] An Overwhelmingly Negative and Demoralizing Force
       ___________________________________________________________________
        
       An Overwhelmingly Negative and Demoralizing Force
        
       Author : Doches
       Score  : 224 points
       Date   : 2025-04-08 09:22 UTC (13 hours ago)
        
 (HTM) web link (aftermath.site)
 (TXT) w3m dump (aftermath.site)
        
       | terminalbraid wrote:
       | This story just makes me sad for the developers. I think
       | especially for games you need a level of creativity that AI won't
       | give you, especially once you get past the "basic engine
       | boilerplate". That's not to say it can't help you, but this "all
       | in" method just looks forced and painful. Some of the best games
       | I've played were far more "this is the game I wanted to play"
       | with a lot of vision, execution, polish, and careful
       | craftspersonship.
       | 
       | I can only hope endeavors (experiments?) like this extreme fail
       | fast and we learn from it.
        
         | tjpnz wrote:
         | Asset flips (half arsed rubbish made with store bought assets)
         | were a big problem in the games industry not so long ago.
         | They're less prevalent now because gamers instinctively avoid
         | such titles. I'm sure they'll wise up to generative slop too,
         | I've personally seen enough examples to get a general feel for
         | it. Not fun, derivative, soulless, buggy as hell.
        
           | hnthrow90348765 wrote:
           | But make some shallow games with generic, cell-shaded anime
           | waifus accessed by gambling and they eat that shit up
        
             | ang_cire wrote:
             | If someone bothered to make deep, innovative games with
             | cell-shaded anime waifus without gambling, they'd likely
             | switch. This is more likely a market problem of US game
             | companies not supplying sufficient CSAWs (acronym feels
             | unfortunate but somehow appropriate).
        
             | Analemma_ wrote:
             | Your dismissive characterization is not really accurate.
             | Even in the cell-shaded anime waifu genre, there is a
             | spectrum of gameplay quality and gamers do gravitate toward
             | and reward the better games. The big reason MiHoYo games
             | (Genshin Impact, Star Rail) have such a big presence and
             | staying power is that even though they are waifu games at
             | the core, the gameplay is surprisingly good (they're a
             | night-and-day difference compared to slop like Blue
             | Archive), and they're still fun even if you resolve to
             | never pay any microtransactions.
        
       | dkobia wrote:
       | I've been wrestling with this tension between embracing AI tools
       | and preserving human expertise in my work. On one hand, I have
       | experienced real genuine productivity gains with LLMs - they help
       | me code, organize thoughts and offer useful perspectives I hadn't
       | even considered. On the other, I realize managers often don't
       | understand the nature of creative work which is trivialized by
       | all the content generation tools.
       | 
       | Creativity emerges through a messy exploration and human
       | experience -- but it seems no one has time for that these days.
       | Managers have found a shiny new tool to do more with less. Also,
       | AI companies are deliberately targeting executives with promises
       | of cost-cutting and efficiency. Someone has to pay for all the
       | R&D.
        
         | 3D30497420 wrote:
         | I had very similar thoughts while reading through the article.
         | I also have found some real value in LLMs, and when used well,
         | I think can and will be quite beneficial.
         | 
         | Notably, a good number the examples were just straight-up bad
         | management, irrespective of the tools being used. I also think
         | some of these reactions are people realizing that they work for
         | managers or in businesses that ultimately don't really care
         | about the quality of their work, just that it delivers monetary
         | value at the end.
        
       | throwawayfgyb wrote:
       | I really like AI. It allows me to complete my $JOB tasks faster,
       | so I have more time for my passion projects, that I craft
       | lovingly and without crappy AI.
        
         | bluefirebrand wrote:
         | I have never had a job where completing tasks faster wound up
         | with me having more personal free time. It always just means
         | you move on to the next task more quickly
        
           | esafak wrote:
           | Perhaps the OP completes the assigned task ahead of schedule
           | and keeps the saved time.
        
             | htek wrote:
             | Shhh! Do you want to kill AI? All the C-suite and middle
             | management need to hear is that "My QoL has never been
             | better since I could use AI at work! Now I can 'quiet quit'
             | half the day away! I can see my family after hours! Or even
             | have a second job!"
        
           | floriannn wrote:
           | This is a fair bit easier as a remote worker, but even in-
           | office you would just sandbag your time rather than
           | publishing the finished work immediately. In-office it's more
           | likely that you would waste time on the internet rather than
           | working on a personal project though.
        
           | dominicrose wrote:
           | That's not the worst thing. Having more work means you're
           | less bored. You probably won't be payed more though. But
           | being too productive can cause you to have no next task, wich
           | isn't the same thing as having free time.
           | 
           | I think that's part of the reason why devs like working from
           | home and not be spied on.
        
         | adrian_b wrote:
         | "AI" is just a trick to circumvent the copyright laws that are
         | the main brake in writing quickly programs.
         | 
         | The "AI" generated code is just code extracted from various
         | sources used for training, which could not be used by a human
         | programmer because most likely they would have copyrights
         | incompatible with the product for which "AI" is used.
         | 
         | All my life I could have written much faster any commercial
         | software if I had been free to just copy and paste any random
         | code lines coming from open-source libraries and applications,
         | from proprietary programs written for former employers or from
         | various programs written by myself as side projects with my own
         | resources and in my own time, but whose copyrights I am not
         | willing to donate to my current employer, so that I would no
         | longer be able to use in the future my own programs.
         | 
         | I could search and find suitable source code for any current
         | task as fast and with much greater reliability than by
         | prompting an AI application. I am just not permitted to do that
         | by the existing laws, unlike the AI companies.
         | 
         | Already many decades ago, it was claimed that the solution for
         | enhancing programmer productivity is more "code reuse". However
         | "code reuse" has never happened at the scale imagined in the
         | distant past, but not because of technical reasons, but due to
         | the copyright laws, whose purpose is exactly to prevent code
         | reuse.
         | 
         | Now "AI" appears to be the magical solution that can provide
         | "code reuse" at the scale dreamed a half of century ago, by
         | escaping from the copyright constraints.
         | 
         | When writing a program for my personal use, I would never use
         | an AI assistant, because it cannot accelerate my work in any
         | way. For boilerplate code, I use various templates and very
         | smart editor auto-completion, there is no need of any "AI" for
         | that.
         | 
         | On the other hand, when writing a proprietary program,
         | especially for some employer that has stupid copyright rules,
         | e.g. not allowing the use of libraries with different
         | copyrights, even when those copyrights are compatible with the
         | requirements of the product, then I would not hesitate to
         | prompt an AI assistant, in order to get code stripped of
         | copyright, saving thus time over rewriting an equivalent code
         | just for the purpose of enabling it to be copyrighted by the
         | employer.
        
           | popularonion wrote:
           | Not sure why this is downvoted. People forget or weren't
           | around for the early 2000s when companies were absolutely
           | preoccupied with code copyright and terrified of lawsuits.
           | That loosened up only slightly during the
           | GitHub/StackOverflow era.
           | 
           | If you proposed something like GitHub Copilot to any company
           | in 2020, the legal department would've nuked you from orbit.
           | Now it's ok because "everyone is doing it and we can't be
           | left behind".
           | 
           | Edit: I just realized this was a driver for why whiteboard
           | puzzles became so big - the ideal employee for MSFT/FB/Google
           | etc was someone who could spit out library quality,
           | copyright-unencumbered, "clean room" code without access to
           | an internet connection. That is what companies had to
           | optimize for.
        
             | int_19h wrote:
             | It's downvoted because it's plainly incorrect.
        
           | bflesch wrote:
           | This is an extremely important point, and first time I see it
           | mentioned with regards to software copyright. Remember the
           | days where companies got sued for including GPL'd code in
           | their proprietary products?
        
         | voidUpdate wrote:
         | I wish I had a job where if I completed all my work quickly, I
         | was allowed to do whatever
        
           | ang_cire wrote:
           | How do they know if you're done, if you haven't "turned it
           | in" yet? They're probably not watching your screen
           | constantly.
           | 
           | My last boss told me essentially (paraphrasing), "I budget
           | time for your tasks. If you finish late, I look like I
           | underestimate time required, or you're not up to it. If you
           | finish early, I look like I overestimate. If I give you a
           | week to do something, I don't care if you finish in 5
           | minutes, don't give it to me until the week is up unless you
           | want something else to do."
        
       | justonceokay wrote:
       | I've always been the kind of developer that aims to have more red
       | lines than green ones in my diffs. I like writing libraries so we
       | can create hundreds of integration tests declaratively. I'm the
       | kind of developer that disappears for two days and comes back
       | with a 10x speedup because I found two loop variables that should
       | be switched.
       | 
       | There is no place for me in this environment. I'd not that I
       | couldn't use the tools to make so much code, it's that AI use
       | makes the metric for success speed-to-production. The solution to
       | bad code is more code. AI will never produce a deletion. Publish
       | or perish has come for us and it's sad. It makes me feel old just
       | like my Python programming made the mainframe people feel old. I
       | wonder what will make the AI developers feel old...
        
         | pja wrote:
         | > _Unseen were all the sleepless nights we experienced from
         | untested sql queries and regexes and misconfigurations he had
         | pushed in his effort to look good. It always came back to a
         | lack of testing edge cases and an eagerness to ship._
         | 
         | If you do this you are creating a rod for your own back: You
         | need management to see the failures & the time it takes to fix
         | them, otherwise they will assume everything is fine & wonderful
         | with their new toy & proceed with their plan to inflict it on
         | everyone, oblivious to the true costs + benefits.
        
           | lovich wrote:
           | >If you do this you are creating a rod for your own back: You
           | need management to see the failures & the time it takes to
           | fix them, otherwise they will assume everything is fine &
           | wonderful with their new toy & proceed with their plan to
           | inflict it on everyone, oblivious to the true costs +
           | benefits.
           | 
           | If at every company I work for, my manager's average 7-8
           | months in their role as _my_ manager, and I am switching jobs
           | every 2-3 years because companies would rather rehire their
           | entire staff than give out raises that are even a portion of
           | the market growth, why would I care?
           | 
           | Not that the market is currently in that state, but that's
           | how a large portion of tech companies were operating for the
           | past decade. Long term consequences don't matter because
           | there are no longer term relationships.
        
         | NortySpock wrote:
         | I think there will still be room for "debugging AI slop-code"
         | and "performance-turning AI slop-code" and "cranking up the
         | strictness of the linter (or type-checker for dynamically-typed
         | languages) to chase out silly bugs" , not to mention the need
         | for better languages / runtime that give better guarantees
         | about correctness.
         | 
         | It's the front-end of the hype cycle. The tech-debt problems
         | will come home to roost in a year or two.
        
           | AlexandrB wrote:
           | > I think there will still be room for "debugging AI slop-
           | code" and "performance-turning AI slop-code"
           | 
           | Ah yes, maintenance, the most fun and satisfying part of the
           | job. /s
        
             | WesolyKubeczek wrote:
             | Congrats, you've been promoted to be the cost center. And
             | sloppers will get to the top by cranking out features you
             | will need to maintain.
        
               | popularonion wrote:
               | > slopper
               | 
               | new 2025 slang just dropped
        
             | unraveller wrote:
             | You work in the slop mines now.
        
         | philistine wrote:
         | > AI will never produce a deletion.
         | 
         | That, right here, is a world-shaking statement. Bravo.
        
           | QuadrupleA wrote:
           | Not quite true though - I've occasionally passed a codebase
           | to DeepSeek to have it simplify, and it does a decent job.
           | Can even "code golf" if you ask it.
           | 
           | But the sentiment is true, by default current LLMs produce
           | verbose, overcomplicated code
        
           | esafak wrote:
           | Today's assistants can refactor, which includes deletions.
        
             | furyofantares wrote:
             | They can do something that looks a lot like refactoring but
             | they suck extremely hard at it, if it's of any considerable
             | size at all.
        
           | Eliezer wrote:
           | And if it isn't already false it will be false in 6 months,
           | or 1.5 years on the outside. AI is a moving target, and the
           | oldest people among you might remember a time in the 1750s
           | when it didn't talk to you about code at all.
        
           | Taterr wrote:
           | It can absolutely be used to refactor and reduce code, simply
           | asking "Can this be simplified" in reference to a file or
           | system often results in a nice refactor.
           | 
           | However I wouldn't say refactoring is as hands free as
           | letting AI produce the code in the first place, you need to
           | cherry pick its best ideas and guide it a little bit more.
        
         | rqtwteye wrote:
         | You have to go lower down the stack. Don't use AI but write the
         | AI. For the foreseeable future there is a lot of opportunity to
         | make the AI faster.
         | 
         | I am sure assembly programmers were horrified at the code the
         | first C compilers produced. And I personally am horrified by
         | the inefficiency of python compared to the C++ code I used to
         | write. We always have traded faster development for
         | inefficiency.
        
           | kmeisthax wrote:
           | C solved the horrible machine code problem by inflicting
           | programmers with the concept of undefined behavior, where
           | blunt instruments called optimizers take a machete to your
           | code. There's a very expensive document locked up somewhere
           | in the ISO vault that tells you what you can and can't write
           | in C, and if you break any of those rules the compiler is
           | free to write whatever it wants.
           | 
           | This created a league of incredibly elitist[0] programmers
           | who, having mastered what they thought was the rules of C,
           | insisted to everyone else that the real problem was you not
           | understanding C, not the fact that C had made itself a
           | nightmare to program in. C is bad soil to plant a project in
           | even if you know where the poison is and how to avoid it.
           | 
           | The inefficiency of Python[1] is downstream of a trauma
           | response to C and all the many, many ways to shoot yourself
           | in the foot with it. Garbage collection and bytecode are
           | tithes paid to absolve oneself of the sins of C. It's not a
           | matter of Python being "faster to write, harder to execute"
           | as much as Python being used as a defense mechanism.
           | 
           | In contrast, the trade-off from AI is unclear, aside from the
           | fact that you didn't spend time writing it, and thus aren't
           | learning anything from it. It's one thing to sacrifice
           | performance for stability; versus sacrificing efficiency and
           | understanding for faster code churn. I don't think the latter
           | is a good tradeoff! That's how we got under-baked and
           | developer-hostile ecosystems like C to begin with!
           | 
           | [0] The opposite of a "DEI hire" is an "APE hire", where APE
           | stands for "Assimilation, Poverty & Exclusion"
           | 
           | [1] I'm using Python as a stand-in for any memory-safe
           | programming language that makes use of a bytecode interpreter
           | that manipulates runtime-managed memory objects.
        
             | pfdietz wrote:
             | Why was bytecode needed to absolve ourselves of the sins of
             | C?
        
           | EVa5I7bHFq9mnYK wrote:
           | C was specifically designed to map 1:1 onto PDP-11 assembly.
           | For example, the '++' operator was created solely to
           | represent auto-increment instructions like TST (R0)+.
        
           | 01HNNWZ0MV43FF wrote:
           | The AI companies probably use Python because all the
           | computation happens on the GPU and changing Python control
           | plane code is faster than changing C/C++ control plane code
        
         | AnimalMuppet wrote:
         | If the company values that 10x speedup, there is absolutely
         | still a place for you in this environment. Only now it's going
         | to take five days instead of two, because it's going to be
         | harder to track that down in the less-well-structured stuff
         | that AI produces.
        
           | Leynos wrote:
           | Why are you letting the AI construct poorly structured code?
           | You should be discussing an architectural plan with it first
           | and only signing off on the code design when you are
           | comfortable with it.
        
         | candiddevmike wrote:
         | I wonder what the impact of LLM codegen will have on open
         | source projects like Kubernetes and Linux.
        
           | bluefirebrand wrote:
           | I haven't really seen what Linus thinks of LLMs but I'm
           | curious
           | 
           | I suspect he is pretty unimpressed by the code that LLMs
           | produce given his history with code he thinks is subpar, but
           | what do I know
        
             | fifilura wrote:
             | Not really.
             | 
             | https://blog.mathieuacher.com/LinusTorvaldsLLM/
        
         | ajjenkins wrote:
         | AI can definitely produce a deletion. In fact, I commonly use
         | AI to do this. Copy some code and prompt the AI to make the
         | code simpler or more concise. The output will usually be fewer
         | lines of code.
         | 
         | Unless you meant that AI won't remove entire features from the
         | code. But AI can do that too if you prompt it to. I think the
         | bigger issue is that companies don't put enough value on
         | removing things and only focus on adding new features. That's
         | not a problem with AI though.
        
           | ryandrake wrote:
           | I messed around with Copilot for a while and this is one of
           | the things that actually really impressed me. It was very
           | good at taking a messy block of code, and simplifying it by
           | removing unnecessary stuff, sometimes reducing it to a one
           | line lambda. Very helpful!
        
             | buggy6257 wrote:
             | > sometimes reducing it to a one line lambda.
             | 
             | Please don't do this :) Readable code is better than clever
             | code!
        
               | bluefirebrand wrote:
               | Especially "clever" code that is AI generated!
               | 
               | At least with human-written clever code you can trust
               | that somebody understood it at one point but the idea of
               | trusting AI generated code that is "clever" makes my skin
               | crawl
        
               | throwaway889900 wrote:
               | Sometimes a lambda is more readable. "lambda x : x if x
               | else 1" is pretty understandable and doesn't need to be
               | it's own separately defined function.
               | 
               | I should also note that development style also depends on
               | tools, so if your IDE makes inline functions more
               | readable in it's display, it's fine to use concisely
               | defined lambdas.
               | 
               | Readablity is a personal preference thing at some point
               | after all.
        
               | gopher_space wrote:
               | My cleverest one-liners will block me when I come back to
               | them unless I write a few paragraphs of explanation as
               | well.
        
           | Freedom2 wrote:
           | I'm no big fan of LLM generated code, but the fact that GP
           | bluntly states "AI will never produce a deletion" despite
           | this being categorically false makes it hard to take the rest
           | of their spiel in good faith.
           | 
           | As a side note, I've had coworkers disappear for N days too
           | and in that time the requirements changed (as is our
           | business) and their lack of communication meant that their
           | work was incompatible with the new requirements. So just
           | because someone achieves a 10x speedup in a vacuum also isn't
           | necessarily always a good thing.
        
             | fifilura wrote:
             | I'd also also be wary of the risk of being an architecture-
             | astronaut.
             | 
             | A declarative framework for testing may make sense in some
             | cases, but in many cases it will just be a complicated way
             | of scripting something you use once or twice. And when you
             | use it you need to call up the maintainer anyway when you
             | get lost in the yaml.
             | 
             | Which of course feels good for the maintainer, to feel
             | needed.
        
           | specialist wrote:
           | This is probably just me projecting...
           | 
           | u/justonceokay's wrote:
           | 
           | > _The solution to bad code is more code._
           | 
           | This has always been true, in all domains.
           | 
           | Gen-AI's contribution is further automating the production of
           | "slop". Bots arguing with other bots, perpetuating the
           | vicious cycle of bullshit jobs (David Graeber) and
           | enshitification (Cory Docotrow).
           | 
           | u/justonceokay's wrote:
           | 
           | > _AI will never produce a deletion._
           | 
           | I acknowledge your example of tidying up some code. What Bill
           | Joy may have characterized as "working in the small".
           | 
           | But what of novelty, craft, innovation? Can Gen-AI, moot the
           | need for code? Like the oft-cited example of -2,000 LOC?
           | https://www.folklore.org/Negative_2000_Lines_Of_Code.html
           | 
           | Can Gen-AI do the (traditional, pre 2000s) role of quality
           | assurance? Identify unnecessary or unneeded work? Tie
           | functionality back to requirements? Verify the goal has been
           | satisfied?
           | 
           | Not yet, for sure. But I guess it's conceivable, provided
           | sufficient training data. Is there sufficient training data?
           | 
           | You wrote:
           | 
           | > _only focus on adding new features_
           | 
           | Yup.
           | 
           | Further, somewhere in the transition from shipping CDs to
           | publishing services, I went from developing products to just
           | doing IT & data processing.
           | 
           | The code I write today (in anger) has a shorter shelf-life,
           | creates much less value, is barely even worth the bother of
           | creation much less validation.
           | 
           | Gen-AI can absolutely do all this @!#!$hit IT and data
           | processing monkey motion.
        
             | gopher_space wrote:
             | > Can Gen-AI, moot the need for code?
             | 
             | During interviews one of my go-to examples of problem
             | solving is a project I was able to kill during discovery,
             | cancelling a client contract and sending everyone back to
             | the drawing board.
             | 
             | Half of the people I've talked to do not understand why
             | that might be a positive situation for everyone involved. I
             | need to explain the benefit of having clients think you
             | walk on water. They're still upset my example isn't heavy
             | on any of the math they've memorized.
             | 
             | It feels like we're wondering how _wise_ an AI can be in an
             | era where wisdom and long-term thinking aren 't really
             | valued.
        
         | stuckinhell wrote:
         | AI can do deletions and refactors, and 10x speedups. You just
         | need to push the latest models constantly.
        
         | DeathArrow wrote:
         | >AI use makes the metric for success speed-to-production
         | 
         | Wasn't it like that always for most companies? Get to market
         | fast, add features fast, sell them, add more features?
        
         | 762236 wrote:
         | AI writes my unit tests. I clean them up a bit to ensure I've
         | gone over every line of code. But it is nice to speed through
         | the boring parts, and without bringing declarative constructs
         | into play (imperative coding is how most of us think).
        
       | nilkn wrote:
       | I don't have much sympathy for this. This country has long
       | expected millions and millions of blue collar workers to accept
       | and embrace change or lose their careers and retirements. When
       | those people resisted, they were left to rot. Now I'm reading a
       | sob story about someone throwing a fit because they refuse to
       | learn to use ChatGPT and Claude and the CEO had to sit them down
       | and hold their hand in a way. Out of all the skillset transitions
       | that history has required or imposed, this is one of the easiest
       | ever.
       | 
       | They weren't fired; they weren't laid off; they weren't
       | reassigned or demoted; they got attention and assistance from the
       | CEO and guidance on what they needed to do to change and adapt
       | while keeping their job and paycheck at the same time, with
       | otherwise no disruption to their life at all for now.
       | 
       | Prosperity and wealth do not come for free. You are not owed
       | anything. The world is not going to give you special treatment or
       | handle you with care because you view yourself as an artisan.
       | Those are rewards for people who keep up, not for those who
       | resist change. It's always been that way. Just because you've so
       | far been on the receiving end of prosperity doesn't mean you're
       | owed that kind of easy life forever. Nobody else gets that kind
       | of guarantee -- why should you?
       | 
       | The bottom line is the people in this article will be learning
       | new skills one way or another. The only question is whether those
       | are skills that adapt their existing career for an evolving world
       | or whether those are skills that enable them to transition
       | completely out of development and into a different sector
       | entirely.
        
         | petesergeant wrote:
         | > These are rewards for people who keep up, not for those who
         | resist change.
         | 
         | lol. I work with LLM outputs all day -- like it's my job to
         | make the LLM do things -- and I probably speak to some LLM to
         | answer a question for me between 10 and 100 times a day.
         | They're kinda helpful for some programming tasks, but pretty
         | bad at others. Any company that tried to mandate me to use an
         | LLM would get kicked to the curb. That's not because I'm "not
         | keeping up", it's because they're simply not good enough to put
         | more work through.
        
           | ewzimm wrote:
           | Wouldn't this depend a lot on how management responds to your
           | use? For example, if you just kept a log of prompts and
           | outputs with notes about why the output wasn't acceptable,
           | that could be considered productive use in this early stage
           | of LLMs, especially if management's goal was to have you
           | learning how to use LLMs. Learning how not to use something
           | is just as important in the process of adapting any new tool.
           | 
           | If management is convinced of the benefits of LLMs and the
           | workers are all just refusing to use them, the main problem
           | seems to be a dysfunctional working environment. It's
           | ultimately management's responsibility to work that out, but
           | if the management isn't completely incompetent, people tasked
           | with using them could do a lot to help the situation by
           | testing and providing constructive feedback rather than
           | making a stand by refusing to try and providing grand
           | narratives about damaging the artistic integrity of something
           | that has been commoditized from inception like video game
           | art. I'm not saying that video game art can't be art, but it
           | has existed in a commercial crunch culture since the 1970s.
        
         | kmeisthax wrote:
         | If you're not doing the work, you're not learning from the
         | result.
         | 
         | The CEOs in question bought what they believed to be a power
         | tool, but got what is more like a smarter copy machine. To be
         | clear, copy machines are not useless, but they also aren't
         | going to drive the 200% increases in productivity that people
         | think they will.
         | 
         | But because management demands the 200% increase in
         | productivity they were promised by the AI tools, all the
         | artists and programmers on the team hear "stop doing anything
         | interesting or novel, just copy what already exists". To be
         | blunt, that's not the shit they signed up for, and it's going
         | to result in a far worse product. Nobody wants slop.
        
       | recursivedoubts wrote:
       | I teach compilers, systems, etc. at a university. Innumerable
       | times I have seen AI lead a poor student down a completely
       | incorrect but plausible path that will still compile.
       | 
       | I'm adding `.noai` files to all the project going forward:
       | 
       | https://www.jetbrains.com/help/idea/disable-ai-assistant.htm...
       | 
       | AI may be somewhat useful for experienced devs but it is a
       | catastrophe for inexperienced developers.
       | 
       | "That's OK, we only hire experienced developers."
       | 
       | Yes, and where do you suppose experienced developers come from?
       | 
       | Again and again in this AI arc I'm reminded of the magicians
       | apprentice scene from fantasia.
        
         | robinhoode wrote:
         | > Yes, and where do you suppose experienced developers come
         | from?
         | 
         | Almost every time I hear this argument, I realize that people
         | are not actually complaining about AI, but about how modern
         | capitalism is going to use AI.
         | 
         | Don't get me wrong, it will take huge social upheaval to
         | replace the current economic system.
         | 
         | But at least it's an honest assessment -- criticizing the
         | humans that are using AI to replace workers, instead of
         | criticizing AI itself -- even if you fear biting the hands that
         | feed you.
        
           | recursivedoubts wrote:
           | i don't think it's an either/or situation
        
           | bayindirh wrote:
           | Actually, there are two main problems with AI:
           | 1. How it's gonna be used and how it'll be a detriment to
           | quality and knowledge.         2. How AI models are trained
           | with a great disregard to consent, ethics, and licenses.
           | 
           | The technology itself, the idea, what it can do is not the
           | problem, but how it's made and how it's _gonna_ be used will
           | be a great problem going forward, and _none of the suppliers_
           | tell that it should be used in moderation and will be harmful
           | in the long run. Plus the same producers are ready to crush
           | /distort anything to get their way.
           | 
           | ... smells very similar to tobacco/soda industry. Both
           | created faux-research institutes to further their causes.
        
             | EFreethought wrote:
             | I would say the huge environmental cost is a third problem.
        
               | bayindirh wrote:
               | Yeah, that's true.
        
             | clown_strike wrote:
             | > How AI models are trained with a great disregard to
             | consent, ethics, and licenses.
             | 
             | You must be joking. Consumer models' primary source of
             | training data seems to be the legal preambles from BDSM
             | manuals.
        
           | ToucanLoucan wrote:
           | > Almost every time I hear this argument, I realize that
           | people are not actually complaining about AI, but about how
           | modern capitalism is going to use AI.
           | 
           | This was pretty consistently my and many others viewpoint
           | since 2023. We were assured many times over that this time it
           | would be different. I found this unconvincing.
        
           | rchaud wrote:
           | > I realize that people are not actually complaining about
           | AI, but about how modern capitalism is going to use AI.
           | 
           | Something very similar can be said about the issue of guns in
           | America. We live in a profoundly sick society where the
           | airwaves fill our ears with fear, envy and hatred. The easy
           | availability of guns might not have been a problem if it
           | didn't intersect with a zero-sum economy.
           | 
           | Couple that with the unavailability of community and social
           | supports and you have a a recipe for disaster.
        
           | lcnPylGDnU4H9OF wrote:
           | > criticizing the humans that are using AI to replace
           | workers, instead of criticizing AI itself
           | 
           | I think you misunderstand OP's point. An employer saying "we
           | only hire experienced developers [therefore worries about
           | inexperienced developers being misled by AI are unlikely to
           | manifest]" doesn't seem to realize that the AI is what makes
           | inexperienced developers. In particular, using the AI to
           | learn the craft will not allow prospective developers to
           | learn the fundamentals that will help them understand when
           | the AI is being unhelpful.
           | 
           | It's not so much to do with roles currently being performed
           | by humans instead being performed by AI. It's that the
           | experienced humans (engineers, doctors, lawyers, researchers,
           | etc.) who can benefit the most from AI will eventually retire
           | and the inexperienced humans who don't benefit much from AI
           | will be shit outta luck because the adults in the room didn't
           | think they'd need an actual education.
        
         | ffsm8 wrote:
         | > Yes, and where do you suppose experienced developers come
         | from?
         | 
         | Strictly speaking, you don't even need university courses to
         | get experienced devs.
         | 
         | There will always be individuals that enjoy coding and do so
         | without any formal teaching. People like that will always be
         | more effective at their job once employed, simply because
         | they'll have just that much more experience from trying various
         | stuff.
         | 
         | Not to discredit University degrees of course - the best devs
         | will have gotten formal teaching and code in their free time.
        
           | erikerikson wrote:
           | GP didn't mention university degrees.
           | 
           | You get experienced devs from inexperienced devs that get
           | experience.
           | 
           | [edit: added "degrees" as intended. University _was_
           | mentioned as the context of their observation]
        
             | ffsm8 wrote:
             | The first sentence contextualized the comment to university
             | degrees as far as I'm concerned. I'm not sure how you could
             | interpret it any other way, but maybe you can enlighten me.
        
               | erikerikson wrote:
               | I read it as this is the context from which I make the
               | following observation. It's not excluding degrees but
               | certainly not requiring them.
        
           | bluefirebrand wrote:
           | > People like that will always be more effective at their job
           | once employed
           | 
           | This is honestly not my experience with self taught
           | programmers. They can produce excellent code in a vacuum but
           | they often lack a ton of foundational stuff
           | 
           | In a past job, I had to untangle a massive nested loop
           | structure written by a self taught dev, which did work but
           | ran extremely slowly
           | 
           | He was very confused and asked me to explain why my code ran
           | fast, his ran slow, because "it was the same number of loops"
           | 
           | I tried to explain Big O, linear versus exponential
           | complexity, etc, but he really didn't get it
           | 
           | But the company was very impressed by him and considered him
           | our "rockstar" because he produced high volumes of code very
           | quickly
        
             | taosx wrote:
             | I was self taught before I studied, most of the
             | "foundational" knowledge is very easy to acquire. I've
             | mentored some self-taught juniors and they surprised me at
             | how fast they picked up concepts like big O just by looking
             | at a few examples.
        
               | bluefirebrand wrote:
               | Big O was just an anecdote for example
               | 
               | My point is you don't know what you don't know. There is
               | really only so far you can get by just noodling around on
               | your own, at some point we have to learn from more
               | experienced people to get to the next level
               | 
               | School is a much more consistent path to gain that
               | knowledge than just diving in
               | 
               | It's not the only path, but it turns out that people like
               | consistency
        
               | abbadadda wrote:
               | I would like a book recommendation for the things I don't
               | know please (Sarcasm but seriously)...
               | 
               | A senior dev mentioned a "class invariant" the other day
               | And I just had no idea what that was because I've never
               | been exposed to it... So I suppose the question I have is
               | what should I be exposed to in order to know that? What
               | else is there that I need to learn about software
               | engineering that I don't know that is similarly going to
               | be embarrassing on the job if I don't know it? I've got
               | books like cracking the coding interview and software
               | engineering at Google... But I am missing a huge gap
               | because I was unable to finish my masters and computer
               | science :-(
        
             | ffsm8 wrote:
             | I literally said as much?
             | 
             | > _Not to discredit University degrees of course - the best
             | devs will have gotten formal teaching and code in their
             | free time._
        
               | Arainach wrote:
               | The disagreement is over the highlighted line:
               | 
               | >People like that will always be more effective at their
               | job once employed
               | 
               | My experience is that "self taught" people are passionate
               | about solving the parts they consider fun but do not have
               | the breadth to be as effective as most people who have
               | formal training but less passion. The previous poster
               | also called out real issues with this kind of developer
               | (not understanding time complexity or how to fix things)
               | that I have repeatedly seen in practice.
        
               | ffsm8 wrote:
               | But the sentence is about people coding in their free
               | time vs not doing so... If you take an issue with that,
               | you argue that self taught people that don't code in
               | their free time are better at coding the the people that
               | do - or people with formal training that don't code in
               | their free time being better at it vs people that have
               | formal training and do...
               | 
               | I just pointed out that removing classes entirely would
               | still get you experiences people. Even if they'd likely
               | be better if they code and get formal training. I stated
               | that _very_ plainly
        
               | bluefirebrand wrote:
               | > I stated that very plainly
               | 
               | You actually didn't state it very plainly at all. Your
               | initial post is contradictory, look at these two
               | statements side by side
               | 
               | > There will always be individuals that enjoy coding and
               | do so without any formal teaching. People like that will
               | always be more effective at their job once employed
               | 
               | > the best devs will have gotten formal teaching and code
               | in their free time
               | 
               | People who enjoy coding without formal training -> more
               | effective
               | 
               | People who enjoy coding and have formal training -> best
               | devs
               | 
               | Anyways I get what you were trying to say, now. You just
               | did not do a very good job of saying it imo. Sorry for
               | the misunderstanding
        
               | Izkata wrote:
               | I read this one:
               | 
               | > There will always be individuals that enjoy coding and
               | do so without any formal teaching. People like that will
               | always be more effective at their job once employed
               | 
               | As "people who enjoy coding and didn't need formal
               | training to get started". It includes both people who
               | have and don't have formal training.
               | 
               | Both statements together are (enthusiasm + formal) >
               | (enthusiasm without formal) > (formal without
               | enthusiasm).
        
               | bluefirebrand wrote:
               | Sure that's a valid interpretation but it wasn't how I
               | read it
               | 
               | > Both statements together are (enthusiasm + formal) >
               | (enthusiasm without formal) > (formal without
               | enthusiasm).
               | 
               | I don't think the last category (formal education without
               | enthusiasm) really exists, I think it is a bit of a
               | strawman being held up by people who are *~passionate~*
               | 
               | I suspect that without any enthusiasm, people will not
               | make it through any kind of formal education program, in
               | reality
        
               | ffsm8 wrote:
               | Uh, almost nobody I've worked with to date codes in their
               | free time with any kind of regularity.
               | 
               | If you've never encountered the average 9-5 dev that just
               | does the least amount of effort they can get away with,
               | then I have to apploud the HR departments of the
               | companies you've worked for. Whatever they're doing,
               | they're doing splendid work.
               | 
               | And almost all of my coworkers are university grads that
               | do literally the same you've used as an example for non
               | formally taught people: they write abysmally performing
               | code because they often have an unreasonable fixation on
               | practices like inversion of control (as a random
               | example).
               | 
               | As a particularly hilarious example I've had to explain
               | to such a developer that an includes check on a large
               | list in a dynamic language such as JS performs abysmally
        
           | philistine wrote:
           | > There will always be individuals that enjoy coding and do
           | so without any formal teaching.
           | 
           | We're talking about the industry responsible for ALL the
           | growth of the largest economy in the history of the world.
           | It's not the 1970s anymore. You can't just count on weirdos
           | in basements to build an industry.
        
             | dingnuts wrote:
             | I'm so glad I learned to program so I could either be
             | called a basement dweller or a tech bro
        
               | philistine wrote:
               | I mean, a garage dweller works just as well.
        
       | voidhorse wrote:
       | I think the software industry will look just like the material
       | goods space post-industrialization after the dust settles:
       | 
       | Large corporations will use AI to deliver low-quality software at
       | high speed and high scale.
       | 
       | "Artisan" developers will continue to exist, but in much smaller
       | numbers and they will mostly make a living by producing refined,
       | high-quality custom software at a premium or on creative
       | marketplaces. Think Etsy for software.
       | 
       | That's the world we are heading for, unless/until companies
       | decide LLMs are ultimately not cost beneficial or overzealous use
       | of them leads to a real hallucination induced catastrophe.
        
         | GarnetFloride wrote:
         | Sounds like fast fashion. The thinnest, cheapest fabric,
         | slapped together as fast as possible with the least amount of
         | stitching. Shipped fast and obsolete fast.
        
           | tmpz22 wrote:
           | Fast fashion - also ruinous to the environment.
        
       | mattgreenrocks wrote:
       | Management: "devs aren't paid to play with shiny new tech, they
       | should be shipping features!"
       | 
       | Also management: "I need you to play with AI and try to find a
       | use for it"
        
       | rchaud wrote:
       | > "I have no idea how he ended up as an art director when he
       | can't visualise what he wants in his head unless can see some end
       | results", Bradley says. Rather than beginning with sketches and
       | ideas, then iterating on those to produce a more finalised image
       | or vision, Bradley says his boss will just keep prompting an AI
       | for images until he finds one he likes, and then the art team
       | will have to backwards engineer the whole thing to make it work.
       | 
       | Sounds like an "idea guy" rather than an art director or
       | designer. I would do this exact same thing, but on royalty-free
       | image websites, trying to get the right background or explanatory
       | graphic for my finance powerpoints. Unsurprisingly, Microsoft now
       | has AI "generating" such images for you, but it's much slower
       | than what I could do flipping through those image sites.
        
       | lanfeust6 wrote:
       | It would be an understatement to call this a skewed perspective.
       | In most of the anecdotes they seem to try really hard to
       | trivialize the productive benefits of AI, which is difficult to
       | take seriously. The case that LLMs create flawed outputs or are
       | limited in what they can do is not controversial at all, but by
       | and large, reports by experienced developers is that it has
       | improved their productivity, and it's now part of their workflow.
       | Whether businesses and hire-ups try to use it in absurd ways is
       | neither here nor there. That, and culture issues, were a problem
       | before AI.
       | 
       | Obviously some workers have a strong incentive to oppose
       | adoption, because it may jeopardize their careers. Even if the
       | capabilities are over-stated it can be a self-fulfilling prophecy
       | as higher-ups choices may go. Union shops will try to stall it,
       | but it's here to stay. You're in a globally competitive market.
        
         | oneeyedpigeon wrote:
         | I have very little objection to AI, providing we get UBI to
         | mitigate the fallout.
        
           | lanfeust6 wrote:
           | Right, well even without AGI (no two people agree on whether
           | it's coming within 5 years, 30, or 100), finely-tuned LLMs
           | can disrupt the economy fast if the bottlenecks get taken
           | care of. The big one is the robot-economy. This is popularly
           | placed further off in timescales, but it does not require AGI
           | at all. We already have humanoid robots on the market for the
           | price of a small car, they're just dumb. Once we scale up
           | solar and battery production, and then manufacturing, it's
           | coming for menial labor jobs. They already have all the
           | pieces, it's a foregone conclusion. What we don't know how to
           | do is to create a real "intelligence", and here the
           | evangelists will wax about the algorithms and the nature of
           | intelligence, but at the end of the day it takes more than
           | scaling up an LLM to constitute an AGI. The bet is that AI-
           | assisted research will lead to breakthrough in a trivial
           | amount of time.
           | 
           | With white-collar jobs the threat of AI feels more abstract
           | and localized, and you still get talk about "creating new
           | jobs", but when robots start coming off the assembly line
           | people will demand UBI so fast it will make your head spin.
           | Either that or they'll try to set fire to them or block them
           | with unions, etc. Hard to say when because another effort
           | like the CHIPS act could expedite things.
        
             | dingnuts wrote:
             | Humanoid robots on the market for the price of a small car?
             | That's complete science fiction. There have been demos of
             | such robots but only demos.
        
               | lanfeust6 wrote:
               | > Humanoid robots on the market for the price of a small
               | car? That's complete science fiction.
               | 
               | Goldman Sachs doesn't think so.
               | 
               | https://www.fortunebusinessinsights.com/humanoid-robots-
               | mark...
               | 
               | https://finance.yahoo.com/news/humanoid-robot-market-
               | researc...
               | 
               | https://www.mordorintelligence.com/industry-
               | reports/robotics...
               | 
               | They don't even need to be humanoid is the thing.
        
               | stuckinhell wrote:
               | 16k humanoid robots https://www.unitree.com/g1/
        
               | Izkata wrote:
               | And 1X has apparently been testing theirs in home
               | environments for months, though it looks like they're not
               | for sale yet:
               | https://techcrunch.com/2025/02/21/norways-1x-is-building-
               | a-h...
        
             | stuckinhell wrote:
             | https://www.unitree.com/g1/ 16k humanoid robot
        
               | namaria wrote:
               | Do they do anything or are they an expensive toy in the
               | shape of a humanoid robot?
        
           | parpfish wrote:
           | I was thinking about this and realized that if we want an AI
           | boom to lead to UBI, AI needs to start replacing the cushy
           | white collar jobs first.
           | 
           | If you start by replacing menial labor, there will be more
           | unemployment but you're not going to build the political will
           | to do anything because those jobs were seen as "less than"
           | and the political class will talk about how good and
           | efficient it is that these jobs are gone.
           | 
           | You need to start by automating away "good jobs" that
           | directly affect middle/upper class people. Jobs where people
           | have extensive training and/or a "calling" to the field. Once
           | lawyers, software engineers, doctors, executives, etc get
           | smacked with widespread unemployment, the political class
           | will take UBI much more seriously.
        
             | lanfeust6 wrote:
             | Incidentally it seems to be happening in that order, but
             | laborers won't have a long respite (if you can call it
             | that)
        
               | parpfish wrote:
               | i think that the factor determining which jobs get
               | usurped by AI first isn't going to be based on the
               | cognitive difficulty as much as it is about robotic
               | difficulty and interaction with the physical world.
               | 
               | if you job consists of reading from a computer ->
               | thinking -> entering things back into a computer, you're
               | on the top of the list because you don't need to set up a
               | bunch of new sensors and actuators. In other words... the
               | easier it is to do your job remotely, the more likely it
               | is you'll get automated away
        
             | waveringana wrote:
             | needing a lawyer and needing a doctor are very common cases
             | of bankruptcy in the US. both feel very primed to be
             | replaced by models
        
             | stuckinhell wrote:
             | I suspect elites will build a two-tiered AI system where
             | only a select few get access to the cutting-edge stuff,
             | while the rest of us get stuck with the leftovers.
             | 
             | They'll use their clout--money, lobbying, and media
             | influence--to lock in their advantage and keep decision-
             | making within their circle.
             | 
             | In the end, this setup would just widen the gap, cementing
             | power imbalances as AI continues to reshape everything. UBI
             | will become the bare minimum to keep the masses sedated.
        
           | hello_computer wrote:
           | But why will that happen? If they have robots and AI and all
           | the money, what's stopping the powers that be from disposing
           | of the excess biomass?
        
             | lanfeust6 wrote:
             | What's there to gain? What do they care about biomass?
             | They're still in the business of selling products, until
             | the economy explodes. I find this to be circular because
             | you could say the same thing about right now, "why don't
             | they dispose of the welfare class" etc.
             | 
             | There's also the fact that "they" aren't all one and the
             | same persons with the exact same worldview and interests.
        
               | hello_computer wrote:
               | The Davos class was highly concerned about ecology before
               | Davos was even a thing. In America, their minions (the
               | "coastie" class) are coming to see the liquidation of the
               | kulaks as perhaps not such a bad thing. If it devolves
               | into a "let them eat cake" scenario, one has to wonder
               | how things will play out in "proles vs robot pinkertons".
               | Watch what the sonic crowd control trucks did in Serbia
               | last week.
               | 
               | Of course there is always the issue of "demand"--of
               | keeping the factories humming, but when you are worth
               | billions, your immediate subordinates are worth hundreds
               | of millions, and all of their subordinates are worth a
               | few million, maybe you come to a point where "lebensraum"
               | becomes more valuable to you than another zero at the end
               | of your balance?
               | 
               | When AI replaces the nerds (in progress), they become
               | excess biomass. Not talking about a retarded hollywood-
               | style apocalypse. Economic uncertainty is more than
               | enough to suppress breeding in many populations. " _not
               | with a bang, but a whimper_ "
               | 
               | If you know any of "them", you will know that "they" went
               | to the same elite prep schools, live in the same cities,
               | intermarry, etc. The "equality" nonsense is just a lie to
               | numb the proles. In 2025 we have a full-blown hereditary
               | nobility.
               | 
               | edit: answer to ianfeust6:
               | 
               | The West is not The World. There are over a billion
               | Chinese, Indians, Africans...
               | 
               | Words mean things. Who said tree hugger? If you are an
               | apex predator living in an increasingly cloudy tank,
               | there is an obvious solution to the cloudyness.
        
               | lanfeust6 wrote:
               | So your take is that the wealthiest class will purge
               | people because they're tree-huggers. Not the worst
               | galaxy-brained thing I've heard before, but still
               | laughable.
               | 
               | Don't forget fertility rate is basically stagnant in the
               | West and falling globally, so this seems like a waste of
               | time considering most people just won't breed at all.
        
               | lanfeust6 wrote:
               | also: emissions will continue to drop
        
               | hello_computer wrote:
               | there has been far more degradation to the natural
               | environment than mere air pollution. general sherman
               | decimated the plains indians with a memorandum. do you
               | think that you are sufficiently better and sufficiently
               | more indispensable than a plains indian?
        
               | hello_computer wrote:
               | _repeated for thread continuity_ :
               | 
               | The West is not The World. There are over a billion
               | Chinese, Indians, Africans...
               | 
               | Words mean things. Who said tree hugger? If you are an
               | apex predator living in an increasingly cloudy tank,
               | there is an obvious solution to the cloudyness.
        
         | hello_computer wrote:
         | It's karma. The creatives weren't terribly concerned when the
         | factory guys lost their jobs. " _Lern to code!_ " Now it's our
         | turn to " _Learn to OnlyFans_ " or " _Learn to Homeless_ "
        
           | Fraterkes wrote:
           | "learn to code" was thrown around by programmers, not
           | creatives. Everyone else (including writers and artists) has
           | long hated that phrase, and condemded it as stupid and
           | shortsighted.
        
             | hello_computer wrote:
             | "learn to code" was from the media. whether they deserve to
             | be classified as "creatives" i will leave to the
             | philosophers.
        
           | Terr_ wrote:
           | > The creatives [...] "Lern to code!"
           | 
           | No, the underlying format of "$LABOR_ISSUE can be solved by
           | $CHANGE_JOB" comes from a place of _politics_ , where a
           | politician is trying to suggest they _have a plan_ to somehow
           | tackle a painful problem among their constituents, and that
           | therefore they should be (re-)elected.
           | 
           | Then the politicians piled onto "coal-miners can learn to
           | code" etc. because it was uniquely attractive, since:
           | 
           | 1. No big capital expenditures, so they don't need to
           | promise/explain how a new factory will get built.
           | 
           | 2. The potential for remote work means constituents wouldn't
           | need to sell their homes or move.
           | 
           | 3. Participants wouldn't require multiple years of expensive
           | formal schooling.
           | 
           | 4. It had some "more money than you make now" appeal.
        
             | hello_computer wrote:
             | Stating it in patronizing fact-checker tone does not make
             | it true. The tech nerds started it ( _they love cheap labor
             | pools_ ). Then the politicians joined their masters'
             | bandwagon. It was a PR blitz. Who has the money for those?
             | Dorseys, Grahams, & Zuckerbergs, or petty-millionaire
             | mayors & congressmen? Politicians are just the house slaves
             | --servants of money.
             | 
             | https://en.wikipedia.org/wiki/Learn_to_Code#Codecademy_and_
             | C...
        
         | Fraterkes wrote:
         | If ai exacerbates culture issues and management incompetence
         | then that is an inherent downside of ai.
         | 
         | There is a bunch of programmers who like ai, but as the article
         | shows, programmers are not the only people subjected to ai in
         | the workplace. If you're an artist, you've taken a job that has
         | crap pay and stability for the amount of training you put in,
         | and the only reason you do it is because you like the actual
         | content of the job (physically making art). There is obviously
         | no upside to ai for those people, and this focus on the
         | managers' or developers' perspective is myopic.
        
           | lanfeust6 wrote:
           | It's an interesting point that passion-jobs that creatives
           | take on (including game dev) tend to get paid less, and where
           | the thrilling component is disrupted there could be less
           | incentive to bother entering the field.
           | 
           | I think for the most part creatives will still line up for
           | these gigs, because they care about contributing to the end
           | products, not the amount of time they spend using Blender.
        
             | Fraterkes wrote:
             | You are again just thinking from the perspective of a
             | manager: Yes, if these ai jobs need to be filled, artists
             | will be the people filling them. But from the artists
             | perspective there are fewer jobs, and the jobs that do
             | remain are less fulfilling. So: from the perspective of a
             | large part of the workforce it is completely true and
             | rational to say that ai at their job has mostly downsides.
        
               | lanfeust6 wrote:
               | > from the artists perspective there are fewer jobs, and
               | the jobs that do remain are less fulfilling.
               | 
               | Re-read what I wrote. You repeated what I said.
               | 
               | > So: from the perspective of a large part of the
               | workforce it is completely true and rational to say that
               | ai at their job has mostly downsides.
               | 
               | For them, maybe.
        
               | Fraterkes wrote:
               | Alright, so doesn't that validate a lot of the feelings
               | and opinions layed out in the OP? Have I broadened your
               | worldview?
        
           | andybak wrote:
           | It might seem hard to believe but there are a bunch of
           | artists who also like AI. People whose artistic practice
           | predates AI. The definition of "artist" is a quagmire which I
           | won't get into but I am not stretching the definition here in
           | any way.
        
             | Fraterkes wrote:
             | I'm sure there are a bunch! I'm an artist, I talk to a
             | bunch of artists physically and online. It's not the
             | prevailing opinion in my experience.
        
               | andybak wrote:
               | Agreed. But it's important to counter the impression that
               | many have that it's nearly unanimous.
        
       | hbsbsbsndk wrote:
       | Software developers are so aware of "enshittification" and yet
       | also bullish about this generation of AI, it's baffling.
       | 
       | It's very clear the "value" of the LLM generation is to churn out
       | low-cost, low-quality garbage. We already outsourced stuff to
       | Fivrr but now we can cut people out altogether. Producing
       | "content" nobody wants.
        
       | kstrauser wrote:
       | There are many, many reasons to be skeptical of AI. There are
       | also excellent tasks it can efficiently help with.
       | 
       | I wrote a project where I'd initially hardcoded a menu hierarchy
       | into its Rust. I wanted to pull that out into a config file so it
       | could be altered, localized, etc without users having it and
       | recompile the source. I opened a "menu.yaml" file, typed the name
       | of the top-level menu, paused for a moment to sip coffee, and Zed
       | popped up a suggested completion of the file which was
       | syntactically correct and perfect for use as-is.
       | 
       | I honestly expected I'd spend an hour mechanically translating
       | Rust to YAML and debugging the mistakes. It actually took about
       | 10 seconds.
       | 
       | It's also been freaking brilliant for writing docstrings
       | explaining what the code I just manually wrote does.
       | 
       | I don't want to use AI to write my code, any more than I'd want
       | it to solve my crossword. I sure like having it help with the
       | repetitive gruntwork and boilerplate.
        
       | Jyaif wrote:
       | "He doesn't know that the important thing isn't just the end
       | result, it's the journey and the questions you answer along the
       | way"
       | 
       | This is satire right?
        
       | 000ooo000 wrote:
       | Can't wait to hear the inevitable slurs people will create to
       | refer to heavy AI users and staunch AI avoiders.
        
         | esafak wrote:
         | Prompt puncher and Luddite come to mind.
        
         | Schiendelman wrote:
         | "Sloppers" appeared in another thread in this post. I've seen
         | it before, I think it'll stick.
        
       | esafak wrote:
       | Companies need to be aware of the long-term affects of relying on
       | AI. It causes atrophy and, when it introduces a bug, it takes
       | more time to understand and fix than if you had written it
       | yourself.
       | 
       | I just spent a week fixing a concurrency bug in generated code.
       | Yes, there were tests; I uncovered the bug when I realized the
       | test was incorrect...
       | 
       | My strong advice, is to digest every line of generated code;
       | don't let it run ahead of you.
        
         | dkobia wrote:
         | It is absolutely terrifying to watch tools like Cursor generate
         | so much code. Maybe not a great analogy, but it feels like
         | driving with Tesla FSD in New Delhi in the middle of rush hour.
         | If you let it run ahead of you, the amount of code to review
         | will be overwhelming. I've also encountered situations where it
         | is unable to pass tests for code it wrote.
        
           | tmpz22 wrote:
           | Like TikTok AI Coding breaks human psychology. It is
           | engrained in us that if we have a tool that looks right
           | enough and highly productive we will over-apply it to our
           | work. Even diligent programmers will be lured to accepting
           | giant commits without diligent review and they will pay for
           | it.
           | 
           | Of course yeeting bad code into production with a poor review
           | process is already a thing. But this will scale that bad code
           | as now you have developers who will have grown up on it.
        
         | Analemma_ wrote:
         | When have companies ever cared about the long-term effects of
         | _anything_ , and why would they suddenly start now?
        
       | DeathArrow wrote:
       | If a manager thinks paying $20 monthly for an AI tool will make a
       | developer or artist 5x more productive, he's delusional.
       | 
       | On the other hand, AI can be useful and can accelerate _a bit_
       | some work.
        
       | AlienRobot wrote:
       | A very bad programmer can program some cool stuff with the help
       | of libraries, toolkits, frameworks and engines that they barely
       | understand. I think that's pretty cool and makes things otherwise
       | impossible possible, but it doesn't make the very bad programmer
       | better than they really are.
       | 
       | I believe AI is a variation of this, except a library at least
       | has a _license_.
        
         | matt3210 wrote:
         | The AI code has thousands of licenses but the legal system
         | hasn't caught up
        
       | woah wrote:
       | Why so much hand-wringing? If you are an anti-AI developer and
       | you are able to develop better code faster than someone using AI,
       | good for you. If AI-using developers will end up ruining their
       | codebase in months like many here are saying, then things will
       | take care of themselves.
        
         | svantana wrote:
         | I see two main problems with this approach:
         | 
         | 1. productivity and quality is hard to measure
         | 
         | 2. the codebase they are ruining is the same one I am working
         | on.
        
         | bluefirebrand wrote:
         | Faster is not a smart metric to judge a programmer by.
         | 
         | "more code faster" is not a good thing, it has never been a
         | good thing
         | 
         | I'm not worried about pro AI workers ruining _their_ codebases
         | at their jobs
         | 
         | I'm worried about pro AI coworkers ruining _my_ job by shitting
         | up the codebases I have to work in
        
           | woah wrote:
           | I said "better code faster". Delivering features to users is
           | always a good thing, and in fact is the entire point of what
           | we do.
        
             | bluefirebrand wrote:
             | > in fact is the entire point of what we do
             | 
             | Pump the brakes there
             | 
             | You may have bought into some PMs idea of what we do, but
             | I'm not buying it
             | 
             | As professional, employed software developers, the entire
             | point of what we do is to provide value to our employers.
             | 
             | That isn't always by delivering features to users, it's
             | certainly not always by delivering features _faster_
        
             | joe_the_user wrote:
             | Even if you say "better faster" tens times fast, the
             | quality of being produced fast and being broadly good are
             | very different. Speed of development can be measured
             | immediately. Quality is holistic. It's a product of not
             | just formatting clear structures but of relating to the
             | rest of a given system.
        
             | AlexandrB wrote:
             | A lot of modern software dev is focused on delivering
             | features to shareholders, not users. Doing that faster is
             | going to make my life, as a user, worse.
        
         | nathan_compton wrote:
         | I've posted recently about a dichotomy which I have had in my
         | head for years as a technical person: there are two kinds of
         | tools; the first lets you do the right thing more easily and
         | the second lets you do the wrong thing more quickly and for
         | longer before you have to pay for it. AI/LLMs can definitely be
         | the latter kind of tool, especially in a context where short
         | term incentives swamp long term ones.
        
         | int_19h wrote:
         | I'm actually pro-AI and I use AI assistants for coding, but I'm
         | also very concerned that the way those things will be deployed
         | at scale in practice is likely to lead to severe degradation of
         | software quality across the board.
         | 
         | Why the hand-wringing? Well, for one thing, as a developer I
         | still have to work on that code, fix the bugs in it, maintain
         | it etc. You could say that this is a positive since AI slop
         | would provide for endless job security for people who know how
         | to clean up after it - and it's true, it does, but it's a very
         | tedious and boring job.
         | 
         | But I'm not just a developer, either - I'm also a user, and
         | thinking about how low the average software quality already is
         | today, the prospect of it getting even worse across the board
         | is very unpleasant.
         | 
         | And as for things taking care of themselves, I don't think they
         | will. So long as companies can still ship _something_ , it's
         | "good enough", and cost-cutting will justify everything else.
         | That's just how our economy works these days.
        
         | ang_cire wrote:
         | This assumes a level of both rationality and omniscience that
         | don't exist in the real world.
         | 
         | If a company fails to compete in the market and dies, there is
         | no "autopsy" that goes in and realizes that it failed because
         | of a chain-reaction of factors stemming from bad AI-slop code.
         | And execs are so far removed from the code level, they don't
         | know either, and their next company will do the same thing.
         | 
         | What you're likely to end up with is project managers and
         | developers who _do_ know the AI code sucks, and they 'll be
         | heeded by execs just as much they are now, which is to say not
         | at all.
         | 
         | And when the bad AI-code-using devs apply to the next business
         | whose execs are pro-AI because they're clueless, guess who
         | they'll hire?
        
       | matt3210 wrote:
       | One thing jumps out about the person who noticed the AI was wrong
       | on things they were familiar with. It's like when ELon Musk talks
       | about rockets. I don't know about rockets so I take his word for
       | it. When Elon Must talked about software it was obvious he has no
       | idea what he's doing. So when the AI generates something I know
       | nothing about, it looks productive but when it's generating
       | things for which I'm familiar I know its full of shit.
        
         | bluefirebrand wrote:
         | > So when the AI generates something I know nothing about, it
         | looks productive but when it's generating things for which I'm
         | familiar I know its full of shit.
         | 
         | This is why when you hear people talk about how great it is at
         | producing X, our takeaway should be "this person is not an
         | expert at X, and their opinions can be disregarded"
         | 
         | They are telling on themselves that they are not experts at the
         | thing they think the AI is doing a great job at
        
           | andybak wrote:
           | "This is why when you hear people talk about how terrible it
           | is at producing X, our takeaway should be "this person either
           | hasn't tried to use it in good faith, and their opinions can
           | be disregarded"
           | 
           | I'm playing devil's advocate somewhat here but it often seem
           | like that there's a bunch of people on both sides using hella
           | motivated reasoning because they have very strong _feelings_
           | that developed early on in their exposure to AI.
           | 
           | AI is both terrible and wonderful. It's useless and some
           | things and impressive at others. It will ruin whole sectors
           | of the economy and upturn lives. It will get better and it is
           | getting better so any limitations you currently observe are
           | probably termporary. The net benefit for humanity may turn
           | out to be positive or negative - it's too early to tell.
        
             | bluefirebrand wrote:
             | > AI is both terrible and wonderful. It's useless and some
             | things and impressive at others
             | 
             | That's kind of my problem. I am saying that it mostly only
             | _appears_ impressive to people who don 't know better
             | 
             | When people do know better it comes up short consistently
             | 
             | Most of the pro AI people I see are bullish about it on
             | things they have no idea about, like non-technical CEOs
             | insisting that it can create good code
        
               | andybak wrote:
               | > When people do know better it comes up short
               | consistently
               | 
               | I disagree with that part and I don't think this opinion
               | can be sustained by anyone using it with any regularity
               | _in good faith_
               | 
               | People can argue whether it's 70/30 or 30/70 or what
               | domains it's more useful in than others but you are
               | overstating the negative.
        
               | int_19h wrote:
               | Have you considered that it's actually impressive in some
               | areas that are outside of your interest or concern?
        
               | bluefirebrand wrote:
               | Could be, but why would I trust that when it's clearly so
               | bad at the things I am good at?
        
             | ang_cire wrote:
             | > The net benefit for humanity may turn out to be positive
             | or negative - it's too early to tell.
             | 
             | It's just a tool, but it is unfortunately a tool that is
             | currently dominated by large-sized corporations, to serve
             | Capitalism. So it's definitely going to be a net-negative.
             | 
             | Contrast that to something like 3D printing, which has most
             | visibly benefited small companies and individual users.
        
           | pfdietz wrote:
           | Sounds like the Gell-Mann amnesia effect.
           | 
           | https://en.wikipedia.org/wiki/Gell-Mann_amnesia_effect
        
       | some-guy wrote:
       | I always assumed game development would be one of the most
       | impacted by AI hype, for better or worse. With game development
       | there's a much higher threshold for subjectivity and
       | "incorrectness".
       | 
       | I'm in a Fortune 500 software company and we are also being
       | pushed AI down our throats, even though so far it has only been
       | useful for small development tasks. However our tolerance for
       | incorrectness is much, much lower--and many skip levels are
       | already realizing this.
        
         | nathan_compton wrote:
         | I'm an indie game developer and its a domain where I find AI to
         | be most useless - too much of what a game is interactive,
         | spatial, and about game-feel. The AI just can't do it. Even
         | GPT's latest models really struggled to write reasonable 3d
         | transformations, which is unsurprising, since they live in text
         | world, not 3d world.
        
       | bufferoverflow wrote:
       | I wish our company forced AI on us. Our security is so tight,
       | it's pretty much impossible to use any good LLMs.
        
         | ang_cire wrote:
         | It really doesn't take that beefy of a machine to run a good
         | LLM locally instead of paying some SaaS company to do it for
         | you.
         | 
         | I've got a refurb homelab server off PCSP with 512gb ram for
         | <$1k, and I run decently good LLM models (Deepseek-r1:70b,
         | llama3.3:70b). Given your username, you might even try pitching
         | a GPU server to them as dual-purpose; LLM + hashcat. :)
        
       | gwbas1c wrote:
       | > "He doesn't know that the important thing isn't just the end
       | result, it's the journey and the questions you answer along the
       | way". Bradley says that the studio's management have become so
       | enamoured with the technology that without a reliance on AI-
       | generated imagery for presentations and pitches they would not be
       | at the stage they are now, which is dealing with publishers and
       | investors.
       | 
       | Take out the word AI and replace it with any other tool that's
       | over-hyped or over-used, and the above statement will apply to
       | any organization.
        
       | nathan_compton wrote:
       | When LLMs came out I suppressed my inner curmudgeon and dove in,
       | since the technology was interesting to me and seemed much more
       | likely than crypto to be useful beyond crime. Thus, I have used
       | LLMs extensively for many years now and I have found that despite
       | the hype and amazing progress, they still basically only excel
       | first drafts and simple refactorings (where they are, I have to
       | say, incredibly useful for eliminating busy work). But I have yet
       | to use a model, reasoning or otherwise, that could solve a
       | problem that required genuine thought, usually in the form of
       | constructing the right abstraction, bottom up style. LLMs write
       | code like super-human dummies, with a tendency to put too much
       | code in a given function and with very little ability to invent a
       | domain in which the solution is simple and clearly expressed,
       | probably because they don't care about that kind of readability
       | and its not much in their data set.
       | 
       | I'm deeply influenced by languages like Forth and Lisp, where
       | that kind of bottom up code is the cultural standard and and I
       | prefer it, probably because I don't have the kind of linear
       | intelligence and huge memory of an LLM.
       | 
       | For me the hardest part of using LLMs is knowing when to stop and
       | think about the problem in earnest, before the AI generated code
       | gets out of my human brain's capacity to encompass. If you think
       | a bit about how AI still is limited to text as its white board
       | and local memory, text which it generates linearly from top to
       | bottom, even reasoning, it sort of becomes clear why it would
       | struggle with _genuine_ abstraction over problems. I 'm no longer
       | so naive as to say it won't happen one day, even soon, but so far
       | its not there.
        
       | caseyy wrote:
       | There is a small, hopeful flipside to this. While people using AI
       | to produce art (such as concept art) have flooded the market,
       | real skills now command a higher price than before.
       | 
       | To pull this out of the games industry for just a moment, imagine
       | this: you are a business and need a logo produced. Would you hire
       | someone at the market price who uses AI to generate something...
       | sort of on-brand they _most definitely_ cannot provide indemnity
       | cover for (considering how many of these dubiously owned works
       | they produce), or would you pay above the market price to have an
       | artist make a logo for you that is guaranteed to be their own
       | work? The answer is clear - you 'd cough up the premium. This is
       | now happening on platforms like UpWork and Fiverr. The prices for
       | real human work have not decreased; they have shot up
       | significantly.
       | 
       | It's also happening slowly in games. The concept artists who are
       | skilled command a higher salary than those who rely on AI. If you
       | depend on image-generating AI to do your work, I don't think many
       | game industry companies would hire you. Only the start-ups that
       | lack experience in game production, perhaps. But that part of the
       | industry has always existed - the one made of dreamy projects
       | with no prospect of being produced. It's not worth paying much
       | attention to, except if you're an investor. In which case,
       | obviously it's a bad investment.
       | 
       | Besides, just as machine-translated game localization isn't
       | accepted by any serious publisher (because it is awful and can
       | cause real reputational damage), I doubt any evident AI art would
       | be allowed into the final game. Every single piece of that will
       | need to be produced by humans for the foreseeable future.
       | 
       | If AI truly can produce games or many of their components, these
       | games will form the baseline quality of cheap game groups on the
       | marketplaces, just like in the logo example above. The buyer will
       | pay a premium for a quality, human product. Well, at least until
       | AI can meaningfully surpass humans in creativity - the models we
       | have now can only mimic and there isn't a clear way to make them
       | surpass.
        
         | gdulli wrote:
         | > There is a small, hopeful flipside to this. While people
         | using AI to produce art (such as concept art) have flooded the
         | market, real skills now command a higher price than before.
         | 
         | It's "hopeful" that the future of all culture will resemble
         | food, where the majority have access to McDonalds type slop
         | while the rich enjoy artisan culture?
        
           | caseyy wrote:
           | It's hopeful because AI has not devalued creative human labor
           | but increased its worth. Similar to how if one were a skilled
           | chef, they didn't start working for McDonald's when it came
           | to be, but for a restaurant that pays significantly above
           | McDonald's.
           | 
           | Most people's purchasing power being reduced is a separate
           | matter, more related to the eroding middle class and
           | greedflation. Many things can be said about it, but they are
           | less related to the trend I highlighted. Even if, supposing
           | the middle class erosion continues, the scenario you suggest
           | may very well play out.
        
         | JohnMakin wrote:
         | > real skills now command a higher price than before.
         | 
         | Only if companies value/recognize those real skills over that
         | of the alternative, and even if they do, companies are pretty
         | notorious for choosing whatever is cheapest/easiest (or
         | perceived to be).
        
       | indoordin0saur wrote:
       | This article is an example of why the gender-neutral use of
       | pronouns makes things a pain to read. If you're already changing
       | the interviewees' names then IDK why you couldn't just pick an
       | arbitrary he/she pronoun to stick to for one character.
       | 
       | > Francis says their understanding of the AI-pusher's outlook is
       | that they see the entire game-making process as a problem, one
       | that AI tech companies alone think they can solve. This is a
       | sentiment they do not agree with.
        
         | add-sub-mul-div wrote:
         | There's nothing painful about this to anyone who hasn't been
         | conscripted into the culture wars.
        
           | indoordin0saur wrote:
           | But it was the culture war that resulted in this change to
           | the language. Previous to the war, singular 'they' was to be
           | avoided due to the ambiguity it introduces.
        
             | spacecadet wrote:
             | What ambiguity? We know it's a human, the human has a name.
             | We do not know their gender or sex, both are not relevant.
             | They works perfectly.
             | 
             | This seems like a you problem...
        
         | gwbas1c wrote:
         | "they" was a gender-neutral pronoun when I was in school in the
         | 1990s.
        
           | ryoshoe wrote:
           | Singular they was used by respected authors even as far back
           | as the 19th century.
        
           | indoordin0saur wrote:
           | It has been considered normal in some colloquial uses for a
           | long time. But until the late 2010s/early 2020s all style
           | guides considered it to be poor form due to the ambiguity and
           | muddy sentence structure it creates. Recommendations were
           | changed recently for political reasons.
        
             | spacecadet wrote:
             | Shit changes. You can either let it roll off you or over
             | you. Alot less painful rolling off.
        
       | caseyy wrote:
       | AI is the latest "overwhelmingly negative" games industry fad,
       | affecting game developers. It's one of many. Most are because
       | nine out of ten companies make games for the wrong reason. They
       | don't make them as interactive art, as something the developers
       | would like to play, or to perfect the craft. They make them to
       | make publishers and businessmen rich.
       | 
       | That business model hasn't been going so well in recent years[0],
       | and it's already been proclaimed dead in some corners of the
       | industry[1]. Many industry legends have started their own studios
       | (H. Kojima, J. Solomon, R. Colantonio, ...), producing games for
       | the right reasons. When these games are inevitably mainstream
       | hits, that will be the inflection point where the old industry
       | will significantly decline. Or that's what I think, anwyay.
       | 
       | [0] https://www.matthewball.co/all/stateofvideogaming2025
       | 
       | [1] https://www.youtube.com/watch?v=5tJdLsQzfWg
        
       | more_corn wrote:
       | Everyone I know uses it to some degree. Simply having a smart
       | debugger does wonders. You don't have to give up control, it can
       | help you stay in flow state. Or it can constantly irritate you if
       | you fight it.
        
       | gwbas1c wrote:
       | I would think that, if AI-generated content is inferior, these
       | games will fail in the marketplace.
       | 
       | So, where are the games with AI-generated content? Where are the
       | reviews that praise or pan them?
       | 
       | (Remember, AI is a tool. Tools take time to learn, and sometimes,
       | the tool isn't worth using.)
        
       | akomtu wrote:
       | Corporations don't need human workers, they need machines, the
       | proverbial cogs that lack their own will and implement the will
       | of the corporation instead. AI will make it happen: human workers
       | will be managed by AI with sub-second precision and kill whatever
       | little creativity and humanity the workers still had.
        
       | specialist wrote:
       | Each example's endeavor is the production of culture. The least
       | interesting use case for "AI".
       | 
       | Real wealth creation will come from other domains. These new
       | tools (big data, ML, LLMs, etc) unlock the ability to tackle
       | entirely new problems.
       | 
       | But as a fad, "AI" is pretty good for separating investors from
       | their money.
       | 
       | It's also great for further beating down wages.
        
       | Animats wrote:
       | AI-generated art just keeps getting better. This looks like a
       | losing battle.
        
       | aucisson_masque wrote:
       | A.i. is a blatant case of darwinism.
       | 
       | There are those who adapt, those who will keep moaning about it
       | and finally those who believe it can do everything.
       | 
       | First one will succeed, second one will be replaced, third one is
       | going to get hurt.
       | 
       | I believe this article and the people it mentions are mostly from
       | the second category. Yet no one with all his mind can deny that
       | ai makes writing code faster, not necessarily better but faster,
       | and games at the end are mostly codes.
       | 
       | Of course ai is going to get pushed hard by your ceo, he knows
       | that if he doesn't, another competitor who use it will be able to
       | produce more games, faster and less expensive.
        
         | ohgr wrote:
         | So on that basis you think the market is happy with shit things
         | made very fast?
         | 
         | I can assure you it's not. And people are starting to realise
         | that there is a lot of shit. And know that LLMs generate it.
        
         | ang_cire wrote:
         | > another competitor who use it will be able to produce more
         | games, faster and less expensive
         | 
         | And yet this is no guarantee they will succeed. In fact, the
         | largest franchises and games tend to be the ones that take
         | their time and build for quality. There are a thousand GTA
         | knock-offs on Steam, but it's R* that rakes in the money.
        
       | DadBase wrote:
       | I've been doing "vibe coding" since Borland C++. We used to align
       | the mood of the program with ambient ANSI art in the comments. if
       | the compiler crashed, that meant the tone was off.
        
       | gukov wrote:
       | Shopify CEO: "AI usage is now a baseline expectation"
       | 
       | https://news.ycombinator.com/item?id=43613079
        
       | crvdgc wrote:
       | A perspective from a friend, who recently gave up trying to get
       | into concept art:
       | 
       | Before AI, there was out-sourcing. With mass-produced cheap
       | works, foreign studios eliminated most junior positions.
       | 
       | Now AI is just taking this trend to its logical extreme: out-
       | sourcing to machines, the ultimate form of out-sourcing. The cost
       | approaches to 0 and the quantity approaches to infinity.
        
       | jongjong wrote:
       | > In terms of software quality, I would say the code created by
       | the AI was worse than code written by a human-though not
       | drastically so-and was difficult to work with since most of it
       | hadn't been written by the people whose job it was to oversee it.
       | 
       | This is a key insight. The other insight is that devs spend most
       | of their time reading and debugging code, not writing it. AI
       | speeds up the writing of code but slows down debugging... AI was
       | trained with buggy code because most code out there is buggy.
       | 
       | TBH, I don't think there exists enough non-buggy code out there
       | to train an AI to write good code which doesn't need to be
       | debugged so much.
       | 
       | When AI is trained on normal language, averaging out all the
       | patterns produces good results. This is because most humans are
       | good at writing with that level of precision. Code is much more
       | precise and the average human is not good at it. So AI was
       | trained on low-quality data there.
       | 
       | The good news for skilled developers is that there probably isn't
       | enough high quality code in the public domain to solve that
       | problem... And there is no incentive for skilled developers to
       | open source their code.
        
       ___________________________________________________________________
       (page generated 2025-04-08 23:00 UTC)