[HN Gopher] AI programming tools should be added to the Joel Test
___________________________________________________________________
AI programming tools should be added to the Joel Test
Author : jtwaleson
Score : 13 points
Date : 2024-04-22 08:55 UTC (14 hours ago)
(HTM) web link (blog.waleson.com)
(TXT) w3m dump (blog.waleson.com)
| eternityforest wrote:
| Trying Codeium was like the difference between notepad and an
| IDE.
|
| Unfortunately, it overheats my laptop so I can't actually use it,
| and I primarily do support, I don't program enough at my job to
| justify paying for copilot.
|
| If the CPU use was lower, I don't see why I'd ever go without.
| weikju wrote:
| I haven't used Codeium in a while, but I remember at some point
| in VSCodium it would also spike CPU usage until I turned off
| some features (was it Chat, or Search? something like that).
| xiwenc wrote:
| > Note that this is often not the same as "#9 - Do you use the
| best tools money can buy?" as blocking AI tools is about data
| security, not money.
|
| I do think AI fits #9. The fact that current AI tools are not
| meeting data security requirements are due to the market demands
| and maturity:
|
| - price needs to be low to attract adopters.
|
| - low price? These service providers will hoard data
|
| - data needs to be collected for training
|
| So i think long term, there will be more premium AI tools that
| "promise" to not collect your data. Perhaps self-hosted? Self
| hosting with AI is not attractive, at least not for consumers or
| small businesses.
| jtwaleson wrote:
| Self-hosted is becoming more and more possible. For the first
| time in forever I think it makes sense to buy a beefy personal
| computer.
| simonw wrote:
| OpenAI have paid plans that promise not to collect your data
| already. I think Anthropic do as well.
|
| People seem not to trust companies which make these promises,
| which is unfortunate for the industry.
| anothernewdude wrote:
| I agree, but they should count negatively.
| jasonpeacock wrote:
| > There's plenty of things wrong with these tools: they are often
| wrong, are slow and the GPT4 ones are really expensive.
|
| So...why should they be included?
|
| I really worry about this "often wrong" part - you only know they
| are wrong if you already know what you're doing. Otherwise you
| end up trying to use hallucinated APIs & libraries, or produce
| code not better than copy & pasting StackOverflow answers (which
| is what the AI was trained on anyway).
| simonw wrote:
| If an LLM hallucinates a method name you'll find out the moment
| you try and run the code.
|
| Code has a built-in form of easy fact checking, which makes it
| one of the most appropriate applications for LLMs. It's much
| harder to spot a hallucinated fact in a paragraph of prose than
| it is to spot a hallucinated API method.
|
| The skills you most need to develop in order to take advantage
| of LLM assisted programming are code reading, code review,
| manual and automated testing and being really good at thinking
| of edge-cases that might not be covered.
|
| It turns out these are important skills for being a great
| developer already - LLMs just force the issue on them a little
| more.
| lambda wrote:
| If the LLM hallucinates a Python package, someone else might
| notice that and sneak in a malicious version of that
| hallucinated package.
|
| Or if it hallucinates a method name, it might be in a code
| path that goes untested. How often are people using these
| tools to also write comprehensive test suites?
| ascar wrote:
| Due to the nature of my current work I haven't really used
| GPT for coding yet, but isn't it easier to write code than to
| read and truly understand it? So how much development time
| was really saved, if I still care about off-by-one errors or
| correct identity checks in HashMaps or all those edge cases I
| probably should care about? Those are all things much harder
| to spot reading than writing the code.
| YetAnotherNick wrote:
| > blocking AI tools is about data security, not money.
|
| Why is AI treated very different than say cloud? Most companies
| don't have problem with putting all data in Github or AWS or
| Office 365, but lot of them freaks out if any AI can access the
| data. I don't think OpenAI/copilot enterprise plan T&C/privacy
| policy is very different than Github or AWS.
| mellosouls wrote:
| Yeah I've thought this myself for a while; GPT-4 class assistance
| is now a standard tool for me.
|
| If a prospect client or company bans it, it's a hard no from me.
|
| I understand that might be too extreme a red line for some, but
| for me, life's too short to wait for laggards to catch up with
| the inevitable.
| 2genders42558 wrote:
| hi are u lonely want ai gf?? https://discord.gg/candyai
| KwAABPUPARgvfiGWd
| 2genders41109 wrote:
| hi are u lonely want ai gf?? https://discord.gg/candyai
| OTzznKCeWuPvFJSLU
| 2genders31285 wrote:
| hi are u lonely want ai gf?? https://discord.gg/elyza
| cgfzyxLOQFnjqjZUb
| 2genders31738 wrote:
| hi are u lonely want ai gf?? https://discord.gg/elyza
| kbAKYmNaQrZIMyxtO
| 2genders9958 wrote:
| hi are u lonely want ai gf?? https://discord.gg/elyza -- FOLLOW
| THE HOMIE https://twitter.com/hashimthearab BvFJIxyhmIErkIufj
___________________________________________________________________
(page generated 2024-04-22 23:00 UTC)