Subj : Re: ChatGPT Writing To : Nightfox From : Bob Worm Date : Thu Dec 04 2025 22:35:32 Re: Re: ChatGPT Writing By: Nightfox to jimmylogan on Thu Dec 04 2025 11:04:46 Hi, Nightfox. > ji> If that's the definition, then okay - a 'mistake' is technically a > ji> hallucination. Again, that won't prevent me from using it as the tool > ji> it > > It's not a "technically" thing. "Hallucination" is simply the term used for > AI producing false output. "Hallucination" sounds much less dramatic than "answering incorrectly" or "bullshitting", though. Ironically there was an article on The Register last year saying that savvy blackhat types had caught onto the fact that AI kept hallucinating non-existent libraries in the code it generated so they created some of them, with a sprinkle of malware added in, naturally: https://www.theregister.com/2024/03/28/ai_bots_hallucinate_software_packages/ Scary stat from that article: "With GPT-4, 24.2 percent of question responses produced hallucinated packages". Wow. BobW --- þ Synchronet þ >>> Magnum BBS <<< - magnumbbs.net .