Subj : Re: ChatGPT Writing To : jimmylogan From : Nightfox Date : Wed Dec 03 2025 08:54:25 Re: Re: ChatGPT Writing By: jimmylogan to Nightfox on Wed Dec 03 2025 07:57 am ji> This is the first time I've seen the objective definition. I was told, ji> paraphrasing here, 'AI will hallucinate - if it doesn't know the answer it ji> will make something up and profess that it is true.' ji> If you ask me a question and I give you an incorrect answer, but I believe ji> that it is true, am I hallucinating? Or am I mistaken? Or is my ji> information outdated? ji> You see what I mean? Lots of words, but hard to nail it down. :-) It sounds like you might be thinking too hard about it. For AI, the definition of "hallucinating" is simply what you said, making something up and professing that it's true. That's the definition of hallucinating that we have for our current AI systems; it's not about us. :) Nightfox --- þ Synchronet þ Digital Distortion: digitaldistortionbbs.com .