Subj : Re: ChatGPT Writing To : jimmylogan From : Nightfox Date : Thu Dec 04 2025 11:04:46 Re: Re: ChatGPT Writing By: jimmylogan to Nightfox on Wed Dec 03 2025 08:58 pm ji>> But again, is it 'making something up' if it is just mistaken? Ni>> In the case of AI, yes. ji> Gonna disagree with you there... If wikipedia has some info that is wrong, ji> and I quote it, I'm not making it up. If 'it' pulls from the same source, ji> it's not making it up either. For AI, "hallucination" is the term used for AI providing false information and sometimes making things up - as in the link I provided earlier about this. It's not really up for debate. :) Ni>> I've heard of people who Ni>> are looking for work who are using AI tools to help update their resume, Ni>> as well as tailor their resume to specific jobs. I've heard of cases Ni>> where the AI tools will say the person has certain skills when they Ni>> don't.. So you really need to be careful to review the output of AI Ni>> tools so you can correct things. Sometimes people might share Ni>> AI-generated content without being careful to check and correct things. ji> I'd like to see some data on that... Anecdotal 'evidence' is not always ji> scientific proof. :-) That seems like a strange thing to say.. I've heard about that from job seekers using AI tools, so of course it's anecdotal. I don't know what scientific proof you need to see that AI produces incorrect resumes for job seekers; we know that from job seekers who've said so. And you've said yourself that you've seen AI tools produce incorrect output. The job search thing isn't really scientific.. I'm currently looking for work, and I go to a weekly job search networking group meeting, and AI tools have come up there recently. Specifically, recently there was someone there talking about his use of AI tools to help customize his resume for different jobs & such, and he talked about needing to check the results of what AI produces, because sometimes AI tools will put skills & things on your resume that you don't have, so you have to make edits. ji> If that's the definition, then okay - a 'mistake' is technically a ji> hallucination. Again, that won't prevent me from using it as the tool it It's not a "technically" thing. "Hallucination" is simply the term used for AI producing false output. Nightfox --- þ Synchronet þ Digital Distortion: digitaldistortionbbs.com .