Post AXHurivVerFlfjjZOS by Dave3307@mountains.social
 (DIR) More posts by Dave3307@mountains.social
 (DIR) Post #AXHpD5EzI3J3Ebe9U8 by grammargirl@zirk.us
       2023-07-02T14:25:50Z
       
       0 likes, 0 repeats
       
       ChatGPT is so weird!I just got a response I can only summarize like this:I apologize profusely for giving you incorrect information. I was completely correct. I apologize profusely again if I caused confusion.#ChatGPT #AI
       
 (DIR) Post #AXHpKnbWkJM4qvrtyq by StaceyCornelius@mstdn.ca
       2023-07-02T14:27:15Z
       
       0 likes, 0 repeats
       
       @grammargirl One wonders at the definition of "intelligence" in this context. (Ahem)
       
 (DIR) Post #AXHpaSpxj2qx4t5dmy by ATellurian@toot.community
       2023-07-02T14:30:00Z
       
       0 likes, 0 repeats
       
       @grammargirl ChatGPT is like an old uncle who reads but understands little.
       
 (DIR) Post #AXHppvtJz8MRX12MfQ by JamesBoxer10@zirk.us
       2023-07-02T14:31:22Z
       
       0 likes, 0 repeats
       
       @grammargirl This is a bit much for me this morning.
       
 (DIR) Post #AXHprkkiMzcRZ2saRM by jaydax@mastodonapp.uk
       2023-07-02T14:33:09Z
       
       0 likes, 0 repeats
       
       @grammargirl I asked it three times to find the HR representative name for a software firm and each time got a different response. You might also ask it if it considers itself to be sentient. Some AIs crash when asked this.
       
 (DIR) Post #AXHqC1DAswg4P1r3B2 by jeber@mastodon.social
       2023-07-02T14:36:44Z
       
       0 likes, 0 repeats
       
       @grammargirl Sounds like a financial advisor.
       
 (DIR) Post #AXHqaHX1rID3y8i8vo by maxleibman@mastodon.social
       2023-07-02T14:41:09Z
       
       0 likes, 0 repeats
       
       @grammargirl I’ve noticed a pattern among techie types who seem to feel that if they can explain why something went wrong, it didn’t actually go wrong in the first place (“Well OF COURSE Google gave you results for ‘Lex Fridman’ when you searched for ‘Lex Friedman,’ because more people search for Fridman!”). When I probe ChatGPT and similar services about their errors, I often feel like they’ve internalized this pattern. “Sorry, I was wrong, but [reasons], so actually I was right!”
       
 (DIR) Post #AXHupiYA5StlvPrAmm by maineblaine@writing.exchange
       2023-07-02T15:28:46Z
       
       0 likes, 0 repeats
       
       @grammargirl I just got a similar response but only after I asked it to explain how it arrived at the result I asked for. I'm writing a rhyming children's book so I thought it might be nice to have a table listing the page number, the rhyming words for each page, and the number of syllables in each stanza.One problem page specifically that I knew was bad, it kept giving me the wrong number of syllables even when I told it it was wrong until I had it explain how it arrived at its answer.
       
 (DIR) Post #AXHurivVerFlfjjZOS by Dave3307@mountains.social
       2023-07-02T15:29:15Z
       
       0 likes, 0 repeats
       
       @grammargirl I only use it once a week or so and it seems like there are more adjectives/adverbs/filler in it’s output every time.
       
 (DIR) Post #AXI10VzjhtgltcHoES by ElizabethLeeCo@vmst.io
       2023-07-02T16:37:53Z
       
       0 likes, 0 repeats
       
       @grammargirl ChatGPT is gaslighting you, obvs.
       
 (DIR) Post #AXI90jUIL99pNQC7cG by RunningFaster@social.linux.pizza
       2023-07-02T18:07:34Z
       
       0 likes, 0 repeats
       
       @grammargirl Bard is definitely better for my experiences. ChatGPT has given me made up nonsense answers or outright refused my requests for the most innocuous things.
       
 (DIR) Post #AXIBROGRYXKvHSYSNk by Keev@mastodonczech.cz
       2023-07-02T18:34:56Z
       
       0 likes, 0 repeats
       
       @grammargirl It will apologize for every answer it gave to you (even the correct one) if you complain about it
       
 (DIR) Post #AXIDKHsz4XCpNbK45g by grammargirl@zirk.us
       2023-07-02T18:56:02Z
       
       0 likes, 0 repeats
       
       @Keev That makes sense because that's definitely what it seemed to be doing.