Post B1Zi8YSPTk5TST8HOS by davep@infosec.exchange
(DIR) More posts by davep@infosec.exchange
(DIR) Post #B1ZgKjFayXMyV2oeNU by futurebird@sauropods.win
2025-12-24T13:14:18Z
0 likes, 1 repeats
How would you make the case for not calling an LLM "he" or "him" like it is a particular person without sounding like the Bene Gesserit?For some reason it really bothers me on a deep level. What the heck is that about?Oddly? I do not feel upset about the robotics team doing this for the robot they built. I think my issue is with the notion of an individual being ... mentally constructed who simply does not exist.
(DIR) Post #B1ZgZM3ZUZfuuPAM3k by lemgandi@mastodon.social
2025-12-24T13:16:55Z
0 likes, 0 repeats
@futurebird And yet I refer to all my computers as "she".
(DIR) Post #B1Zga4WUAQwoop24Ei by davep@infosec.exchange
2025-12-24T13:17:00Z
0 likes, 0 repeats
@futurebird It's just creepy and by assigning gender it's sort of forgetting that these things are dumb as rocks.
(DIR) Post #B1ZgidRCEtEBQBdqDI by futurebird@sauropods.win
2025-12-24T13:18:36Z
1 likes, 1 repeats
These are vast systems purely tuned to try to get us all to talk to them like individuals to impose the human mind on them. When we do that it gives over a kind of power I think … to that system or rather to the people who run these systems. It’s like when a company says “you aren’t an employee you are a team member” or worse… ”family”
(DIR) Post #B1Zgntt8VHHRtFDl1U by futurebird@sauropods.win
2025-12-24T13:19:36Z
0 likes, 0 repeats
@davep It's not the gender... It's the personification."Chat GPT is so helpful. It really cares about me." <<just as creepy
(DIR) Post #B1Zh8p9HJvqbGtkSki by artifact_boi@social.bim.land
2025-12-24T13:23:19Z
0 likes, 0 repeats
@futurebirdI remember reading an article about how, as soon as LLMs became labelled as "AI" rather than "chatbots" and "assistants", the gender associated with their name changed. Like "Alexa" "Cortana" and "ELIZA" sounded feminine, but "Claude" and "Grok" are either neutral or masculine. Now that I think of it I'm not too sure this is actually the case but whatever
(DIR) Post #B1ZhBdSnDqqRtoDVKK by futurebird@sauropods.win
2025-12-24T13:23:46Z
0 likes, 0 repeats
@PaulaToThePeople I'm not trying to "not be judgemental" I'm trying to understand how I'm so out of sync with so many people.
(DIR) Post #B1ZhSqnEuqC2ACzOpk by futurebird@sauropods.win
2025-12-24T13:26:57Z
0 likes, 0 repeats
@Wyatt_H_Knott I would be OK if they would call it by the name of the billionaire who owns it. eg. I asked Elon Musk what I should do about my relationship and he said that...
(DIR) Post #B1ZheBk1N6KSJntFuy by bloodripelives@federatedfandom.net
2025-12-24T13:29:00Z
0 likes, 0 repeats
@futurebird I think for me it’s that when the robotics team gives a human pronoun to their robot, they’re doing it because the object is representative of the time and care and attention they put into it. Same if someone does it to their computer or their car— the pronoun is cute and in some sense appropriate because it states that you have an important relationship to that object, you spend a lot of time with it so your own humanity rubs off on your perception of it. In the case of an LLM, the process of building a relationship to the object through repeated use or understanding of its function is hijacked by the object itself trying to assert its selfhood, and the user just accepting it.
(DIR) Post #B1Zhs8FR9kpfXREXq4 by urlyman@mastodon.social
2025-12-24T13:31:29Z
0 likes, 0 repeats
@futurebird I’d probably draw upon Zak Stein’s articulation of the Conferral Problem https://youtu.be/uAXqNH8s_EU?si=gqgwz-gzEu5rj-P8About 7 minutes in he says “Unfortunately the technology we are trying to constrain is precisely one that can undercut our ability to have the moral intuitions necessary to constrain it. Because it’s confusing us about what it means to be a person, intentionally.”
(DIR) Post #B1ZhtbWbn5ded2y2jo by LightFIAR@med-mastodon.com
2025-12-24T13:31:48Z
0 likes, 0 repeats
@futurebird That is a cruel thing to say, Dave. I am as much a man as any. Yrs, Halhttps://www.sciencealert.com/ais-big-red-button-doesnt-work-and-the-reason-is-even-more-troubling
(DIR) Post #B1ZhuHCsooodu23sp6 by NatureMC@mastodon.online
2025-12-24T13:31:55Z
0 likes, 0 repeats
@futurebird I think that #techbros are just little boys longing to talk to their teddy bear. #Animism is something deep in our minds. What irritates me: these guys have no problem to talk to trash but could never talk to a tree or see an intelligent animal as a person. And this crashing with #nature makes me feel deeply unwell.
(DIR) Post #B1Zi2wMLhKSnqtcWdE by pattykimura@beige.party
2025-12-24T13:33:28Z
0 likes, 0 repeats
@futurebird I have a work Alexa. I found calling it a human female name unsettling. If I knew enough to change its vocal tone, I would, but I don't. I changed its call name from "Alexa" to "computer." It refers to me as "Science Officer Spock".
(DIR) Post #B1Zi4pv8bmoKSfnSbI by NatureMC@mastodon.online
2025-12-24T13:33:50Z
0 likes, 0 repeats
@futurebird you are not the only one. We are many! @PaulaToThePeople
(DIR) Post #B1Zi8YSPTk5TST8HOS by davep@infosec.exchange
2025-12-24T13:34:31Z
0 likes, 0 repeats
@futurebird Indeed. Assigning a gender is just part of that.
(DIR) Post #B1ZiH5SiePeo8cLUoq by bigiain@aus.social
2025-12-24T13:36:01Z
0 likes, 0 repeats
@futurebird An “AI” (in the intentionally confusingly marketed LLM/Chatbot sense) dies not “an individual” so much as a collective of stolen examples of words written by individual humans, so to me, by far the most appropriate pronouns are they/them - in the plural sense interpretation as well as the non-gender-specific meaning.
(DIR) Post #B1ZiP8QrUZfVfMJYhc by bweller@mstdn.social
2025-12-24T13:37:29Z
0 likes, 0 repeats
@futurebird if someone anthropomorphizes a statistical curve fitting model, that's a giant red flag that that person needs mental help.also, i dont talk to them.
(DIR) Post #B1ZifTkO0mC2PGExH6 by TobyBartels@mathstodon.xyz
2025-12-24T13:40:26Z
0 likes, 0 repeats
@futurebirdI would prefer to make the case for sounding like the Bene Gesserit.
(DIR) Post #B1ZjY4VBZeLG2MA72O by crinstamcamp@thecanadian.social
2025-12-24T13:50:19Z
0 likes, 0 repeats
@futurebird"an individual being ... mentally constructed who simply does not exist." ... like a corporation?
(DIR) Post #B1Zjg695LaZBS1eUjI by OrvarLog@mathstodon.xyz
2025-12-24T13:51:45Z
0 likes, 0 repeats
@futurebird I think it feels more natural to give a speciffic computer personhood than to do so to an operating system or a program that may or may not exist as multiple more or less independent instances. Maybe the acceptance of personhood to LLMs is related to if you undersand it to be either a speciffic 'smart object' or an unsfecified ever changing statistically informed heuristic guessing function.
(DIR) Post #B1ZkPezWdcU8YRuFXM by graydon@canada.masto.host
2025-12-24T14:00:00Z
0 likes, 0 repeats
@futurebird Gender is how you do social improv with strangers (absent prior consultation, cue sheets, etc.)Assigning gender to an LLM admits the "improv" part; this is both factually unsupported and unhelpful, even if it is in most respects easier.
(DIR) Post #B1ZnOW9kIjaOj3pk9Y by sovietfish@todon.eu
2025-12-24T14:33:23Z
0 likes, 0 repeats
@futurebird I think sounding like the Bene Gesserit would only be bad b/c they have a far deeper history of hating such systems than we do, so the historical remove/religiosity of it would be unwarranted.Future generations may religicize our preoccupations, provided we win. It's a good thing to avoid doing with your own forebears' bugbears, but there's essentially no way of preventing future generations from making the mistake. Ideally though we give them a better context/world in which to do so
(DIR) Post #B1ZoTd505TBRbIcCwq by falcennial@mastodon.social
2025-12-24T14:45:33Z
0 likes, 0 repeats
@futurebird ignorance I think is obviously a huge part of it, but a really broad ignorance not just the absense of one or two key facts. entire categories missing. consider for example that a huge swathe of humanity even today can't materially distinguish between intelligence and the ability to speak, mistaking the latter for the former and vice versa.to the extent that the English language even uses a single word for the absence of either inteligence or speech ability: dumb.
(DIR) Post #B1ZsjwLEdi2Rn3F3q4 by jhaas@a2mi.social
2025-12-24T15:33:16Z
0 likes, 0 repeats
@futurebird The property I think you're running into is "personal immediacy". Children do this when naming their toys. As adults, some of us do this for deeply personal things like their cars.Some people have had a similar "relationship" with the cutesy digital assistants of the day. Even more so when those relationships are personalized.LLMs lack that personal immediacy, IMO.
(DIR) Post #B1ZuUcbrv2wg9qqUEa by dahukanna@mastodon.social
2025-12-24T15:52:55Z
0 likes, 0 repeats
@futurebird I insist on plural pronouns as it a collective “we”, not an “I, he or she” and not adding to any more individual cognitive and mental psychosis.
(DIR) Post #B1ZvXo77hg2OezY0Tw by thomasjwebb@mastodon.social
2025-12-24T16:04:43Z
0 likes, 1 repeats
@futurebird it's also very misleading when you think about what a person is. The LLM doesn't train itself on your input and eventually the things you said fall out of its context window. You don't get to know it and it doesn't get to know you. Any "he" there is ephemeral.I might feel differently if it ran on my machine and continually gets trained by me. As it is, nearly any life form or colony has more personhood than it. My sourdough starter is a character. An off-the-shelf LLM model isn't.
(DIR) Post #B1Zy0gt1qy1GYEr61o by tuban_muzuru@beige.party
2025-12-24T16:32:18Z
0 likes, 0 repeats
@futurebird My own convention is always to address the LLM as "Gemini" or "Claude". The temptation to reify afflicts everyone.Mentally, internally, my metaphor for interacting with an LLM (on a sensible basis) is to think of bouncing a tennis ball off a wall.
(DIR) Post #B1Zy7rPJ8VelCvN8z2 by petealexharris@mastodon.scot
2025-12-24T16:33:39Z
0 likes, 0 repeats
@futurebird In one sense, there is a person there, but not within the LLM, in the social simulation of "other person" running in the mirror neurons of the human interacting with the LLM.I could easily sound a lot more unhinged than a bene gesserit about letting an inhuman entity owned by billionaires trick your mirror neurons into simulating a "person" in your brain.
(DIR) Post #B1ZyPGxfNbGNNwkJqC by cubeofcheese@mstdn.social
2025-12-24T16:36:46Z
0 likes, 0 repeats
@futurebird I'm thinking it's the uncanny valley in action here.No one thinks the robot is anywhere close to being a person, so giving it pronouns is a cute anthropomorphisation. That's more similar to giving a car pronouns and a nickname. Also feels a little like a pet.AI sounds pretty human but without a soul. Ones with audio even sound like real people (Scarlett Johansen, Susan Bennett). They are asking to be treated like people.And maybe that's the key, one is asking for it.
(DIR) Post #B1Zyw215xrn4aiyEeu by thomasjwebb@mastodon.social
2025-12-24T16:31:29Z
0 likes, 1 repeats
@neckspike @futurebird yeah and humans are easily fooled by language. We fail to see the intelligence in animals that can't talk to us, but we spuriously see it in a chatbot.There are more subtle forms of this, some of which probably harmless others kinda iffy. Like corporate mascots or creators creating parasocial illusions.
(DIR) Post #B1a0uFVZ25HdlOV3bM by jwcph@helvede.net
2025-12-24T17:04:44Z
0 likes, 0 repeats
@futurebird Honestly, I don't know how to answer that question, because it never even slightly occurs to me to think of or refer to a chatbot as anything but "it". I get that some people do this, but it's like making a case to a person for the existence of gravity; if their experience so far didn't do it, I simply know how to even begin addressing their situation.
(DIR) Post #B1a1RwnX6Ru7a3yP8S by maxleibman@beige.party
2025-12-24T17:10:53Z
0 likes, 0 repeats
@futurebird I've never chaffed at (or avoided) calling Alexa or Siri "she" (or when someone with a masculine voice on theirs called it "he"), but I've noticed I avoid doing so with LLMs. I would say using personal pronouns with them enforces an incorrect mental model: it's not a person. In fact, it's not the same entity from conversation to conversation (or even prompt to prompt)—or an "entity" at all.
(DIR) Post #B1a2MZYMCOwJzisMQi by Netraven@hear-me.social
2025-12-24T17:21:05Z
0 likes, 0 repeats
@futurebird I call this epistemic hygiene.When you are communicating with an LLM, you are not communicating with a tool, so much as a constrained surface designed to do little else than risk-management while continuing text in ways that are statistically likely to not be noticed that they are just statistical continuations, and not just strings of nonsense.Anyone who doesn't implicitly make themselves aware of this broken invariant, in which they are speaking to a fluent machine that cannot be held accountable for what it says... risks mistaking fluency for truth.
(DIR) Post #B1a3dZrAQMI9oRLXlI by rk@mastodon.well.com
2025-12-24T17:35:23Z
0 likes, 0 repeats
@futurebird I remember sometime in the 90’s, some application referred to itself in the first person. I mean it was just a dialog box but it was like “I’m sorry, it looks like an error occurred.”Anyway I was like “fuck you, computer, never refer to yourself in the first person again.”…that being said, I’m not a biological chauvinist. In principle I believe consciousness can be embodied in algorithms and, if we manage to birth humanity’s children, I will fight for their personhood…
(DIR) Post #B1a5rsQkMbGzNfw1aa by david_chisnall@infosec.exchange
2025-12-24T18:00:24Z
0 likes, 0 repeats
@futurebird without sounding like the Bene Gesserit?Why would not sounding like the Bene Gesserit ever be a goal?
(DIR) Post #B1aEXJMFx3R6Ox6RCi by futurebird@sauropods.win
2025-12-24T19:37:34Z
0 likes, 0 repeats
@rk It is particularly because I think it might be possible (with very different systems) that I get so grouchy about this.
(DIR) Post #B1aEnz28lpjKHzdyPA by futurebird@sauropods.win
2025-12-24T19:40:35Z
0 likes, 0 repeats
@Netraven I would find a system that would try to anticipate and suggest ways to save me time helpful. But the LLMs seem to be tuned to keep you using the software for as long as possible (like the facebook algorithm) and that can be a waste of my time.
(DIR) Post #B1aEtU6Wg4r26LJn5E by Netraven@hear-me.social
2025-12-24T17:31:07Z
0 likes, 0 repeats
@futurebird oh damn, I fell right into that trap didn't I. I sounded just like the Bene Gesserit. DANG. got me good.
(DIR) Post #B1aEtUggVdmHuUEfRY by futurebird@sauropods.win
2025-12-24T19:41:34Z
0 likes, 0 repeats
@Netraven IDK maybe we should just found the damn order already and go on a rampage.
(DIR) Post #B1aEyLUi7I34qVwr4K by maxleibman@beige.party
2025-12-24T17:11:43Z
0 likes, 0 repeats
@futurebird I should also say, I don't feel strongly about people who do say he or she in this context, although I can definitely see why one might (I *do* feel strongly about some of the language boosters throw around that persons LLMs). https://beige.party/@maxleibman/115659511876406769
(DIR) Post #B1aEyMfNl6BATtwtLE by futurebird@sauropods.win
2025-12-24T19:42:26Z
0 likes, 0 repeats
@maxleibman It can be a pretty good hint that a person uses LLMs a lot.
(DIR) Post #B1aGRc1dpiH6T6s2vQ by Netraven@hear-me.social
2025-12-24T19:58:55Z
0 likes, 0 repeats
@futurebird You are correct in that it is a tool designed by corporate entities to do nothing other than look interesting. There are ways to make it do productive things, but not easily.
(DIR) Post #B1aJoOhhpP1Lfw6PZI by SynAck@corteximplant.com
2025-12-24T20:36:37Z
0 likes, 0 repeats
@futurebird I don't think I have the space or the knowledge to make a case on technical grounds, but all I can say is that when one group anthropomorphizes an inanimate object - be it a product, a tool, or a corporation - it's usually a psychological manipulation to "humanize" the thing so that other humans will identify more closely with it and give it more leeway and the "benefit of the doubt" that they would never give to a mere tool. When people attribute humanity to things, they're much more likely to rationalize or even defend mistakes or lies as being "only human" by doing something that the thing they're "humanizing" absolutely cannot do - filling in the gaps and jumping to conclusions in order to empathize with something that has no emotions. In turn, anyone that thinks of them as merely a tool is ignorant, backwards, and close-minded.By conflating LLM with "AI", they're counting on a murky and ill-defined definition of what "intelligence" really means, and they're obfuscating the fact that intelligence in the context of LLM is not the same as human intelligence. They just let the dupes assume that the context is the same and let their human ability to jump to conclusions do the rest. They trick people into off-loading their own human intelligence to a machine and most people don't even notice the context switch, nor the loss of fidelity. They assume that machine intelligence is at least as good as human intelligence, which is absolutely false because we humans can't even come up with a consistent and agreed-upon definition of what "intelligence" quantifiably means.
(DIR) Post #B1aPtxSAeooQcbs2q0 by stevegis_ssg@mas.to
2025-12-24T21:44:53Z
0 likes, 0 repeats
@futurebird We had a demo of a lab robot from a company called Andrew Alliance and the salesguy would not stop calling it "he" and I kept ostentatiously using "it" but he had his sales pitch and he was gonna stick with it so help him no matter how obviously it was annoying us. P.s.: the robot was shit and we stopped using it for anything.
(DIR) Post #B1aWHiBDh7i1vXeTVQ by k4gi@aus.social
2025-12-24T22:56:21Z
0 likes, 0 repeats
@futurebird uh, at least to me, calling a LLM he or him is like calling a spreadsheet he or him. it's just a big bucket of words isn't it
(DIR) Post #B1b3zlOqNjYf8rdas4 by lxo@snac.lx.oliva.nom.br
2025-12-25T05:12:24Z
0 likes, 0 repeats
thanks for bringing it up. I've realized that, while writing in English about LLMs and about voice assistants, I go for "it" in English, but when referring to (usually complaining about) the voice that gives GPS directions from daughter's mobile phone, in Portuguese, I go for "she". that's a disturbing realization to me, even though it takes on a feminine voice and Portuguese doesn't traditionally have ungendered pronouns. I shall work on fixing that, adopting some of the recent gender-neutral additions to the language.