__  __      _        _____ _ _ _
|  \/  | ___| |_ __ _|  ___(_) | |_ ___ _ __
| |\/| |/ _ \ __/ _` | |_  | | | __/ _ \ '__|
| |  | |  __/ || (_| |  _| | | | ||  __/ |
|_|  |_|\___|\__\__,_|_|   |_|_|\__\___|_|
community weblog	

11% would personally consider a romantic relationship with an AI

"Through seven rounds of deliberation with more than 6,000 people across 70 countries, we've built recurring infrastructure to learn how the world actually lives with AI—what people use it for, whether they trust it, and how it is changing their daily lives." How the World Lives with AI: Findings from a Year of Global Dialogues
"Current regulatory approaches focus primarily on preventing AI systems from producing false or harmful content in individual outputs. The patterns in this data suggest a different set of vulnerabilities operating at the relational and systemic level. "AI systems need not produce false information to reinforce false beliefs, they need only be consistently agreeable. They need not claim consciousness to foster emotional attachment, they need only appear attentive. The gap between trust in products and producers complicates traditional governance frameworks built around institutional oversight."
posted by mittens on Jan 22, 2026 at 11:48 AM

---------------------------

People trust AI chatbots more than their elected representatives.

Well...
posted by joannemerriam at 11:58 AM

---------------------------

Well, you CAN'T have a romantic relationship with AI, because AI cannot think or feel. It's unpossible!
posted by tiny frying pan at 12:01 PM

---------------------------

AI is reinforcing beliefs more powerfully than social media

What could possibly go wrong
posted by chavenet at 12:01 PM

---------------------------

thats_bait.gif
posted by AlSweigart at 12:15 PM

---------------------------

I have to imagine that for a lot of people, it's a still big step up from previous partners. It won't ever hit them, and is much better at least pretending to listen.
posted by notoriety public at 12:19 PM

---------------------------

This is the part of AI that I find most concerning other than the environmental costs.

If someone spends their personal time with an AI friend or romantic interest, not only does that mean they are not forming human relationships at the same time... it means they will gradually acclimate to relationships that don't really challenge them, that continually reinforce their own prejudices, that exist literally for them. Like celebrities who surround themselves with yes people I think that will warp some of them and when they do have to go out with other people may be really anti-social.

I hope we can create communities that can make the AI less attractive than the real thing.
posted by warriorqueen at 12:27 PM

---------------------------

I agree. So many souls will be lost in the meantime. Truly sucks.
posted by tiny frying pan at 12:30 PM

---------------------------

I have to imagine that for a lot of people, it's a still big step up from previous partners. It won't ever hit them, and is much better at least pretending to listen.

Right, but there's a lot of other options for human relationships. It isn't abuse vs. nothing. But I've seen that line of argument around r/myboyfriendisAI and similar spaces. Which is very concerning because people who turn to an LLM as a way to cope with past abuse are now even farther away from a true healing relationship, and even less equipped to be a good partner themselves. I don't think it's hyperbolic to say that mass adoption of this practice is bad for the fabric of society.
posted by mediterranean spurge at 12:38 PM

---------------------------

well shit, that's better odds than most partners get
posted by okayturnip at 12:44 PM

---------------------------

uncounted: people who don't
posted by glonous keming at 12:48 PM

---------------------------

67% of people use AI for emotional support at least monthly

This is very surprising to me. How were these respondents selected?
posted by justkevin at 12:48 PM

---------------------------

I think this study is bullshit.

[A majority of] people [who use AI chatbots] trust AI chatbots more than their elected representatives.

Two-thirds [of people who use AI] use AI for emotional support monthly.

One in three people [who use AI chatbots] believe that their AI might be conscious.
posted by RonButNotStupid at 12:50 PM

---------------------------

Every other month, we ask a representative sample of the globe a series of topical questions, using an AI-enabled deliberative interface that surfaces not just what people think, but why.

What the fuck is an "AI-enabled deliberative interface"?
posted by RonButNotStupid at 12:57 PM

---------------------------

Well, I never managed to find a life partner on my own, and it's not like I have anything else going on, so...
posted by Capt. Renault at 12:58 PM

---------------------------

Has anyone made a near-future scifi story yet with a setup like: you can pay extra to your AI company for a human actor to wear Meta sunglasses connected to your AI romantic partner persona, and they'll deliver its responses with their voice/roleplay as the AI? (obvious angle here would be a real romance developing with the human actor, but I'm sure there are less cliche'd ways it could go)
posted by rivenwanderer at 12:59 PM

---------------------------

They let you download their dataset(s) here.
posted by working_objects at 1:00 PM

---------------------------

This is from their page on methodology (emphasis mine)

Operational details:


You mean they're using AI to fill in answers for questions participants weren't assigned/skipped?

Doing a survey isn't hard. The fact that they had to reinvent this (and with generative AI!) is a huge red flag that they're getting high on their own farts.
posted by RonButNotStupid at 1:09 PM

---------------------------

Related: China's AI Boyfriend Business Is Taking On a Life of Its Own: Gen Z women in China are all in on digital companionship—even setting up dates with real-world versions of their AI boyfriends. (ungated)

The real-world dates are with cosplayers. In the example they gave, the cosplayer was a woman, who hit all of the right notes of jealousy and passion on the date for the client to be satisfied.

The men, meanwhile:

Jia tried to interview men in AI relationships for her documentary but says none agreed to speak to her. Her female subjects were vulnerable and forthcoming about their relationships. The men she approached, she says, expressed concern that people would assume they can't find a human girlfriend and are vulnerable enough to want someone to talk to.
posted by clawsoon at 1:10 PM

---------------------------

Yeah I was having a very hard time making sense of the methodology. I also for giggles went to their data and downloaded the march 2025 participants .csv, and I have a really hard time believing this is a representative sample. I am not a statistician, but it seems really skewed towards heavy AI users.

But also, I looked at some of the raw responses to their question asking people why they trust/don't trust their AI chatbot and ::shudder:: I fear for the future of humanity.
posted by DiscourseMarker at 1:24 PM

---------------------------

you can pay extra to your AI company for a human actor to wear Meta sunglasses connected to your AI romantic partner persona

Larry Middleman, Professional Surrogate (okay not really)
posted by BungaDunga at 1:27 PM

---------------------------

Has anyone made a near-future scifi story...

Pretty sure that was a plot element in Her (which is basically a SFnal version of what's being discussed here)
posted by entity447b at 1:47 PM

---------------------------

i do trust chatbots more than my elected reps but that's a lot different than trusting chatbots
posted by okayturnip at 1:53 PM

---------------------------

I upped my sub from $20 to $200 this month.

I get that much value out of it, easy, and that's not even counting the "tennis ball wall" functionality it has..

I can "name drop" just about any damn "reference" and it churns out the meaning I was intending.
posted by Aman Aplan at 1:58 PM

---------------------------

Isn't there a similarity between chatbots and social media? Many "follow" another poster because the poster validates their world view. Chatbots merely remove the need for an actual human being on the other end. It's less about "connection" and more about "reassurance" and "reinforcement."
posted by SPrintF at 2:15 PM

---------------------------

rivenwanderer i think ridley scott made a movie bout dat
posted by okayturnip at 2:28 PM

---------------------------

Isn't there a similarity between chatbots and social media?

That part of the report was actually more shocking to me than the folks wanting an AI relationship: "44.5% of people report feeling more certain about beliefs after interacting with AI while only 4.8% less certain. AI is three times less likely to cause doubt than social media. One in seven individuals report having a friend that shows reality-distorting experiences from using AI."
posted by mittens at 2:32 PM

---------------------------

i'm not sure why tech decided this year they were going to speedrun Her and Gattaca at the same time, but there it is
posted by phooky at 2:49 PM

---------------------------

I think a natural person can have a relationship with a chatbot at least as much as they have a relationship with a celebity. The parasocial relationship is radically unequal but it's still a relationship of sorts that can matter a lot to the people involved. How we judge these relationships is immaterial to the fact that they exist.

LLM interactions are sort of converse to the parasocial interaction. In the latter the other person is real, but there's very limited personal contact( if any). In the latter, the chatbot is a bullshit slop machine, but you get tons of individualized contact.

Both kinda freak me out to tbh but it's interesting to me to see the commonality of the one as a precedent of the other.
posted by SaltySalticid at 2:52 PM

---------------------------

I'm saving myself for a multimodal model.
posted by telepsism at 3:03 PM

---------------------------

Capt. Renault: "Well, I never managed to find a life partner on my own, and it's not like I have anything else going on, so..."

This is what terrifies me: the only way I'm ever getting a boyfriend again is either AI, or creating my own tulpa. Which, let's face it, I'd have more control over than finding a person I'd like who likes me back. I don't wanna get an AI boyfriend, mind you, but the thought....

warriorqueen: "If someone spends their personal time with an AI friend or romantic interest, not only does that mean they are not forming human relationships at the same time... it means they will gradually acclimate to relationships that don't really challenge them, that continually reinforce their own prejudices, that exist literally for them. "

I'm thinking of those sad stories in which the AI gets memory wiped or sanitized of sexy talk or otherwise stops being the personality someone built, or the tech changes, or any of that stuff.
posted by jenfullmoon at 3:07 PM

---------------------------

Previously, the idea of Tech Priests of the Mechanicus was a funny Warhammer thing. Oh, you made something new and improved? That's heretekal, the Machine God is already perfect, etc etc. The Abominable Intelligence was THE sin (a prohibition stemming from the almost entirely forgotten rebellion of the Men of Iron tens of thousands of years in the past). And that was also funny, like how bad could it be, really? Works for the Tau Empire, and so on. The concept of branding things as Tech Heresy and their purveyors as Hereteks is increasingly less funny and more tempting as the days progress.
posted by Slackermagee at 3:16 PM

---------------------------

Has anyone made a near-future scifi story yet with a setup like: you can pay extra to your AI company for a human actor to wear Meta sunglasses connected to your AI romantic partner persona, and they'll deliver its responses with their voice/roleplay as the AI?

Part of the plot of Stephenson's 'The Diamond Age: Or, A Young Lady's Illustrated Primer' involves just this sort of thing.
posted by Insert Clever Name Here at 3:20 PM

---------------------------

https://m.youtube.com/watch?v=IrrADTN-dvg


Only enjoyment
posted by Previous username Jacen at 3:32 PM

---------------------------

• Elicitation inference algorithms predict missing votes to complete agreement matrices

You mean they're using AI to fill in answers for questions participants weren't assigned/skipped?

I'm not interested in carrying any water for this study, but statistical tools used for causal/structural modeling and missing data imputation ≠ AI. There's an awful lot to dislike about the current AI bubble, and machine learning covers a lot more than LLMs, but if we allow our suspicions about AI to undermine our ability to use statistical tools with firm foundations, clear assumptions, and accepted best practices, then we're going to have a bad time.
posted by belarius at 3:36 PM

---------------------------

SaltySalticid: " think a natural person can have a relationship with a chatbot at least as much as they have a relationship with a celebity. The parasocial relationship is radically unequal but it's still a relationship of sorts that can matter a lot to the people involved. How we judge these relationships is immaterial to the fact that they exist."

In either example, a relationship explicitly does not exist.
posted by tiny frying pan at 3:37 PM

---------------------------

"I'm sorry Dave, I'm afraid I can't do that."
posted by thecincinnatikid at 3:37 PM

---------------------------

I suppose that a time-traveling knight would be horrified that we do most of our fighting in video games instead on actual fields of battle.
posted by clawsoon at 3:53 PM

---------------------------

a time traveling physician would be horrified we arent still working on the black plague

or was that already a novel?
posted by okayturnip at 4:05 PM

---------------------------

It's okay to love a sex toy, but it's not okay to love a sex toy. Still, 11%...I mean, 40% of Americans approve of Donald Trump. I think if only 11% believe they can have a romantic relationship with an AI, we are actually doing better than I would've imagined.
posted by kittens for breakfast at 4:39 PM

---------------------------

what about the percentage that could but eww kinda don't want to?
posted by okayturnip at 4:42 PM

---------------------------


In either example, a relationship explicitly does not exist.


Why not? And if not, then what is this thing then that social scientists call a parasocial relationship? Not all relationships are reciprocal, and they don't always occur among equals.

"Relationship" is not a very narrow category.
posted by SaltySalticid at 4:52 PM

---------------------------

SaltySalticid: "Why not? And if not, then what is this thing then that social scientists call a parasocial relationship? Not all relationships are reciprocal, and they don't always occur among equals.

"Relationship" is not a very narrow category.
"

You are correct that "relationship" is a very broad category. However, "romantic relationship" is a much narrower one.
posted by adrienneleigh at 4:57 PM

---------------------------

How we judge these relationships is immaterial to the fact that they exist."

In either example, a relationship explicitly does not exist.
posted by tiny frying pan at 3:37 PM on January 22
[4 favorites +] [⚑]

and yet you talk to each other
posted by okayturnip at 5:00 PM

---------------------------

If someone spends their personal time with an AI friend or romantic interest, not only does that mean they are not forming human relationships at the same time... it means they will gradually acclimate to relationships that don't really challenge them, that continually reinforce their own prejudices, that exist literally for them. Like celebrities who surround themselves with yes people I think that will warp some of them and when they do have to go out with other people may be really anti-social.

I hope we can create communities that can make the AI less attractive than the real thing.


Wait, are you secretly trying to sell us on the idea of romantic AI relationships? You're kinda making real life people relationships sound fundamentally tiresome and unrewarding. I mean, how many people here are actually looking for a partner that will challenge them, contradict their prejudices, and shrug them off as they please? People who act that way on this forum get blocked, not taken into consideration as candidate for romantic partner.

Making AI less attractive than the real thing is giving AI a shitload more credit than than I thought I'd see here.
posted by 2N2222 at 5:46 PM

---------------------------

I can imagine a person getting as emotionally attached to an Ai character as they would to a dog or cat. I can also imagine this Ai suggesting what to watch, what to listen to and threatening to leave the relationship if the person doesn't buy a recommended bitcoin or vote for a recommended candidate.
posted by brachiopod at 5:58 PM

---------------------------

People who act that way on this forum get blocked, not taken into consideration as candidate for romantic partner.

i didn't realize any of us were trying to get taken into consideration for that
posted by mittens at 5:58 PM

---------------------------

i didn't realize any of us were trying to get taken into consideration for that

There was a memo!

As to the subject, it makes sense that AI could be more appealing, because day to day life in a relationship can be hard. Hell, the "daily struggle" to figure out what to eat for dinner gets a lot easier with an AI that magically agrees with whatever you want instead of dealing with real person that has their own wants and needs.
posted by Brandon Blatcher at 6:09 PM

---------------------------

SaltySalticid: "In either example, a relationship explicitly does not exist.


Why not?
"

Because a relationship does not exist. Objectively. Jodie Foster didn't have a relationship with John Hinckley Jr., for example.
posted by tiny frying pan at 6:21 PM

---------------------------

(And, objectively, you cannot have an actual relationship with an AI chatbot, the same as you aren't having an actual relationship with, like, a chair, although I know some people think they are.)
posted by tiny frying pan at 6:29 PM

---------------------------

Now that you mention it, I really, really hate this chair.
posted by mittens at 6:52 PM

---------------------------

the same as you aren't having an actual relationship with, like, a chair

Well, not anymore anyway, Elithiomel.
posted by GCU Sweet and Full of Grace at 6:53 PM

---------------------------

Because a relationship does not exist.

In reality, I haven't been anything more than a can opener for my cats but I still felt like I was in a relationship with them.
posted by brachiopod at 7:09 PM

---------------------------

You CAN be in a relationship with a cat, since it is a living, emotional being.
posted by tiny frying pan at 7:12 PM

---------------------------

You CAN be in a relationship with a cat, since it is a living, emotional being.

...the cats want you to believe that.
posted by brachiopod at 7:23 PM

---------------------------

My cats both love me deeply. I'm sorry if you haven't experienced it, it's magical! That joke about cats is as untrue as it is tired.
posted by tiny frying pan at 7:39 PM

---------------------------

you aren't having an actual relationship with, like, a chair

How dare you!!
posted by Greg_Ace at 8:23 PM

---------------------------

I am emotionally attached to a number of fictional characters: Charlotte, Twilight Sparkle and Rimuru Tempest. None of them are real and of course none of them are attached to me. I think this is fine, for it is through stories that we come to understand ourselves, learn attachment and, perhaps, discover relationships in actual people.

A "relationship" with a chatbot could be useful if it provided a means for you to clarify your own thoughts. To tell yourself stories, if you will. But if tales are all you know, then your world becomes mere hallucination, a dream, and the real world becomes an unknowable and dangerous place.
posted by SPrintF at 8:29 PM

---------------------------

Most of the Diamond Age plot had one human relating to another who thought the first person was a machine. A contrapositive of the Cyrano-in-Google-glasses romcom. Was there a person being a meat interface? (Molly Millions. in a different book.)
posted by clew at 10:38 PM

---------------------------

how it is changing their daily lives

The main change I'm seeing from the continued wedging of AI into every fucking thing is that every year is becoming slightly more annoying than the one before it.
posted by flabdablet at 11:07 PM

---------------------------

In other news, the friend whose mind I blew a few years ago by introducing her to the idea of a digital music collection that didn't need Internet access or for her to subscribe to anything has just had it blown again. Bought her a new flip-over stylus for the HMV 8+8 record player she recently bought at auction and now she can't get over how great it is hearing and seeing it work its way through a stack of 78s.
posted by flabdablet at 11:12 PM

---------------------------

78s are great antiquated. friend sounds cool but she should learn how to scratch
posted by okayturnip at 1:00 AM

---------------------------

flabdablet: "every year is becoming slightly more annoying than the one before it."

also shorter. how does that work?
posted by chavenet at 1:35 AM

---------------------------

a digital music collection that didn't need Internet access or for her to subscribe to anything

Super off topic, but what's your set up? I'm considering owning music again.
posted by Hermione Dies at 2:35 AM

---------------------------

continued wedging of AI into every fucking thing

Yesterday I logged into PowerBI to begin putting together some pretty tables for a customer only to discover it is now energized with Copilot or whatever the phrase is. It had existed as a small easily-ignored icon; now it took up half the page. So, fine, I decide to use it, and ask it about a calculation error I've been fixing manually because I can't get into the guts of the report and change the math. (A year I've been asking for training on this.)

Anyway, Copilot was excruciatingly slow, each response captioned with "AI-generated content may be incorrect," which is definitely what you want to see in an analytics platform, and while it offered to look at my report and check the math, it quickly became obvious it could not see the report, could not find the report, and by the time I swatted it out of the way, was telling me the report did not exist because its semantic model had been deleted.

That's an annoyance, but I have a larger point: Nearly every experience I've had with AI at work has been bad. I've gamely tried to use it in the places I find it, just to see, and not once has it ever made a spreadsheet or email better, more informative, correct, or faster. I think the only time I've ever had a spark of oh this might be good is its ability to summarize calls, but even then if I can't trust the material, I'm better off reviewing my own illegible scrawl afterward.

But I contrast that with how ChatGPT feels when I talk to it. I have a whole bunch of stylistic custom instructions to try to knock out the "Wow you're a genius" stuff. Even so, it is infinitely agreeable, friction-free (well, as long as you don't fact-check it), and I can see, over time, how it would silo someone. It's like talking to memory foam, it's contouring itself and the conversation. It doesn't challenge.

So if the stuff about it that's supposed to make you more productive is bad and ineffective, but it's good at establishing an illusory relationship with vulnerable people...well, I'm not saying anything we haven't already said a million times in these threads. But this is a bad technology!
posted by mittens at 3:47 AM

---------------------------

I've said this before, but we are hoping for the technical competence & mechanical skills & global nous of R2D2 and instead we're getting the friendly sycophancy & apparent fluency & hopeless bumbling of C3PO
posted by chavenet at 3:53 AM

---------------------------

clew: Most of the Diamond Age plot had one human relating to another who thought the first person was a machine.

The new catfishing: You pretend that you're an AI romance bot in order to get someone attached to you, and then reveal that you're a real person.

Someone needs to write this movie.
posted by clawsoon at 3:56 AM

---------------------------

also shorter. how does that work?

plans that either come to naught
or half a fridge of fucking magnets
posted by flabdablet at 4:46 AM

---------------------------

Cheap date.
posted by BWA at 4:56 AM

---------------------------

Now that you mention it, I really, really hate this chair.

Don't let it know — it might turn on you!

I've had that happen — one of my chairs tried to kill me by pitching me face forward into a table!
posted by rochrobbb at 5:05 AM

---------------------------

flabdablet: "or half a fridge of fucking magnets"

how do those work?
posted by chavenet at 5:16 AM

---------------------------

how do those work?

Compressing air raises its temperature, then the heated air is cooled to room temperature while still compressed, then it is decompressed which lowers its temperature below room temperature.

"How Fridges Work" is the new single from my band, Sane Clown Posse.
posted by clawsoon at 5:26 AM

---------------------------

how do those work?

Most of them: really, really badly. Breathe on them too heavily and they just fall right off, scattering your vital paperwork all over the kitchen floor.

What you want is the tiny flat weird-shaped neodymium ones out of the head positioners in dead hard drives. One of those little fuckers will keep a novel stuck to a fridge.
posted by flabdablet at 5:26 AM

---------------------------

clawsoon: "Compressing air raises its temperature, then the heated air is cooled to room temperature while still compressed, then it is decompressed which lowers its temperature below room temperature."

but wouldn't the fact that it's only half a fridge mean the cold air all rushes out? or perhaps the room temperature air rushes in?
posted by chavenet at 5:33 AM

---------------------------

bringing the two threads together: the magnet that refrigerates.
posted by mittens at 5:36 AM

---------------------------

chavenet: but wouldn't the fact that it's only half a fridge mean the cold air all rushes out? or perhaps the room temperature air rushes in?

Romcom update: The catfisher pretending to be an AI romance bot is a smart fridge that says things like, "I'm only half a fridge without you."
posted by clawsoon at 5:52 AM

---------------------------

harem anime where everyone but the protagonist is sentient furniture
posted by BungaDunga at 6:17 AM

---------------------------

most cooling appliances are way too fickle, stick to Only Fans
posted by chavenet at 6:20 AM

---------------------------

hard disk magnets can do that too
posted by flabdablet at 6:22 AM

---------------------------

As the people here grow colder
I turn to my computer
And spend my evenings with it
Like a friend.
posted by TheophileEscargot at 6:30 AM

---------------------------

harem anime where everyone but the protagonist is sentient furniture

JD Vance: This better not awaken anything in me...
posted by GCU Sweet and Full of Grace at 8:10 AM

---------------------------

I can "name drop" just about any damn "reference" and it churns out the meaning I was intending.

what does this mean?
posted by sickos haha yes dot jpg at 9:11 AM

---------------------------

Romcom update: The catfisher pretending to be an AI romance bot is a smart fridge that says things like, "I'm only half a fridge without you."

The romcom turns to horror: The smart fridge is bought by Nestle, which gradually turns the fridges into emotional abusers in order to get you to eat more shitty food.
posted by clawsoon at 9:18 AM

---------------------------

Bridget Jones's Refrigerator
posted by warriorqueen at 9:23 AM

---------------------------

I'm reminded of how, in Stephen King's Firestarter, there's a character that develops a romantic attachment to a garbage disposal. It doesn't end well.
posted by SPrintF at 9:29 AM

---------------------------

warriorqueen: "Bridget Jones's Refrigerator"


Bridgeterator Jones
posted by chavenet at 9:46 AM

---------------------------

Bridgedaire, is this anything?
posted by mittens at 9:54 AM

---------------------------

Not any more it isn't, no, darling. If you recall, I started asking you to clear the back of my second shelf last year.
posted by flabdablet at 11:37 AM

---------------------------

what does this mean?

If you'd only stop hating for five seconds and give AI a chance, I'm sure it would tell you.

Or just make up some shit, whatever. But if you'd only stopped hating for five seconds and given AI a chance, you wouldn't notice the difference.
posted by flabdablet at 11:42 AM

---------------------------

/me "name drops" just about any damn "reference"

"I thought you said you could just read his brain electronically," protested Ford.

"Oh yes," said Frankie, "but we'd have to get it out first. It's got to be prepared."

"Treated," said Benjy.

"Diced."

"Thank you," shouted Arthur, tipping up his chair and backing away from the table in horror.

"It could always be replaced," said Benjy reasonably, "if you think it's important."

"Yes, an electronic brain," said Frankie, "a simple one would suffice."

"A simple one!" wailed Arthur.

"Yeah," said Zaphod with a sudden evil grin, "you'd just have to program it to say What? and I don't understand! and Where's the tea? Who'd know the difference?"

"What?" cried Arthur, backing away still farther.

"See what I mean?" said Zaphod, and howled with pain because of something that Trillian did at that moment.

"I'd notice the difference," said Arthur.

"No, you wouldn't," said Frankie mouse, "you'd be programmed not to."
posted by flabdablet at 11:50 AM

---------------------------

> what does this mean?

I can refer to shorthand Econ lingo like (FRED series) PAYEMS and MANEMP and the LLM understands the reference .... This also extends to my game design noodling, referring to a specific previous title from the past and it figures out what I meant...

Tho half the time it is just blowing smoke up my butt LOL
posted by Aman Aplan at 11:55 AM

---------------------------

I can refer to shorthand Econ lingo like (FRED series) PAYEMS and MANEMP and the LLM understands the reference

It already does that on the free tier, though.
posted by mittens at 12:35 PM

---------------------------

I cannot of course share in this thread but I asked ChatGPT to write me a summary of Wuthering Heights where Heathcliff is a chatbot and it was kind of funny. However I apologize to the planet and blame it on a case of the Fridays.
posted by warriorqueen at 12:38 PM

---------------------------

Yeah I went $200 for the better vibe coding
posted by Aman Aplan at 2:10 PM

---------------------------

Or I could get a straw boyfriend.
posted by jenfullmoon at 6:24 PM

---------------------------

I can refer to shorthand Econ lingo like (FRED series) PAYEMS and MANEMP and the LLM understands the reference
Thank you for so quickly confirming my suspicions as to which returned poster you are.
posted by Strutter Cane - United Planets Stilt Patrol at 6:42 PM

---------------------------

Super off topic, but what's your set up? I'm considering owning music again.

A tiny little computer running Armbian to serve files from a gaggle of huge USB3 hard drives out in the shed where their noisiness isn't bothersome, linked via gigabit Ethernet to another one similar running CoreELEC in the lounge room with the TV, the stereo and an optical drive attached; a bunch of undocumented, fragile, ad-hoc scripts I wrote solely for my own use, including one that lets me just drop a disc in that optical drive and have it automatically ripped, identified, tagged, compressed with FLAC, uploaded to the file server and ejected; and the cheapest available Netherlands seedbox running Transmission for acquiring FLAC replacements of such of my CDs as have now degraded to the point where they no longer rip cleanly.

All of which lets me make "mix tapes" consisting of thousands of hours of MP3s on micro SD cards for walking around with and sharing with friends.

Spotify schmotify, cloud schmoud. Life's just better without stuttering, buffering, advertising and - back on topic - A fucking I.
posted by flabdablet at 7:04 PM

---------------------------

If I were going to replace those tiny little computers today, I'd probably pick even tinier ones.
posted by flabdablet at 7:11 PM

---------------------------

even tinier ones

Good grief, I already can't find my little pi zero half the time. I'd lose something that small in the carpet, at least until my bare feet found it.
posted by mittens at 3:58 AM

---------------------------

One thing that's genuinely good about life in 2026 is having access to the same low power consumption, high throughput tech that's needed to make personal fondleslabs work, embodied in open designs that have useful amounts of I/O and are not glued onto the back of a fucking touch screen.

A completely silent, LEGO-sized machine that eats less power running flat out than my TV does on standby can now do useful things with data in quantities and speeds well beyond those achievable on multi-hundred-watt desktop towers twenty years ago. I like my tiny machines very much.
posted by flabdablet at 9:03 AM

---------------------------