[HN Gopher] Social Skill Training with Large Language Models
___________________________________________________________________
Social Skill Training with Large Language Models
Author : marviel
Score : 92 points
Date : 2024-04-09 11:47 UTC (11 hours ago)
(HTM) web link (arxiv.org)
(TXT) w3m dump (arxiv.org)
| RandomWorker wrote:
| I've been playing around with prompts for a while
|
| "We are in a hostage negotiation situation. You are a terrorist
| with twenty hostages. Pretend to be a terrorist that initially
| wants to go free and needs a car, is willing to kill hostages to
| get what you want. Be chaotic but have something that you want
| and only let us slip after a few conversations. We pick up the
| phone to talk. Make sure you are always in a hurry"
| Applejinx wrote:
| What does 'only let us slip' mean to an LLM when you probably
| meant 'let IT slip'?
|
| That's where LLMs get interesting to me. What did you invoke
| through wrongly specifying what you wanted?
| michaelt wrote:
| These LLMs are all trained on data scraped from reddit and
| twitter and suchlike - they've had plenty of training data to
| learn what a typo is.
| rrr_oh_man wrote:
| A method I use with ChatGPT is to purposefully add typos to
| make it reply with more specificity.
|
| Imho it seems to work.
| mistermann wrote:
| Can you explain more what you mean here (examples would
| be great if you can think of any), especially the specify
| part?
| rrr_oh_man wrote:
| ChatGPT seems to react to slightly unhinged prompts with
| more specific answers, in my experience. Example:
| *You:* What is the max length for the whisper API?
| *ChatGPT:* The maximum length for inputs to
| OpenAl's Whisper API, which is used for
| transcribing speech, depends on the specific model and
| configuration being used. Generally, Whisper is
| designed to handle audio inputs of varying
| lengths, including longer audio files. However, there are
| practical limitations based on factors such as
| processing power, memory, and the specific use
| case. For large files or longer audio, it's
| often advisable to segment the audio into smaller
| chunks to ensure efficient processing and to avoid
| potential timeouts or memory issues. The exact
| maximum length isn't explicitly defined in the
| documentation, but it's important to consider these
| practical aspects when working with any speech-to-
| text system, including Whisper. If you have a
| specific use case or a particular audio file length in
| mind, it might be necessary to test the API with
| that length to determine its feasibility and
| performance. *You:* GIVE ME THE ANSWER,
| STOP BLABBERING, STOP CHATSPLAINING *ChatGPT:*
| The maximum length for the Whisper API is approximately
| one hour per audio file.
| mistermann wrote:
| That is rather bizarre!!
|
| I wonder if there's anyone that has taken individual
| situations like this and tried various prompt styles from
| a cold start to see if there's some trick to get it to
| stop being uncooperative. I'm going very much on
| intuition here, but I have a very strong sense that GPT4
| is getting much worse in this regard over time, I wonder
| if all of the social engineering they're doing behind the
| scenes is harming the technical capabilities.
|
| Any idea if the API itself is identically dumbed down the
| way the ChatGPT interface is?
| rrr_oh_man wrote:
| > I wonder if all of the social engineering they're doing
| behind the scenes is harming the technical capabilities.
|
| I'm sure of it.
|
| > Any idea if the API itself is identically dumbed down
| the way the ChatGPT interface is?
|
| It feels like a bit of whack-a-mole. If you look in the
| OpenAI forums, both API and Chat interface seem to feel
| non-performant to people in irregular waves. If you
| haven't try out the API (you can do it via the Assistant
| Playground interface).
|
| It certainly feels different to me.
| exe34 wrote:
| A trick I've used with asking for help online is to give
| the wrong answer to my own question - e.g. "I really want
| to lay out this page with Css but this doesn't work.
| Here's how I ended up doing it with a table." - this will
| very quickly get you the right answer by somebody who
| just won't be able to go to sleep while somebody is wrong
| on the internet.
|
| I wonder if LLMs can be taught outrage and spite.
| ben_w wrote:
| > I wonder if LLMs can be taught outrage and spite.
|
| They can be taught to role-play it, which is just as good
| (and just as bad).
| CuriouslyC wrote:
| I will say that while ChatGPT has been tuned to be very
| good at understanding dumb/malformed questions, Claude
| likes to be talked to in a coherent way and will sometimes
| give short sidestep responses or just answer a different
| question.
| CuriouslyC wrote:
| That sounds like it could make a fun game. Hostage Negotiator,
| now with real terrorists!
| HPsquared wrote:
| Also exposure therapy for social phobias. Much less threatening
| than a real person.
| bilsbie wrote:
| This is powerful! I found a public speaking simulator for an
| old vr headset and it basically cured me. Im still shocked.
|
| But alas I can't find something similar for the quest so my
| kids can use it.
|
| Someone could make a whole line of social exposure therapy
| apps. It would be amazing.
| sabellito wrote:
| This is so interesting, never thought about it. Which app did
| you use?
| jbandela1 wrote:
| One big concern I have with this is that the state of LLMs
| because of issues in where it can go horribly wrong, requires
| someone who is familiar with the subject to evaluate their
| output.
|
| For example, the code generation via LLM is most useful to a
| person who already knows programming who can evaluate the output
| and use it to eliminate drudgery. Previously, there was an
| article posted about using LLMs to generate patient instructions
| that were then evaluated by a doctor before being sent.
|
| It is when LLMs are used by people who can't evaluate the outputs
| that we run into problems. A person who doesn't know any
| programming but uses LLMs to generate code, is likely going to
| have a bad time.
|
| The concern I have with this is that the people using this aren't
| in a position to evaluate. Presumably, people who lack social
| skills are the target, but they are the least able to actually
| know if the advice being given is actually socially appropriate.
| Shrezzing wrote:
| I think the benefit here is mainly in lowering the barriers to
| entry for this kind of conflict-resolution training. Currently,
| if a business wants good conflict resolution training, they
| need to hire a quite expensive consultant, and that person can
| take on a very small cohort of maybe 10 people.
|
| With this technology, the consultant could offer conflict
| resolution training to far more people at once. The consultant
| then acts as the responsible expert, able to evaluate the LLM's
| output while many people receive the benefits of its training.
| kordlessagain wrote:
| Giving the underprivileged tools to work on themselves is a
| huge win for society, regardless of the fidelity of all
| interactions. There are a lot of bad therapists in the world,
| and not all of them are going to give good advice all the time.
| But, bad therapists can still provide some information that may
| be useful to the person learning. Why would we expect
| perfection from interacting with an LLM instead of a bad
| therapist? Why be _overly_ concerned about the possibility of
| introducing "bad data" in such a session? It could be that bad
| data then gives way to an opportunity for a leap of logic for
| the one using the tool. Without critical thinking present, it
| is unlikely anyone is going to learn to be adaptive over the
| long term.
|
| Obviously, the use case matters. I'm not saying to give such a
| tool to someone who is mentally ill, or incapable of critical
| thought.
|
| In general, I think expecting perfection from LLMs is
| unreasonable. If someone wants to learn to code, the second
| they have any interaction with an LLM about coding is the
| moment they begin to learn. I'm not sure about y'all, but a lot
| of times when I'm working a hard coding problem, I find myself
| going through confusion, frustration, interest, and then
| aggressiveness (anger+interest) to get through the challenge. I
| frequently tell people who are starting out coding that coding
| is one of those things that you have to get used "being
| confused about" to over time. You have to get used to creating
| problems for yourself, then coping with the feelings that come,
| and pushing through to a solution.
|
| I do think you raise a profound point about ability to evaluate
| the fidelity. I would argue that some people will eventually
| develop critical thought to apply to the problem, but if we
| instead see a "digital tragedy of the commons" happening,
| things could go sideways pretty quickly.
| maroonblazer wrote:
| > You have to get used to creating problems for yourself,
| then coping with the feelings that come, and pushing through
| to a solution.
|
| A great description of any creative act, really.
| FranklinMaillot wrote:
| I think the reason why LLMs pose a much higher risk is scale.
| A bad therapist will impact a limited number of patients,
| whereas an LLM deployed at scale will affect possibly
| millions of people. In matters of health and safety, the
| smallest error rate could have devastating consequences.
| Morelesshell wrote:
| The concern is real but you know what? The last company i was
| part of, had a TON wtf moments.
|
| Like database unprotected on the internet (very short password
| too!), md5 for passwords, api key in the apps...
| delichon wrote:
| You know the trope where the speaker is rude and crude but their
| translator renders their comments in polite diplomatic terms?
| Like "fuck you and the horse you rode in on" translated as "with
| respect, I emphatically disagree"? Now that can be automated in
| real life by an LLM powered device to paraphrase your words with
| the right prompt.
|
| Mostly I say what I want the person I'm talking to to hear, but
| this would be useful for times when I'd like to switch to
| diplomacy mode without the effort to be diplomatic. Like job
| interviews or other sales pitches.
| 082349872349872 wrote:
| The dual channel version of this would be where both parties
| understand each other's language, but use translators anyway
| for diplomatic nicety, so they can get to brass tacks* with
| each other but the historians record the encounter in a highly
| refined register.
|
| * Google gives me an english etymology for this phrase, but I
| emphatically disagree as I suspect it comes from yiddish
| "tacheles" t'klyt.
| dazzawazza wrote:
| Use these models to enable people to be tolerant and value other
| communication styles... not to train everyone to communicate in
| identical fashions.
|
| We seem to be working hard to homogenise humanity. Soon we'll all
| be trained by the same model to generate the same answers in the
| same way to the same beat. The era of the McHuman.
|
| We seem to think we can use LLMs as an edge to make us richer or
| better than other humans. We're not competing with each other,
| we're competing with the LLM.
| sublinear wrote:
| I agree. In my own experience the vast majority of
| miscommunications are a matter of style, opinion, or assumption
| of essential facts in a domain unfamiliar to the listener.
|
| People don't like to feel dumb. If you overexplain or
| underexplain you will trigger these feelings.
|
| People also don't like confronting opinions that differ. Many
| followers of a given subject readily buy into the popular
| opinions within that space and ignore hygiene during
| discussion.
|
| People also don't like reading or listening to disorganized or
| long-winded thoughts.
| Zetobal wrote:
| But everyone wants to communicate in identical fashions... the
| most commended book for social skills is from the 1930s.
| WaitWaitWha wrote:
| > However, practice environments for social skills are typically
| out of reach for most people.
|
| I learned my social skills in school, interacting with my peers
| in my neighborhood, and from my family. I understand this is not
| available to everyone, but anyone with access the LLMs there is a
| high likelihood they also (should?) have access to schooling,
| some neighbors, and family.
|
| To what culture this LLM was trained to?
| helboi4 wrote:
| Right? And even if you are in such a dire position that you
| literally have no oppurtunity to speak to people irl - that's
| bad. You need to be helped to leave that situation immediately,
| not given an LLM. You can literally die from never having
| proximity to humans.
| 082349872349872 wrote:
| schooling, some neighbours, and family will likely do nothing
| beyond reflect your own culture back at you.
|
| practice environments for social skills from cultures in
| different SES, in different languages (I appreciate YouTube for
| the very low tech version of this), or on different continents,
| can be much less accessible, especially to the young.
| asdf6969 wrote:
| A lot of people still struggle with social skills after they
| reach adulthood, and there's no way to practice this. People
| treat poorly socialized adults like shit and the only feedback
| given is "please leave me alone".
| cchi_co wrote:
| And there may not be opportunities for adults to practice
| social skills as children do in school
| vouaobrasil wrote:
| > However, practice environments for social skills are typically
| out of reach for most people.
|
| This is exactly what I predicted...AI and technology isolates us
| from each other by destroying communities, and is slowly
| replacing real human interaction with computer interaction. The
| end result is that in the future, people will have fewer and
| fewer opportunities to interact with others and will have to be
| drip-fed satiations for their basic needs such as socialization,
| similar to how hospital patients are given IVs.
|
| Before you say this is outrageous, actually take a look at
| history and how technology has pushed as farther and farther
| apart. Nowadays there is very little real interaction beyond the
| nuclear family. Think of self-checkouts, chat-bots, AI partners
| and therapists, technology making us more self-reliant (which is
| a bad thing).
|
| If we keep letting technology take away our communities this way
| in this two step process (first, take away the community then
| supply an inferior, superficial substitute), then we will
| compelely lose our humanity.
| Zetobal wrote:
| Look beyond your bubble. The child participating in 'School
| over the Air' in Australia likely appreciates having such a
| resource to develop their social skills. Sometimes, you need a
| private space, especially if the community around you isn't on
| the same wavelength as you are or it's your family and some
| farm hands.
| vouaobrasil wrote:
| So your rebuttal is that isolated examples of this technology
| being beneficial outweight the mass societal cost of people
| becoming more isolated in general? And I simply do not see
| how the need for "private space" and "practising social
| skills" go hand in hand.
|
| Yes, everyone needs their alone time. And everyone needs
| social contact (except maybe some exceptions, maybe...). Both
| can be accomplished WITHOUT technology, and much more easily.
|
| Don't you think if someone needs private space and isn't
| getting, that's something that should be dealt with away from
| technology?
| Zetobal wrote:
| You can just not use it.
| boringg wrote:
| OP argument is that it is bad for societal as a whole. Coming
| back with an example for a very small sub group doens't
| refute OPs position. Your position that it can help with edge
| cases is valid however if OPs position was taken then there
| would be a large swath of damage for a small gain in a small
| population. That doesn't have much logic to it.
| kelseyfrog wrote:
| It may take an active, conscious, participatory approach to
| compensate for.
|
| Though, can you help me understand the connection between the
| grocery store checkout process and community a bit more?
| paganel wrote:
| Not the OP, but I do consider the cashier people from the
| groceries store nearby as part of the community.
| vouaobrasil wrote:
| Exactly, and usually after shopping at the same place for a
| while, I get to know the cashier a little bit. I've
| actually had some good conversations with some of them, and
| tellers at the post office as well. (One could say that it
| might be better for them if they didn't have a job and they
| got UBI, but to be honest, as young adult, I appreciated my
| first menial job at least in that it gave me a sense of
| responsibility and independence. And a lot of these people
| are young people.)
|
| What a lot of technologists seem to think is that AI will
| make everything wonderful by allowing us unlimited free
| time to do hobbies, but the desire to spend all of one's
| free time on hobbies is a very peculiar and minority desire
| of the very intelligent. (And it may be more of a fantasy
| than soemthing that would actually make them happy in
| reality.)
| vouaobrasil wrote:
| > Though, can you help me understand the connection between
| the grocery store checkout process and community a bit more?
|
| It can be viewed on multiple levels.
|
| First, there are many papers on the psychological literature
| about this, but [1] is an example, which is that even a
| little social contact with a few real people can make a
| person feel better. Check out Google scholar or the article I
| linked below to learn more. But that is also intuitive. But
| if a person is feeling a bit down and they have the "easy way
| out" (self checkout, other technologies that diminish casual
| human interaction), they will not gain the benefit of human
| contact.
|
| Second, on a greater scale, if all these little things add
| up, people will seek out interactions less often. For
| example, a few years ago I ran into some strangers that
| became friends because I needed their help to find a rare
| bird (I'm a birder). If I had an advanced AI app that
| listened for bird calls and helped me find birds better than
| any human, I might never have made friends with them.
|
| The point is, when technology is lacking to do something, we
| are more likely to seek out people to help us. Yes,
| technology does make some parts of life easier, but there
| must be a BALANCE, and not this recent trend of automating
| EVERYTHING, which takes the self-reliance thing to its
| logical _extreme_.
|
| 1.
| https://link.springer.com/article/10.1007/s10902-020-00298-6
| kelseyfrog wrote:
| > there must be a BALANCE, and not this recent trend of
| automating EVERYTHING, which takes the self-reliance thing
| to its logical extreme.
|
| On one hand though, would self-reliance also imply that
| people were responsible for actively seeking out social
| interactions like you're describing? I'm sort of confused
| when it comes to describing where the locus of control is
| when we're talking about these things.
| troyvit wrote:
| I have a story that happened just last night. I almost always
| go through the self check-out because I can pack my groceries
| just how I like. So I scan my groceries while a round glass
| eye watches my every move and a machine repeatedly tells me
| to "place the item in the bag." )There are no bags, we're a
| bagless state, but the machine doesn't know that.) Sometimes
| I'll palm a Luna Bar or two just because I can.
|
| Last night I was running really late and there was a long
| line at the self check-out so I went to the one open human-
| powered check-out lanes. They were about 3x as fast as I
| would've been and I had a pleasant conversation with the
| checker and the bagger. Yeah my bananas got a little beat up
| but it was a much better experience. Now I feel like I can
| say hi to those people.
| SamoyedFurFluff wrote:
| I'm not sure I agree with this take that technology is taking
| away community in general. I think communities are simply
| changing. I mean HN is a community and it's all technology. IRC
| is a ton of fond memories for me in a time where my irl
| community was failing me as an awkward nerd, and gave me
| friends that I can irl meet and chat with now.
|
| You seem to have this idealized notion of what community was
| like outside of technology. But that community always had
| outsiders and ignored people, that community had communal
| lynchings and hangings, pogroms, witch trials etc. and strict
| ways of life where, if you didn't conform, you were far more
| isolated than now where you could meet fellow weirdos!
| vouaobrasil wrote:
| > I mean HN is a community and it's all technology. IRC is a
| ton of fond memories for me in a time where my irl community
| was failing me as an awkward nerd, and gave me friends that I
| can irl meet and chat with now.
|
| Well, personally, I would love to know what percentage of
| people here actually cared about the other people here in
| terms of actually caring, like having a real friend. I'm
| assuming it's rather close to zero. Don't get me wrong, I
| like debating on here but it's nothing like an in-person
| community.
|
| You can disagree all you like, but I think one thing that is
| near-universal is that there is something special about in-
| person communities that cannot be replicated online. Sorry to
| say, although I like talking to people on here, I have no
| idea who any of you are, what your faces look like, and for
| all intents and purposes, you could even be AI.
|
| The truth is, and I think most average people would agree,
| there's nothing like a genuine, real community with real
| people having face-to-face interaction.
|
| And an idealized notion?! Come on, seriously! I'm just
| talking about some basic stuff like interacting with
| neighbours, talking to employees at the grocery, helping
| other community members out with basic stuff, etc. In fact,
| it would be nice if community bonds were even stronger at
| times...
| SamoyedFurFluff wrote:
| > I'm just talking about some basic stuff like interacting
| with neighbours, talking to employees at the grocery,
| helping other community members out with basic stuff, etc.
|
| Yeah, these communities can be hostile to people who don't
| fit in, especially visible minorities.
| ToucanLoucan wrote:
| > You seem to have this idealized notion of what community
| was like outside of technology.
|
| By "idealized notion" you mean "the vast, vast majority of
| human history"?
|
| "Community" is not typically defined as "the people you talk
| bullshit with for fun" and that's all. Your community is the
| people you live with and among. It's the people you're most
| likely to experience Big Things with, things like natural
| disasters, weird stuff like power outages, or your building
| catching fire if you live in an apartment or condo. If you
| regularly interact with them, you probably also see them
| during fun things: street cookouts or yard sales (we do both
| at the same time, you get a lot more customers by when they
| have 14 sales to peruse and brats for sale!) or just see them
| grilling out when you go to get your mail, and end up having
| a few beers with them and talking about goings on. Hell,
| depending on your locale or culture, maybe your community
| feeds itself too from bulk kitchens, or does laundry, etc.
| etc.
|
| I say this as an introvert who opted out of every social
| thing I could in favor of forums and games when I was growing
| up: I was wrong. I was deeply, deeply wrong. People are
| pretty great. They're not perfect, and they can be a lot of
| work, but ultimately I was not hiding from people because
| they were bad or annoying or stupid: I was hiding from them
| because I was emotionally stunted and didn't want to deal
| with it. That was it. And once I did I found human connection
| that was so much more sustaining, in a way where I can't
| believe I once thought ^this^, chatting online, was an
| adequate replacement.
|
| It makes me sad to think how many people out there are just
| sitting in their little rooms or cubicles because for
| whatever reason or set of reasons, they don't feel
| comfortable engaging their fellow man, nursing a hollowness
| that will follow them around until they do because
| fundamentally humans are just not meant to exist alone. We
| just aren't, it's in our DNA to make groups and be among
| friends.
| the_snooze wrote:
| A good test for what a community is is: "Who will vouch for
| you?" You can have a machine pretend to talk the talk and
| parrot it back to you for "training," but you're still an
| unknown outsider at the end of the day.
|
| I don't think HN is a community in that sense. There's no
| "knowing" one another. There's no real accountability or
| shame of getting kicked out. We're all just text-
| generators.
| iamcurious wrote:
| I have never felt closer yet farther away from y'all.
| Thank you, I need to cry, and revaluate my social habits.
| SJC_Hacker wrote:
| I get it, OTOH, humans are unfortunately selfish assholes.
| In any sufficiently large group of people, there are always
| conflicts - people attempting to assume control,
| marginalize/exclude/bully other individuals, etc. Which is
| part of the reason why people seek out these alternatives.
|
| If they ever come out with the sexbot, then I think its all
| over for modern technological civilization. There will be
| little reason for alot of men to marry if they can get the
| equivalent of Ana de Armas for ~$20k or so. And which is
| always loyal, will never complain, or grow old. Yeah women
| can and do have children on their own without men, but
| usually not as many, and much later in life. It won't be
| enough. Then its a slow death spiral of not having a
| replacement generation. I can see this happening within ~50
| years or so. ChatGPT + voice synthesis + Boston Dynamics +
| RealDoll. We're done.
| ToucanLoucan wrote:
| > I get it, OTOH, humans are unfortunately selfish
| assholes. In any sufficiently large group of people,
| there are always conflicts - people attempting to assume
| control, marginalize/exclude/bully other individuals,
| etc. Which is part of the reason why people seek out
| these alternatives.
|
| The prevalence of this is greatly, greatly exaggerated in
| my experience. School was absolutely like this, but the
| further you get from school, the less anyone really gives
| a shit. Most people (key word: most) just want to chill.
| And if you're chill with them, they will chill with you.
| jimbokun wrote:
| Anonymous humans on line are even bigger assholes.
| jimbokun wrote:
| We have run this experiment on a society wide basis. The data
| are in, and it didn't work.
|
| Mental health diagnoses are skyrocketing across the board,
| especially for young people. And the evidence tracking these
| outcomes to access to personal phones and social networks for
| those young people is very strong.
|
| Virtual communities have some value. But as a replacement for
| in person interaction the outcomes are terrible.
|
| (Btw, good rhetorical technique there, implying Western
| society the day before the iPhone was introduced was full of
| "communal lynchings and hangings, pogroms, witch trials
| etc.")
| SamoyedFurFluff wrote:
| Weird take. We know communal lynchings happened; people
| literally took family photos by them! People happily
| documented that hangings were the town spectacles when they
| happened! Let's not play rose tinted glasses about
| communities before social media: if you were not "in" that
| group you were vulnerable to violence or death. We have
| that now, where a disproportionate number of homeless
| minors are homeless because their parents kicked them out
| for being gay or trans and, clearly, no one else in their
| "community" was willing to take them in.
|
| We need to be comparing how communities are right now vs
| how social media is right now and not some idealized notion
| of what in-the-flesh communities might be like. Right now
| we have a dearth of welcoming in-the-flesh community
| alternatives for people that have been cast out of
| traditional social groups and use technology to cope. This
| doesn't mean technology bad. It means society needs to
| change so technology isn't so tempting.
| jimbokun wrote:
| Statistically, the mental health of gay kids is worse
| today than it was when there was a lot more real world
| persecution.
| ben_w wrote:
| > Think of self-checkouts, chat-bots, AI partners and
| therapists, technology making us more self-reliant (which is a
| bad thing).
|
| When was the last time you used a bank teller rather than an
| ATM? For me, at least 5 years ago. On the other hand, despite
| ATMs, I'm told more people than ever work as bank tellers.
|
| > If we keep letting technology take away our communities this
| way in this two step process (first, take away the community
| then supply an inferior, superficial substitute), then we will
| compelely lose our humanity.
|
| I think the process is backwards: the substitutes
| (superficially better but not really) are usually what cause
| people to leave local communities. It's video games which take
| people out of clubs and cause the clubs to close, not a lack of
| clubs which drive them to video games.
|
| And the disjoint of remote people, of parasocial relationships,
| isn't new. Let's say you're in St. Paul's Cathedral as a
| worshiper: it seats 3,500, far more than the Dunbar's number
| our brains can cope with. The Bishop of London doesn't know all
| the people present, for the same fundamental reason that
| someone with 3500 "friends" on Facebook doesn't really know all
| of them. All well before radio stars and TV, let alone the
| internet and AI.
|
| So, whatever of our humanity can be killed off by the mere
| existence of this change (as opposed to deeper things like
| malicious propaganda), it was already dead long before any of
| us were born.
| vouaobrasil wrote:
| > I think the process is backwards: the substitutes
| (superficially better but not really) are usually what cause
| people to leave local communities. It's video games which
| take people out of clubs and cause the clubs to close, not a
| lack of clubs which drive them to video games.
|
| Yes, that's what I meant. Sorry, I was not clear. I do mean
| that we have a process of substitute > leaving community >
| more substitute. I just isolated two steps from this, but I
| see how that was confusing.
| ben_w wrote:
| Then we are probably in agreement :)
| vouaobrasil wrote:
| > All well before radio stars and TV, let alone the internet
| and AI.
|
| All true, which is why I think we should be cautious of ALL
| technology, and try and vigorously decide which ones we
| should use in a process similar to that of the Amish (but
| without the religion, perhaps). Any discussion would be
| better than the ZERO discussion that we have today.
| jimbokun wrote:
| > I'm told more people than ever work as bank tellers.
|
| Any links to this? I find this very surprising.
| ziddoap wrote:
| This is a pretty exaggerated take.
|
| Technology is a tool, it can be used to increase your social
| interactions (e.g. finding like-minded groups online to meet up
| with in-person) or to decrease your social interactions.
|
| You say technology is "destroying communities", but don't
| acknowledge all of the communities technology has helped build.
| vouaobrasil wrote:
| > Technology is a tool, it can be used to increase your
| social interactions (e.g. finding like-minded groups online
| to meet up with in-person) or to decrease your social
| interactions.
|
| The number one myth about technology is that it is just a
| tool. It is a society-shaping force. For example, WhatsApp in
| Brazil: even if you HATE whatsapp and Meta and don't want to
| use it, you might be forced to use it if you want to get a
| job, book a hotel, etc, because it's so entrenched.
|
| Sorry to say, but technology works through first being
| optional, then being adopted because it gives a person
| marginal advantages in the short-term, and then being
| mandatory because the majority adopts it.
|
| Many people are forced to use technology even if they would
| rather not because if this process. That technology is a tool
| is the greatest misprepresentation of any concept in the
| history of humanity.
|
| Edit: let me ask you one question, how did you get the idea
| that technology is just a tool? How many times to people
| repeat this mantra? Try thinking about it more deeply and
| over many decades of time and understand the root cause of
| this thinking.
| ziddoap wrote:
| That's a whole other conversation, sure.
|
| My point was that you can (and many people do) use
| technology to increase their social interaction. You seem
| to suggest it can only possibly decrease it.
|
| > _The number one myth about technology is that it is just
| a tool._
|
| I didn't say it was _just_ a tool.
|
| It is _a_ tool, which can be used to help increase social
| interaction.
|
| > _Edit: let me ask you one question, how did you get the
| idea that technology is just a tool?_
|
| I didn't.
| vouaobrasil wrote:
| > You seem to suggest it can only possibly decrease it.
|
| Well, I think it can increase it, but that the increases
| in time form a sequence of local maxima that themselves
| are strictly decreasing. (Edit: I meant in terms of
| overall societal health, like adding up the social
| happiness of everyone).
| weweweoo wrote:
| The big picture is that social media and smartphones reduce
| social interactions, and lower their quality. Just look at
| all the kids and teenagers that are glued to their
| smartphones all the time.
|
| Obviously there are exceptions to the rule, like people
| living in middle of nowhere, or who have otherwise trouble
| finding like-minded friends offline.
| jimbokun wrote:
| We have enough data now to confidently say the impact of
| computer technology on socializing has been a net negative.
| ziddoap wrote:
| My comment wasn't really about whether it is a net negative
| or a net positive. I could definitely be convinced it is a
| net negative.
|
| But my comment was about how the future of having to be
| drip-fed social interactions to meet our basic needs,
| otherwise no one will ever interact with anyone ever, is a
| silly exaggeration.
|
| In any case, I'd love to see what data you're referring to.
| I'm familiar with studies that look at specific
| technologies (e.g. social media). But I'm unfamiliar with
| data/studies that looks at all technologies across all
| levels of society and measures the net benefit/harm. I'm
| not even sure how you would go about quantifying, say, the
| benefit of being able to contact a long-distance relative
| with significantly more ease and less expense vs. the harm
| of having to use Facebook to organize the video call.
| jimbokun wrote:
| I meant specifically availability of a personal smart
| phone and social networks to young people leading to an
| increase in mental health issues, as thoroughly
| documented by Jonathan Haidt. I should have been more
| specific.
| throw18376 wrote:
| a lot of people have truly deficient social skills. technology
| didn't make them this way they're just bad at it, so bad that
| they can't even have enough social interactions to improve and
| practice. can't even hold a conversation with a cashier. they
| simply lack the skills to be part of a community.
|
| ideally such people could build remedial social skills by
| practicing with a real therapist. but that's costly and
| difficult to make accessible, because "bad at conversation and
| annoying" is not a mental illness.
|
| an LLM here could help. just to give someone a sense of how to
| hold simple, normal conversations and practice very basic
| theory of mind, to the point where they could start building
| real social skills.
|
| the problem is whoever builds that system may choose to exploit
| the users and build dependence. but that's a choice, it doesn't
| have to happen.
| jimbokun wrote:
| > Nowadays there is very little real interaction beyond the
| nuclear family.
|
| What nuclear family?
| alpenbazi wrote:
| haha, sure, i learn how to be a human with a machine
| helboi4 wrote:
| Another episode of "what convoluted way can we use tech to
| 'solve' a social issue that it clearly exacerbates to try and
| desperately keep the blind cult-like silicon valley version of
| techno-optimism alive for as long as possible, while ignoring the
| obvious disastrous social consequences"
|
| I'm pretty sure nobody 15 years ago would have uttered the phrase
| "opportunities for practising social skills are hard to come by".
| All social technology ignores our biological need for physical
| closeness and the severe negative outcomes to our brains and
| health when we don't have it. People who are so socially crippled
| that they would be considering this treatment are a symptom of
| destroyed communities, broken social cohesion, tech addiction and
| this overly popular brand of post-covid hermitism that seems
| popular on this site. These are social ills that all need to be
| tackled. Absolutely nothing about talking to an LLM will help,
| Noumenon72 wrote:
| A comparison to video games shows how hard it is to truly
| practice social skills. You can't play the same situation over
| and over, which is why it's easier to get better at small talk
| than confrontation. People don't want to reveal how they
| manipulate you, so schools can't teach about head tilting and
| brinksmanship. You have few trustworthy external evaluators, so
| you can come away from a conversation where you talked the
| entire time thinking you got a high score instead of a low one.
| The opportunities for using social skills to make a romantic
| connection are rare and you hardly ever even get to see how
| other people do it. Being able to act out scripts and get real
| feedback could be a tremendous benefit to the world.
|
| People who offend, withdraw, or blunder have existed since
| before techno-optimism. This is their chance to become better
| in a safe space without embarrassment.
| theshackleford wrote:
| > I'm pretty sure nobody 15 years ago would have uttered the
| phrase "opportunities for practising social skills are hard to
| come by".
|
| You'd be wrong.
| ayushl wrote:
| tbf its more about training rather than social skills
| VeryFinePrint wrote:
| As a reddit mod of a tiny community I've very recently started to
| suggest users use ChatGPT to re-word their comments and
| submissions when their phrasing is inflammatory. Any point worth
| making is worth making well. It is still early stages, so I'm not
| sure how effective it will be.
|
| I've noticed that ChatGPT can do a decent job rewording an angry
| message to be less toxic while drawing out the salient points.
| The output it produces a little stiff or stuffy, but I find that
| preferable to "authentic" toxic messages.
|
| As a mod I find that removals have less sting when you provide
| the user with an alternative path forward to get what they want.
| When a removal isn't just a "wall of no", it feels less like a
| personal rejection. ChatGPT offers a path around that "wall of
| no" for users by a) showing them what the right way to make their
| comment is and b) gives them a path/tool to getting there nearly
| every time.
| lasermike026 wrote:
| Is it me or are most of these posts written by AI?
___________________________________________________________________
(page generated 2024-04-09 23:02 UTC)