[HN Gopher] Cheerful chatbots don't necessarily improve customer...
___________________________________________________________________
Cheerful chatbots don't necessarily improve customer service
Author : giuliomagnifico
Score : 39 points
Date : 2022-12-28 16:50 UTC (2 days ago)
(HTM) web link (research.gatech.edu)
(TXT) w3m dump (research.gatech.edu)
| karaterobot wrote:
| > The results across the studies show that using positive emotion
| in chatbots is challenging because businesses don't know a
| customer's biases and expectations going into the interaction. A
| happy chatbot could lead to an unhappy customer.
|
| I don't think chatbots count as customer service, and the idea of
| a "happy chatbot" is nonsense, since the bot isn't happy or sad,
| it's a script. It's an interface to a set of FAQs. That does not
| count as customer service any more than a searchable FAQ or
| documentation site is customer service. It's useful, but it's not
| customer service, it's a way to avoid paying people who can
| provide customer service.
| daneel_w wrote:
| I believe we learned this lesson already with Clippy.
| notyourwork wrote:
| Chat boys are never helpful.
| rickreynoldssf wrote:
| What infuriates me about support chat
|
| Me: My order wasn't delivered
|
| Support: Thank you for contacting support. I am sorry you are not
| having a delightful experience with our service. How can I help?
|
| Me: My order wasn't delivered
|
| Support: Thank you for that. I understand your order wasn't
| delivered. Is that correct?
|
| Me: Yes
|
| Support: Thank you for that. Please hold while I look it up.
|
| ...minutes pass...
|
| Support: What is the order number?
|
| Me: abc123
|
| Support: Thank you for that. I understand your order number is
| abc123 and you have not received the delivery. Is that correct?
|
| Me: yes
|
| Support: Please wait while I look this up.
|
| ...minutes pass...
|
| Support: Thank you for waiting. I'm happy to tell you that your
| order was delivered on 12/29/22. Is there anything else I can
| help with?
|
| Me: I did not receive it.
|
| Support: Thank you for that. I understand that you did not
| receive your order. Is that correct?
|
| Me: Yes
|
| Support: Your order was delivered by UPS on 12/29/22 and was left
| on your porch. Did you look there?
|
| Me: Yes
|
| Support: Thank you for that. I understand you looked on your
| porch and did not see it. Is that correct?
|
| Me: Yes
|
| ...minutes pass...
|
| Support: Thank you for that. Can you tell me your order number?
| [deleted]
| [deleted]
| ilyt wrote:
| They were never made for that.
|
| Their purpose is to hire less tech support people. They don't
| care about your time wasted, and you bought the product already,
| better to pay people to sell product to more people than to deal
| with tiny % that has problems with it.
| ericd wrote:
| If your company has a reputation of not standing behind its
| products, selling is going to become harder, and more
| expensive, and your margins will go down. This is hard to
| measure accurately (people try with NPS, but that's fraught
| with bias), so it might be hard to argue for spending more on
| customer satisfaction and a good reputation, if your org is
| trying to be metrics driven rather than intuition/principles
| driven.
|
| So basically, blame the MBAs :-)
| ilyt wrote:
| That's for the next manager or CEO to worry, you get your
| bonus for saving corporate money then move in.
| Andrew_nenakhov wrote:
| Wherever I encounter a chatbot, my user experience usually
| suffers a lot, especially if they give me a slow and hard to
| navigate tree menu instead of a well structured help index, and
| no way to contact actual support.
| sideshowb wrote:
| s/necessarily//
| agilob wrote:
| Cheerful chatbots don't necessarily improve customer service, but
| at least they are annoying and time wasting, so there's that.
| chucksmash wrote:
| Even when actual human beings give scripted customer service
| responses, it's already hard enough to come across well. Makes
| sense that taking the human being out of the loop but leaving in
| obsequious corporate niceness rubs people the wrong way. The
| customers were responding positively to a person, not to the
| script in front of the person (imo).
| charcircuit wrote:
| I feel the results of this study would be culture specific.
| MonkeyMalarky wrote:
| If I'm talking to support, it's because I already have some sort
| of problem. If I'm shunted into talking to a chatbot, I've now
| got two problems and I'm not happy about it. Adding a chipper
| disposition to it is like rubbing salt in the wound and a hollow
| attempt to paper over the fact that the business is trying to cut
| support costs as much as possible.
| colejohnson66 wrote:
| > If I'm talking to support, it's because I already have some
| sort of problem.
|
| You and I may wait until then, but plenty of customers don't
| bother, and will open the chatbot because they can't be
| bothered to read the navigation. Companies then realize their
| customers are idiots wasting employee's time and throw them
| behind a chatbot to attempt to filter them out.
|
| It's frustrating.
| [deleted]
| lazide wrote:
| As someone who tends to have unconventional needs of support
| (because I already figure out anything that is a more normal
| request), I despite chat bots with a passion.
|
| They never have any idea about what is going on, and add
| excessive delay getting to someone who can actually look at and
| override or fix whatever broken thing in their system is causing
| the problem.
|
| Hard to say if they top voice call mazes though.
| herbst wrote:
| PayPals literally took me into the same canned question &
| response loop twice in the same session until the bot suggested
| that I could talk to a human which was available seconds after
| that suggestion.
|
| I tried to same loop before several times, always asking for a
| human. But obviously I didn't meet the character threeshold or
| whatever that it would be willing to forward me.
|
| Worst possible customer experience by far
| mmanfrin wrote:
| The chat bots always just seem to be frontends for their
| knowledge bases, because a lot of people don't even try to
| self-help. But that's so frustrating for those of us who do;
| YES, I _have_ tried resetting it, let me get past you chatbot.
| Spooky23 wrote:
| The chatbot is there to encourage abandonment.
| ilyt wrote:
| Shibboleet [1]
|
| * [1] https://xkcd.com/806/
| ericd wrote:
| Not the point of that comic, but I love how the first guy
| doesn't even have a computer on his desk, so he couldn't
| help directly with anything even if he wanted to.
| netsharc wrote:
| I wonder if any telephone menu engineers have actually
| implemented this after reading the xkcd.
|
| I'm trying to think of other real life copying fictional
| concepts, but can't come up with anything right now...
| ceejayoz wrote:
| It's a pity you can't opt for a short quiz to prove you've
| tried the usual approaches.
| Nextgrid wrote:
| Or just put down a deposit that gets refunded only if the
| issue ends up being legitimate. A "I bet $100 that I'm not
| a dumbass" button.
| echelon wrote:
| Sounds like a very quick way to build a negative brand
| image.
| Nextgrid wrote:
| You can frame it in a more "brandable" way, such as
| calling it a support contract.
| echelon wrote:
| That would take engineering time and resources. Given the
| multiplied bell curve distributions of technical aptitude,
| domain experience, and whatever else might compel a person
| to solve their own problems, this audience is vanishingly
| long tail.
|
| If you put the direct contact link on the website, people
| who forgot to plug the product in will find it.
|
| Another solution is to hire more customer support staff.
| But that's costly - headcount, training, etc. And there's a
| lot of churn from downright abusive customers.
|
| A business has a million other things that need attention.
| lazide wrote:
| Is it better or worse when you know exactly why they're
| fucking you that way?
| jareklupinski wrote:
| would be nice of those chatbot teams to put in a backdoor we
| can use to bypass the first-line support scripts, like mashing
| # 0 * on a phone pad
|
| "Please escalate" seems to work sometimes, but I'd be fine with
| "ENGAGE_LUCKY_MODE_777" or something
| varjag wrote:
| I always go with "bring me to your leader". Works quite often
| as it seems to confuse elizas enough.
| joelrunyon wrote:
| This.
|
| When you have already troubleshot and gone through the help
| docs and are actually pointing out a flaw in their system -
| chatbots are completely useless.
| oidar wrote:
| It's become a huge selling point for me that I can walk into a
| store and talk to a human about a problem I am having with a
| product. Lately, this has meant that the Walmart online store
| gets more of my business than Amazon for items that aren't
| available at brick-and-mortar stores in my area. Because Walmart
| will take back items bought at the store and resolve problems
| right then and there.
|
| Amazon has started selling items that aren't returnable and I
| wish I could opt out of it. I got a laptop/phone bracket that was
| made with some crappy, brittle plastic, but Amazon didn't accept
| returns on that item. If Walmart sold ebooks, I don't think I'd
| use Amazon anymore. After dealing with expired food and medicine
| through Subscribe and Save, Amazon delivery just throwing things
| on the ground and damaging them, and unreturnable items like the
| phone accessory, I am done with them for physical items.
|
| Getting through to Amazon customer service is a huge pain in the
| butt, especially now that the customer service flow is self-help
| -> text chat -> telephone chatbot -> foreign call center. With
| Walmart, I just drive a half block away, go to customer service,
| wait in line for five minutes, and boom, it's done. It's a breath
| of fresh air honestly.
| schappim wrote:
| > started selling items that aren't returnable and I wish I
| could opt out of it
|
| I'm not sure if this is legal in Australia or EU. You might
| also have some state-level laws that would prevent this. Have
| you tried pushing back against them?
| armchairhacker wrote:
| > Amazon has started selling items that aren't returnable and I
| wish I could opt out of it
|
| Is that legal?
|
| I would not buy something where the seller can literally ship
| junk and legally you have no recourse.
| unity1001 wrote:
| > Is that legal?
|
| Not in Europe.
| SoftTalker wrote:
| Caveat Emptor.
|
| AFAIK it's legal. Might be courteous to disclose "no returns
| accepted" on such listings, but I don't think it's mandatory,
| at least not everywhere.
|
| Buy with a credit card that offers warranty coverage on
| purchased items.
| archontes wrote:
| It's reasonable they could do a chargeback on the basis of
| violation of implied warranty of fitness.
|
| Edit: it's probably a violation of implied warranty of
| merchantability.
| ghaff wrote:
| Even brick and mortar stores definitely will sell special
| items (discontinued items, returned items, etc.) on an all
| sales final basis sometimes.
| wahnfrieden wrote:
| in the US yeah. it takes coordinated pressure from the
| working class to get reasonable-seeming laws like that
| passed. when we leave our rights to corporate charity, this
| is what we're left with
| drstewart wrote:
| You mean Japan, not the US.
|
| https://www.reddit.com/r/japanlife/comments/zt9ra1/why_is_i
| t...
| [deleted]
| johnea wrote:
| Micro Center RULES!!!
| delusional wrote:
| It's so bad that I've started to disregard any customer support
| that happens as a chat. I'd rather drop and email or a support
| ticket than talk to some godforsaken chatbot or wait in line
| for 2 hours to talk to a human. It seems to me that the "chat
| support" craze has been used as an excuse to gut customer
| support teams.
| johnea wrote:
| This really shouldn't be a surprise. Even the outsourced
| Indian human support lines, where they are required to recite
| the corporate litany, and monitored to make sure they do, are
| really just human shields that protect the corporation from
| accountability...
| ilyt wrote:
| > Amazon has started selling items that aren't returnable and I
| wish I could opt out of it. I got a laptop/phone bracket that
| was made with some crappy, brittle plastic, but Amazon didn't
| accept returns on that item.
|
| Every time I read shit like that I feel like EU's intrusiveness
| and overreaching might in the end be worth it if it stops [1]
| corporation from shit like that
|
| *
| https://europa.eu/youreurope/citizens/consumers/shopping/gua...
| strbean wrote:
| Maybe the intrusiveness and overreaching was just fundamental
| consumer protections all along.
| ilyt wrote:
| Oh, no, there is definitely some questionable overreach in
| regulations, the most known example to the outside is
| probably just how shittily the "cookie law" was written,
| good intentions with law that made the intentions
| irrelevant as implementation just trained users to click
| the stupid button and get on with their day.
|
| And stuff like pushing for EVs before infrastructure is
| there, all while planes are fine and dandy to run on
| fucking leaded fuel...
| tgv wrote:
| The cookie law wasn't half as shitty as the lazy ways
| companies adhere to it. But even that I'll take any day
| over the time before the cookie law (which, BTW, is much
| more extensive than just regulating cookies).
| robotnikman wrote:
| Having worked customer support as one of my first jobs, I have a
| feeling a lot of people use it just to have an emotional punching
| bag to take out their frustration on others, or as a buffer for
| unpopular corporate policies which customers don't like.
|
| Corporation adds a policy that's unpopular with customers? No
| worries, customer support is there to take the anger and
| complaints
| LarryMullins wrote:
| When talking to human customer support, I'm happy to be
| superficially pleasant even if I'm having a bad day with their
| company, and I like when they treat me the same. Just because
| one of us is having a bad day, doesn't mean we should go and
| ruin the day of another person. Even if I hate their employer,
| the discussion I'm having is person-to-person, not person-to-
| corporation. Being superficially pleasant is a genuine courtesy
| to the other person.
|
| But when the corporate representative is a bot, not a person, I
| know the superficial pleasantness cannot be explained by one
| person earnestly trying to be pleasant to another. No, in this
| case the pleasantness is _wholly_ artificial and I don 't
| appreciate it in the least.
| cafard wrote:
| Yes, I've been on the phones, and talked to one or two such
| customers.
|
| However, chatbots are just another unpopular (result of) policy
| on top of whatever else the customers are unhappy with.
| joelrunyon wrote:
| How soon until someone lets you
|
| 1) create a series of help docs
|
| 2) trains an AI on those help docs
|
| 3) solves customer headaches using a chatgpt interface with that
| training
|
| 4) drops you into a human that can help as soon as you ask for
| it.
|
| My primary issues with "live chat" features are
|
| 1) They ask you for stupid stuff (email/name/etc) when you're
| logged in that are already in the system. This makes things worse
| somehow.
|
| 2) The sheer resistance to handing me over to a human when I've
| asked for it. If you don't have humans on live chat - that's
| fine, but at least send it to a help desk.
|
| 3) The fact they try to ask me to categorize the issue vs.
| semantically figuring it out via the request is extra annoying as
| they will often ask you this AGAIN if they can't initially place
| you in one of their pre-selected categories.
| m348e912 wrote:
| This is what I was going to say. Chatbots suck today but with
| integration of a trained ChatGPT instance they could get
| substantially better. In some cases it could even better than
| first-line human support. (I am looking at you AT&T)
| smegger001 wrote:
| maybe, but seeing as ChaptGPT already has a habit of making
| shitup when it doens't know that looks plausible but is wrong
| I would be worried that it would start sending bad responses
| make situations worse and not have a obvious way to escalate
| to a human as its decided it has a 'solved' the problem.
| SoftTalker wrote:
| Issue #1 is likely because the "live chat" is a separate
| product that they bought and linked on the main website, but
| which is not integrated into the user database or
| authentication system at all. Therefore the bot needs to ask
| you for all this identifying information even though you are
| already logged in.
| joelrunyon wrote:
| I understand that - but they should be able to sync the
| databases or pass an auth token of some sort.
|
| The fact you have to re-enter it all makes the experience
| stupid and customers frustrated.
___________________________________________________________________
(page generated 2022-12-30 23:00 UTC)