[HN Gopher] Insurers launch cover for losses caused by AI chatbo...
___________________________________________________________________
Insurers launch cover for losses caused by AI chatbot errors
Author : jmacd
Score : 96 points
Date : 2025-05-11 10:07 UTC (2 days ago)
(HTM) web link (www.ft.com)
(TXT) w3m dump (www.ft.com)
| Neywiny wrote:
| No mercy. Had to deal with one when looking for apartments and it
| made up whatever it thought I wanted to be right. Good thing they
| still had humans around in person when I went for a tour.
| tomrod wrote:
| https://archive.is/BrLso
| conartist6 wrote:
| Man I wish I could get insurance like that. "Accountability
| insurance"
|
| You were responsibile for something, say, child care, and you
| just decided to go for beer and leave the child with an AI. The
| house burns down, but because you had insurance you are not
| responsible. You just head along to your next child care job and
| don't too much worry about it.
| alexriddle wrote:
| Lots of insurance covers these types of situation which are the
| result of careless acts...
|
| Don't take the right safety precautions and burn down a
| customers house - liability insurance
|
| Click on a link in a phishing email and open up your network to
| a ransomware attack - cyber insurance
|
| Forget to lock your door and get burgled - property insurance
|
| Write buggy software which leads to a hospital having to
| suspend operations - PI (or E&O) insurance
|
| Fail to adequately adhere to regulatory obligations and get
| sued - D&O insurance
|
| Obviously there will be various conditions etc which apply but
| I've been in Insurance a long time and cover for carelessness
| and stupidity is one of the things which keeps the industry
| going. I've dealt directly with (paid) claims for all of the
| above situations.
|
| It doesn't absolve responsibility though, it just protects
| against the financial loss. I suspect if you leave a child
| alone with an AI and the house burns down that's going to be
| the least of your problems.
| jpc0 wrote:
| > Forget to lock your door and get burgled - property
| insurance
|
| I'm pretty sure this will be the same for the other insurance
| you mentioned but for property insurance if you left your
| front door open you will have a hard time getting the
| insurance to actually pay out your claim. At least here they
| require a burglar alarm and they require it to be armed when
| nobody is on site or they will absolutely decline the claim.
|
| Insurance insures against risk, but there's a threshold to
| that and if you prove to be above it they will decline your
| claim or void your insurance in totality.
| alexriddle wrote:
| In the UK where I am, most standard (not budget) property
| policies would cover theft from an unlocked entry point.
|
| Two main exceptions:
|
| 1 - if you are letting the property to someone else, e.g a
| lodger or have paying guests staying with you then this is
| typically excluded.
|
| 2 - if you have had previous theft claims, live in a high
| crime area, or you have a particularly high risk (e.g lots
| of valuables), the Insurer will add an endorsement that you
| need a minimum standard of locks and have them engaged when
| the property is unoccupied.
|
| Outside of those, if you accidentally leave a door
| unlocked, your claim will likely be paid. The situation
| obviously may be different in other countries. I worked for
| a property insurer and saw hundreds of these claims (entry
| via an unlocked entry point) paid during my time there - I
| also saw many declined because of the above.
|
| I suspect that over time the number of policies in the
| 'budget' category will continue to increase as price
| continues to trump everything else for most people]
|
| edit: it is the same for the other lines I mentioned as
| well -e.g a cyber policy I saw recently has no conditions
| relating to use of MFA. It will have been factored in when
| writing the risk (they will have said they use it) and if
| it turned out it was a lie then there would be an issue
| with cover but if it was just a case of an admin forgetting
| to include an OU in the MFA group policy the claim would
| almost certainly be covered. Policies aimed at the SME
| space are much more likely to have specific conditions
| though.
| FireBeyond wrote:
| > At least here they require a burglar alarm
|
| Is that commercial or residential?
|
| I've never seen a residential insurance that requires an
| alarm system, let alone a monitored system. Though many
| carriers will offer a discount for having this.
| dfxm12 wrote:
| This sounds like a racket for residential properties.
| Alarms do nothing to prevent burglary. Where this is a
| requirement, I'm sure the insurance company gets kick backs
| from companies that make or install them. Or it's an easy
| out, designed to make it as hard as possible for people to
| get any value from their insurance...
| nickff wrote:
| Alarms usually don't prevent burglaries, but they often
| reduce the amount of theft, as the burglars take what
| they can do in one trip and leave, rather than
| comprehensively emptying the building/unit.
| luma wrote:
| I have no idea who is underwriting your policies but this
| is absolutely not true with any carrier in the US that I've
| ever seen. Insurance pretty regularly covers being a
| dumbass.
| duk3luk3 wrote:
| There is no insurance that will insure you against your own
| gross negligence.
|
| Insurance will only pay out if you can show that you have
| done everything a reasonable person would be expected to do
| to avoid the loss/damage.
|
| > Don't take the right safety precautions and burn down a
| customers house - liability insurance
|
| You mean someone burnt a customers house down /because of
| something like an electrical or equipment malfunction that
| they could not have reasonably foreseen or prevented/, right?
|
| > Forget to lock your door and get burgled - property
| insurance
|
| That seems unlikely. Compare this:
| https://moneysmart.gov.au/home-insurance/contents-insurance
|
| > It's worth checking what isn't included. For example,
| damage caused by floods, intentional or criminal damage, or
| theft if you leave windows or doors unlocked.
|
| Happy to be shown that I'm wrong but please do not give
| people the impression that liability insurance or property
| insurance will absolve them of losses no questions asked.
| kube-system wrote:
| Insurance can't go to jail for you but it can and often does
| pay your legal fees and/or civil liabilities regardless of
| fault.
| tedivm wrote:
| Yup, I have an umbrella policy to cover a variety of legal
| situations. It costs me $900 a year for a $3m (per incident)
| policy.
| WrongAssumption wrote:
| Being covered does not mean you are not responsible.
| conartist6 wrote:
| That was basically my whole point.
|
| Would you want to insure people who think they have no
| responsibility because they've delegated it to an AI? They
| might as well have delegated the responsibility to a child or
| a dog. To sell them insurance, you as the insurer are making
| a financial bet on the ability of the dog to take care of
| anything that does go wrong.
|
| And still as the insured, using the AI imbued with your
| responsibility risks horrible outcomes that could still ruin
| your life. The AI has no life to ruin. It was never really
| responsible.
| wat10000 wrote:
| It's just a numbers game. Set your premiums such that you
| take in more than you pay out. If losses due to dumb use of
| AI are common then the premiums will be high, but there's
| no reason to refuse to issue such policies altogether.
| Justin_K wrote:
| It's called errors and omissions and it's as basic an insurance
| as it gets.
| caulkboots wrote:
| Not sure insurance will take the rap for criminal negligence.
| thallium205 wrote:
| Crime Insurance (Criminal Acts) is exactly what this is for -
| when an employee does something criminal while on the clock and
| the company is facing liability as a result of their actions.
| loeber wrote:
| Insurance tech guy here. This is not the revolutionary new type
| of insurance that it might look like at first glance. It's an
| adaptation of already-commonplace insurance products that are
| limited in their market size. If you're curious about this topic,
| I've written about it at length:
| https://loeber.substack.com/p/24-insurance-for-ai-easier-sai...
| omoikane wrote:
| Was it also commonplace to have insurances covering human
| errors? For example:
|
| > A tribunal last year ordered Air Canada to honour a discount
| that its customer service chatbot had made up.
|
| If a human sales representative had made that mistake instead
| of a chatbot, I wonder if companies will try to recover that
| cost through insurance. Or perhaps AI insurance won't cover the
| chatbot for that either?
| loeber wrote:
| Yes, this is called Professional Liability or Errors &
| Omissions insurance. It's an important insurance category,
| but limited in market size. It's uncommon to have e.g. human
| sales representatives covered for this, but your doctor,
| lawyer, accountant, architect, etc. will all carry this kind
| of insurance.
| kayodelycaon wrote:
| I worked in this market for a few years. It was
| fascinating. I still have some ACORD documentation from
| that. I learned very quickly that standards aren't. :)
| kjs3 wrote:
| I carried E&O for years as an independent consultant. I
| fortunately never had to use it, but I have peers whose
| financial future was probably saved by having it.
| SoftTalker wrote:
| How is it priced? I was always under the impression that
| it was prohibitively expensive for one-person operations.
| notahacker wrote:
| The key bit is _why_ those niches have it: typically either
| regulators require it or clients require it (sometimes
| specifying it to a given value in their contract). And that
| 's because the consequences of mistakes some professions
| make can be _very_ expensive relative to the size of their
| business. Also helps that a lot of the errors they cover
| are very rare so pooling the risk as insurance makes more
| sense...
|
| cf an airline chatbot agreeing to an inappropriate refund
| or giving wrong advice that leaves the airline deciding to
| apologise and pay their holiday-related expenses. Those are
| costs it makes more sense for the airline to eat than get
| their insurers to price up (unlike other aviation insurance
| which can be for eye-wateringly large sums) even if it
| happens several times a month (which if your chatbot is an
| LLM supposed to handle a wide variety of questions it
| probably does). Same goes for the human sales
| representatives who may work with higher-stakes
| relationships than chatbots but the consequence of their
| error is usually not much bigger than _issue refund_ or
| _lose client relationship_
|
| I guess chatbots/LLMs will end up as a special case for
| professional indemnity insurance in a lot of those
| regulated firms as lawyers/accountants start to use them in
| certain contexts.
| willyt wrote:
| Yes. I would say it probably makes more sense that
| whoever designed the chatbot system for the airline will
| need indemnity insurance. Then the airline has somewhere
| to go if it starts giving out free plane tickets willy
| nilly.
| dghlsakjg wrote:
| The Air Canada case is interesting since it predates LLMs. If
| you read the details it was basically the chatbot had been
| programmed to respond to point at a policy that for some
| reason differed from what Air Canada claimed was its actual
| policy. Nothing was made up, Air Canada simply had two
| contradictory policies based on where you were on the site.
|
| A customer trusted the policy that the chatbot provided to
| make a decision, and the tribunal said that it was reasonable
| for the customer to make a decision based on that policy, and
| that the airline had to honor that policy.
| em-bee wrote:
| while i am not a fan of the AI craze, and regardless of what i
| think of the practices of certain insurers, my first thought
| was that the current state of AI naturally lends itself for
| insurance. there is a chance that AI gives you a right or wrong
| answer. and a lesser chance that a wrong answer will lead to
| damages. but risk averse users will want to protect themselves.
| so as long as the income insurers make is higher than the
| payouts, it's a sound business model.
| bpodgursky wrote:
| It's also easier in many ways than insuring against employees
| because the insurance company can evaluate a precise model
| and insure against it, as opposed to employees where the
| hiring bar can vary.
| Retric wrote:
| Doing that kind of analysis is expensive for the insurance
| company.
|
| Insurance generally offsets low precision with higher
| premiums and a wide range of clients. 1 employee has a lot
| of variability but 100,000 become reasonably predictable.
| imoverclocked wrote:
| At best, this screams, "you're doing it wrong."
|
| We know this stuff isn't ready, is easily hacked, is undesirable
| by consumers... and will fail. Somehow, it's still more efficient
| to cover losses and degrade service than to approach the problem
| differently.
| nickff wrote:
| Customer service personnel are expensive to train properly, and
| often quit very quickly because they are treated very poorly by
| customers. The alternative to AI customer service is often no
| customer service (like Google).
| DonHopkins wrote:
| Can consumers get AI insurance that covers eating a pizza with
| glue on it, or eating a rock?
|
| https://www.forbes.com/sites/jackkelly/2024/05/31/google-ai-...
|
| How about MAGA insurance that covers injecting disinfectant, or
| eating horse dewormer pills, or voting for tariffs?
| 85392_school wrote:
| Reading the actual article, this seems odd. It only covers cases
| when the models degrade, but there hasn't been evidence of a LLM
| pinned to a checkpoint degrading yet.
| yieldcrv wrote:
| AI that hallucinates accurately enough times should just carry
| Errors and Omissions insurance like human contractors do
___________________________________________________________________
(page generated 2025-05-13 23:01 UTC)