[HN Gopher] Tell HN: Tesla rear-ended me on autopilot, no one wi...
       ___________________________________________________________________
        
       Tell HN: Tesla rear-ended me on autopilot, no one will investigate
        
       Yesterday I was stopped at a red light on a highway and a Tesla
       Model S rear ended me at high speed, twice! There were two impacts
       somehow. It was a 4-car pileup and my car was destroyed.  The
       driver said she was using Autopilot, that an OTA update had had a
       problem earlier in the day, and that she had actually had the same
       kind of collision on Autopilot previously!  I talked with the NTSB
       who said that they won't investigate because no one was killed.  I
       talked with the NHTSA who said that they can't take a complaint:
       only the driver can file a complaint and they will only investigate
       if they "see a trend".  This seems like a serious issue that
       someone should be looking at. Does anyone have any idea about how
       to get this actually investigated to see if it was another
       Autopilot failure?
        
       Author : raylad
       Score  : 274 points
       Date   : 2021-11-01 20:14 UTC (2 hours ago)
        
       | elil17 wrote:
       | You could ask the other driver to file a complaint - seems like
       | she's not too happy with it either. You could talk to the
       | inspectors general responsible for NHTSA and NTSB - they can
       | investigate why those two aren't investigating.
       | 
       | Only other option would be to sue Tesla. You'll have a claim if
       | your damages aren't paid for by the other driver/her insurance
       | (e.g. the damages exceed the insurance limit, so you get to sue
       | either her or Tesla). However, you'd be better of financially
       | suing her. If she's uninsured she could sue Tesla to cover the
       | damages.
       | 
       | Perhaps some public interest law firm would want to take up the
       | case - that could make suing practical.
        
       | oblib wrote:
       | Really sorry to hear this.
       | 
       | I really cannot see how this bullshit is allowed on our roads.
       | We're all guinea pigs for this insane experiment when the truth
       | is we know it cannot work well enough to not kill and maim
       | people.
       | 
       | Our government, as you've proved, is completely asleep at the
       | wheel here.
        
       | Dotnaught wrote:
       | Feel free to email me: tclaburn at theregister dot com
        
         | slownews45 wrote:
         | You can see here how "journalism" runs to stuff that's not
         | based on any data. 5 million + car accidents, and a probably
         | total bogus retelling from someone probably not even sure what
         | they were doing (ie, did she read manual?) will make it into a
         | news article.
        
           | striking wrote:
           | You're assuming this will instantly be spun into a story
           | based on this one anecdote. It might be used as one anecdote
           | of many, it may be represented as one point of view among
           | many, it may never end up becoming a story at all.
           | 
           | "You can see here how" a journalist is choosing to
           | investigate a possibly promising lead while their profession
           | continues to be misunderstood.
        
             | bryan_w wrote:
             | We'll see how much "quality" we get out of tclaburn but I
             | don't think it's gonna be an article full of numbers and
             | facts, but rather outrage and feelings, judging by the
             | "quality" that newspaper tends to produce.
        
               | RichardHesketh wrote:
               | Have you actually read any content at the register.com?
               | It's often tongue in cheek, but openly so, and rarely -
               | if ever - (in my experience) dishonest. Don't assume all
               | media outlets and journalists are the same.
        
           | ProAm wrote:
           | How else do you expect a journalist to start an
           | investigation?
        
             | HWR_14 wrote:
             | Apparently all news reports must be based solely on trends
             | from statistically valid publications. Ignore the fact that
             | most news is of the "man bites dog" variety and not the
             | "dog bites man trend decreases 3% year-over-year" variety.
        
           | laurowyn wrote:
           | So what should they do? Copy OP and publish as is with no
           | follow up, investigation or additional information? or should
           | they do their job and investigate a claim that could be of
           | interest to the public?
           | 
           | Not like they could get information from OP, and then make a
           | FOIA request of the two departments OP mentioned to identify
           | similar reports and write the facts up in an article for us
           | to read, understand and decide for ourselves if it's an
           | issue. That's good journalism: give the facts and let the
           | public make opinions, not the current system which gives the
           | opinions and let the public make up the facts. Just like this
           | post, deciding it's a wide spread issue and the comments
           | agreeing it is without any evidence.
        
           | trollied wrote:
           | The Register are pretty good, to be honest. They don't run
           | just any old crap, and are actually technical.
        
           | lvs wrote:
           | I don't think you understand what real journalists do. Bad
           | ones might perhaps just "forward" an unchallenged claim as
           | your uncle does on Facebook. Journalists and editors trained
           | at journalism schools have a little more integrity and
           | skepticism than your uncle.
        
             | hasdf wrote:
             | Ah yes the "no true journalist" argument
        
       | ensignavenger wrote:
       | Did the local police not come out and investigate?
        
         | raylad wrote:
         | Yes, they were there and so was a fire engine. I don't think
         | it's typical for police to request autopilot logs from Tesla
         | though.
        
           | BitwiseFool wrote:
           | I'm not defending the police per-se, but I assume the local
           | PD probably doesn't know how to handle such an investigation
           | yet.
        
             | [deleted]
        
       | atdrummond wrote:
       | Reach out to Jalopnik - this is right up their alley.
        
       | heavyset_go wrote:
       | > _The driver said she was using Autopilot, that an OTA update
       | had had a problem earlier in the day, and that she had actually
       | had the same kind of collision on Autopilot previously!_
       | 
       | So she continued to use it? That's insane.
       | 
       | It's worrying that there are tons of videos on YouTube reviewing
       | Tesla's Autopilot and FSD where the cars do something incredibly
       | scary and dangerous, and the drivers just laugh it off, go "well,
       | it's a beta!" and continue using it instead of turning off the
       | feature that almost got them into a car accident. I don't want to
       | share the road with drivers who think dangerous driving is a
       | funny game.
        
         | gordon_freeman wrote:
         | I think the main problem here seems to be that the driver has
         | put too much trust into Autopilot and what she needs to do is
         | rather assuming it is as just another driver assistance feature
         | and always be in control of her vehicle at all times.
        
         | rvz wrote:
         | > I talked with the NTSB who said that they won't investigate
         | because no one was killed.
         | 
         | So they must wait until lots of people get killed by autopilot
         | / FSD in order for your incident (and many others) to be
         | investigated?
         | 
         | The scary thing is that it is beta quality software on safety
         | critical systems and not only the drivers know it, they use it
         | as an excuse to cover them not paying attention on the road.
         | Was driver monitoring even switched on at the time?
         | 
         | As I have said before, this software without proper driver
         | monitoring and autopilot / FSD turned on puts the driver and
         | many others on the road at risk.
         | 
         | At this point, this is more like Teslas on LSD rather than FSD
         | judging from the bugs from many YouTube videos and social media
         | posts of all of this.
         | 
         | Oh dear.
        
           | HWR_14 wrote:
           | > o they must wait until lots of people get killed by
           | autopilot / FSD in order for your incident (and many others)
           | to be investigated?
           | 
           | The entire NTSB (which covers trains, planes and automobiles)
           | has 400 employees. They only get pulled into fatal accidents
           | because they only have that much manpower.
        
             | raylad wrote:
             | That low level of funding for such a critical function
             | seems a clear threat to public safety.
        
               | sokoloff wrote:
               | The NHTSA has primary responsibility for highway safety
               | at around $1B/yr in spend and 650-ish people.
        
         | hasdf wrote:
         | > "So she continued to use it? That's insane."
         | 
         | Well its proving to be a great excuse when she runs into people
         | with her car. Sounds like a win for her
        
         | postmeta wrote:
         | Was it TACC or Autopilot or FSD or driver trying to blame
         | someone else?
        
         | penjelly wrote:
         | this is how i feel too. i want FSD to succeed and i dont want
         | it to destroy itself during the beta. But the videos ive seen
         | are how you put it, people say "whoa that was close" and then
         | re-engage autopilot and have 3-4 more incidents in the same
         | session
        
           | FireBeyond wrote:
           | And then laugh nervously, and demonstrate a complete
           | misunderstanding of ML. "We need to keep having near misses
           | to train the car of what a near miss might be!".
           | 
           | And God forbid if they post a dashcam video to YouTube. Then
           | they'll get hit with comments like this with a sincere
           | straight face:
           | 
           | > "FSD didn't try to kill you. More like you're trying to get
           | FSD killed by publishing this video."
        
         | ashtonkem wrote:
         | "It's a beta" and the unsaid "and I don't care about other
         | peoples' lives anyways".
         | 
         | Still pretty frustrated nobody has regulated "self driving"
         | cars off the market yet. If you're testing it with a driver and
         | a copilot that's fine, but putting it in the hands of customers
         | and using "it's a beta" to hide behind should not be legal.
        
           | heavyset_go wrote:
           | > _"It's a beta" and the unsaid "and I don't care about other
           | peoples' lives anyways"._
           | 
           | Yes, it's a bit distressing having myself and the people I
           | care about be treated as NPCs to some Tesla owners as they
           | beta test their new toys on public roadways.
        
             | [deleted]
        
             | shadowgovt wrote:
             | I don't disagree, but I can't help but wonder if people
             | feel the same about Ford's pre-collision assist with
             | automatic emergency braking or, back in the day, anti-lock
             | brakes.
             | 
             | Sooner or later, these technologies are either public-ready
             | or they aren't. If Tesla autopilot isn't public-ready, I
             | agree it shouldn't be on the road, but I suddenly realize I
             | have no idea by what criteria the system _was_ cleared for
             | public-access roads.
        
               | ashtonkem wrote:
               | I would absolutely not expect a feature attached to my
               | car to be beta grade, no. And if early ABS systems had
               | issues, I'd expect them to be recalled and fixed at the
               | manufacturers cost.
        
               | syshum wrote:
               | I have long said that society has become such risk
               | adverse that if someone invented a technology like the
               | car today it would never be allowed in society.
               | 
               | I said this pre-COVID, and COVID has now cemented this
               | idea, we will not accept any level of risk at all...
               | None.
               | 
               | If it is not perfectly safe everything should be banned,
               | locked down, or otherwise prohibited by law
        
         | gameswithgo wrote:
         | Well she says she did. Further confusing the matter is there is
         | the FSD Beta, which very few people have access to for now.
         | Then there is adaptive cruise (works really well!), then there
         | is adaptive cruise with autosteer (not so good!). So who even
         | knows which she was talking about.
         | 
         | I fault Tesla for having a mess of badly named options, and for
         | putting beta stuff out on the roads for sure.
        
           | shmoe wrote:
           | Yea, the fsd beta issues last weekend (weekend before?)
           | involved false forward collision warnings.. I imagine she did
           | not have the fsd beta though.
        
         | slownews45 wrote:
         | Then don't drive with other drivers - NON-autopilot drivers
         | eating, texting (constantly!), seeming to look up directions
         | and much more (falling asleep).
         | 
         | https://www.youtube.com/watch?v=cs0iwz3NEC0&t=2s
         | 
         | The fact that you promote this absolute recklessness and demand
         | users turn off safety features that may, net net save lives is
         | depressing.
         | 
         | Finally, folks get very confused between things like autopilot
         | and other features (autosteer beta with FSD visualization) etc.
         | I think some teslas now even show a message warning that cruise
         | control will not brake in some cases? Anyone have that.
         | 
         | I've been following the unintended acceleration claims around
         | tesla's as well. Most seem pretty bogus.
         | 
         | For what its worth here is data we currently have from Tesla:
         | 
         | "In the 2nd quarter, we recorded one crash for every 4.41
         | million miles driven in which drivers were using Autopilot
         | technology (Autosteer and active safety features). For drivers
         | who were not using Autopilot technology (no Autosteer and
         | active safety features), we recorded one crash for every 1.2
         | million miles driven. By comparison, NHTSA's most recent data
         | shows that in the United States there is an automobile crash
         | every 484,000 miles."
         | 
         | So you want to go to a 10x increase in crashes. (1 per 484 vs 1
         | per 4M). I realize this is nt one to one, but even outside of
         | this we are seeing maybe a 2x improvement (if your normalize
         | for road conditions etc).
         | 
         | Nission Maxima had something like 68 deaths per million
         | registered vehicle years. Will be interesting to see how this
         | comes out for Teslas.
        
           | keefe8 wrote:
           | > I think some teslas now even show a message warning that
           | cruise control will not brake in some cases?
           | 
           | In my experience, that warning about not braking occurs when
           | you are in TACC mode but are pressing on the accelerator to
           | go temporarily faster than the set speed.
        
             | keefe8 wrote:
             | Unfortunately some people may have gotten into the habit of
             | hovering their foot over the accelerator to be ready to
             | override phantom braking, so maybe some even rest their
             | foot on the pedal without realizing.
             | 
             | Fortunately, removal of radar will reduce the frequency of
             | phantom braking, so hopefully this habit will fade.
        
           | FireBeyond wrote:
           | > NON-autopilot drivers eating, texting (constantly!),
           | seeming to look up directions and much more (falling asleep).
           | 
           | FSD is something people are _encouraged_ to use, not
           | _discourage_.
           | 
           | This is an entirely disingenuous take.
           | 
           | And Tesla's stats have been, _repeatedly_, shown to be
           | entirely disingenuous.
           | 
           | Human drivers don't have a "disengage" mode, other than
           | pulling off the road, when conditions are "suboptimal". AP
           | disengages. FSD disengages. And suboptimal conditions are,
           | surprise, surprise, the conditions in which more incidents
           | occur. So the whole "AP is safer because when it's safe to
           | use AP it's safer" circuituous reasoning, and every stat
           | Tesla publishes, while withholding as much information from
           | the public as it can (while touting their "corporate
           | transparency is a core value" spiel), should safely be thrown
           | in the trash.
        
             | [deleted]
        
           | gameswithgo wrote:
           | That data is not comparable, the autopilot in question there
           | can only be activated in simple situations like a straight
           | highway when you aren't changing lanes. While the human data
           | is for all scenarios.
           | 
           | Tesla is not being transparent enough with their data for us
           | to know anything.
        
             | slownews45 wrote:
             | No one has shown any credible data that the fatality rate
             | driving a tesla is higher than anything. They go out of
             | their way to ignore cars getting into pretty clearly lots
             | of accidents.
             | 
             | My expectation would be that Suburu WRX / Infinit Q50 /
             | Elantra GT type drivers are crashing more than folks using
             | autopilot? Maybe ban them first?
             | 
             | Anyways, we should in a few years get some NHTSA data on
             | this, though there is some speculation that given their
             | approach towards tesla if its possible they will delay it
             | and stop updating vehicle model statistics if its favorable
             | towards Tesla.
        
               | semi-extrinsic wrote:
               | The burden of evidence is on Tesla to show their car is
               | significantly safer in an apples-to-apples comparison.
               | They haven't done that so far.
               | 
               | And this is actually quite intriguing: even if Tesla was
               | 10x safer, it would be extremely difficult for them to
               | prove it. Because they'd have to do a multi-year feature
               | freeze on Autopilot etc. while collecting data.
               | 
               | See in total, Tesla has sold around 2 million cars.
               | Comparable cars have a fatal accident rate of around 10
               | per 1 million vehicle years. So the target for Tesla with
               | the current number of cars is to have less than 2
               | fatalities per year, on average. To show with statistical
               | significance that you have achieved that, you would need
               | data for around 5 years without doing any changes. Maybe
               | only 3 or 4 years given that the number of Tesla's keeps
               | increasing, but still.
               | 
               | Really, it's only when you're doing 10 million cars per
               | year, like Volkswagen or Toyota, that you can make
               | statistically meaningful statements about safety while
               | also updating your car's software frequently.
        
               | FireBeyond wrote:
               | That's one subset.
               | 
               | What people can credibly claim, because Tesla's stats are
               | entirely loaded and biased, is that any claim AP is safer
               | than humans is just that, a claim. Because AP will
               | (usually) disengage in situations where it won't do well,
               | something humans can't.
               | 
               | The closest you could come (maybe, who knows, because
               | Tesla hordes the raw data like Gollum, despite their
               | "commitment to corporate transparency") is:
               | 
               | "In optimal and safe driving conditions, AP may be safer
               | than human drivers, and in less optimal conditions, uh,
               | we don't know, because it turns off. And we won't count
               | those miles against AP."
               | 
               | Tesla should actually, if they have the courage of their
               | convictions, show how many miles are driven where AP/FSD
               | would refuse to engage.
        
               | Diederich wrote:
               | > Tesla's stats are entirely loaded and biased
               | 
               | Do you _know_ that, or do you _suspect_ that?
               | 
               | > how many miles are driven where AP/FSD would refuse to
               | engage
               | 
               | I've used FSD for the vast majority of the nearly 40k
               | miles on my 2017 Tesla model S. Highways, boulevards,
               | side streets, even dirt roads on occasion.
               | 
               | It's a powerful but subtle tool. Used correctly, I have
               | absolutely no doubt that has made my driving experience
               | safer and less stressful.
               | 
               | It definitely requires an engaged, alert driver.
               | 
               | Where I suspect you and I agree is the impact of Musk's
               | language and claims around the technology.
               | 
               | If he would fully shut the hell up about it, I think it's
               | quite likely that there would be way less ill will toward
               | the product.
        
               | FireBeyond wrote:
               | It's known. The logical fallacy is accurate re "miles
               | driven" versus "total possible miles". Tesla's
               | statisticians cannot possibly be unaware of this if they
               | have anything beyond a middle school education, yet they
               | repeatedly continue to tout this stat without so much as
               | a footnote.
               | 
               | I realize that may still, to some, be "suspect", not
               | "known", so yes, if you're asking, "Is there public or
               | leaked internal documentation saying 'we realize these
               | stats are misleading and don't care'", then no, there's
               | not.
        
               | czzr wrote:
               | Here's some credible data that your assumptions are
               | probably wrong:
               | 
               | https://medium.com/@MidwesternHedgi/teslas-driver-
               | fatality-r...
        
               | ricardobeat wrote:
               | That page is not credible at all.
               | 
               | IIHS reports are publicly available [1] and the numbers
               | there don't match what's in the article at all.
               | Large/very large luxury vehicles have an overall fatality
               | rate of 16-21, ranging from 0 to 60 depending on model.
               | The overall average for all vehicles is 36. The Audi A6
               | for example, is reported as having 0 fatalities in the
               | article, while in the report the actual number is 16.
               | 
               | The other source used, tesladeaths.com, lists a ton of
               | accidents where AP was not deemed responsible. It
               | actually states the real number if you pay attention - 10
               | confirmed deaths in the lifetime of Tesla as of 2021 -
               | yet somehow the article claims 11 deaths up to 2016.
               | 
               | [1] https://www.iihs.org/api/datastoredocument/status-
               | report/pdf...
        
               | gameswithgo wrote:
               | The comparison there is as bad as elons in the other
               | direction. They are digging into every missing bit of
               | data and mistake for tesla in a different data set than
               | other brands. I wonder of you could redo now with
               | official data.
        
         | dkdk8283 wrote:
         | Still need transportation...
        
           | gpvos wrote:
           | She's supposed to be able to drive herself, without
           | autopilot.
        
       | mrshadowgoose wrote:
       | "The driver said she was using Autopilot"
       | 
       | "she had actually had the same kind of collision on Autopilot
       | previously!"
       | 
       | I've got a hypothesis...and it has nothing to do with autopilot
       | being faulty.
        
         | taxcoder wrote:
         | Likely an error in the seat to steering wheel connector - used
         | to see it all the time when I worked on cars.
        
       | iamleppert wrote:
       | Tesla does the bare minimum when it comes to software QA on their
       | autopilot. Looking at the codebase can tell you a few things:
       | 
       | - There are few, if any unit tests. - No regression tests, no
       | integration tests. - CI is "something to aspire to have one day"
       | - Uses thousands of open source dependencies, many of which are
       | updated randomly based on if a developer needs or wants to. - No
       | code review process. - Lots of commented out code, dead code. -
       | No code linters or pre-commit process. - About 5 different
       | architectures due to in part constant churn on the team.
       | 
       | Anyone who allows the "Tesla autopilot" would think twice if they
       | actually knew the quality of code that they are betting their
       | lives on. It's a dumpster fire.
        
       | trhway wrote:
       | an idea for startup - a flat screen mounted on the back of your
       | car and a rear looking AI. Once it recognizes say Tesla behind
       | you, it will display on the screen an image which is known to be
       | successfully recognized by Tesla, so it wouldn't rear-end you. Of
       | course it isn't limited to Tesla. Also there is very good
       | business case for subscription as Tesla and the others update
       | their autonomous AI, the users would need updates to that anti-
       | rear-ending software.
        
       | MaxBareiss wrote:
       | Correct me if this is what you meant by "talked with the NHTSA",
       | but another place you can submit information is to the NHTSA
       | Office of Defects Investigation (ODI):
       | https://www.nhtsa.gov/report-a-safety-problem#index , which is
       | the group that forces automakers to do recalls.
       | 
       | In my opinion (PhD student in vehicle safety) it doesn't sound
       | severe or novel enough for the NTSB to investigate. NTSB has done
       | good reports on a couple of similar Tesla Autopilot crashes
       | (https://data.ntsb.gov/Docket/?NTSBNumber=HWY18FH011
       | https://data.ntsb.gov/Docket/?NTSBNumber=HWY16FH018).
       | 
       |  _minor note: NHTSA is spoken as "nit-sah" so "NHTSA" is better
       | than "the NHTSA". NTSB is spoken as "N-T-S-B" and is fine._
        
       | modeless wrote:
       | Detecting stationary objects is a known issue with _every_ driver
       | assist system. No automaker 's system guarantees that it will
       | stop for stationary objects. In fact they all explicitly state
       | that they may not. It's a known and disclosed and accepted risk
       | of these systems.
       | 
       | Ford: "may not detect stationary or slow moving vehicles"
       | https://www.fordservicecontent.com/Ford_Content/Catalog/owne...
       | 
       | Tesla: "may not recognize or detect oncoming vehicles, stationary
       | objects, and special-use lanes such as those used exclusively for
       | bikes, carpools, emergency vehicles, etc"
       | https://www.tesla.com/ownersmanual/model3/en_us/GUID-0535381...
       | 
       | GM: "may not detect and react to stopped or slow-moving vehicles
       | ahead of you"
       | https://my.chevrolet.com/content/dam/gmownercenter/gmna/dyna...
       | 
       | etc. You can find it in every brand's manual. NTSB and NHTSA knew
       | that this risk existed when they approved these systems. There is
       | nothing to investigate.
        
         | raylad wrote:
         | Yes, but there is usually an additional system like Toyota's
         | Safety Stop which WILL detect a stationary object and kick in
         | at the last minute to reduce speed by N MPH before collision.
         | 
         | AFAIK Tesla used to have that via its RADAR but because the new
         | production has no RADAR, new cars can't have it, and they
         | disabled it on the older cars too, in the interest of using a
         | single software version.
         | 
         | At least this is what I've gathered from reading here and
         | elsewhere.
        
           | modeless wrote:
           | Radar is useless for detecting stationary objects because of
           | far too many false positives. Radar is the reason why these
           | warnings exist. It is not the solution.
           | 
           | Teslas continue to have forward collision warning and
           | automatic emergency braking as standard features whether or
           | not they have radar. But again, they do not detect 100% of
           | all stationary objects and neither does any other automaker's
           | system.
        
             | raylad wrote:
             | These systems do work and have worked for years. They are
             | also regularly tested by IIHS.
             | 
             | See, for example:
             | https://www.youtube.com/watch?v=TJgUiZgX5rE
             | 
             | And Teslas at least used to do this too:
             | 
             | https://www.youtube.com/watch?v=aJJfn2tO5fo
             | 
             | If they no longer stop reliably to avoid hitting stationary
             | objects, this is a serious problem that needs to be
             | corrected.
        
               | modeless wrote:
               | They absolutely _work_ but they are not _reliable_. Not
               | on _any_ car. They are a safety system of last resort.
        
         | conductr wrote:
         | > It's a known and disclosed and accepted risk of these
         | systems.
         | 
         | Accepted by who? As another driver getting rear-ended by these
         | vehicles, I didn't agree to that. It's just a way for them to
         | shift liability to the driver for what they know is going to
         | happen.
         | 
         | This technology shouldn't be allowed on the road unless it
         | could detect and avoid impact with parked cars. This should be
         | price of admission for the technology.
        
           | modeless wrote:
           | Accepted by NTSB and NHTSA.
        
       | mwint wrote:
       | Have your insurance company harass Tesla. Chances are they'll
       | come back with a report saying "driver depressing accelerator
       | 13.42deg at time of incident overriding AEB and ignoring FCW for
       | 5 seconds". That's usually how these kinds of stories end.
        
         | xtracto wrote:
         | That was my thought as well, over here, if a car rear-ends you,
         | both conductors stop, call their respective insurance
         | companies, wait until a representative arrives and let them
         | deal with the consequences.
         | 
         | As an individual we don't have bargaining power vs stupid
         | automotive choices. But insurance companies have in their best
         | interests to make sure somebody else will pay.
        
         | fmakunbound wrote:
         | I wonder if we'll see a difference liability insurance rates
         | for Tesla owners vs. the others. If they're truly road-bound,
         | OTA-distracted, undirected electric missiles, I'm sure we'll
         | see a difference there before we hear back from NTSB.
        
         | pdonis wrote:
         | The OP's insurance company should be asking the other driver's
         | insurance company to cover the loss. Then the other driver's
         | insurance company should be harassing Tesla if it believes
         | Tesla was responsible.
        
           | soheil wrote:
           | It's probably safe to assume that if she were going to do
           | that she would have done that the first she got into this
           | type of high speed accident.
        
           | mwint wrote:
           | OP's insurance company can harass Tesla directly if they
           | believe Tesla holds evidence of the true cause of the crash,
           | which they probably do.
        
             | dragonwriter wrote:
             | But neither is going to go after Tesla, because insurance
             | companies aren't about ferocious prosecution of potentially
             | valid legal claims, or manifesting customer outrage, but
             | getting things resolved in a final manner as quickly as
             | possible with as little out of their pocket as possible.
             | 
             | To go after Tesla for defective FSD over any individual
             | accident takes a litigant who is more concerned with making
             | a point and/or harming Tesla than cost effectiveness.
        
         | raylad wrote:
         | I doubt they will do anything.
         | 
         | They have a process: claims adjuster looks at the police
         | report, looks at the wreckage, proposes a payment, closes the
         | file. Anything else would be extra work on their part and not
         | required or probably even encouraged.
        
         | FireBeyond wrote:
         | Oh, you assume Tesla won't fight to the death to avoid
         | releasing data recorder information. They will.
         | 
         | Unless it "absolves" Tesla.
         | 
         | Remember, this is the company that when someone died, put out
         | actual PRESS RELEASES to say "Not the car's fault. In fact it
         | warned him before the collision that he was inattentive."
         | 
         | They neglected to mention it triggered ONE steering wheel
         | warning... FOURTEEN MINUTES before the collision.
         | 
         | Even your example is problematic. "FSD/AP isn't at fault/didn't
         | cause the collision/near miss, because it wasn't engaged..."
         | 
         | ... because the driver had to take emergency action to avert
         | FSD/AP's behavior.
         | 
         | They got taken to task for that, when investigatory boards
         | started asking "just how long before that collision was FSD/AP
         | disengaged, and was it disengaged by hard manual braking,
         | etc.?"
        
           | filoleg wrote:
           | > Oh, you assume Tesla won't fight to the death to avoid
           | releasing data recorder information. They will.
           | 
           | Last time this happened in a similar situation, the driver
           | went to a third-party technician that was able to extract the
           | log data, and it proved the same thing that Tesla was
           | claiming. The driver was full-on pressing accelerator instead
           | of the braking pedal.[0]
           | 
           | Raw log data is present in that link, so it isn't just
           | another "he said one thing, they said another thing"
           | situation. But the driver in question was indeed fighting to
           | death claiming their innocence, even thought the initial
           | accident was already raising eyebrows of most people familiar
           | with autopilot.
           | 
           | 0. https://insideevs.com/news/496305/valet-crashes-tesla-
           | data-r...
        
         | paxys wrote:
         | Insurance companies aren't going to bother investigating
         | autopilot until this issue gets a lot more widespread. Rear-end
         | accidents happen thousands of times a day.
        
       | AlexandrB wrote:
       | I think the comments section here neatly summarizes how liability
       | for any self-driving bugs will be foisted onto individual
       | drivers.
       | 
       | This was once a big question mark for me - who would be liable in
       | a self-driving crash, the driver or the manufacturer? Apparently
       | individuals _want_ to be liable. It 's not clear whether this is
       | the result of marketing or ideology.
        
         | krisoft wrote:
         | It is not a self-driving car unless you can have a sleep in it
         | while it takes you where you need to go in my opinion. :)
         | 
         | Take a real self-driving car for example: a waymo taxi you are
         | riding in rear-ends an other car. Do you think you will be
         | responsible for it?
        
         | lvs wrote:
         | Consumer protection is weak in the US. In numerous cases of
         | faulty products and negligent manufacturers, failures of
         | regulatory agencies to intervene eventually escalates into
         | class actions of victims, owners, or state AGs against
         | manufacturers -- with variable outcomes. We've seen it with
         | opioids, cigarettes, guns, diet pills, etc..
        
       | sidibe wrote:
       | I have a feeling a lot of stuff is being swept under the rug at
       | this stage. Watching how many near misses are on YouTube from a
       | handful of beta testers, the only accidents I've seen are rims
       | being destroyed by curbing. From the amount of dicy things it's
       | doing I find it hard to believe there's no accidents. Especially
       | before the rollback last week of 10.3 where a lot of people
       | reported very dangerous behavior as soon as it rolled out
        
         | mtoner23 wrote:
         | Wait till you see the amount of car accidents from non
         | autopilot vehicles! you'd want to ban cars all together if a
         | youtube video was posted for every accident.
        
           | sidibe wrote:
           | Humans are pretty good drivers compared to FSD beta. If you
           | think that's not true after trying it or watching some videos
           | I would appreciate it if you tore up your license and used
           | Uber
        
             | mbushey wrote:
             | > Humans are pretty good drivers compared to FSD beta. Lol!
             | You haven't driven in Montana, have you? What I see in the
             | latest youtube videos is an order of magnitude better than
             | the average driver here.
        
           | soheil wrote:
           | So sad that the average non-Tesla driver is not as
           | enthusiastic about latest technology as not to broadcast
           | their driving journeys on YouTube for the rest of the world
           | to criticize and make fun of the shady dumpster fire of a
           | manufacturer that made their car when they get into a close
           | call.
        
           | ashtonkem wrote:
           | Come on, that's blatant whataboutism. Surely we're good
           | enough here to realize that we can hold both Tesla _and_
           | other drivers to a higher standard?
           | 
           | Yes, people are often bad drivers. But it doesn't follow that
           | we must ignore the faults of any self driving technology just
           | because regular drivers crash too. If Tesla is pushing out
           | updates that make their own super fans comment on the cars
           | doing unsafe and erratic things, that's something we should
           | look into.
        
             | jjk166 wrote:
             | No, we can't. Accidents are going to happen no matter what
             | because driving is a complex task with an infinite number
             | of potential failure modes, and no threshold where the risk
             | of serious injury or death can be considered zero. If
             | something is safer than average, then it is objectively an
             | improvement, and that's all it needs to be.
        
               | ashtonkem wrote:
               | > If something is safer than average, then it is
               | objectively an improvement, and that's all it needs to
               | be.
               | 
               | GP is asserting that FSD is significantly less safe than
               | previously known, and only driver intervention is
               | preventing FSD from killing more people. Personally I've
               | heard way too many "FSD regularly tries to drive into
               | that bridge pillar" for me to believe an assertion that
               | FSD is safer than drivers without significant evidence.
               | 
               | Second, we're talking about a case where even Tesla
               | owners acknowledge that an update made their cars
               | noticeably less safe. Even in cases where FSD is better
               | than human drivers, which I don't think is the case yet,
               | we should be quite concerned about the possibility of
               | software updates making vehicles less safe.
        
               | jjk166 wrote:
               | My comment was made in response to GP's claim that it is
               | whataboutism to compare the safety record of FSD to human
               | drivers. It is entirely possible that FSD actually is
               | less safe than drivers, and if so let the evidence show
               | it. But the fact remains that you should very much be
               | comparing the safety of FSD to human driving.
        
             | NickM wrote:
             | I don't think that's an example of whataboutism, given that
             | one of the stated goals of autopilot is literally to
             | provide a safer alternative to the status quo of humans
             | driving manually; pointing out the riskiness of driving in
             | general is not some unrelated thing that's being raised as
             | a distraction from the real issue.
        
       | [deleted]
        
       | soheil wrote:
       | I'm not sure what you think Autopilot is. It's basically adaptive
       | cruise control which almost all new cars have. As shocking as
       | this may sound like the onus is on the driver to avoid a high
       | speed collision when approaching a stopped car at a red light.
       | The driver should push the brake petal.
       | 
       | Just because it was a Tesla somehow this is a problem that NTSB
       | and NHTSA need to investigate? But if it was a Honda Civic it's
       | just an accident?
        
       | huslage wrote:
       | She is still the driver. Blaming things on Autopilot is not going
       | to get anyone very far in the legal sense.
        
         | JshWright wrote:
         | Tesla seems to want it both ways... Sell their cars by
         | marketing how magical the autopilot is, then blaming drivers
         | when the believe the marketing hype.
        
           | FireBeyond wrote:
           | Yup. Tesla will happily put out press releases saying
           | "autopilot is not at fault - the vehicle warned the driver
           | prior to the collision to put hands on the steering wheel"...
           | 
           | One warning. Fourteen minutes prior to the incident. That
           | part wasn't in the press release.
           | 
           | The Summon feature is the same.
           | 
           | Marketing copy: "Use Summon in a car park to bring your
           | vehicle to you while dealing with a fussy child" (literal
           | quote).
           | 
           | Disclaimer: "Do not use Summon while distracted."
           | 
           | There's apparently a giant Chinese wall between Marketing and
           | Legal at Tesla, because it's far from the only example.
           | Another, that's still present in some Tesla videos, and has
           | been there for years:
           | 
           | > The driver is only in the seat for legal reasons. The car
           | is driving itself.
        
           | buffington wrote:
           | I only have my own experience to draw upon, so I may be an
           | outlier, but I didn't buy a Tesla because of its Autopilot
           | feature. If they pulled the feature today, I'd be totally
           | fine with that.
           | 
           | As a user of Autopilot, it's absolutely insane to me that
           | anyone would blame Autopilot for a wreck. It's like a "smart"
           | cruise control, except unlike cruise control, it gives you
           | all sorts of warnings about your role as the driver and will
           | shut itself off if it thinks you're not paying attention. Any
           | one blaming Autopilot for a Tesla wreck is either trying to
           | sensationalize, or is just completely inept or lying.
        
             | yupper32 wrote:
             | > I didn't buy a Tesla because of its Autopilot feature. If
             | they pulled the feature today, I'd be totally fine with
             | that.
             | 
             | Many many people paid $1000+ for the promise of Full Self-
             | Driving that doesn't exist. People definitely care about
             | the Autopilot feature a lot more than you.
             | 
             | > It's like a "smart" cruise control
             | 
             | Except they call it Autopilot! You can't call something
             | Autopilot and then blame people for expecting the car to
             | drive itself.
             | 
             | Call it lane assist or cruise control plus or something.
        
               | [deleted]
        
       | lnanek2 wrote:
       | Maybe it is being investigated and you don't know. For both
       | drivers, your responsibility begins and ends with getting a
       | police report done and informing the insurance companies, then
       | they'll fight it out. If they think they can go after Tesla,
       | instead of one of them paying, they will. You aren't going to
       | hear about it regardless.
        
       | veltas wrote:
       | Recently I was driving behind a Tesla and it suddenly started
       | braking for no reason at all, I almost rear-ended it. I think it
       | had something to do with the sun reflecting on the road surface
       | brightly because I noticed this reflection just as I was driving
       | past where the car suddenly started braking.
        
       | kf6nux wrote:
       | You might be able to seek a criminal complaint (not that I
       | recommend it). You can call your local DA's office to see what
       | they think. It's unlikely they'd want to take on Tesla, but a CA
       | DA might go after the driver for criminal negligence.
        
         | tacobelllover99 wrote:
         | Take on Tesla? The driver is responsible at all times. If AP
         | was failing to stop the driver should have been paying
         | attention and took over.
        
       | dalys wrote:
       | My VW Passat was on ACC and LKAS just a week ago when it suddenly
       | stopped detecting the car in front and was about to drive into
       | it. This was also at a red light. As I was actually paying
       | attention, I just hit the brakes manually. Technology isnt
       | perfect. This happens to all cars.
        
         | a-dub wrote:
         | i recently rented a late model rav4 and played around with the
         | lane keeping and radar cruise control.
         | 
         | it kinda drove like a drunk lead footed teenager. it was cool
         | conceptually, but it didn't seem to work very well and it was a
         | rough ride of heavy acceleration and deceleration with
         | occasional odd steering choices. it also had these lights on
         | the mirrors that would light up when someone was in the blind
         | spots, which ultimately were kind of annoying/distracting.
         | 
         | it felt like maybe the ui for these features, which seemed to
         | push you towards setting adaptive cruise and lane keeping, and
         | then letting the car do the rest, wasn't really in line with
         | the actual capabilities of the system. although, maybe i just
         | wasn't used to it. it would certainly do scary things like
         | accelerate harshly upon cars ahead when the road was curved or
         | had small hills that temporarily obscured direct line of sight
         | to them.
        
       | floatingatoll wrote:
       | If you're in California, you could file an OL-316 since a citizen
       | could reasonably expect to think that this is an example of a
       | "traffic collision involving an autonomous vehicle", and Tesla is
       | a licensed member of that program. At worst, they'll refuse the
       | report for some reason or another.
        
       | op00to wrote:
       | You should delete this post, stop talking to people, and get a
       | really good lawyer because you are gonna get a little bit of
       | Elon's billions!
        
       | szundi wrote:
       | Just imagine that you're a person who constantly lies about
       | anything just to feel better. You have a Tesla. You made a
       | mistake and hit this car. You get out, what do you say.
       | 
       | That's what I thought.
       | 
       | Obviously this is just one possible scenario, one of the others
       | is OP's interpretation that the lady was telling the truth.
        
         | raylad wrote:
         | My interpretation is that it may or may not be what happened,
         | and should be investigated.
        
       | tacobelllover99 wrote:
       | Yeah it seems like a serious problem that the driver wasn't
       | paying attention for sure.
       | 
       | My money is OP is a short
        
       | AnimalMuppet wrote:
       | If this keeps happening, it seems like a logical consequence
       | would be for Teslas to become more expensive to insure.
        
         | zarkov99 wrote:
         | Nah, if indeed AP on was more dangerous than AP off then Tesla
         | would just shut it down over the wire.
        
           | klyrs wrote:
           | I'd agree if you say "less profitable" instead of "more
           | dangerous."
        
       | kevin_thibedeau wrote:
       | Sue Tesla. It's their responsibility for this garbage. You will
       | get all the incriminating data during discovery.
        
         | slownews45 wrote:
         | That's not how it works. You sue the driver, if they have
         | insurance the insurance company steps in, after paying you out
         | the insurance company can go after tesla.
        
           | dragonwriter wrote:
           | > That's not how it works. You sue the driver, if they have
           | insurance the insurance company steps in, after paying you
           | out the insurance company can go after tesla.
           | 
           | That's...not how it works.
           | 
           | Anyone injured as a result of a defective product has a claim
           | against the manufacturer and anyone else in the chain of
           | commerce. You don't have to sue someone else and let them sue
           | the manufacturer.
           | 
           | (You _can_ , but you don't have to, and its a complicated
           | tactical consideration as to whether it is optimal in any
           | particular case, further complicated by the fact that you can
           | also sue _everyone who might be liable at once_ , instead of
           | picking and choosing who is best to sue.)
        
           | kevin_thibedeau wrote:
           | If that's not how it works we'd still have Pintos and
           | Corvairs on the streets. Companies can be held responsible
           | for defective products no matter how much they want the shift
           | the blame to someone else.
        
           | HWR_14 wrote:
           | That's best if your goal is to get your car fixed and
           | (hopefully none) medical bills paid. That's not likely to
           | result in Tesla paying anything and certainly not in Tesla
           | changing behavior.
           | 
           | If you want to go for the giant settlement from Tesla, that's
           | a different lawsuit, much higher-risk, much more expensive
           | and much higher reward (financially, media attention and
           | changing behavior).
        
         | trutannus wrote:
         | Sue a billion dollar company? I'm not sure that's a solution
         | for the vast majority of people. Fighting discovery is
         | generally step one in the process of dealing with this sort of
         | lawsuit. Something like this could/will take years, and
         | hundreds of thousands of dollars to deal with.
         | 
         | It would be likely cheaper to buy a whole new car every year
         | for five years than to take Tesla to court.
        
           | bolasanibk wrote:
           | Nit: 1 $[T]rillion as of today.
        
         | gameswithgo wrote:
         | Or you may find out the driver of the Tesla was lying.
        
         | [deleted]
        
       | soheil wrote:
       | > The driver said she was using Autopilot, that an OTA update had
       | had a problem earlier in the day, and that she had actually had
       | the same kind of collision on Autopilot previously!
       | 
       | She had the same type of high speed collision in the past and
       | continues to keep relying on Autopilot so heavily to cause yet
       | another collision? I mean is the average high collision rate for
       | Tesla drivers close to 2? Or is it just this driver?
        
       | INTPenis wrote:
       | It's not actually that much different than a negligent driver.
       | The driver was negligent with the OTA update and the driver was
       | negligent enough to trust autopilot in traffic.
       | 
       | Tesla should of course improve its autopilot technology but I
       | don't see how they're responsible for the crash.
        
         | i_am_proteus wrote:
         | A common reason people (me included) fault Tesla is that their
         | marketing is designed to instill more trust in the auto pilot
         | feature than the user ought have given that feature's maturity,
         | while cleverly dancing around the truth to avoid legal
         | liability, which can instead be placed on "negligent drivers."
        
           | INTPenis wrote:
           | My perspective is skewed because I'm in Sweden but in real
           | world situations I've seen more good about adaptive cruise
           | control than Tesla autopilot. Tesla autopilot is mostly
           | something I see hyped on HN or reddit.
           | 
           | But in real life, adaptive cruise control is being used all
           | the time, and no good driver that I know trusts it with their
           | life or their insurance.
        
           | buffington wrote:
           | I own a Tesla, and I'm not a Tesla apologist. There's tons of
           | stuff that is just wrong about that car (the UI for example,
           | is a dumpster fire).
           | 
           | That said, I recall hearing some stuff in the early Tesla
           | days about how the cars could drive themselves. Summoning,
           | auto parking, and some hints at what we now know as Full Self
           | Driving.
           | 
           | Aside from that, I don't recall much marketing hype around
           | Autopilot. It has a bullet point on the Model 3 website, and
           | some details on the naming of related features and what they
           | do, but that's about it. None of it seems like "hype".
           | 
           | Here's the full description of Autopilot:
           | 
           | > Autopilot enables your car to steer, accelerate and brake
           | automatically within its lane.
           | 
           | That's it.
           | 
           | In the car's user manual it makes it very clear what the
           | feature does and what your responsibilities are as a driver.
           | They don't make it out as some sort of magical feature.
        
             | shmoe wrote:
             | Someone needs to tell this to the idiot at my mother's
             | office that praises autopilot because he can answer all his
             | emails during his commutes.
        
             | i_am_proteus wrote:
             | Things like Elon Musk (he is the Tesla CEO) making
             | statements about the self-driving capabilities of the
             | vehicle come to mind: https://www.wired.com/2016/10/elon-
             | musk-says-every-new-tesla...
             | 
             | You're correct about what's in the fine print. But this
             | Musk fellow is effectively Tesla's salesman in chief, and
             | has quite a following on the Internet.
        
             | heavyset_go wrote:
             | For reference, Tesla's marketing[1] is clear about what
             | they mean when they say "Autopilot" and "Full Self-
             | Driving":
             | 
             | > _The person in the driver's seat is only there for legal
             | reasons. He is not doing anything. The car is driving
             | itself._
             | 
             | [1] https://www.tesla.com/videos/autopilot-self-driving-
             | hardware...
        
       | [deleted]
        
       | revscat wrote:
       | Why do you think she was telling the truth? People's instincts in
       | such situations are for self-preservation, and shifting blame
       | elsewhere is common.
        
       | buffington wrote:
       | I drive a Tesla. If my car rear ends another car, that's my
       | fault. Doesn't matter if I had Autopilot on or not.
       | 
       | Imagine if you were rear ended by a Toyota and the driver said
       | "it was on cruise control, _shrug_. " Would you talk to NTSB or
       | NHTSA about that? Probably not.
       | 
       | The only scenario I can imagine that'd warrant investigation is
       | if the driver were completely unable to override Autopilot.
       | Similar investigations have been started by NTSB when Toyota had
       | issues with their cruise control systems several years ago.
        
         | cinntaile wrote:
         | The big black box makes a mistake and you voluntarily accept
         | the blame? I really don't get this train of thought. I know
         | that this is legally correct, but at some point in time along
         | the technological advancement of self driving technology we'll
         | have to stop blaming ourselves and legally shift the blame to
         | whoever created the self driving tech instead.
         | 
         | Edit: Updated wording to make it less confusing. From system to
         | whoever created the self driving tech.
        
           | hasdf wrote:
           | when you turn on autopilot it tells you to keep your hands on
           | the wheel and if it senses you are distracted it will turn it
           | off if you don't wiggle the wheel a little to show you are
           | paying attention. So at least as far as autopilot is
           | concerned I would say the driver is at fault if they run into
           | another car.
        
           | RichardHesketh wrote:
           | Yes, and that point is most likely SAE level 5, as shown here
           | (https://www.nhtsa.gov/technology-innovation/automated-
           | vehicl...). We are some way away from that - just how far
           | depends on whose hype you listen to.
           | 
           | I drive a Tesla Model 3, but I have sufficient grey matter to
           | understand that AutoPilot, in spite of its name, is SAE level
           | 2 (see above reference). If the car is involved in an impact
           | either something hit me, or I hit something - no blame
           | attaches to the computer, because I'm still in charge. Given
           | the current state of the art, I'm perfectly happy with that
           | and bloody livid at anybody who is dumb enough to buy/lease
           | and drive one of these amazing machines without being
           | constantly and totally aware of their limitations.
        
             | lamontcg wrote:
             | > Yes, and that point is most likely SAE level 5
             | 
             | Hard disagree.
             | 
             | This is the same as identity theft, where it becomes your
             | responsibility instead of a failure on the part of the bank
             | to protect your account, and the burden gets shifted
             | entirely onto the consumer.
             | 
             | Relevant Michell and Webb:
             | https://www.youtube.com/watch?v=CS9ptA3Ya9E
             | 
             | If Tesla sells a SAE Level 2 lane following / accident
             | avoidance feature which should be able to stop and avoid
             | rear end collisions, yet it causes rear end collisions,
             | they must be liable for that failure. They can't just write
             | words and shift all liability onto the consumer for the
             | software failing to work properly. Nice try though.
             | 
             | And I don't care if the consumer technically agreed to
             | that. If someone smacks into me with Autopilot enabled, I
             | never consented to that technology and didn't give up my
             | rights to sue Tesla.
        
             | ryantgtg wrote:
             | I saw an old dude apparently asleep behind the wheel of a
             | tesla while on the freeway. I immediately feared for my
             | family members's lives, and I had to speed in order to gain
             | distance from this fellow.
        
           | movedx wrote:
           | Because as the driver you're responsible for the tool you're
           | using to drive your self to your destination. That means
           | you're responsible for whether or not you use Autopilot, do
           | 200kmph, take your hands off of the steering wheel, turn your
           | lights on at night, and everything else associated with the
           | operation of that vehicle.
           | 
           | > ... at some point in time along the technological
           | advancement of self driving technology we'll have to stop
           | blaming ourselves.
           | 
           | No we don't. We built it, so we're responsible for it and the
           | consequences associated with its operation, regardless of how
           | well it's engineered or how smart we think we are.
        
             | cinntaile wrote:
             | > We built it, so we're responsible for it and the
             | consequences associated with its operation, regardless of
             | how well it's engineered or how smart we think we are.
             | 
             | That's my point. The company that built the self driving
             | technology should be held responsible. If you're not
             | driving you're not responsible, that's how it works when
             | somebody else is driving your car as well so why would it
             | be any different if that somebody else is code? It seems
             | like a good way to align the incentives to create a safe
             | system. You could claim that at this point in time you
             | still have to pay attention and take control when
             | necessary, the issue I have with that argument is that we
             | already have research showing us that the context switch
             | results in a delay that's too long to be really useful in
             | practice.
        
             | polishdude20 wrote:
             | Yeah and honestly the chain of responsibility is driver
             | first then tesla. What's important is the victim gets their
             | payout. It's the drivers responsibility to provide that
             | payout / get dinged on insurance. if the driver then wants
             | to complaint to tesla for damages that's totally fair. But
             | the victim's compensation should come first from the
             | driver. The driver can't say "but muh autopilot" and
             | release all blame
        
             | ipaddr wrote:
             | The question is the driver responsible? Should it be the
             | owner? Should it be the company who made the failing
             | product?
             | 
             | If your toaster blows up and kills someone a solid case
             | against the maker of toasterb
        
             | Someone wrote:
             | Yes, and no. With other dangerous tools, society decided
             | manufacturers have responsibilities, too.
             | 
             | If you buy a chainsaw, you can trust it has certain safety
             | features. If you buy a children's toy, you can trust it
             | doesn't use lead paint, etc.
             | 
             | Those aren't features you, as a consumer, have to
             | explicitly test for before using them.
             | 
             | Similarly, I think society will demand that cars behave as
             | cars, and not as projectiles. If a manufacturer claims a
             | car has auto-pilot, but that auto-pilot has a tendency to
             | rear end other cars, the manufacturer should carry at least
             | some blame.
             | 
             | I think that should be true even if they explicitly mention
             | that problem in the instruction manual. Certainly in the
             | EU, whatever you buy, you can assume it is "fit for its
             | purpose".
        
               | syshum wrote:
               | The problem here is Tesla does not make the claim that
               | the car is autonomous, in fact they clearly state
               | otherwise, nor it is legal for any driver to operate the
               | car like it was autonomous
               | 
               | To continue your analogy Chainsaw manufacturers include
               | clear instructions, and directions to use safety PPE.. if
               | a operator of the chainsaw fails to follow the
               | instructions, or where the PPE and chops their leg off
               | the manufacturer is not liable.. Hell even if they did
               | follow the instructions and chop their leg off it would
               | be unlikely the manufacturer would be liable.
        
               | cinntaile wrote:
               | Lets take the analogy one step further. Imagine we have
               | an Autopilot chainsaw, you just have to touch it every
               | once in a while to tell it to keep sawing. Then suddenly
               | it starts sawing in a completely unexpected way and
               | causes an accident. Are you at fault because you have to,
               | in theory, keep in control at all times? Even though you
               | in practice relinquish control and humans don't have the
               | ability to context switch without a delay? The issue
               | would not have occured if the chainsaw didn't start
               | behaving in an unexpected way but it also would not have
               | occured if you didn't use the Autopilot function.
        
             | mattnewton wrote:
             | I agree, but if you bought a tool called "autopilot" and it
             | piloted you into another car, there is something wrong, no?
             | Maybe not the NHTSA, but it seems like someone should be
             | tallying that.
        
               | ryantgtg wrote:
               | Has Tesla ever said anything like: the "auto" in this
               | case means "automobile" and not "automatic"?
               | 
               | But yeah, they shouldn't call it that.
        
             | Brian_K_White wrote:
             | Incorrect. You are responsible for using the tool
             | responsibly and according to any applicable directions and
             | best practices.
        
             | dafoex wrote:
             | I didn't build it, Tesla did, and I personally think the
             | company that signed off on "touch screen controlled window
             | wipers" is the one with the most liability in autopilot
             | failures.
        
               | [deleted]
        
               | dafoex wrote:
               | Context: https://www.bbc.co.uk/news/technology-53666222
        
         | amelius wrote:
         | > I drive a Tesla. If my car rear ends another car, that's my
         | fault. Doesn't matter if I had Autopilot on or not.
         | 
         | You can have a similar argument for: if I'm drunk while driving
         | that's _my_ choice; if I don 't cause any accidents, it should
         | be none of anybody's business that I'm drunk.
         | 
         | You see, it doesn't work like that.
         | 
         | If Autopilot poses a significant risk to other road users, then
         | an investigation is warranted.
        
           | moate wrote:
           | No, you're right, none of what you're talking about works
           | like that.
           | 
           | There was a traffic accident. There are a variety of
           | "automated" driving features in modern cars: Auto-park,
           | cruise control, auto-pilot. Any one of these features could,
           | under "less than ideal" circumstances, cause an accident that
           | doesn't warrant contacting national authorities. Well before
           | any regulatory body is going to do anything, private
           | insurance would. They're going to investigate in an effort to
           | determine fault and facts. Was autopilot to blame, or did the
           | user spill a cup of coffee in the passenger's seat that
           | caused them to take their eyes off the road, etc.
           | 
           | The idea that a national regulatory body is going to start
           | looking into a single car crash seems great until you start
           | thinking about the expense that would create for the
           | taxpayers. Bureaucracy just isn't built to be agile, for
           | better or worse.
           | 
           | Similarly, if you drive drunk and don't cause an accident,
           | nobody will know. This doesn't make it legal, and nobody is
           | trying to argue a point even tangentially similar to that.
           | This is a straw man, and a rather obvious one. There is no
           | national group that would ever investigate a drunk driving
           | crash (assuming that was the only relevant information in the
           | case). That's a local law enforcement issue.
           | 
           | TL;dr- The feds don't care about a sample size of 1 fender
           | bender with no fatalities.
        
             | Brian_K_White wrote:
             | The class of problem called driving while incapacitated was
             | studied and special regulatory attention was applied to it.
             | 
             | An automated driving system that can offer to function
             | while incapacitated without the operater even being
             | informed that there is any problem, is a different problem
             | from an operator that neglected to press the brake pedal.
             | The brake pedal itself and the rest of the braking system
             | is absolutely required to meet a whole bunch of fitness
             | standards.
        
           | zamadatix wrote:
           | An NTSB investigation is a separate thing from an accident
           | investigation for fault or insurance purposes. The NTSB does
           | not investigate a drunk driver, that does not mean the drunk
           | driver would be free of fault or charges.
        
             | MerelyMortal wrote:
             | A more apt comparison would be if OP was hit by a drunk
             | driver, and there were no laws against drunk driving. It
             | would be appropriate to ask the government to investigate
             | whether or not drunk driving is safe, because it could
             | happen to many more people.
        
               | situationista wrote:
               | Except that both Tesla and the law make it clear that in
               | a Level 2 system the driver is responsible at all times,
               | regardless of whether Autopilot is engaged or not.
        
               | riffraff wrote:
               | To stretch the metaphor further, I don't think it's fair
               | to say Tesla makes it clear. Or better, it's not all the
               | story.
               | 
               | People would be less upset with the drunk driver and more
               | with the beer manufacture if they had been imbibing
               | "alcohol free beer" which actually was not alcohol free.
        
           | asdfsd234234444 wrote:
           | lol, what
        
           | wyldfire wrote:
           | > Autopilot poses a significant risk to other road users,
           | then an investigation is warranted.
           | 
           | You could fault their methods but NTSB is doing just that:
           | waiting for a trend to emerge that would warrant such an
           | investigation.
           | 
           | I'm not suggesting that Tesla's design is not the cause but
           | if the driver were lying in order to shift blame, then NTSB
           | would end up wasting their resources investigating it.
        
           | ajross wrote:
           | > If Autopilot poses a significant risk to other road users,
           | then an investigation is warranted.
           | 
           | Except there's no evidence that it does. I drive a Tesla.
           | I've seen AP brake hard in circumstances where a distracted
           | driver would have failed to notice. The system works. It does
           | not work _perfectly_. But all it has to do to make that  "if"
           | clause false is work better than drivers do, and that's a
           | very low bar.
           | 
           | Tesla themselves publish data on this, and cars get in fewer
           | accidents with AP engaged. (So then the unskewers jump in and
           | argue about how that's not apples and oranges, and we all end
           | up in circles of nuance and no one changes their opinion. But
           | it doesn't matter what opinions are because facts are all
           | that matter here, and what facts exist say AP is safer than
           | driving.)
        
         | pempem wrote:
         | This is so tremendously _NOT_ a case of personal
         | responsibility. A large portion of our economy attempts to rely
         | on the belief of:  "you said it would do x, it must do x".
         | 
         | You bought a car, it promised functionality, it did not deliver
         | and endangered another human/their property.
         | 
         | This is the fault of the manufacturer.
         | 
         | Here are some examples of non-autonomous driving cases where
         | the manufacturer made a false promise: Volvo lawsuit:
         | https://www.motorbiscuit.com/volvo-owners-seeking-class-acti...
         | 
         | Toyota recall after lawsuit https://www.industryweek.com/the-
         | economy/article/21959153/to...
         | 
         | Chevy lawsuit: https://topclassactions.com/lawsuit-
         | settlements/consumer-pro...
         | 
         | It is my sincere hope that we can enjoy Elon's persona without
         | just letting Tesla off the hook of being a car company. Or
         | really, a company.
        
         | jdavis703 wrote:
         | Based on videos and observed driving behavior, it seems the
         | marketing and documentation for Autopilot is ineffective at
         | communicating to drivers that Autopilot is basically advanced
         | cruise control. If this is correct, it represents a systemic
         | issue that should be investigated by the NTSB or NHTSA.
        
           | soheil wrote:
           | The moment you use Autopilot it's evident that it's basically
           | a fancy cruise control. You're assuming some people would not
           | interfere with Autopilot in any scenario. To think somehow
           | people with that much lack of common sense exist is odd. It
           | shows a massive level of disregard for those people's ability
           | to perform basic functions.
        
           | cma wrote:
           | https://tesla.com/autopilot
           | 
           | In the video there it says:
           | 
           | "THE PERSON IN THE SEAT IS ONLY THERE FOR LEGAL REASONS.
           | 
           | HE IS NOT DOING ANYTHING. THE CAR IS DRIVING ITSELF."
           | 
           | Total lie unless they are psycopaths and would be willing to
           | run it with no one in the seat if not for legal reasons. In
           | other words unless they were willing to murder if murder were
           | legal--the video is from 2018 or maybe even earlier, where we
           | know they were nowhere near ready for driverless.
        
         | EastOfTruth wrote:
         | > I drive a Tesla. If my car rear ends another car, that's my
         | fault. Doesn't matter if I had Autopilot on or not.
         | 
         | It is supposed to be super-self-driving... it has to count for
         | something.
        
         | jsight wrote:
         | I think there is a good argument that now is a good time for a
         | liability shift for certain types of accidents with TACC. TACC
         | should never be rear ending people.
        
           | sokoloff wrote:
           | It's a driver aid, not a driver replacement. I'm responsible
           | for where my airplane autopilot takes the airplane (in large
           | part because I'll be the first one to arrive at the accident
           | scene).
           | 
           | Why shouldn't the driver be responsible for accidents in the
           | car they're driving at the current level of driver assist
           | technology?
        
           | mistrial9 wrote:
           | a bad actor can easily create an accident of this
           | description, no?
        
         | soheil wrote:
         | It's so odd that in stories like this people don't even ask
         | this simple question: did you try to brake?
         | 
         | It's usually conveniently omitted from the story.
        
       | kjhughes wrote:
       | > _There were two impacts somehow. It was a 4-car pileup and my
       | car was destroyed._
       | 
       | Likely explanation for two impacts during a rear-end pileup:
       | 
       | 1. Tesla rear-ends your car.
       | 
       | 2. Another car rear-ends the Tesla, propelling it forward and
       | hitting your car a second time.
        
         | raylad wrote:
         | Tesla was rear car. Nothing else hit it. It hit me twice.
        
           | cbo100 wrote:
           | Doesn't this indicate that she was likely "panic braking" on
           | the accelerator?
           | 
           | This would override AEB / TACC on any vehicle I own, radar or
           | not.
        
             | raylad wrote:
             | Yes, I thought that could be the case too, but no way to
             | know without seeing the logs.
        
             | hasdf wrote:
             | yeah this seems to be the most likely scenario. It sounds
             | like the original poster's car was in the middle, tesla was
             | at the end. OP's car suddenly breaks (or hits someone),
             | tesla either does not react fast enough or starts reacting
             | and then the tesla driver hits the gas pedal (meaning to
             | break). Still not sure how they would be able to hit it
             | twice if they hadn't backed up though.
        
               | raylad wrote:
               | I slowed down gradually and was already stopped at the
               | red light with 2 cars in front of me when the Tesla
               | plowed into my car.
        
           | [deleted]
        
           | oblib wrote:
           | Sounds to me like the Tesla tried to keep going after it hit
           | you. Those cars really should not be allowed to use that tech
           | and they've gone well beyond having enough proof to admit
           | that.
        
       | throwaway803453 wrote:
       | On a motorcycle if I am at a stop with no one behind me I am
       | periodically scanning my review mirror and considering escape
       | routes. If some appears to approach too fast I start flashing my
       | break light until it is clear they are decelerating at a
       | comfortable level.
       | 
       | I am going to start taking a similar precaution when driving.
       | I'll just flash of my hazard lights once and hopefully that
       | triggers something in the driver or autopilot.
        
         | HWR_14 wrote:
         | Since Tesla was hitting stationary emergency vehicles, you
         | might put yourself in more danger doing that for a Tesla on
         | Autopilot.
        
       | dmitrygr wrote:
       | Tell your insurance company. Nobody is as motivated as them to
       | make the other driver/car manufacturer/etc
       | pay/hurt/apologize/etc. Then talk to reporters. Sadly in today's
       | world, if it isn't trending on Twitter, it didn't happen. Sorry
       | 
       | Hope you are physically ok and unhurt.
        
         | HWR_14 wrote:
         | > Nobody is as motivated as them to make the other driver/car
         | manufacturer/etc pay/hurt/apologize/etc.
         | 
         | I don't know why you think that. Insurance companies working
         | together is an iterated game with no clear end point. It makes
         | all the insurance companies more money if they just settle all
         | claims as quickly as possible, as long as the total money
         | changing hands among all claims is correct to within the margin
         | of how much much they save from not investigating.
        
       | [deleted]
        
       | sidibe wrote:
       | Last week Tesla rolled out an update to all of its FSD beta users
       | that caused rogue FCWs leading to dangerous sudden braking.
       | Somehow they didn't catch this internally but most of their
       | customers seemed to notice it the first day. It's clear that
       | Tesla doesn't test their software enough
        
       | belval wrote:
       | I'll go against the general opinion in this thread and say that
       | this is not something that I'd expect to be blamed on Tesla.
       | Especially considering this:
       | 
       | > The driver said she was using Autopilot, that an OTA update had
       | had a problem earlier in the day, and that she had actually had
       | the same kind of collision on Autopilot previously!
       | 
       | Autopilot might be an absolute dumpster fire, but what you are
       | describing is similar to an adaptive cruise control failure and
       | the liability is still with the driver. She rear ended you while
       | you were at a red light, make sure that her insurance pays and
       | that's it. If she wishes to sue Tesla she can obviously do so.
        
         | zahma wrote:
         | Responsibility and who pays are certainly important, but isn't
         | it equally concerning that there isn't an investigation
         | initiated directly to determine if there's malfunctioning
         | technology?
        
           | belval wrote:
           | At scale not really, how many cars get rear-ended daily in
           | the US? Insurance companies will forward the info to another
           | government body which will launch an investigation if there
           | is a significant deviation from the norm.
        
         | evv wrote:
         | I have adaptive cruise control, and it behaves the exact same
         | today as it did on the day I test drove the car. It doesn't
         | change behavior unpredictably with OTA updates!
         | 
         | How was the driver supposed to know that the previous issue was
         | not some rare fluke?
         | 
         | Tesla is recklessly rolling out software, and amazingly has
         | pushed off the liability to the drivers willing to try it.
         | Sadly we are all part of the beta, like it or not, if there are
         | are Teslas driving around on public streets.
         | 
         | I'm mostly shocked that insurance rates have not skyrocketed
         | for FSD beta testers.
        
           | belval wrote:
           | To clarify, I was responding to the OP with regard to the
           | "investigation" part. Authorities won't launch an
           | investigation on someone rear-ending someone else on the
           | basis that "autopilot did it". As a wronged individual he
           | should make sure that he is properly compensated as non-
           | responsible for the crash. The driver herself can (and maybe
           | should!) at least contact Tesla to inquire about the whole
           | thing.
           | 
           | If truly Teslas with FSD are causing more accidents than
           | another car then yes at some point some government body will
           | investigate but they won't care about a single incident with
           | only material damages.
        
           | jsight wrote:
           | > I have adaptive cruise control, and it behaves the exact
           | same today as it did on the day I test drove the car. It
           | doesn't change behavior unpredictably with OTA updates!
           | 
           | I do too on a non-Tesla. I've seen differing behaviors not
           | just due to updates but due to the time of day!
        
           | hasdf wrote:
           | I have a tesla - I could be wrong but I don't think that
           | autopilot has seen any updates in quite some time. The
           | exception to this is people who have the FSD Beta - this is a
           | very small percentage of people who were able to get a
           | impossibly high driving score. Getting into an accident would
           | have made it impossible for this lady to get the FSD Beta
        
       | MBCook wrote:
       | What about done sort of civil suit for negligence? If she knew it
       | was prone to this from her previous incident and let it happen
       | again could that be grounds?
       | 
       | I bet some sort of auto accident lawyer would salivate at the
       | idea of suing a trillion dollar company.
        
         | pdonis wrote:
         | The negligence would be on the part of the other driver, not
         | Tesla.
        
       | deegles wrote:
       | I ask this every time there's an article about Autopilot behaving
       | badly and still no good answer... What is a (hypothetical)
       | situation where Autopilot would definitely be responsible for an
       | accident? Not brushed off as "the driver is responsible and
       | should have had hands on the wheel at all times."
        
         | conductr wrote:
         | I think that's the problem, they've been allowed to legalese
         | away all liability. There would have to be an OTA update so bug
         | ridden that it caused multiple accidents in short order and
         | with major carnage. In that instance, someone might notice the
         | "trend" or public outcry would become loud enough.
        
         | paxys wrote:
         | Not until level 5 is rolled out - so potentially never.
        
       | darthvoldemort wrote:
       | NTSB and NHTSA can't investigate every single complaint. That's
       | unfortunately not scalable. They would need to talk to the
       | driver, talk to Tesla, get the data, etc. All for something
       | that's handled through civil lawsuits. You can sue the driver
       | because ultimately she is at fault, and if she wants to sue
       | Tesla, that's her prerogative.
       | 
       | This makes sense to me.
        
         | Drunk_Engineer wrote:
         | The problem is that NTSB has a bizarro prioritization scheme.
         | They have spent huge efforts investigating things like hot air
         | balloon crashes, and even Harrison Ford's crash landing a
         | vintage airplane into a golf course. The crazier the incident,
         | the more likely they investigate -- which is completely
         | opposite of how they should be prioritizing. The result is that
         | the common, everyday rear-end car crashes are just written off.
        
           | tick_tock_tick wrote:
           | It's not hard to figure out the companies themselves and
           | other agency work well enough at solving the common
           | reproducible errors the NTSB focuses on the edge cases that
           | would otherwise go unanswered.
        
             | Drunk_Engineer wrote:
             | The US has one of the worst road safety records of any
             | modern industrialized country. The reason is that other
             | agencies have actually not solved the "common reproducible"
             | errors at all.
        
       | spullara wrote:
       | She was probably lying.
        
       | sleibrock wrote:
       | Do you have any pictures of the accident site? Just for curiosity
       | sake.
        
         | pdonis wrote:
         | If he does he shouldn't be posting them here. He should be
         | giving them to his insurance company so they can go after the
         | other driver's insurance company for compensation.
        
           | buffington wrote:
           | I'm confused. Digital images can be posted here AND sent to
           | an insurance company. What am I missing?
        
             | xeromal wrote:
             | You could potentially post evidence against yourself if you
             | posted online. It's a little shady but I don't think many
             | people are ok with shooting themselves in the foot.
        
             | pdonis wrote:
             | If the images contain information that is relevant to an
             | insurance claim, they probably should not be posted in a
             | public forum that everyone on the Internet can see.
        
       | dragonwriter wrote:
       | > Does anyone have any idea about how to get this actually
       | investigated to see if it was another Autopilot failure?
       | 
       | If you want to get autopilot _in general_ invesitgated and you
       | think that 's not happening enough based on your experience,
       | because the thresholds used by agencies skip investigation when
       | it is warranted, you should direct your complaints to Congress,
       | especially your members and those on the relevant committees.
       | 
       | This will _probably_ not, even in the case where it is part of
       | triggering action, get _your_ accident invesitgated differently,
       | but if your concern is general safety that 's probably not
       | important.
       | 
       | If you _really_ want your accident investigated more thoroughly,
       | you'll probably have to do it yourself by gathering enough
       | initial evidence to find an attorney willing to pursue it as a
       | defective product liability case against Tesla. This may be
       | complicated somewhat by dealing with the existing insurance
       | companies involved in handling claims out of the accident, but
       | that 's probably a small part of the involved challenge. On the
       | other hand, your personal potential financial upside is higher in
       | this case, though the likelihood of recovery may not be good.
        
       | lucb1e wrote:
       | Does it need to be investigated for you to get insurance payout?
       | If not, I don't really understand your reason for asking. Non-
       | software-driven rear-endings don't usually lead to investigating
       | the mental stability of the driver, and so long as there isn't an
       | unusually high case of accidents with this software (as in,
       | higher-than-human rates)... This is n=1 for all we know, or am I
       | missing any information?
        
       | xedeon wrote:
       | Here's an idea. Maybe she said she was on autopilot as a cop out?
       | She could also have her foot on the accelerator which disables
       | automatic stopping (clearly shown to the driver as a warning)
       | when autopilot and traffic-aware cruise control (TACC) is
       | engaged.
       | 
       | If this is truly a substantiated issue. Hundreds of Tesla owners
       | would have created threads on the forums below. That would
       | indicate a trend which will prompt NHTSA to take action:
       | 
       | https://teslamotorsclub.com/tmc/forums/-/list
       | 
       | https://www.reddit.com/r/teslamotors/
       | 
       | But alas, a quick cursory search yielded nothing.
       | 
       | There was another media frenzy a while back. Falsely claiming
       | sudden unintended acceleration (SUA) on Tesla vehicles. The
       | result? "NHTSA determines sudden acceleration complaints in Tesla
       | vehicles were due to driver error"
       | 
       | "There is no evidence of any fault in the accelerator pedal
       | assemblies, motor control systems, or brake systems that has
       | contributed to any of the cited incidents"
       | 
       | https://www.youtube.com/watch?v=TqTXhKVtQbU&t=203s
       | 
       | https://techcrunch.com/2021/01/08/nhtsa-tesla-sudden-uninten...
        
         | trollied wrote:
         | This needs more visibility
        
         | misiti3780 wrote:
         | You need to be careful pointing out anything like this on HN.
         | If you have havn't noticed, it's really cool to hate anything
         | TSLA these days.
        
           | xedeon wrote:
           | I know anything TSLA gets some people too excited. But it
           | should never be at the cost of being objective.
        
       | h2odragon wrote:
       | Anther 10 years and few thousand injured and dead, you may be
       | able to collect enough statistics to interest someone in starting
       | a lawsuit.
       | 
       | Look at what it took to get acknowledgement that Pintos had a
       | problem.
        
       | ProjectArcturis wrote:
       | Talk to a good lawyer, see if you have a case to sue Tesla.
       | They'll only fix this if it costs them money.
        
       | tobyjsullivan wrote:
       | Possible scenario A: Teslas have a fundamental flaw that causes a
       | single driver to have several accidents. Multiplied by the number
       | of Tesla drivers who use Autopilot, there must be hundreds of
       | thousands of unreported and uninvestigated auto-pilot accidents.
       | Most certainly a conspiratorial cover-up by Tesla, the NTSB, and
       | the NHTSA.
       | 
       | Possible scenario B: The driver of the Tesla has misrepresented
       | or misunderstood the facts of the situation.
       | 
       | Possible scenario C: The driver of the Tesla doesn't know that
       | she should take her foot off the accelerator pedal when using
       | Autopilot.
       | 
       | I suppose any of these (or several other) scenarios are at least
       | possible. I'd probably apply Hanlon's razor here before anything
       | else.
        
         | practice9 wrote:
         | I would assign a large probability that Autopilot wasn't even
         | engaged and it's driver's fault.
         | 
         | Accidents with Tesla (many of those that make the press) are
         | often caused by not being accustomed to the car's acceleration.
         | 
         | There is Chill mode that decreases max acceleration, but I
         | wouldn't expect the driver who manages to get into 2 accidents
         | with a same car to read a manual
        
           | filoleg wrote:
           | Yeah, the first thing that came to mind was that one valet
           | who claimed that it was not his fault, but the autopilot that
           | made him accelerate heavily and crash inside a parking
           | garage.[0]
           | 
           | People called out bs on it instantly, as it would make no
           | sense for autopilot to suddenly accelerate inside a garage
           | until the crash, but the driver defended himself to death.
           | 
           | Only for the formal investigation to happen and confirm that
           | the driver did it himself by fully pressing on the
           | acceleration pedal instead of the braking pedal without any
           | autopilot engagement, and was just trying to shift blame. And
           | no, it wasn't just Tesla's legal team claiming that. The guy
           | went to a third-party service that could decode the blackbox
           | data from the car (again, that was his own technician, not
           | requested by Tesla or anything), and the evidence clearly
           | showed that the driver pressed the accelerator full-on. The
           | raw log that is relevant to the situation is included in the
           | link as well, in case someone wants to see it with their
           | eyes.
           | 
           | Almost every single article I've seen that features bizarre
           | accidents like this ended up being bs, and I am seeing the
           | "accidentally pushed the accelerator instead of the braking
           | pedal" vibes here as well. Will be eagerly awaiting for the
           | results of the investigation before I make my own judgement
           | though. Note: I am not claiming that autopilot is infallible,
           | and there were indeed some legitimate incidents a while ago.
           | But this specific one in the OP? Yeah, I call bs.
           | 
           | 0. https://insideevs.com/news/496305/valet-crashes-tesla-
           | data-r...
        
         | throwaway0a5e wrote:
         | Or D) Tesla stops instantly when it hits OP once by itself and
         | again when the person behind it hits it.
         | 
         | (Spare me the low effort comment about how everyone should be
         | prepared for the car in front of them to stop instantly because
         | it hit something, statistically nobody drives like that)
         | 
         | Edit: I misinterpreted the OP about it being an four car
         | pileup.
        
           | raylad wrote:
           | No other car hit the Tesla from behind. It hit me with two
           | impacts all by itself.
        
           | Zanni wrote:
           | Dude. It's not a "low effort comment" to point that out, it's
           | literally _the law_. If you can 't stop when the person in
           | front of you stops, you're too close. Increase your following
           | distance. Don't let others bad habits justify your own.
        
             | HWR_14 wrote:
             | > If you can't stop when the person in front of you stops,
             | you're too close
             | 
             | If the other driver decides to randomly brake in the middle
             | of the road (as some Teslas have been known to do), it's
             | not necessarily the person behind's fault.
        
             | throwaway0a5e wrote:
             | Yes, yes it is a low effort comment. And it is exactly what
             | I was trying to preempt. It adds exactly zero to the
             | conversation to say "but the law" or "but drivers ed" or
             | "but some proverbial rule of thumb".
             | 
             | For better or worse neither your fantasy of how people
             | ought to act nor the letter of the law is reflective of how
             | the overwhelming majority of the human population operators
             | motor vehicles or expects others to. Is it ideal? Probably
             | not. But it's a decent balance between being somewhat
             | prepared for the expected traffic oddities, leaving margin
             | for some subset of the unexpected and efficient use of road
             | space.
             | 
             | I'm sure this will be an unpopular comment because there is
             | no shortage of people here who think humans adhere to laws
             | the way a network switch adheres to its configuration but
             | the reality is that there is not perfect alignment between
             | how reasonable traffic participants behave and the letter
             | of the law.
        
               | Zanni wrote:
               | You could have pre-empted it by not making such a claim
               | in the first place, and I abhor your attempt to normalize
               | this dangerous behavior. It is _not_ a  "delicate
               | balance." Increase your follow distance. Driving closer
               | to the car in front of you gains you nothing but
               | sacrifices valuable reaction time in the event of an
               | emergency.
        
               | throwaway0a5e wrote:
               | Take your high horse and turn it into glue. I'm not
               | attempting to normalize anything. Look outside, it's
               | already normalized. It is the current status quo. I'm not
               | endorsing it. I'm simply asking you to not to pretend
               | otherwise so you can pretend to be outraged at someone
               | who failed to avoid an accident because they were driving
               | like typical people drive.
        
               | trav4225 wrote:
               | I think that some of us would argue that these are not
               | "reasonable traffic participants". People who do not
               | maintain sufficient stopping distance are one of the most
               | frustrating parts of the American driving experience and
               | are (IMO) extremely disruptive to safe travel, especially
               | on highways.
        
               | throwaway0a5e wrote:
               | >I think that some of us would argue that these are not
               | "reasonable traffic participants"
               | 
               | You're basically arguing that (almost) everyone else is
               | unreasonable. That's going to be a very uphill argument.
               | 
               | >People who do not maintain sufficient stopping distance
               | 
               | Who defines "sufficient" because the consensus based on
               | the observed behavior of typical traffic seems to be that
               | "sufficient" is a few seconds where possible but always
               | less than whatever the comments section on the internet
               | thinks it should be
               | 
               | >American driving experience
               | 
               | The american driving experience is not particularly
               | remarkable compared (except maybe in its low cost)
               | compared to other developed nations. All of which are
               | pretty tame compared to developing nations
               | 
               | I'm not asking you to like the way people drive. I'm just
               | asking you to not assess traffic incidents based on the
               | farcical assumption that most participants are following
               | or can be expected to be following whatever rules are on
               | paper.
        
         | raylad wrote:
         | One concern I would like investigated is that it appears
         | Tesla's Autopilot can't detect (or can no longer detect?)
         | stationary objects.
         | 
         | Until fairly recently, I believe that the RADAR, which used to
         | be in all their cars, would have detected the stationary object
         | (me) and applied braking force to at least reduce the impact.
         | 
         | Now, though, Tesla has stopped installing RADARs in their cars
         | and apparently also disabled existing RADAR in cars that have
         | it (because they all are using the same camera-only software).
         | 
         | If this has also removed the ability to detect stationary
         | objects and the corresponding emergency braking functionality,
         | this is a really serious problem and needs to be addressed,
         | perhaps with a recall to require installation of RADAR in all
         | the new cars and re-enabling of it.
        
           | krisoft wrote:
           | Radar is not good for detecting stationary objects. Of course
           | you get a nice reflection back from a stationary car but you
           | get a similarly nice reflection from an overhead trafic light
           | or a manhole cover or a dropped nail. Because of this every
           | automotive radar ever fielded gates out the stationary
           | objects. If you wouldn't you would get a crazy amount of
           | false positives.
           | 
           | They can do this because the radar measures the relative
           | speed of objects via dopler shift, and you know the speed of
           | your own vehicle. Anything which has the same speed as you
           | have but goes in the other direction is most likely
           | stationary. (Or moving perpendicular to you. The velocity
           | difference is vectorial, but dopler can only observe the
           | component along the observation vector, and civilian radars
           | have terrible angular resolution.)
           | 
           | In short: nobody ever used radar to stop cars from hitting
           | stationary objects. This is not a Tesla specific thing.
        
           | eightysixfour wrote:
           | > Until fairly recently, I believe that the RADAR, which used
           | to be in all their cars, would have detected the stationary
           | object (me) and applied braking force to at least reduce the
           | impact.
           | 
           | It is the other way around. They've stopped using RADAR
           | because RADAR has too many false reflections at speed and it
           | was causing learning issues in the ML dataset. The vision
           | based position detection is better (could argue about weather
           | and visibility here, but it is irrelevant to this convo).
           | Reliance on RADAR was causing it to hit stopped objects,
           | because it was relying on radar data that was already too
           | jittery and had to be ignored.
           | 
           | Regardless of the RADAR issue, the driver is responsible for
           | the vehicle at all times. If someone hits you using cruise
           | control they remain at fault.
        
             | raylad wrote:
             | Nearly every major manufacturer has a separate emergency
             | stop system that continually looks for an obstacle in the
             | vehicle's path and applies braking force, overriding the
             | driver and any cruise control or self-driving that is in
             | use.
             | 
             | These often use RADAR, have worked for years, and are
             | routinely tested by international agencies such as the
             | IIHS.
             | 
             | See, for example:
             | https://www.youtube.com/watch?v=TJgUiZgX5rE
             | 
             | Teslas at least used to do this too:
             | 
             | https://www.iihs.org/news/detail/performance-of-
             | pedestrian-c...
             | 
             | https://www.youtube.com/watch?v=aJJfn2tO5fo
             | 
             | If Teslas no longer have this functionality, is a serious
             | problem that needs to be corrected. That could mean
             | reprogramming existing cameras or adding a sensor if the
             | system really did rely on the now-removed RADAR.
        
             | yupper32 wrote:
             | > If someone hits you using cruise control they remain at
             | fault.
             | 
             | That'd be relevant if they didn't call it autopilot. If
             | they called it lane assist or cruise control plus or
             | something then I'd agree.
             | 
             | Tesla is at major fault for implying that the Teslas can
             | drive themselves, even if the driver is also at fault.
        
               | mwint wrote:
               | Tesla could call it Egre-MuzzFlorp and it wouldn't
               | matter. The surveys showing people think "autopilot"
               | means "can drive itself" were fatally flawed, in that
               | they were done with _non-Tesla owners_.
        
               | yupper32 wrote:
               | And yet, people drive their Teslas as if it means "can
               | drive itself".
               | 
               | Words matter. You'd need real evidence to convince me
               | that Tesla calling the system "Autopilot" hasn't
               | significantly contributed to crashes.
        
               | mwint wrote:
               | On the contrary, we need real evidence suggesting it
               | does. Certainly the population level crash data doesn't
               | support it.
        
               | sokoloff wrote:
               | If an at-fault Tesla hits me, my legal beef is with the
               | driver of that Tesla. They may in turn rely on insurance
               | or on suing Tesla to recover damages from their own
               | misunderstanding of the car's abilities or improper
               | marketing, but my beef remains with the driver of the at-
               | fault vehicle.
        
           | awfml wrote:
           | I think its probably more likely the driver made a user
           | error-- tens of thousands of people use autopilot every day
           | in stop and go traffic
        
             | raylad wrote:
             | Stop and go traffic is not the same.
             | 
             | In stop and go traffic, the car in front of you is visible
             | to the camera at all times.
             | 
             | In this case, my car may not have been acquired by the
             | camera before it was already stationary, in which case it
             | might have been ignored, causing the accident.
        
               | xeromal wrote:
               | We won't know until the lady provides you with evidence
               | that her AP was actually on, but my guess is that she
               | fucked up and she's just using AP to offset blame.
        
               | raylad wrote:
               | Exactly: An investigation would find out what happened.
        
               | practice9 wrote:
               | Please post an update when the investigation concludes
        
         | jsight wrote:
         | To be fair to scenario A, I've seen one other video of a driver
         | reporting AP hitting the car in front of them during a rapid
         | slowdown in traffic. Its hard to say how widespread it is, if
         | the NHTSA isn't wanting to investigate.
        
           | [deleted]
        
       | jdavis703 wrote:
       | If Tesla, NTSB and NHTSA refuse to investigate, you could hire a
       | lawyer who's willing to go after Tesla specifically. I've
       | similarly been stonewalled by Lyft. Long story short, the driver
       | was fiddling with the official Lyft app that sends notifications
       | during revenue service when the driver caused a 3-car pile up.
       | 
       | Be careful, a lot car crash lawyers just want an easy pay day by
       | hounding the insurance companies. Make sure your lawyer is on the
       | same page as you.
        
       | esalman wrote:
       | I'm happy to be driving on the road with other human drivers and
       | have them crash into me. As long as I can prove I was not at
       | fault, I can expect fair compensation. It seems if robots crash
       | into me, I cannot expect fair compensation because whoever
       | programmed the robot can't be held accountable!
        
       | jmpman wrote:
       | I have a Model 3. The autopilot was acting weird while driving on
       | the freeway, sometime disengaging, but mostly not tracking the
       | lane smoothly. For an unrelated reason, I opted to view my camera
       | footage, and was shocked to find that my camera was completely
       | blurry. I'm driving down the freeway at 75mph, being guided by a
       | camera that couldn't pass a standard driver's vision test.
       | 
       | Tesla came out to replace the camera, and it appeared that a film
       | had been deposited on the windshield, obscuring the camera.
       | 
       | Living in Phoenix, my car is parked outside, facing south west.
       | During Covid, I'd go many days at a time without driving, and in
       | the summer, the interior temperatures can easily rise over 150f.
       | Within the camera enclosure, there's a reflection absorbing
       | material, which likely creates a perfect miniature greenhouse.
       | 
       | I believe the glue in the camera housing melted/evaporated at
       | these elevated temperatures and deposited on the windshield.
       | 
       | Concerned that others would experience this problem, I opened a
       | case with the NHTSA. Crickets.
       | 
       | There could be many people driving around on autopilot, with
       | obstructed vision, due to this same failure mode. It's something
       | Tesla could easily check, and something NHTSA should be
       | investigating.
       | 
       | For something as safety critical as a forward facing camera,
       | you'd expect both Tesla and NHTSA would be investigating. I have
       | no indication that anything has happened as a result of my
       | filing. Possibly because nobody else has reported the issue -
       | maybe because nobody else is actively viewing their camera
       | footage? There's no other way for a Tesla owner to be aware of
       | the issue. Frustrating.
        
         | raylad wrote:
         | This does sound like something that should be the subject of a
         | recall investigation.
        
           | MomoXenosaga wrote:
           | I wonder if the US government is afraid to kill their golden
           | goose like what happened with Boeing.
           | 
           | EV is the future and China is moving fast.
        
             | stfp wrote:
             | EV != Tesla and Tesla != Autopilot Also maybe hydrogen is
             | the future?
        
               | outworlder wrote:
               | > Also maybe hydrogen is the future
               | 
               | It definitely isn't if we are going to move away from
               | fossil fuels.
        
               | gumby wrote:
               | You can get H+ by cracking water. You "just" need a
               | source of cheap energy.
               | 
               | Storage is tough -- those protons are really tiny and
               | it's hard to keep them from escaping
        
               | outworlder wrote:
               | > You can get H+ by cracking water. You "just" need a
               | source of cheap energy.
               | 
               | Well aware. But we don't have a source of cheap energy.
               | And guess what, whatever source of energy we have, it is
               | very wasteful to produce hydrogen.
               | 
               | First, you need electrolysis. Around 80% efficient.
               | However, in order to get any usable amount of energy per
               | volume out of it, you need to either compress hydrogen
               | (using more power) or liquefy it (way more power, boiloff
               | problems).
               | 
               | Now you need to transport it, physically. This uses more
               | power, usually with some big trucks driving around, just
               | as we have today with gasoline and diesel.
               | 
               | This will get stored into some gas station equivalent.
               | All the while losing mass, as those protons are hard to
               | store, indeed.
               | 
               | Now you can drive your vehicle and refill. Some
               | inefficiencies here too but we can ignore them unless you
               | need to travel long distances to refuel.
               | 
               | This hydrogen will generally be used in a fuel cell. The
               | output of the fuel cell is... electrical energy (+water)
               | 
               | You could skip all that, use the electricity from the
               | power grid to directly charge batteries. No hydrogen
               | production plants needed, no container challenges, no
               | diffusing through containers and embrittlement to worry
               | about. No trucks full of hydrogen going around.
               | Electricity is found almost everywhere, even in places
               | where there's little gas infrastructure.
               | 
               | Mind you, hydrogen has another disadvantage, other than
               | thermodynamics: it ensures control is kept with the
               | existing energy companies. We would still need to rely on
               | their infrastructure to refill our cars (much like gas
               | stations). They would like to keep it this way.
               | Ultimately it doesn't really matter what's the fuel we
               | are using, as long as we keep driving to gas stations.
        
               | nameisname wrote:
               | Can't renewables produce hydrogen?
        
               | outworlder wrote:
               | They can. But overwhelmingly, they do not. It's much
               | cheaper to extract hydrogen from fossil fuels and that's
               | not likely to change any time soon.
        
             | ipsum2 wrote:
             | All major car companies manufacture EVs. Off the top of my
             | head, I've seen BMW, Prius (Toyota), Honda electric cars
             | outside recently.
        
           | tshaddox wrote:
           | Genuine question, what temperature are cars expected to be
           | stored at safely? I'm not talking about engine temperature,
           | just literally the room temperate of the garage. I'm not
           | wanting to excuse or accuse anyone here, but are cars
           | supposed to be manufactured such that they can sit in 150F
           | garages for days at a time without any maintenance?
        
             | xwdv wrote:
             | Your garage is not 150F, and certainly not for _days_ at a
             | time.
        
               | tshaddox wrote:
               | Indeed mine certainly is not, but apparently the parent
               | commenter's routinely reaches 150F.
        
         | tyingq wrote:
         | That's odd since Tesla seems proud of their AI expertise. Isn't
         | something like "image is blurred" a fairly pedestrian (heh)
         | exercise? Especially given that you would have related data,
         | like what speed the car is going, if you're parked, etc.
        
           | slg wrote:
           | They do report if the camera can't get a good image. Glare
           | from the sun is the most common reason. It is certainly
           | possible that there is a window between when vision is
           | degraded enough to be dangerous and when it alerts the
           | driver, but a single occurrence isn't proof of a widescale
           | problem. That is the type of thing that should be
           | investigated in more depth, but it is hard to get anyone to
           | care when it is only a theory that isn't even yet connected
           | to a single accident.
        
         | coryrc wrote:
         | > was shocked to find that my camera was completely blurry
         | 
         | They maybe don't need to figure out how it got blurry.
         | 
         | But their testing regime is sad if it continues to work when it
         | can't see!
        
         | soheil wrote:
         | There are several cameras for redundancy. The car also doesn't
         | need crystal clear vision to drive. What may seem like a blurry
         | image to human eye with simple filtering can become much more
         | visible. It also doesn't need to see details, but rather needs
         | to detect if there are any objects in the way or where the lane
         | markings are.
        
         | screye wrote:
         | Tesla and Carelessness, name a better duo.
         | 
         | From removing radar from their cars [1] to having their $200k
         | super car be criminally underbraked. [2]
         | 
         | Tesla has a low regard for human life, and exemplifies the
         | worst of silicon valley's progress at any cost approach. Elon
         | is undeniably intelligent, but his ethics are dangerously
         | sidelined by his need to create products that might not be
         | ready yet.
         | 
         | [1] https://www.reuters.com/business/autos-
         | transportation/tesla-...
         | 
         | [2] https://www.youtube.com/watch?v=Hn9QWjxFPKM
        
           | vasco wrote:
           | I don't know much about this but wasn't Tesla getting safety
           | award after safety award in testing and that translating into
           | lower insurance rates?
        
             | toast0 wrote:
             | Doing well in crash tests means the car does well in those
             | specific scenarios, but may not mean much about other
             | safety issues.
        
             | 420official wrote:
             | My understanding is that they are safe in the event of a
             | crash, not that they have the best crash avoidance.
        
             | adventured wrote:
             | The first several model years of the Model S were generally
             | considered to be among the safest vehicles in the history
             | of automobiles.
             | 
             | https://www.businessinsider.com/tesla-tesla-model-s-
             | achieves...
        
               | moffkalast wrote:
               | And they probably still are, along with the rest of the
               | models.
               | 
               | Assuming you're actually the one driving them that is,
               | and not a broken ass camera.
        
           | misiti3780 wrote:
           | Tesla doesnt make a $200K car.......
        
         | bronzeage wrote:
         | It's also a problem that the camera could have degraded vision
         | that's inadequate for self driving yet report no problem at all
         | and continue driving as if everything's fine.
         | 
         | I'm pretty sure your situation isn't the only way the vision
         | can be obstructed, maybe liquids, or even bird poop can also
         | degrade vision.
        
           | bscphil wrote:
           | This is what I would expect as well. Some kind of visual
           | self-test, and if/when it fails, the driver gets an obvious
           | warning and autopilot does not engage - you get some kind of
           | clearly visible warning so that you can contact Tesla to
           | investigate / repair the obstruction.
        
             | servercobra wrote:
             | This does happen, but maybe not in all cases. I've had it
             | happen where lane change is no longer supported (and you
             | get a notification) because the sun is blinding one of the
             | pillar cameras.
        
         | jader201 wrote:
         | Meanwhile, TSLA is up 52% over the past 30 days, and 200% over
         | the last year.
        
           | bsagdiyev wrote:
           | Does that magically make it safe? I don't get the connection.
        
             | gatlin wrote:
             | My charitable read of the GP comment is that they are
             | pointing out how ludicrous the stock increase is, but
             | that's my reading. Side note, how much attention does your
             | username usually garner you?
        
               | bsagdiyev wrote:
               | None typically. I don't think anyone has made the
               | connection, publicly anyways, since I've started using
               | it.
        
             | gumby wrote:
             | I think the point is that Tesla investors and customers
             | don't care about safety, either for the passengers or as an
             | externality.
        
             | moffkalast wrote:
             | I think he's referring to the naive idea that stock prices
             | in general are somehow influenced by what a company is
             | doing and are not pure speculation.
        
       | a-dub wrote:
       | i have questions.
       | 
       | 1) how high of a speed, exactly?
       | 
       | 2) it rear ended you twice? as in it smacked into you, decided
       | that wasn't enough and either waited for you to pull forward or
       | for the cars to separate and then accelerated into you again? if
       | it actually did this, this is downright comical. i'd wager the
       | driver was involved in the second collision though. (which should
       | be considered a failure of the system nonetheless, as
       | driver+autopilot operate as a single system, where if the state
       | of the system leaves the bewildered driver manually accelerating
       | the car into a collision, that is still a degenerate state!)
        
       | juanani wrote:
       | This is what regulatory capture looks like. Barricades to get the
       | proper authorities to regulate. But omg buy T$LA
        
       ___________________________________________________________________
       (page generated 2021-11-01 23:02 UTC)