[HN Gopher] US probes Tesla's Full Self-Driving software after f...
       ___________________________________________________________________
        
       US probes Tesla's Full Self-Driving software after fatal crash
        
       Author : jjulius
       Score  : 166 points
       Date   : 2024-10-18 16:01 UTC (1 days ago)
        
 (HTM) web link (www.reuters.com)
 (TXT) w3m dump (www.reuters.com)
        
       | jqpabc123 wrote:
       | By now, most people have probably heard that Tesla's attempt at
       | "Full Self Driving" is really anything but --- after a decade of
       | promises. The vehicle owners manual spells this out.
       | 
       | As I understand it, the contentious issue is the fact that unlike
       | most others, their attempt works mostly from visual feedback.
       | 
       | In low visibility situations, their FSD has limited feedback and
       | is essentially driving blind.
       | 
       | It appears that Musk may be seeking a political solution to this
       | technical problem.
        
         | whamlastxmas wrote:
         | It's really weird how much you comment about FSD being fake. My
         | Tesla drives me 10+ miles daily and the only time I touch any
         | controls is pulling in and out of my garage. Literally daily. I
         | maybe disengage once every couple days just to be on the safe
         | side in uncertain situations, it I'm sure it'd likely do fine
         | there too.
         | 
         | FSD works. It drives itself fine 99.99% of the time. It is
         | better than most human drivers. I don't know how you keep
         | claiming it doesn't or doesn't exist.
        
           | jqpabc123 wrote:
           | So you agree with Musk, the main problem with FSD is
           | political?
           | 
           |  _Tesla says on its website its "Full Self-Driving" software
           | in on-road vehicles requires active driver supervision and
           | does not make vehicles autonomous._
           | 
           | https://www.reuters.com/business/autos-
           | transportation/nhtsa-...
        
           | sottol wrote:
           | The claim was about _full_ driving being anything but, ie not
           | _fully_ self-driving, not being completely fake. Disengaging
           | every 10-110 miles is just not "full", it's partial.
           | 
           | And then the gp went into details in which specific
           | situations fsd is especially problematic.
        
           | peutetre wrote:
           | The problem is Tesla and Musk have been lying about full
           | self-driving for years. They have made specific claims of
           | full autonomy with specific timelines and it's been a lie
           | every time: https://motherfrunker.ca/fsd/
           | 
           | In 2016 a video purporting to show full self-driving with the
           | driver there purely "for legal reasons" was staged and faked:
           | https://www.reuters.com/technology/tesla-video-promoting-
           | sel...
           | 
           | In 2016 Tesla said that "as of today, all Tesla vehicles
           | produced in our factory - including Model 3 - will have the
           | hardware needed for full self-driving capability at a safety
           | level substantially greater than that of a human driver."
           | That was a lie: https://electrek.co/2024/08/24/tesla-deletes-
           | its-blog-post-s...
           | 
           | Musk claimed there would be 1 million Tesla robotaxis on the
           | road in 2020. That was a lie:
           | https://www.thedrive.com/news/38129/elon-musk-
           | promised-1-mil...
           | 
           | Tesla claimed Hardware 3 would be capable of full self-
           | driving. When asked about Hardware 3 at Tesla's recent
           | robotaxi event, Musk didn't want to "get nuanced". That's
           | starting to look like fraud:
           | https://electrek.co/2024/10/15/tesla-needs-to-come-clean-
           | abo...
           | 
           | Had Tesla simply called it "driver assistance" that wouldn't
           | be a lie. But they didn't do that. They doubled, tripled,
           | quadrupled down on the claim that it is "full self-driving"
           | making the car "an appreciating asset" that it would be
           | "financially insane" not to buy:
           | 
           | https://www.cnbc.com/2019/04/23/elon-musk-any-other-car-
           | than...
           | 
           | https://edition.cnn.com/2024/03/03/cars/musk-tesla-cars-
           | valu...
           | 
           | It's not even bullshit artistry. It's just bullshit.
           | 
           | Lying is part of the company culture at Tesla. Musk keeps
           | lying because the lies keep working.
        
             | whamlastxmas wrote:
             | Most of this is extreme hyperbole and it's really hard to
             | believe this is a genuine good faith attempt at
             | conversation instead of weird astroturfing, bc these tired
             | inaccurate talking points are what come up in literally
             | every single even remotely associated to Elon. It's like
             | there's a dossier of talking points everyone is sharing
             | 
             | The car drives itself. This is literally undeniable. You
             | can test it today for free. Yeah it doesn't have the last
             | 0.01% done yet and yeah that's probably a lot of work. But
             | commenting like the GP is exhausting and just not
             | reflective of reality
        
               | jqpabc123 wrote:
               | _... not reflective of reality_
               | 
               | Kinda like repeated claims of "Full Self Driving" for
               | over a decade.
        
               | peutetre wrote:
               | > _bc these tired inaccurate talking points are what come
               | up in literally every single even remotely associated to
               | Elon_
               | 
               | You understand that the false claims, the inaccuracies,
               | and the lies come _from_ Elon, right? They 're associated
               | with him because he is the source of them.
               | 
               | They're only tired because he's been telling the same lie
               | year after year.
        
         | enslavedrobot wrote:
         | Here's a video of FSD driving the same route as a waymo 42%
         | faster with zero interventions. 23 min vs 33. This is my
         | everyday. Enjoy.
         | 
         | https://youtu.be/Kswp1DwUAAI?si=rX4L5FhMrPXpGx4V
        
           | ck2 wrote:
           | There are also endless videos of teslas driving into
           | pedestrians, plowing full speed into emergency vehicles
           | parked with flashing lights, veering wildly from strange
           | markings on the road, etc. etc.
           | 
           | "works for me" is a very strange response for someone on
           | Hacker News if you have any coding background - you should
           | realize you are a beta tester unwittingly if not a full blown
           | alpha tester in some cases
           | 
           | All it will take is a non-standard event happening on your
           | daily drive. Most certainly not wishing it on you, quite the
           | opposite, trying to get you to accept that a perfect drive 99
           | times out of 100 is not enough.
        
             | enslavedrobot wrote:
             | Those are Autopilot videos this discussion is about FSD.
             | FSD has driven ~2 billion miles at this point and had
             | potentially 2 fatal accidents.
             | 
             | The US average is 1.33 deaths/100 million miles. Tesla on
             | FSD is easily 10x safer.
             | 
             | Every day it gets safer.
        
               | diggernet wrote:
               | How many miles does it have on the latest software?
               | Because any miles driven on previous software are no
               | longer relevant. Especially with that big change in v12.
        
               | enslavedrobot wrote:
               | The miles driven are rising exponentially as the versions
               | improve according to company filings. If the miles driven
               | on previous versions are no longer relevant how can the
               | NHTSA investigation of previous versions impact FSD
               | regulation today?
               | 
               | Given that the performance has improved dramatically over
               | the last 6 months, it is very reasonable to assume that
               | the miles driven to fatality ratio also improving.
               | 
               | Using the value of 1.33 deaths per 100 million miles
               | driven vs 2 deaths in 2 billion miles driven, FSD has
               | saved approximately 24 lives so far.
        
               | hilux wrote:
               | Considering HN is mostly technologists, the extent of
               | Tesla-hate in here surprises me. My best guess is that it
               | is sublimated Elon-hate. (Not a fan of my former neighbor
               | myself, but let's separate the man from his creations.)
               | 
               | People seem to be comparing Tesla FSD to perfection, when
               | the more fair and relevant comparison is to real-world
               | American drivers. Who are, on average, pretty bad.
               | 
               | Sure, I wouldn't trust data coming from Tesla. But we
               | have government data.
        
           | jqpabc123 wrote:
           | Can it drive the same route without a human behind the wheel?
           | 
           | Not legally and not according to Tesla either --- because
           | Tesla's FSD is not "Fully Self Driving" --- unlike Waymo.
        
       | knob wrote:
       | Didn't Uber have something similar happen? Ran over a woman in
       | Phoenix?
        
         | BugsJustFindMe wrote:
         | Yes. And Uber immediately shut down the program in the entire
         | state of Arizona, halted all road testing for months, and then
         | soon later eliminated their self driving unit entirely.
        
       | daghamm wrote:
       | While at it, please also investigate why it is sometimes
       | impossible to leave a damaged vehicle. This has resulted in
       | people dying more than once:
       | 
       | https://apnews.com/article/car-crash-tesla-france-fire-be8ec...
        
         | MadnessASAP wrote:
         | The why is pretty well understood, no investigation needed. I
         | don't like the design but it's because the doors are electronic
         | and people don't know where the manual release is.
         | 
         | In a panic people go on muscle memory, which is push the
         | useless button. They don't remember to pull the unmarked
         | unobtrusive handle that they may not even know exists.
         | 
         | If it was up to me, sure have your electronic release, but make
         | the manual release a big handle that looks like the ejection
         | handle on a jet (yellow with black stripes, can't miss it).
         | 
         | * Or even better, have the standard door handle mechanically
         | connected to the latch through a spring loaded solenoid that
         | disengages the mechanism. Thus when used under normal
         | conditions it does the thing electronically but the moment
         | power fails the door handle connects to the manual release.
        
           | daghamm wrote:
           | There are situations where manual release has not worked
           | 
           | https://www.businessinsider.com/how-to-manually-open-
           | tesla-d...
        
             | willy_k wrote:
             | The article you provided does not say that. The only
             | failure related to the manual release it mentions is that
             | using it breaks the window.
             | 
             | > Exton said he followed the instructions for the manual
             | release to open the door, but that this "somehow broke the
             | driver's window."
        
           | Clamchop wrote:
           | Or just use normal handles, inside and outside, like other
           | cars. What they've done is made things worse by any objective
           | metric in exchange for a "huh, nifty" that wears off after a
           | few weeks.
        
             | nomel wrote:
             | I think this is the way. Light pull does the electronic
             | thing. Hard pull does the mechanical thing. They could have
             | done this with the mechanical handle that's there already
             | (that I have pulled almost every time I've used a Tesla,
             | getting anger and weather stripping inspection from the
             | owner).
        
           | carimura wrote:
           | it's worse than that, at least in ours, the backseat latches
           | are under some mat, literally hidden. i had no idea it was
           | there for the first 6 months.
        
           | Zigurd wrote:
           | The inside trunk release on most cars has a glow-in-the-dark
           | fluorescent color handle
        
           | amluto wrote:
           | I've seen an innovative car with a single door release. As
           | you pull it, it first triggers the electronic mechanism
           | (which lowers the window a bit, which is useful in a door
           | with no frame above the window) and then, as you pull it
           | farther, it mechanically unlatches the door.
           | 
           | Tesla should build their doors like this. Oh, wait, the car
           | I'm talking about is an older Tesla. Maybe Tesla should
           | _remember_ how to build doors like this.
        
             | crooked-v wrote:
             | It's not very 'innovative' these days. My 2012 Mini Cooper
             | has it.
        
       | aanet wrote:
       | About damn time NHTSA opened this full scale investigation.
       | Tesla's "autonowashing" has gone on for far too long.
       | 
       | Per Reuters [1] "The probe covers 2016-2024 Model S and X
       | vehicles with the optional system as well as 2017-2024 Model 3,
       | 2020-2024 Model Y, and 2023-2024 Cybertruck vehicles. The
       | preliminary evaluation is the first step before the agency could
       | seek to demand a recall of the vehicles if it believes they pose
       | an unreasonable risk to safety."
       | 
       | Roughly 2.4 million Teslas in question, with "Full Self Driving"
       | software after 4 reported collisions and one fatality.
       | 
       | NHTSA is reviewing the ability of FSD's engineering controls to
       | "detect and respond appropriately to reduced roadway visibility
       | conditions."
       | 
       | Tesla has, of course, rather two-facedly called its FSD as SAE
       | Level-2 for regulatory purposes, while selling its "full self
       | driving" but also requiring supervision. -\\_(tsu)_/-
       | -\\_(tsu)_/-
       | 
       | No other company has been so irresponsible to its users, and
       | without a care for any negative externalities imposed on non-
       | consenting road users.
       | 
       | I treat every Tesla driver as a drunk driver, steering away
       | whenever I see them on highways.
       | 
       | [FWIW, yes, I work in automated driving and know a thing or two
       | about automotive safety.]
       | 
       | [1]
       | https://archive.is/20241018151106/https://www.reuters.com/bu...
        
         | ivewonyoung wrote:
         | > Roughly 2.4 million Teslas in question, with "Full Self
         | Driving" software after 4 reported collisions and one fatality.
         | 
         | 45000 people die yearly just in the US in auto accidents. Those
         | numbers and timeline you quoted seem insignificant at first
         | glance magnified by people with an axe to grind like that guy
         | running anti Tesla superbowl ads, who makes self driving
         | software like you.
        
       | dietsche wrote:
       | I would like more details. There are definitely situations where
       | neither a car nor a human could respond quickly enough to a
       | situation on the road.
       | 
       | for example, I recently hit a deer. The dashcam shows that I had
       | less than 100 feet from when the deer became visible due to
       | terrain to impact while driving at 60 mph. Keeping in mind that
       | stopping a car in 100 feet at 60 mph is impossible. Most vehicles
       | need more than triple that without accounting for human reaction
       | time.
        
         | ra7 wrote:
         | Unfortunately, Tesla requests NHTSA to redact almost all useful
         | information from their crash reports. So it's impossible to get
         | more details.
         | 
         | Here is the public database of all ADAS crashes:
         | https://static.nhtsa.gov/odi/ffdd/sgo-2021-01/SGO-2021-01_In...
        
         | Log_out_ wrote:
         | just have a drone fly ahead and have the lidar pointcloud on
         | hud. This are very bio-logic excuses :)
        
         | nomel wrote:
         | I've had a person, high on drugs, walk out from between bushes
         | that were along the road. I screeched to a halt in front of
         | them, but 1 second later and physics would have made it
         | impossible, regardless of reaction time (or non-negligible
         | speed).
        
         | arcanemachiner wrote:
         | This is called "overdriving your vision", and it's so common
         | that it boggles my mind. (This opinion might have something to
         | do with the deer I hit when I first started driving...)
         | 
         | Drive according to the conditions, folks.
        
           | Kirby64 wrote:
           | On many roads if a deer jumps across the road at the wrong
           | time there's literally nothing you can do. You can't always
           | drive at 30mph on back country roads just because a deer
           | might hop out at you.
        
             | seadan83 wrote:
             | World of difference between, 30, 40, 50 and 60. Feels like
             | something I have noticed between west and east coast
             | drivers. Latter really send it on country turns and just
             | trust the road. West coast, particularly montana, when
             | vision is reduced, speed slows down. Just too many animals
             | or road obstacles (eg: rocks, planks of wood) to just trust
             | the road.
        
               | dragonwriter wrote:
               | > West coast, particularly montana
               | 
               | Montana is not "West coast".
        
               | seadan83 wrote:
               | Yeah, I was a bit glib. My impression is more
               | specifically of the greater northwest vs rest. Perhaps
               | just "the west" vs "the east".
               | 
               | Indiana drivers for example really do send it (in my
               | experience). Which is not east coast of course.
               | 
               | There is a good bit of nuance... I would perhaps say more
               | simply east of Mississippi vs west, but Texas varies by
               | region and so-Cal drivers vary a lot as well,
               | particularly compared to nor-Cal and central+eastern
               | california. (I don't have an impression for nevada and
               | new mexico drivers - I dont have any experience on
               | country roads in those states)
        
               | Kirby64 wrote:
               | Road obstacles are static and can be seen by not "out
               | driving your headlights". Animals flinging themselves
               | into the road cannot, in many instances.
        
               | amenhotep wrote:
               | You are responding in a thread about a person saying they
               | were driving at 60 when the deer only became visible "due
               | to terrain" at 100 feet away, and therefore hitting it is
               | no reflection on their skill or choices as a driver.
               | 
               | I suppose we're meant to interpret charitably here, but
               | it really seems to me like there is a big difference
               | between the scenario described and the one you're talking
               | about, where the deer really does fling itself out in
               | front of you.
        
               | dietsche wrote:
               | op here. you nailed it on the head. also, the car started
               | breaking before i could!
               | 
               | incidentally, i've also had the tesla dodge a deer
               | successfully!
               | 
               | autopilot has improved in BIG ways over the past 2 years.
               | went 700 miles in one day on autopilot thru the
               | mountains. no issues at all.
               | 
               | that said expecting perfection from a machine or a human
               | is a fools errand.
        
           | Zigurd wrote:
           | We will inevitably see "AVs are too cautious! Let me go
           | faster!" complaints as AVs drive in more places. But, really
           | humans just suck at risk assessment. And at driving. Driving
           | like a human is comforting in some contexts, but that should
           | not be a goal when it trades away too much safety.
        
           | thebruce87m wrote:
           | There is a difference between driving too fast around a
           | corner to stop for something stationary on the road and
           | driving through countryside where something might jump out.
           | 
           | I live in a country with deer but the number of incidences of
           | them interacting with road users is so low that it does not
           | factor in to my risk tolerance.
        
             | Zigurd wrote:
             | The risks vary with speed. At 30mph a deer will be injured
             | and damage your car, and you might have to call animal
             | control to find the deer if it was able to get away. At
             | 45mph there is a good chance the deer will impact your
             | windshield. If it breaks through, that's how people die in
             | animal collisions. They get kicked to death by a frantic,
             | panicked, injured animal.
        
         | freejazz wrote:
         | The article explains the investigation is based upon visibility
         | issues... what is your point? I don't think any reasonable
         | person doubts there are circumstances where nothing could
         | adequately respond in order to prevent a crash. It seems a
         | rather odd assumption to reach that these crashes would be in
         | one of those scenarios such that we should be explained to
         | otherwise, no less so when the report facially explains this to
         | not be the case.
        
       | ivewonyoung wrote:
       | > NHTSA said it was opening the inquiry after four reports of
       | crashes where FSD was engaged during reduced roadway visibility
       | like sun glare, fog, or airborne dust. A pedestrian was killed in
       | Rimrock, Arizona, in November 2023 after being struck by a 2021
       | Tesla Model Y, NHTSA said. Another crash under investigation
       | involved a reported injury
       | 
       | > The probe covers 2016-2024 Model S and X vehicles with the
       | optional system as well as 2017-2024 Model 3, 2020-2024 Model Y,
       | and 2023-2024 Cybertruck vehicles.
       | 
       | This is good, but also for context 45 thousand people are killed
       | in auto accidents in just the US every year, making 4 report
       | crashes and 1 reported fatality for 2.4 million vehicles over 8
       | years look miniscule by comparison, or even better than many
       | human drivers.
        
         | whiplash451 wrote:
         | Did you scale your numbers in proportion of miles driven
         | autonomously vs manually?
        
           | josephg wrote:
           | Yeah, that'd be the interesting figure: How many deaths per
           | million miles driven? How does Tesla's full self driving
           | stack up against human drivers?
        
             | gostsamo wrote:
             | Even that is not good enough, because the "autopilot"
             | usually is not engaged in challenging conditions making any
             | direct comparisons not really reliable. You need similar
             | roads in simila weather and similar time of the day for
             | approximating good comparison.
        
               | ivewonyoung wrote:
               | How many of the 45,000 deaths on US roads( and an order
               | of magnitude more injuries) occur due to 'challenging
               | conditions' ?
        
         | dekhn wrote:
         | Those numbers aren't all the fatalities associated with tesla
         | cars; IE, you can't compare the 45K/year (roughly 1 per 100M
         | miles driven) to the limited number of reports.
         | 
         | What they are looking for is whether there are systematic
         | issues with the design and implementation that make it unsafe.
        
           | moduspol wrote:
           | Unsafe relative to what?
           | 
           | Certainly not to normal human drivers in normal cars. Those
           | are killing people left and right.
        
             | dekhn wrote:
             | I don't think the intent is to compare it to normal human
             | drivers, although having some level of estimate of
             | accident/injury/death rates (to both the driver, passenger,
             | and people outside the car) with FSD enabled/disabled would
             | be very interesting.
        
               | moduspol wrote:
               | > I don't think the intent is to compare it to normal
               | human drivers
               | 
               | I think our intent should be focused on where the
               | fatalities are happening. To keep things comparable, we
               | could maybe do 40,000 studies on distracted driving in
               | normal cars for every one or two caused by Autopilot /
               | FSD.
               | 
               | Alas, that's not where our priorities are.
        
             | llamaimperative wrote:
             | Those are good questions. We should investigate to find
             | out. (It'd be different from this one but it raises a good
             | question. What _is_ FSD safe compared to?)
        
             | AlexandrB wrote:
             | No they're not. And if you do look at human drivers you're
             | likely to see a Pareto distribution where 20% of drivers
             | cause most of the accidents. This is completely unlike
             | something like FSD where accidents would be more evenly
             | distributed. It's entirely possible that FSD would make 20%
             | of the drivers safer and ~80% less safe even if the overall
             | accident rate was lower.
        
             | Veserv wrote:
             | What? Humans are excellent drivers. Humans go ~70 years
             | between injury-causing accidents and ~5,000 years between
             | fatal accidents even if we count the drunk drivers. If you
             | started driving when the Pyramids were still new, you would
             | still have half a millennium until you reach the expected
             | value between fatalities.
             | 
             | The only people pumping the line that human drivers are bad
             | are the people trying to sell a dream that they can make a
             | self-driving car in a weekend, or "next year", if you just
             | give them a pile of money and ignore all the red flags and
             | warning signs that they are clueless. The problem is
             | shockingly hard and underestimating it is the first step to
             | failure. Reckless development will not get you there safely
             | with known technology.
        
         | throwup238 wrote:
         | _> The agency is asking if other similar FSD crashes have
         | occurred in reduced roadway visibility conditions, and if Tesla
         | has updated or modified the FSD system in a way that may affect
         | it in such conditions._
         | 
         | Those four crashes are just the ones that sparked the
         | investigation.
        
         | tapoxi wrote:
         | I don't agree with this comparison. The drivers are licensed,
         | they have met a specific set of criteria to drive on public
         | roads. The software is not.
         | 
         | We are not sure when FSD is engaged with all of these miles
         | driven, and if FSD is making mistakes a licensed human driver
         | would not. I would at the very least expect radical
         | transparency.
        
           | fallingknife wrote:
           | I too care more about bureaucratic compliance than what the
           | actual chances of something killing me are. When I am on that
           | ambulance I will be thinking "at least that guy met the
           | specific set of criteria to be licensed to drive on public
           | roads."
        
             | tapoxi wrote:
             | Are we really relegating drivers licenses to "bureaucratic
             | compliance"?
             | 
             | If FSD is being used in a public road, it impacts everyone
             | on that road, not just the person who opted-in to using
             | FSD. I absolutely want an independent agency to ensure it's
             | safe and armed with the data that proves it.
        
               | fallingknife wrote:
               | What else are they? You jump through hoops to get a piece
               | of plastic from the government that declares you "safe."
               | And then holders of those licenses go out and kill 40,000
               | people every year just in the US.
        
         | insane_dreamer wrote:
         | > making 4 report crashes and 1 reported fatality for 2.4
         | million vehicles over 8 years look miniscule by comparison,
         | 
         | that's the wrong comparison
         | 
         | the correct comparison is the number of report crashes and
         | fatalities for __unsupervised FSD__ miles driven (not counting
         | Tesla pilot tests, but actual customers)
        
           | jandrese wrote:
           | That seems like a bit of a chicken and egg problem where the
           | software is not allowed to go unsupervised until it racks up
           | a few million miles of successful unsupervised driving.
        
             | AlotOfReading wrote:
             | There's a number of state programs to solve this problem
             | with testing permits. The manufacturer puts up a bond and
             | does testing in a limited area, sending reports on any
             | incidents to the state regulator. The largest of these,
             | California's, has several dozen companies with testing
             | permits.
             | 
             | Tesla currently does not participate in any of these
             | programs.
        
             | insane_dreamer wrote:
             | Similar to a Phase 3 clinical trial (and for similar
             | reasons).
        
         | enragedcacti wrote:
         | > making 4 report crashes and 1 reported fatality for 2.4
         | million vehicles over 8 years look miniscule by comparison, or
         | even better than many human drivers.
         | 
         | This is exactly what people were saying about the NHTSA
         | Autopilot investigation when it started back in 2021 with 11
         | reported incidents. When that investigation wrapped earlier
         | this year it had identified 956 Autopilot related crashes
         | between early 2018 and August 2023, 467 of which were confirmed
         | the fault of autopilot and an inattentive driver.
        
           | fallingknife wrote:
           | So what? How many miles were driven and what is the record vs
           | human drivers? Also Autopilot is a standard feature that is
           | much less sophisticated than and has nothing to do with FSD.
        
       | graeme wrote:
       | Will the review assess overall mortality of the vehicles compared
       | to similar cars, and overall mortality while FSD is in use?
        
         | dekhn wrote:
         | No, that is not part of a review. They may use some reference
         | aggregated industry data, but it's out of scope to answwer the
         | question I think you're trying to imply.
        
         | infamouscow wrote:
         | Lawyers are not known for their prowess in mathematics, let
         | alone statistics.
         | 
         | Making these arguments from the standpoint of an engineer is
         | counterproductive.
        
           | fallingknife wrote:
           | Which is why they are the wrong people to run the country
        
             | paulryanrogers wrote:
             | Whom? Because math is important and so is law, among a
             | variety of other things.
             | 
             | s/ Thankfully the US presidential choices are at least
             | rational, of sound mind, and well rounded people. Certainly
             | no spoiled man children among them. /s
        
         | johnthebaptist wrote:
         | Yes, if tesla complies and provides that data
        
         | bbor wrote:
         | I get where you're coming from and would also be interested to
         | see, but based on the clips I've seen that wouldn't be enough
         | in this case. Of course the bias is inherent in what people
         | choose to post (not normal _and_ not terrible /litigable), but
         | I think there's enough at this point to perceive a stable
         | pattern.
         | 
         | Long story short, my argument is this: it doesn't matter if you
         | reduce serious crashes from 100PPM to 50PPM if 25PPM of those
         | are _new_ crash sources, speaking from a psychological and
         | sociological perspective. Everyone should know that driving
         | drunk, driving distracted, driving in bad weather, and in rural
         | areas at dawn or dusk is dangerous, and takes appropriate
         | precautions. But what do you do if your car might crash because
         | someone ahead flashed their high beams, or because the sun was
         | reflecting off another car in an unusual way? Could you really
         | load up your kids and take your hands off the wheel knowing
         | that at any moment you might hit an unexpected edge condition?
         | 
         | Self driving cars are (presumably!) hard enough to trust
         | already, since you're giving away so much control. There's a
         | reason planes have to be way more than "better, statistically
         | speaking" -- we expect them to be nearly flawless, safety-wise.
        
           | dragonwriter wrote:
           | > But what do you do if your car might crash because someone
           | ahead flashed their high beams, or because the sun was
           | reflecting off another car in an unusual way?
           | 
           | These are -- like drunk driving, driving distract, and
           | driving in bad weather -- things that actually do cause
           | accidents with human drivers.
        
             | hunter-gatherer wrote:
             | The point is the choice of taking precaution part that you
             | left out of the quote. The other day I was taking my kid to
             | school, and when we turned east the sun was in my eyes and
             | I couldn't see anything, so I pulled over as fast as I
             | could and changed my route. Had I chosen to press forward
             | and been in an accident, it is explainable (albeit still
             | unfortunate and often unnecessary!). However, if I'm under
             | the impression that my robot car can handle such
             | circumstances because it does most of the time and then it
             | glitches, that is harder to explain.
        
             | dfxm12 wrote:
             | This language is a bit of a sticking point for me. If
             | you're drunk driving or driving distracted, there's no
             | "accident". You're _intentionally_ doing something wrong
             | and _committing a crime_.
        
             | paulryanrogers wrote:
             | Indeed, yet humans can anticipate such things and rely on
             | their experience to reason about what's happening and how
             | to react. Like slow down or shift lanes or just move ones
             | head for a different perfective. A Tesla with only two
             | cameras ("because that's all humans need") is unlikely to
             | provably match that performance for a long time.
             | 
             | Tesla could also change its software without telling the
             | driver at any point.
        
         | FireBeyond wrote:
         | If you're trying to hint at Tesla's own stats, then at this
         | point those are hopelessly, and knowingly, misleading.
         | 
         | All they compare is "On the subsets of driving on only the
         | roads where FSD is available, active, and has not or did not
         | turn itself off because of weather, road, traffic or any other
         | conditions" versus "all drivers, all vehicles, all roads, all
         | weather, all traffic, all conditions".
         | 
         | There's a reason Tesla doesn't release the raw data.
        
           | rblatz wrote:
           | I have to disengage FSD multiple times a day and I'm only
           | driving 16 miles round trip. And routinely have to stop it
           | from doing dumb things like stopping at green traffic lights,
           | attempting to do a u turn from the wrong turn lane, or
           | switching to the wrong lane right before a turn.
        
             | rad_gruchalski wrote:
             | Why would you even turn it on at this point...
        
         | akira2501 wrote:
         | Fatalities per passenger mile driven is the only statistic that
         | would matter. I actually doubt this figure differs much, either
         | way, from the overall fleet of vehicles.
         | 
         | This is because "inattentive driving" is _rarely_ the cause of
         | fatalities on the road. The winner there is, and probably
         | always will be, Alcohol.
        
           | dylan604 wrote:
           | > The winner there is, and probably always will be, Alcohol.
           | 
           | I'd imagine mobile device use will overtake alcohol soon
           | enough
        
             | akira2501 wrote:
             | Mobile devices have been here for 40 years. The volume of
             | alcohol sold every year suggests this overtake point will
             | never occur.
        
           | porphyra wrote:
           | Distracted driving cost 3308 lives in 2022 [1].
           | 
           | Alcohol is at 13384 in 2021 [2].
           | 
           | Although you're right that alcohol does claim more lives,
           | distracted driving is still highly dangerous and isn't all
           | that rare.
           | 
           | [1] https://www.nhtsa.gov/risky-driving/distracted-driving
           | 
           | [2] https://www.nhtsa.gov/book/countermeasures-that-
           | work/alcohol...
        
             | akira2501 wrote:
             | They do a disservice by not further breaking down
             | distracted driving by age. Once you see it that way it's
             | hard to accept that distracted driving on it's own is the
             | appropriate target.
             | 
             | Anyways.. NHTSA publishes the FARS. This is the definitive
             | source if you want to understand the demographics of
             | fatalities in the USA.
             | 
             | https://www.nhtsa.gov/research-data/fatality-analysis-
             | report...
        
       | xvector wrote:
       | My Tesla routinely tries to kill me on absolutely normal
       | California roads in normal sunny conditions, especially when
       | there are cars parked on the side of the road (it often brakes
       | thinking I'm about to crash into them, or even swerves into them
       | thinking that's the "real" lane).
       | 
       | Elon's Unsupervised FSD dreams are a good bit off. I do hope they
       | happen though.
        
         | delichon wrote:
         | Why do you drive a car that routinely tries to kill you? That
         | would put me right off. Can't you just turn off the autopilot?
        
           | ddingus wrote:
           | My guess is the driver tests it regularly.
           | 
           | How does it do X, Y, ooh Z works, etc...
        
           | xvector wrote:
           | It's a pretty nice car when it's not trying to kill me
        
         | jrflowers wrote:
         | > My Tesla routinely tries to kill me
         | 
         | > Elon's Unsupervised FSD dreams are a good bit off. I do hope
         | they happen though.
         | 
         | It is very generous that you would selflessly sacrifice your
         | own life so that others might one day enjoy Elon's dream of
         | robot taxis without steering wheels
        
           | massysett wrote:
           | Even more generous to selflessly sacrifice the lives and
           | property of others that the vehicle "self-drives" itself
           | into.
        
           | judge2020 wrote:
           | If the data sharing checkboxes are clicked, OP can still help
           | send in training data while driving on his own.
        
         | Renaud wrote:
         | And what if the car swerves, and you aren't able to correct in
         | time and end up killing someone?
         | 
         | Is that your fault or the car's?
         | 
         | I would bet that since it's your car, and you're using a
         | knowingly unproven technology, it would be your fault?
        
           | ra7 wrote:
           | The driver's fault. Tesla never accepts liability.
        
             | LunicLynx wrote:
             | And they have been very clear about that
        
         | bogantech wrote:
         | > My Tesla routinely tries to kill me
         | 
         | Why on earth would you continue to use it? If it does succeed
         | someday that's on you
        
           | newdee wrote:
           | > that's on you
           | 
           | They'd be dead, doubt it's a concern at that point.
        
         | left-struck wrote:
         | That's hilariously ironic because I have a pretty standard
         | newish Japanese petrol car (I'm not mentioning the brand
         | because my point isn't that brand x is better than brand y),
         | and it has no ai self driving functions just pretty basic radar
         | adaptive cruise control and emergency brake assist where it
         | will stop if there's a car brake hard in front of you... and it
         | does a remarkable job at rejecting cars which are slowing down
         | or stopped in other lanes, even when you're going around a
         | corner and the car is pointing straight towards the other cars
         | but not actually heading towards them since it's turning. I
         | assume they are using the steering input to help reject other
         | vehicles and dopler effects to detect differences in speed, but
         | it's remarkable how accurate it is at matching the speed of the
         | car in front of you and only the car in front of you, even when
         | that car is over 15 seconds in front of you. If teslas can't
         | beat that, it's sad
        
         | gitaarik wrote:
         | I wonder, how are you "driving"? Are you sitting behind the
         | wheel doing nothing except watch really good everything the car
         | does so you can take over when needed? Isn't that a stressful
         | experience? Wouldn't it be more comfortable to just do
         | everything yourself so you know nothing weird can happen?
         | 
         | Also, if the car does something crazy, how much time do you
         | have to react? I can imagine in some situations you might have
         | too little time to prevent the accident the car is creating.
        
       | botanical wrote:
       | Only the US government can allow corporations to beta test
       | unproven technology on the public.
       | 
       | Governments should carry out comprehensive tests on a self-
       | driving car's claimed capabilities. This is the same as cars
       | without proven passenger safety (Euro NCAP) aren't allowed to be
       | on roads carrying passengers.
        
         | krasin wrote:
         | > Only the US government can allow corporations to beta test
         | unproven technology on the public.
         | 
         | China and Russia do it too. It's not an excuse, but definitely
         | not just the US.
        
         | CTDOCodebases wrote:
         | Meh. Happens all around the world. Even if the product works
         | there is no guarantee that it will be safe.
         | 
         | Asbestos products are a good example of this. A more recent one
         | is Teflon made with PFOAs or engineered stone like Caesarstone.
        
         | dzhiurgis wrote:
         | If it takes 3 months to approve where steel rocket falls you
         | might as well give up iterating something as complex as FSD.
        
           | bckr wrote:
           | Drive it in larger and larger closed courses. Expand to
           | neighboring areas with consent of the communities involved.
           | Agree on limited conditions until enough data has been
           | gathered to expand those conditions.
        
             | romon wrote:
             | While controlled conditions promote safety, they do not
             | yield effective training data.
        
               | AlotOfReading wrote:
               | That's how all autonomous testing programs currently work
               | around the world. That is, every driverless vehicle
               | system on roads today was developed this way. You're
               | going to have to be more specific when you say that it
               | doesn't work.
        
           | AlotOfReading wrote:
           | There _are_ industry standards for this stuff. ISO 21448,
           | UL-4600, UNECE R157 for example, and even commercial
           | certification programs like the one run by TUV Sud for
           | European homologation. It 's a deliberate series of decisions
           | on Tesla's part to make their regulatory life as difficult as
           | possible.
        
         | akira2501 wrote:
         | > Only the US government
         | 
         | Any Legislative body can do so. There's no reason to limit this
         | strictly to the federal government. States and municipalities
         | should have a say in this as well. The _citizens_ are the only
         | entity that _decide_ if beta technology can be used or not.
         | 
         | > comprehensive tests on a self-driving car's claimed
         | capabilities.
         | 
         | This presupposes the government is naturally capable of
         | performing an adequate job at this task or that the automakers
         | won't sue the government to interfere with the testing regime
         | and efficacy of it's standards.
         | 
         | > aren't allowed to be on roads carrying passengers.
         | 
         | According to Wikipedia Euro NCAP is a _voluntary_ organization
         | and describes the situation thusly "legislation sets a minimum
         | compulsory standard whilst Euro NCAP is concerned with best
         | possible current practice." Which effectively highlights the
         | above problems perfectly.
        
       | dzhiurgis wrote:
       | What is FSD uptake rate. I bet it's less than 1% since in most
       | countries it's not even available...
        
       | massysett wrote:
       | "Tesla says on its website its FSD software in on-road vehicles
       | requires active driver supervision and does not make vehicles
       | autonomous."
       | 
       | Despite it being called "Full Self-Driving."
       | 
       | Tesla should be sued out of existence.
        
         | bagels wrote:
         | It didn't always say that. It used to be more misleading, and
         | claim that the cars have "Full Self Driving Hardware", with an
         | exercise for the reader to deduce that it didn't come with
         | "Full Self Driving Software" too.
        
           | peutetre wrote:
           | And Musk doesn't want to "get nuanced" about the hardware:
           | 
           | https://electrek.co/2024/10/15/tesla-needs-to-come-clean-
           | abo...
        
         | fhdsgbbcaA wrote:
         | "Sixty percent of the time, it works every time"
        
         | systemvoltage wrote:
         | That seems extreme. They're the biggest force to combat Climate
         | Change. Tesla existing is good for the world.
        
           | mbernstein wrote:
           | Nuclear power adoption is the largest force to combat climate
           | change.
        
             | ivewonyoung wrote:
             | Are you proposing that cars should have nuclear reactors in
             | them?
             | 
             | Teslas run great on nuclear power, unlike fossil fuel ICE
             | cars.
        
               | mbernstein wrote:
               | Of course not.
        
               | dylan604 wrote:
               | Why not? We just need to use Mr Fusion in everything
               | 
               | https://backtothefuture.fandom.com/wiki/Mr._Fusion
        
               | ivewonyoung wrote:
               | In a world where nuclear power helped with climate
               | change, would also be a world where Teslas would
               | eliminate a good chunk of harmful pollution by allowing
               | cars to be moved by nuclear, so not sure what point you
               | were trying to make.
               | 
               | Even at this minute, Teslas are moving around powered by
               | nuclear power.
        
             | Retric wrote:
             | Historically, hydro has prevented for more CO2 than nuclear
             | by a wide margin.
             | https://ourworldindata.org/grapher/electricity-prod-
             | source-s...
             | 
             | Looking forward Nuclear isn't moving the needle. Solar grew
             | more in 2023 alone than nuclear has grown since 1995. Worse
             | nuclear can't ramp up significantly in the next decade
             | simply due to construction bottlenecks. 40 years ago
             | nuclear could have played a larger role, but we wasted that
             | opportunity.
             | 
             | It's been helpful, but suggesting it's going to play a
             | larger role anytime soon is seriously wishful thinking at
             | this point.
        
               | UltraSane wrote:
               | That just goes to show how incredibly short sighted
               | humanity is. We new about the risk of massive CO2
               | emissions from burning fossil fuels but just ignored it
               | while irrationally demonizing nuclear energy because it
               | is scawy. If humans were sane and able to plan earth
               | would be getting 100% of all electricity from super-
               | efficient 7th generation nuclear reactors.
        
               | mbernstein wrote:
               | When talking to my parents, I hear a lot about Jane Fonda
               | and the China Syndrome as far as the fears of nuclear
               | power.
               | 
               | She's made the same baseless argument for a long time:
               | "Nuclear power is slow, expensive -- and wildly
               | dangerous"
               | 
               | https://ourworldindata.org/nuclear-
               | energy#:~:text=The%20key%....
               | 
               | CO2 issues aside, it's just outright safer than all forms
               | of coal and gas and about as safe as solar and wind, all
               | three of which are a bit safer than hydro (still very
               | safe).
        
               | Retric wrote:
               | I agree costs could have dropped significantly, but I
               | doubt 100% nuclear was ever going to happen.
               | 
               | Large scale dams will exist to store water, tacking
               | hydroelectric on top of them is incredibly cost
               | effective. Safety wise dams are seriously dangerous, but
               | they also save a shocking number of lives by reducing
               | flooding.
        
               | valval wrote:
               | There was adequate evidence that nuclear is capable of
               | killing millions of people and causing large scale
               | environmental issues.
               | 
               | It's still not clear today what effect CO2 or fossil fuel
               | usage has on us.
        
               | UltraSane wrote:
               | Nuclear reactors are not nuclear bombs. Nuclear reactors
               | are very safe on a Joules per death bases
        
               | mbernstein wrote:
               | History is a great reference, but it doesn't solve our
               | problems now. Just because hydro has prevented more CO2
               | until now doesn't mean that plus solar are the
               | combination that delivers abundant, clean energy. There
               | are power storage challenges and storage mechanisms
               | aren't carbon neutral. Even if we assume that nuclear,
               | wind, and solar (without storage) all have the same
               | carbon footprint - I believe nuclear is less that solar
               | pretty much equivalent to wind - you have to add the
               | storage mechanisms for scenarios where there's no wind,
               | sun, or water.
               | 
               | All of the above are significantly better than burning
               | gas or coal - but nuclear is the clear winner from an CO2
               | and general availability perspective.
        
               | Retric wrote:
               | Seriously scaling nuclear would involve batteries.
               | Nuclear has issues being cost effective at 80+% capacity
               | factors. When you start talking sub 40% capacity factors
               | the cost per kWh spirals.
               | 
               | The full cost of operating a multiple nuclear reactor for
               | just 5 hours per day just costs more than a power plant
               | at 80% capacity factor charging batteries.
        
               | mbernstein wrote:
               | > Seriously scaling nuclear would involve batteries.
               | Nuclear has issues being cost effective at 80+% capacity
               | factors.
               | 
               | I assume you mean that sub 80% capacity nuclear has
               | issues being cost effective (which I agree is true).
               | 
               | You could pair the baseload nuclear with renewables
               | during peak times and reduce battery dependency for
               | scaling and maintaining higher utilization.
        
               | Retric wrote:
               | I meant even if you're operating nuclear as baseload
               | power looking forward the market rate for electricity
               | looks rough without significant subsidies.
               | 
               | Daytime you're facing solar head to head which is already
               | dropping wholesale rates. Off peak is mostly users
               | seeking cheap electricity so demand at 2AM is going to
               | fall if power ends up cheaper at noon. Which means
               | nuclear needs to make most of its money from the duck
               | curve price peaks. But batteries are driving down peak
               | prices.
               | 
               | Actually cheap nuclear would make this far easier, but
               | there's no obvious silver bullet.
        
               | dylan604 wrote:
               | > Historically, hydro has
               | 
               | done harm to the ecosystems where they are installed.
               | This is quite often overlooked and brushed aside.
               | 
               | There is no single method of generating electricity
               | without downsides.
        
               | Retric wrote:
               | We've made dams long before we knew about electricity. At
               | which point tacking hydropower to a dam that would exist
               | either way has basically zero environmental impact.
               | 
               | Pure hydropower dams definitely do have significant
               | environmental impact.
        
               | dylan604 wrote:
               | I just don't get the premise of your argument. Are you
               | honestly saying that stopping the normal flow of water
               | has no negative impact on the ecosystem? What about the
               | area behind the dam that is now flooded? What about the
               | area in front of the dam where there is now no way to
               | traverse back up stream?
               | 
               | Maybe your just okay and willing to accept that kind of
               | change. That's fine, just as some people are okay with
               | the risk of nuclear, the use of land for solar/wind. But
               | to just flat out deny that it has impact is just
               | dishonest discourse at best
        
               | Retric wrote:
               | It's the same premise as rooftop solar. You're building a
               | home anyway so adding solar panels to the roof isn't
               | destroying pristine habitat.
               | 
               | People build dams for many reasons not just electricity.
               | 
               | Having a reserve of rainwater is a big deal in
               | California, Texas, etc. Letting millions of cubic meters
               | more water flow into the ocean would make the water
               | problems much worse in much of the world. Flood control
               | is similarly a serious concern. Blaming 100% of the
               | issues from dams on Hydropower is silly if outlawing
               | hydropower isn't going to remove those dams.
        
             | porphyra wrote:
             | I think solar is a lot cheaper than nuclear, even if you
             | factor in battery storage.
        
           | gamblor956 wrote:
           | Every year Musk personally flies enough in his private jet to
           | undo the emissions savings of over 100,000 EVs...
           | 
           | Remember that every time you get in your Tesla that you're
           | just a carbon offset for a spoiled billionaire.
        
             | enslavedrobot wrote:
             | Hmmmm average car uses 489 gallons a year. Large private
             | jet uses 500 gallons an hour. There are 9125 hours in a
             | year.
             | 
             | So if Elon lives in a jet that flys 24/7 you're only very
             | wrong. Since that's obviously not the case you're
             | colossally and completely wrong.
             | 
             | Remember that the next time you try to make an argument
             | that Tesla is not an incredible force for decarbonization.
        
               | briansm wrote:
               | I think you missed the 'EV' part of the post.
        
             | valval wrote:
             | As opposed to all the other execs whose companies aren't a
             | force to combat climate change and still fly their private
             | jets.
             | 
             | But don't get me wrong, anyone and everyone can fly their
             | private jets if they can afford such things. They will
             | already have generated enough taxes at that point that
             | they're offsetting thousands or millions of Prius drivers.
        
               | gamblor956 wrote:
               | _As opposed to all the other execs_
               | 
               | Yes, actually.
               | 
               | Other execs fly _as needed_ because they recognize that
               | in this wondrous age of the internet that
               | teleconferencing can replace most in-person meetings.
               | Somehow, only a supposed technology genius like Elon Musk
               | thinks that in-person meetings required for everything.
               | 
               | Other execs also don't claim to be trying to save the
               | planet while doing everything in their power to exploit
               | its resources or destroy natural habitats.
        
           | gitaarik wrote:
           | As I understand, electric cars are more polluting than non-
           | electric, because first of all manufacturing and resources
           | footprint is larger, but also because they are heavier
           | (because of the batteries), the tires wear down much faster,
           | needing more tire replacement, which is so significantly much
           | that their emission free-ness doesn't compensate for it.
           | 
           | Besides, electric vehicles still seem to be very impractical
           | compared to normal cars, because they can't drive very far
           | without needing a lengthy recharge.
           | 
           | So I think the eco-friendliness of electric vehicles is maybe
           | like the full self-driving system: nice promises but no
           | delivery.
        
             | theyinwhy wrote:
             | That has been falsified by more studies than I can keep
             | track of. And yes, if you charge your electric with
             | electricity produced by oil, the climate effect will be
             | non-optimal.
        
             | djaychela wrote:
             | Pretty much everything you've said here isn't true. You are
             | just repeating tropes that are fossil fuel industry FUD.
        
         | hedora wrote:
         | Our non-Tesla has steering assist. In my 500 miles of driving
         | before I found the buried setting that let me completely
         | disable it, the active safety systems never made it more than
         | 10-20 miles without attempting to actively steer the car left-
         | of-center or into another vehicle, even when it was "turned
         | off" via the steering wheel controls.
         | 
         | When it was turned on according to the dashboard UI, things
         | were even worse. It'd disengage less than every ten miles.
         | However, there wasn't an alarm when it disengaged, just a tiny
         | gray blinking icon on the dash. A second or so after the
         | blinking, it'd beep once and then pull crap like attempt a
         | sharp left on an exit ramp that curved to the right.
         | 
         | I can't imagine this model kills fewer people per mile than
         | Tesla FSD.
         | 
         | I think there should be a recall, but it should hit pretty much
         | all manufacturers shipping stuff in this space.
        
           | shepherdjerred wrote:
           | My Hyundai has a similar feature and it's excellent. I don't
           | think you should be painting with such a broad brush.
        
           | noapologies wrote:
           | I'm not sure how any of this is related to the article. Does
           | this non-Tesla manufacturer claim that their steering assist
           | is "full self driving"?
           | 
           | If you believe their steering assist kills more people than
           | Tesla FSD then you're welcome, encouraged even, to file a
           | report with the NHTSA here [1].
           | 
           | [1] https://www.nhtsa.gov/report-a-safety-problem
        
           | gamblor956 wrote:
           | If what you say is true, name the car model and file a report
           | with the NHTSA.
        
           | HeadsUpHigh wrote:
           | Ive had similar experience with a Hyundai with steering
           | assist. It would get confused by messed road lining all the
           | time. Meanwhile it had no problem climbing a road curb that
           | was unmarked. And it would try to constantly nudge the
           | steering wheel meaning I had to put force into holding it in
           | place all the time since it which was extra fatigue.
           | 
           | Oh and it was on by default, meaning I had to disable it
           | every time I turned the car on.
        
             | shepherdjerred wrote:
             | What model year? I'm guessing it's an older one?
             | 
             | My Hyundai is a 2021 and I have to turn on the steering
             | assist every time which I find annoying. My guess is that
             | you had an earlier model where the steering assist was more
             | liability than asset.
             | 
             | It's understandable that earlier versions of this kind of
             | thing wouldn't function as well, but it is very strange
             | that they would have it on by default.
        
         | m463 wrote:
         | I believe it's called "Full Self Driving (Supervised)"
        
           | maeil wrote:
           | The part in parentheses has only recently been added.
        
             | rsynnott wrote:
             | And is, well, entirely contradictory. An absolute
             | absurdity; what happens when the irresistible force of the
             | legal department meets the immovable object of marketing.
        
             | tharant wrote:
             | Prior to that, FSD was labeled 'Full Self Driving (Beta)'
             | and enabling it triggered a modal that required two
             | confirmations explaining that the human driver must always
             | pay attention and is ultimately responsible for the
             | vehicle. The feature also had/has active driver monitoring
             | (via both vision and steering-torque sensors) that would
             | disengage FSD if the driver ignored the loud audible alarm
             | to "Pay attention". Since changing the label to
             | '(Supervised)', the audible nag is significantly reduced.
        
       | Aeolun wrote:
       | I love how the image in the article has a caption that says it
       | tells you to pay attention to the road, but I had to zoom in all
       | the way to figure out where that message actually was.
       | 
       | I'd expect something big and red with a warning triangle or
       | something, but it's a tiny white message in the center of the
       | screen.
        
         | valine wrote:
         | It gets progressively bigger and louder the longer you ignore
         | it. After 30ish seconds it sounds an alarm and kicks you out.
        
           | FireBeyond wrote:
           | > After 30ish seconds it sounds an alarm and kicks you out.
           | 
           | That's much better. When AP functionality was introduced, the
           | alarm was _fifteen MINUTES_.
        
         | taspeotis wrote:
         | Ah yes, red with a warning like "WARNING: ERROR: THE SITUATION
         | IS NORMAL!"
         | 
         | Some cars that have cruise control but an analog gauge cluster
         | that can't display WARNING ERRORs even hide stuff like "you
         | still have to drive the car" in a manual you have to read yet
         | nobody cares about that.
         | 
         | Honestly driving a car should require some sort of license for
         | a bare minimum of competence.
        
       | 23B1 wrote:
       | "Move fast and kill people"
       | 
       | Look, I don't know who needs to hear this, but just stop
       | supporting this asshole's companies. You don't need internet when
       | you're camping, you don't need a robot to do your laundry, you
       | don't need twitter, you can find more profitable and reliable
       | places to invest.
        
         | hnburnsy wrote:
         | Move slow and kill peo
        
         | hnburnsy wrote:
         | Move slow and kill people...
         | 
         | GM ignition switch deaths 124
         | 
         | https://money.cnn.com/2015/12/10/news/companies/gm-recall-ig...
        
           | 23B1 wrote:
           | And?
        
       | wg0 wrote:
       | In all the hype of AI etc, if you think about it then the
       | foundational problem is that even Computer Vision is not a solved
       | problem at the human level of accuracy and that's at the heart of
       | the issue of both Tesla and that Amazon checkout.
       | 
       | Otherwise as thought experiment, imagine just a tiny 1 Inch tall
       | person glued to the grocery trolley and another sitting on each
       | shelf - just these two alone are all you need for "automated
       | checkout".
        
         | vineyardmike wrote:
         | > Otherwise as thought experiment, imagine just a tiny 1 Inch
         | tall person glued to the grocery trolley and another sitting on
         | each shelf - just these two alone are all you need for
         | "automated checkout".
         | 
         | I don't think this would actually work, as silly a thought
         | experiment as it is.
         | 
         | The problem isn't the vision, it's state management and cost.
         | It was very easy (but expensive) to see and classify via CV if
         | a person picked something up, it just requires hundreds of
         | concurrent high resolution streams and a way to stitch the
         | global state from all the videos.
         | 
         | A little 1 inch person on each shelf needs a good way to
         | communicate to every other tiny person what they say, and come
         | to consensus. If 5 people/cameras detect person A picking
         | something up, you need to differentiate between every
         | permutation within 5 discrete actions and 1 seen 5 times.
         | 
         | In case you didn't know, Amazon actually hired hundreds of
         | people in India to review the footage and correct mistakes (for
         | training the models). They literally had a human on each shelf.
         | And they still had issues with the state management. With
         | people.
        
           | wg0 wrote:
           | Yeah - that's exactly is my point that humans were required
           | to recognize and computer vision is NOT a solved problem
           | regardless of tech bros misleading techno optimism.
           | 
           | Distributed communication and state management on the other
           | hand is a solved problem already mostly with known
           | parameters. How else do you think thousand and thousands of
           | Kubernetes work in the wild.
        
       | gnuser wrote:
       | I worked in 18 a wheeler automation unicorn.
       | 
       | Never rode in one once for a reason.
        
         | akira2501 wrote:
         | Automate the transfer yards, shipping docks, and trucking
         | terminals. Make movement of cargo across these limited use
         | areas entirely automated and as smooth as butter. Queue drivers
         | up and have their loads automatically placed up front so they
         | can drop and hook in a few minutes and get back on the road.
         | 
         | I honestly think that's the _easier_ problem to solve by at
         | least two orders of magnitude.
        
           | dylan604 wrote:
           | Did you miss the news about the recent strike by the very
           | people you are suggesting to eliminate? This automation was
           | one of the points of contention.
           | 
           | Solving the problem might not be as easy as you suggest as
           | long as their are powerful unions involved
        
             | akira2501 wrote:
             | This automation is inevitable. The ports are a choke point
             | created by unnatural monopoly and a labor union is the
             | incorrect solution. Particularly because their labor
             | actions have massive collateral damage to other labor
             | interests.
             | 
             | I believe that if trucking were properly unionized the port
             | unions would be crushed. They're not that powerful they've
             | just outlived this particular modernization the longest out
             | of their former contemporaries.
        
               | dylan604 wrote:
               | So a union is okay for the trucking industry, but not for
               | the dock workers?
               | 
               | And what exactly will the truckers be trucking if the
               | ports are crushed?
        
           | porphyra wrote:
           | There are a bunch of companies working on that. So far off
           | the top of my head I know of:
           | 
           | * Outrider: https://www.outrider.ai/
           | 
           | * Cyngn: https://www.cyngn.com/
           | 
           | * Fernride: https://www.fernride.com/
           | 
           | Any ideas what other ones are out there?
        
             | akira2501 wrote:
             | Promising. I'm actually more familiar with the actual
             | transportation and logistics side of the operation and
             | strictly within the USA. I haven't seen anything new put
             | into serious operation out here yet but I'll definitely be
             | watching for them.
        
       | xqcgrek2 wrote:
       | Law fare can work both ways and this administration is basically
       | a lame duck.
        
       | whiplash451 wrote:
       | Asking genuinely: is FSD enabled/accessible in EU?
        
         | AlotOfReading wrote:
         | FSD is currently neither legal nor enabled in the EU. That may
         | change in the future.
        
       | UltraSane wrote:
       | I'm astonished at how long Musk has been able to keep his
       | autonomous driving con going. He has been lying about it to
       | inflate Tesla shares for 10 years now.
        
         | ryandrake wrote:
         | Without consequences, there is no reason to stop.
        
           | UltraSane wrote:
           | When is the market going to realize Tesla is NEVER going to
           | have real level 4 autonomy where Tesla takes legal liability
           | for crashes the way Waymo has?
        
             | tstrimple wrote:
             | Market cares far more about money than lives. Until the
             | lives lost cost more than their profit, they give less than
             | zero fucks. Capitalism. Yay!
        
         | jjmarr wrote:
         | There's no con. It works. You can buy a Tesla that can drive
         | itself places and it's been technically capable of doing so for
         | several years. The current restrictions on autonomous cars are
         | legal, not technical.
        
           | UltraSane wrote:
           | It "works" if you mean often does incredibly stupid and
           | dangerous things and requires a person to be ready to take
           | over for it at any moment to prevent a crash. So far no Tesla
           | car has ever legally driven even a single mile without a
           | person in the driver's seat.
        
             | jjmarr wrote:
             | And? How does that make Elon Musk a con artist?
             | 
             | It's possible to physically get in a Tesla and have it
             | drive you from point A to point B. That's a self-driving
             | car. You're saying it's unreliable, makes mistakes, and can
             | be used illegally. That doesn't mean the car can't drive
             | itself, just that it doesn't do a very good job at "self-
             | driving"
        
               | UltraSane wrote:
               | "How does that make Elon Musk a con artist?"
               | 
               | Because since 2014 he has made wildly unrealistic claims
               | that he is smart enough to know were BS.
               | 
               | December 2015: "We're going to end up with complete
               | autonomy, and I think we will have complete autonomy in
               | approximately two years."
               | 
               | January 2016 In ~2 years, summon should work anywhere
               | connected by land & not blocked by borders, eg you're in
               | LA and the car is in NY
               | 
               | June 2016: "I really consider autonomous driving a solved
               | problem, I think we are less than two years away from
               | complete autonomy, safer than humans."
               | 
               | October 2016 By the end of next year, said Musk, Tesla
               | would demonstrate a fully autonomous drive from, say, a
               | home in L.A., to Times Square ... without the need for a
               | single touch, including the charging.
               | 
               | "A 2016 video that Tesla used to promote its self-driving
               | technology was staged to show capabilities like stopping
               | at a red light and accelerating at a green light that the
               | system did not have, according to testimony by a senior
               | engineer."
               | 
               | January 2017 The sensor hardware and compute power
               | required for at least level 4 to level 5 autonomy has
               | been in every Tesla produced since October of last year.
               | 
               | March 2017: "I think that [you will be able to fall
               | asleep in a Tesla] in about two years."
               | 
               | May 2017 Update on the coast to coast autopilot demo? -
               | Still on for end of year. Just software limited. Any
               | Tesla car with HW2 (all cars built since Oct last year)
               | will be able to do this.
               | 
               | March 2018 I think probably by end of next year [end of
               | 2019] self-driving will encompass essentially all modes
               | of driving and be at least 100% to 200% safer than a
               | person.
               | 
               | February 2019 We will be feature complete full self
               | driving this year. The car will be able to find you in a
               | parking lot, pick you up, take you all the way to your
               | destination without an intervention this year. I'm
               | certain of that. That is not a question mark. It will be
               | essentially safe to fall asleep and wake up at their
               | destination towards the end of next year
               | 
               | April 2019 We expect to be feature complete in self
               | driving this year, and we expect to be confident enough
               | from our standpoint to say that we think people do not
               | need to touch the wheel and can look out the window
               | sometime probably around the second quarter of next year.
               | 
               | May 2019 We could have gamed an LA/NY Autopilot journey
               | last year, but when we do it this year, everyone with
               | Tesla Full Self-Driving will be able to do it too
               | 
               | December 2020 I'm extremely confident that Tesla will
               | have level five next year, extremely confident, 100%
               | 
               | January 2021 FSD will be capable of Level 5 autonomy by
               | the end of 2021
        
           | two_handfuls wrote:
           | I tried it. It drives worse than a teenager.
           | 
           | There is absolutely no way this can safely drive a car
           | without supervision.
        
             | valval wrote:
             | It's been safer than a human driver for years. It's also
             | not meant to be unsupervised.
        
               | gitaarik wrote:
               | Something about these two statements seem to be in
               | conflict with each other, but maybe that is just kinda
               | Tesla PR talk.
        
               | valval wrote:
               | It's quite easy to be safer than a human driver, since
               | humans are just human. Supervision is required because
               | the system can face edge cases.
        
               | gitaarik wrote:
               | Ah ok so if humans would be supervised for their edge
               | cases then humans would actually be safer!
        
               | UltraSane wrote:
               | Edge cases like intersections?
        
               | UltraSane wrote:
               | It is cultish doublespeak.
        
               | lawn wrote:
               | Safer than a human driver...
               | 
               | According to Tesla.
        
           | peutetre wrote:
           | Musk believes Tesla without FSD is "worth basically zero":
           | https://www.businessinsider.com/elon-musk-tesla-worth-
           | basica...
           | 
           | Musk is now moving the FSD work to xAI, taking what
           | supposedly makes the public company Tesla valuable and
           | placing it into his private ownership:
           | https://www.wsj.com/tech/tesla-xai-partnership-elon-
           | musk-30e...
           | 
           | Seems like a good way to privatize shareholder capital.
        
         | porphyra wrote:
         | Just because it has taken 10 years longer than promised doesn't
         | mean that it will never happen. FSD has made huge improvements
         | this year and is on track to keep up the current pace so it
         | actually does seem closer than ever.
        
           | UltraSane wrote:
           | The current vision-only system is a clear technological dead-
           | end that can't go much more than 10 miles between
           | "disengagements". To be clear, "disengagements" would be
           | crashes if a human wasn't ready to take over. And not needing
           | a human driver is THE ENTIRE POINT! I will admit Musk isn't a
           | liar when Tesla has FSD at least as good as Waymo's system
           | and Tesla accepts legal liability for any crashes.
        
             | valval wrote:
             | You're wrong. Nothing about this is clear, and you'd be
             | silly to claim otherwise.
             | 
             | You should explore your bias and where it's coming from.
        
               | UltraSane wrote:
               | No Tesla vehicle has legally driven even a single mile
               | with no driver in the driver's seat. They aren't even
               | trying to play Waymo's game. The latest FSD software's
               | failure rate is at least 100 times higher than it needs
               | to be.
        
               | fallingknife wrote:
               | That's a stupid point. I've been in a Tesla that's driven
               | a mile by itself. It makes no difference if a person is
               | in the seat.
        
               | UltraSane wrote:
               | "It makes no difference if a person is in the seat." It
               | does when Musk is claiming that Tesla is going to sell a
               | car with no steering wheel!
               | 
               | The current Tesla FSD fails so often that a human HAS to
               | be in the driver seat ready to take over at any moment.
               | 
               | You really don't understand the enormous difference
               | between the current crappy level 2 Tesla FSD and Waymo's
               | level 4 system?
        
               | valval wrote:
               | The difference is that Tesla has a general algorithm,
               | while Waymo is hard coding scenarios.
               | 
               | I never really got why people bring Waymo up every time
               | Tesla's FSD is mentioned. Waymo isn't competing with
               | Tesla's vision.
        
               | porphyra wrote:
               | Waymo uses a learned planner and is far from "hardcoded".
               | In any case, imo both of these can be true:
               | 
               | * Tesla FSD works surprisingly well and improving
               | capabilities to hands free actual autonomy isn't as far
               | fetched as one might think.
               | 
               | * Waymo beat them to robotaxi deployment and scaling up
               | to multiple cities may not be as hard as people say.
               | 
               | It seems that self driving car fans are way too tribal
               | and seem to be convinced that the "other side" sucks and
               | is guaranteed to fail. In reality, it is very unclear as
               | both strategies have their merits and only time will tell
               | in the long run.
        
               | UltraSane wrote:
               | " Tesla FSD works surprisingly well and improving
               | capabilities to hands free actual autonomy isn't as far
               | fetched as one might think"
               | 
               | Except FSD doesn't work surprisingly well and there is no
               | way it will get as good as Waymo using vision-only.
               | 
               | "It seems that self driving car fans are way too tribal
               | and seem to be convinced that the "other side" sucks and
               | is guaranteed to fail."
               | 
               | I'm not being tribal, I'm being realistic based on the
               | very public performance of both systems.
               | 
               | If Musk was serious about his Robotaxi claims then Tesla
               | would be operating very differently. Instead it is pretty
               | obvious it all a con to inflate Tesla shares beyond all
               | reason.
        
               | UltraSane wrote:
               | The difference is that Waymo has a very well engineered
               | system using vision, LIDAR, and millimeter wave RADAR
               | that works well enough in limited areas to provide tens
               | of thousands of actual driver-less rides. Tesla has a
               | vision only system that sucks so bad a human has to be
               | ready to take over for it at any time like a parent
               | monitoring a toddler near stairs.
        
           | gitaarik wrote:
           | Just like AGI and the year of the Linux desktop ;P
        
             | porphyra wrote:
             | Honestly LLMs were a big step towards AGI, and gaming on
             | Linux is practically flawless now. Just played through
             | Black Myth Wukong with no issues out of the box.
        
               | UltraSane wrote:
               | LLMs are to AGI
               | 
               | as
               | 
               | A ladder is to getting to orbit.
               | 
               | I can seem LLMs serving as a kind of memory for an AGI
               | but something fundamentally different will be needed for
               | true reasoning and continues self-improvement.
        
       | DoesntMatter22 wrote:
       | Each version has improved. FSD is realistically the hardest thing
       | humanity as ever tried to do. It involves an enormous amount of
       | manpower, compute power and human discoveries, and has to work
       | right in billions of scenarios.
       | 
       | Building a self flying plane is comically easy by comparison.
       | Building Starship is easier by comparison.
        
         | gitaarik wrote:
         | Ah ok, first it is possible within 2 years, and now it is
         | humanity's hardest problem? If it's really that hard I think we
         | better put our resources into something more useful, like new
         | energy solutions, seems we have an energy crisis.
        
       | Animats wrote:
       | If Trump is elected, this probe will be stopped.
        
       | gitaarik wrote:
       | It concerns me that these Tesla's can suddenly start acting
       | differently after a software update. Seems like a great target
       | for a cyber attack. Or just a fail from the company. A little bug
       | that is accidentally spread to millions of cars all over the
       | world.
       | 
       | And how is this regulated? Say the software gets to a point that
       | we deem it safe for full self driving, then it gets approved on
       | the road, and then Tesla adds a new fancy feature to their
       | software and rolls out an update. How are we to be confident that
       | it's safe?
        
         | rightbyte wrote:
         | Imagine all Teslas doing a full left right now. And full right
         | in left steer countries.
         | 
         | OTA updates and auto updates in general is just a thing that
         | should not be in vehicles. The ecu:s should have to be air
         | gaped to the internet to be considered road worthy.
        
       | soerxpso wrote:
       | For whatever it's worth, Teslas with Autopilot enabled crash
       | about once every 4.5M miles driven, whereas the overall rate in
       | the US is roughly one crash every 70K miles driven. Of course,
       | the selection effects around that stat can be debated (people
       | probably enable autopilot in situations that are safer than
       | average, the average tesla owner might be driving more carefully
       | or in safer areas than the average driver, etc), but it is a
       | pretty significant difference. (Those numbers are what I could
       | find at a glance; DYOR if you'd like more rigor).
       | 
       | We have a lot of traffic fatalities in the US (in some states, an
       | entire order of magnitude worse than in some EU countries), but
       | it's generally not considered an issue. Nobody asks, "These
       | agents are crashing a lot; are they really competent to drive?"
       | when the agent is human, but when the agent is digital it becomes
       | a popular question even with a much lower crash rate.
        
         | deely3 wrote:
         | > Gaps in Tesla's telematic data create uncertainty regarding
         | the actual rate at which vehicles operating with Autopilot
         | engaged are involved in crashes. Tesla is not aware of every
         | crash involving Autopilot even for severe crashes because of
         | gaps in telematic reporting. Tesla receives telematic data from
         | its vehicles, when appropriate cellular connectivity exists and
         | the antenna is not damaged during a crash, that support both
         | crash notification and aggregation of fleet vehicle mileage.
         | Tesla largely receives data for crashes only with pyrotechnic
         | deployment, which are a minority of police reported crashes.3 A
         | review of NHTSA's 2021 FARS and Crash Report Sampling System
         | (CRSS) finds that only 18 percent of police-reported crashes
         | include airbag deployments.
        
       | alexjplant wrote:
       | > The collision happened because the sun was in the Tesla
       | driver's eyes, so the Tesla driver was not charged, said Raul
       | Garcia, public information officer for the department.
       | 
       | Am I missing something or is this the gross miscarriage of
       | justice that it sounds like? The driver could afford a $40k
       | vehicle but not $20 polarized shades from Amazon? Negligence is
       | negligence.
        
         | theossuary wrote:
         | You know what they say, if you want to kill someone in the US,
         | do it in a car.
        
           | littlestymaar wrote:
           | _Crash Course: If You Want to Get Away With Murder Buy a Car_
           | Woodrow Phoenix
        
           | immibis wrote:
           | In the US it seems you'd do it with a gun, but in Germany
           | it's cars.
           | 
           | There was this elderly driver who mowed down a family in a
           | bike lane waiting to cross the road in Berlin, driving over
           | the barriers between the bike lane and the car lane because
           | the cars in the car lane were too slow. Released without
           | conviction - it was an unforeseeable accident.
        
         | macintux wrote:
         | I have no idea what the conditions were like for this incident,
         | but I've blown through a 4-way stop sign when the sun was
         | setting. There's only so much sunglasses can do.
        
           | eptcyka wrote:
           | If environmental factors incapacitate you, should you not
           | slow down or stop?
        
           | vortegne wrote:
           | You shouldn't be on the road then? If you can't see, you
           | should slow down. If you can't handle driving in given
           | conditions safely for everyone involved, you should slow down
           | or stop. If everybody would drive like you, there'd be a
           | whole lot more death on the roads.
        
           | alexjplant wrote:
           | -\\_(tsu)_/- If I can't see because of rain, hail, intense
           | sun reflections, frost re-forming on my windshield, etc. then
           | I pull over and put my flashers on until the problem
           | subsides. Should I have kept the 4700 lb vehicle in fifth
           | gear at 55 mph without the ability to see in front of me in
           | each of these instances? I submit that I should not have and
           | that I did the right thing.
        
           | ablation wrote:
           | Yet so much more YOU could have done, don't you think?
        
           | Doctor_Fegg wrote:
           | Yes, officer, this one right here.
        
           | singleshot_ wrote:
           | > There's only so much sunglasses can do.
           | 
           | For everything else, you have brakes.
        
         | smdyc1 wrote:
         | Not to mention that when you can't see, you slow down? Does the
         | self-driving system do that sufficiently in low visibility?
         | Clearly not if it hit a pedestrian with enough force to kill
         | them.
         | 
         | The article mentions that Tesla's only use cameras in their
         | system and Musk believes they are enough, because humans only
         | use their eyes. Well firstly, don't you want self-driving
         | systems to be _better_ than humans? Secondly, humans don 't
         | just respond to visual cues as a computer would. We also hear
         | and respond to feelings, like the sudden surge of anxiety or
         | fear as our visibility is suddenly reduced at high speed.
        
           | pmorici wrote:
           | The Tesla knows when it's cameras and blinded by sun and act
           | accordingly or tells the human to take over.
        
             | kelnos wrote:
             | Expect when it doesn't actually do that, I guess? Like when
             | this pedestrian was killed?
        
             | eptcyka wrote:
             | If we were able to know when a neural net is failing to
             | categorize something, wouldn't we get AGI for free?
        
           | plorg wrote:
           | I would think one relevant factor is that human vision is
           | different than and in some ways significantly better than
           | cameras.
        
           | hshshshshsh wrote:
           | I think one of the reasons they focus only on vision is
           | basically the entire transportation infra is designed using
           | human eyes a primary way to channel information.
           | 
           | Useful information for driving are communicated through
           | images in form of road signs, traffic signals etc.
        
             | nkrisc wrote:
             | I dunno, knowing the exact relative velocity of the car in
             | front of you seems like it could be useful and is something
             | humans can't do very well.
             | 
             | I've always wanted a car that shows my speed and the
             | relative speed (+/-) of the car in front of me. My car's
             | cruise control can maintain a set distance so obviously
             | it's capable of it but it doesn't show it.
        
             | SahAssar wrote:
             | We are "designed" (via evolution) to perceive and
             | understand the environment around us. The signage is
             | designed to be easily readable for us.
             | 
             | The models that drive these cars clearly either have some
             | more evolution to do or for us to design the world more to
             | their liking.
        
               | hshshshshsh wrote:
               | Yes. I was talking why Tesla choose to use vision. Since
               | they can't control designing the transport infra to their
               | liking at least for now.
        
           | jsight wrote:
           | Unfortunately there is also an AI training problem embedded
           | in this. As Mobileye says, there are a lot of driver
           | decisions that are common, but wrong. The famous example is
           | rolling stops, but also failing to slow down for conditions
           | is really common.
           | 
           | It wouldn't shock me if they don't have nearly enough
           | training samples of people slowing appropriately for
           | visibility with eyes, much less slowing for the somewhat
           | different limitations of cameras.
        
         | jabroni_salad wrote:
         | Negligence is negligence but people tend to view vehicle
         | collisions as "accidents", as in random occurrences dealt by
         | the hand of fate completely outside of anyone's control. As
         | such, there is a chronic failure to charge motorists with
         | negligence, even when they have killed someone.
         | 
         | If you end up in court, just ask for a jury and you'll be okay.
         | I'm pretty sure this guy didnt even go to court, sounds like it
         | got prosecutor's discretion.
        
       | rKarpinski wrote:
       | 'Pedestrian' in this context seems pretty misleading
       | 
       | "Two vehicles collided on the freeway, blocking the left lane. A
       | Toyota 4Runner stopped, and two people got out to help with
       | traffic control. A red Tesla Model Y then hit the 4Runner and one
       | of the people who exited from it. "
       | 
       | edit: Parent article was changed... I was referring to the title
       | of the NPR article.
        
         | Retric wrote:
         | More clarity may change people's opinion of the accident, but
         | IMO pedestrian meaningfully represents someone who is limited
         | to human locomotion and lacks any sort of protection in a
         | collision.
         | 
         | Which seems like a reasonable description of the type of
         | failure involved in the final few seconds before impact.
        
           | rKarpinski wrote:
           | Omitting that the pedestrian was on a freeway meaningfully
           | mis-represents the situation.
        
             | Retric wrote:
             | People walking on freeways may be rare from the perspective
             | of an individual driver but not a self driving system
             | operating on millions of vehicles.
        
               | rKarpinski wrote:
               | What does that have to do with the original article's
               | misleading title?
        
               | Retric wrote:
               | I don't think it's misleading. It's a tile not some
               | hundred word description of what exactly happened.
               | 
               | Calling them motorists would definitely be misleading by
               | comparison. Using the simple "fatal crash" of the linked
               | title implies the other people might in be responsible
               | which is misleading.
               | 
               | Using accident but saying Tesla was at fault could open
               | them up to liability and therefore isn't an option.
        
               | rKarpinski wrote:
               | > I don't think it's misleading. It's a tile not some
               | hundred word description of what exactly happened.
               | 
               | "Pedestrian killed on freeway" instead of "pedestrian
               | killed" doesn't take 100 words and doesn't give the
               | impression Tesla's are mowing people down on crosswalks
               | (although that's a feature to get clicks, not a bug).
        
               | Retric wrote:
               | Without context that implies the pedestrians shouldn't
               | have been on the freeway.
               | 
               | It's not an issue for Tesla, but it does imply bad things
               | about the victims.
        
               | rKarpinski wrote:
               | A title of "U.S. to probe Tesla's 'Full Self-Driving'
               | system after pedestrian killed on freeway" would in no
               | way imply bad things about the pedestrian who was killed.
        
           | potato3732842 wrote:
           | This sort of framing you're engaging in is exactly what the
           | person you're replying to is complaining about.
           | 
           | Yeah, the person who got hit was technically a pedestrian but
           | just using that word with no other context doesn't covey that
           | it was a pedestrian on a limited access highway vs somewhere
           | pedestrians are allowed and expected. Without additional
           | explanation people assume normalcy and think that the
           | pedestrian was crossing a city street or something
           | pedestrians do all the time and are expected to do all the
           | time when that is very much not what happened here.
        
             | Retric wrote:
             | Dealing with people on freeways is the kind of edge case
             | humans aren't good at but self driving cars have zero
             | excuses. It's a common enough situation that someone will
             | exit a vehicle after a collision to make it a very
             | predictable edge case.
             | 
             | Remember all of the bad press Uber got when a pedestrian
             | was struck and killed walking their bike across the middle
             | of a street at night? People are going to be on limited
             | access freeways and these systems need to be able to deal
             | with it. https://www.bbc.com/news/technology-54175359
        
               | potato3732842 wrote:
               | I'd make the argument that people are very good at
               | dealing with random things that shouldn't be on freeways
               | as long as they don't coincide with blinding sun or other
               | visual impairment.
               | 
               | Tesla had a long standing issue detecting partial lane
               | obstructions. I wonder if the logic around that has
               | anything to do with this.
        
         | neom wrote:
         | That is the correct use of pedestrian as a noun.
        
           | echoangle wrote:
           | Sometimes using a word correctly is still confusing because
           | it's used in a different context 90% of the time.
        
           | szundi wrote:
           | I think parent commenter emphasized the context.
           | 
           | Leaving out context that would otherwise change the
           | interpretation of most or targeted people is the main way to
           | misled those people without technically lying.
        
             | neom wrote:
             | I mean it's the literal language they use in the report[1].
             | Personally, would much prefer a publication to be
             | technically correct, a person on foot on a motorway is
             | referred to as a pedestrian, that is the name for that.
             | 
             | [1]https://static.nhtsa.gov/odi/inv/2024/INOA-
             | PE24031-23232.pdf
        
           | varenc wrote:
           | By a stricter definition, a pedestrian is one who _travels_
           | by foot. Of course, they are walking, but they're traveling
           | via their car, so by some interpretations you wouldn't call
           | them a pedestrian. You could call them a "motorist" or a
           | "stranded vehicle occupant".
           | 
           | For understanding the accident it does seem meaningful that
           | they were motorists that got out of their car on a highway
           | and not pedestrians at a street crossing. (Still inexcusable
           | of course, but changes the context)
        
             | bastawhiz wrote:
             | Cars and drivers ideally shouldn't hit people who exited
             | their vehicles after an accident on a highway. Identifying
             | and avoiding hazards is part of driving.
        
             | neom wrote:
             | As far as I am aware, pes doesn't carry an inherent meaning
             | of travel. Pedestrian just means foot on, they don't need
             | to be moving, they're just not in carriage. As an aside,
             | distinguishing a person's mode of presence is precisely
             | what reports aim to capture.
             | 
             | (I also do tend to avoid this level of pedantry, the points
             | here are all well taken to be clear. I do think the
             | original poster was fine in their comment, I was just
             | sayin' - but this isn't a cross I would die on :))
        
           | sebzim4500 wrote:
           | That's why he said misleading rather than an outright lie. He
           | is not disputing that it is techincally correct to refer to
           | the deceased as a pedestrian, but this scenario (someone out
           | of their car on a freeway) is not what is going to spring to
           | the mind of someone just reading the headline.
        
         | danans wrote:
         | > Pedestrian' in this context seems pretty misleading
         | 
         | What's misleading? The full quote:
         | 
         | "A red Tesla Model Y then _hit the 4Runner and one of the
         | people who exited from it_. A 71-year-old woman from Mesa,
         | Arizona, was pronounced dead at the scene. "
         | 
         | If you exit a vehicle, and are on foot, you are a pedestrian.
         | 
         | I wouldn't expect FSD's object recognition system to treat a
         | human who has just exited a car differently than a human
         | walking across a crosswalk. A human on foot is a human on foot.
         | 
         | However, from the sound of it, the object recognition system
         | didn't even see the 4Runner, much less a person, so perhaps
         | there's a more fundamental problem with it?
         | 
         | Perhaps this is something that lidar or radar, if the car had
         | them, would have helped the OR system to see.
        
           | jfoster wrote:
           | The description has me wondering if this was definitely a
           | case where FSD was being used. There have been other cases in
           | the past where drivers had an accident and claimed they were
           | using autopilot when they actually were not.
           | 
           | I don't know for sure, but I would think that the car could
           | detect a collision. I also don't know for sure, but I would
           | think that FSD would stop once a collision has been detected.
        
             | bastawhiz wrote:
             | Did the article say the Tesla didn't stop after the
             | collision?
        
               | jfoster wrote:
               | If it hit the vehicle and then hit one of the people who
               | had exited the vehicle with enough force for it to result
               | in a fatality, it sounds like it might not have applied
               | any braking.
               | 
               | Of course, that depends on the speed it was traveling at
               | to begin with.
        
             | FireBeyond wrote:
             | > FSD would stop once a collision has been detected.
             | 
             | Fun fact, at least until very recently, if not even to this
             | moment, AEB (emergency braking) is not a part of FSD.
        
             | pell wrote:
             | > There have been other cases in the past where drivers had
             | an accident and claimed they were using autopilot when they
             | actually were not.
             | 
             | Wouldn't this be protocoled by the event data recorder?
        
             | danans wrote:
             | > There have been other cases in the past where drivers had
             | an accident and claimed they were using autopilot when they
             | actually were not.
             | 
             | If that were the case here, there wouldn't be a government
             | probe, right? It would be a normal "multi car pileup with a
             | fatality" and added to statistics.
             | 
             | With the strong incentive on the part of both the driver
             | and Tesla to lie about this, there should strong
             | regulations around event data recorders [1] for self
             | driving systems, and huge penalties for violating those. A
             | search across that site doesn't return a hit for the word
             | "retention" but it's gotta be expressed in some way there.
             | 
             | 1. https://www.ecfr.gov/current/title-49/subtitle-B/chapter
             | -V/p...
        
           | potato3732842 wrote:
           | Tesla's were famously poor at detecting partial lane
           | obstructions for a long time. I wonder if that's what
           | happened here.
        
       | AlchemistCamp wrote:
       | The interesting question is how good self-driving has to be
       | before people tolerate it.
       | 
       | It's clear that having half the casualty rate per distance
       | traveled of the median human driver isn't acceptable. How about a
       | quarter? Or a tenth? Accidents caused by human drivers are one of
       | the largest causes of injury and death, but they're not
       | newsworthy the way an accident involving automated driving is.
       | It's all too easy to see a potential future where many people die
       | needlessly because technology that could save lives is regulated
       | into a greatly reduced role.
        
         | iovrthoughtthis wrote:
         | at least 10x better than a human
        
           | becquerel wrote:
           | I believe Waymo has already beaten this metric.
        
             | szundi wrote:
             | Waymo is limited to cities that their engineers has to map
             | and this map maintained.
             | 
             | You cannot put a waymo in a new city before that. With
             | Tesla, what you get is universal.
        
               | RivieraKid wrote:
               | Waymo is robust to removing the map / lidars / radars /
               | cameras or adding inaccuracies to any of these 4 inputs.
               | 
               | (Not sure if this is true for the production system or
               | the one they're still working on.)
        
         | triyambakam wrote:
         | Hesitation around self-driving technology is not just about the
         | raw accident rate, but the nature of the accidents. Self-
         | driving failures often involve highly visible, preventable
         | mistakes that seem avoidable by a human (e.g., failing to stop
         | for an obvious obstacle). Humans find such incidents harder to
         | tolerate because they can seem fundamentally different from
         | human error.
        
           | crazygringo wrote:
           | Exactly -- it's not just the overall accident rate, but the
           | rate _per accident type_.
           | 
           | Imagine if self-driving is 10x safer on freeways, but on the
           | other hand is 3x more likely to run over your dog in the
           | driveway.
           | 
           | Or it's 5x safer on city streets overall, but actually 2x
           | _worse_ in rain and ice.
           | 
           | We're fundamentally wired for loss aversion. So I'd say it's
           | less about what the total improvement rate is, and more about
           | whether it has categorizable scenarios where it's still worse
           | than a human.
        
         | becquerel wrote:
         | My dream is of a future where humans are banned from driving
         | without special licenses.
        
           | gambiting wrote:
           | So.........like right now you mean? You need a special
           | licence to drive on a public road right now.
        
             | seizethecheese wrote:
             | Geez, clearly they mean like a CDL
        
             | nkrisc wrote:
             | The problem is it's obviously too easy to get one and keep
             | one, based on some of the drivers I see on the road.
        
               | gambiting wrote:
               | That sounds like a legislative problem where you live,
               | sure it can be fixed by overbearing technology but we
               | already have all the tools we need to fix it, we are just
               | choosing not to for some reason.
        
           | FireBeyond wrote:
           | And yet Tesla's FSD never passed a driving test.
        
         | Arainach wrote:
         | This is about lying to the public and stoking false
         | expectations for years.
         | 
         | If it's "fully self driving" Tesla should be liable for when
         | its vehicles kill people. If it's not fully self driving and
         | Tesla keeps using that name in all its marketing, regardless of
         | any fine print, then Tesla should be liable for people acting
         | as though their cars could FULLY self drive and be sued
         | accordingly.
         | 
         | You don't get to lie just because you're allegedly safer than a
         | human.
        
           | jeremyjh wrote:
           | I think this is the answer: the company takes on full
           | liability. If a Tesla is Fully Self Driving then Tesla is
           | driving it. The insurance market will ensure that dodgy
           | software/hardware developers exit the industry.
        
             | blagie wrote:
             | This is very much what I would like to see.
             | 
             | The price of insurance is baked into the price of a car. If
             | the car is as safe as I am, I pay the same price in the
             | end. If it's safer, I pay less.
             | 
             | From my perspective:
             | 
             | 1) I would *much* rather have Honda kill someone than
             | myself. If I killed someone, the psychological impact on
             | myself would be horrible. In the city I live in, I dread
             | ageing; as my reflexes get slower, I'm more and more likely
             | to kill someone.
             | 
             | 2) As a pedestrian, most of the risk seems to come from
             | outliers -- people who drive hyper-aggressively. Replacing
             | all cars with a median driver would make me much safer (and
             | traffic, much more predictable).
             | 
             | If we want safer cars, we can simply raise insurance
             | payouts, and vice-versa. The market works everything else
             | out.
             | 
             | But my stress levels go way down, whether in a car, on a
             | bike, or on foot.
        
               | gambiting wrote:
               | >> I would _much_ rather have Honda kill someone than
               | myself. If I killed someone, the psychological impact on
               | myself would be horrible.
               | 
               | Except that we know that it doesn't work like that. Train
               | drivers are ridden with extreme guilt every time "their"
               | train runs over someone, even though they know that
               | logically there was absolutely nothing they could have
               | done to prevent it. Don't see why it would be any
               | different here.
               | 
               | >>If we want safer cars, we can simply raise insurance
               | payouts, and vice-versa
               | 
               | In what way? In the EU the minimum covered amount for any
               | car insurance is 5 million euro, it has had no impact on
               | the safety of cars. And of course the recent increase in
               | payouts(due to the general increase in labour and parts
               | cost) has led to a dramatic increase in insurance
               | premiums which in turn has lead to a drastic increase in
               | the number of people driving without insurance. So now
               | that needs increased policing and enforcement, which we
               | pay for through taxes. So no, market doesn't "work
               | everything out".
        
               | blagie wrote:
               | > Except that we know that it doesn't work like that.
               | Train drivers are ridden with extreme guilt every time
               | "their" train runs over someone, even though they know
               | that logically there was absolutely nothing they could
               | have done to prevent it. Don't see why it would be any
               | different here.
               | 
               | It's not binary. Someone dying -- even with no
               | involvement -- can be traumatic. I've been in a position
               | where I could have taken actions to prevent someone from
               | being harmed. Rationally not my fault, but in retrospect,
               | I can describe the exact set of steps needed to prevent
               | it. I feel guilty about it, even though I know rationally
               | it's not my fault (there's no way I could have known
               | ahead of time).
               | 
               | However, it's a manageable guilt. I don't think it would
               | be if I knew rationally that it was my fault.
               | 
               | > So no, market doesn't "work everything out".
               | 
               | Whether or not a market works things out depends on
               | issues like transparency and information. Parties will
               | offload costs wherever possible. In the model you gave,
               | there is no direct cost to a car maker making less safe
               | cars or vice-versa. It assumes the car buyer will even
               | look at insurance premiums, and a whole chain of events
               | beyond that.
               | 
               | That's different if it's the same party making cars,
               | paying money, and doing so at scale.
               | 
               | If Tesla pays for everyone damaged in any accident a
               | Tesla car has, then Tesla has a very, very strong
               | incentive to make safe cars to whatever optimum is set by
               | the damages. Scales are big enough -- millions of cars
               | and billions of dollars -- where Tesla can afford to hire
               | actuaries and a team of analysts to make sure they're at
               | the optimum.
               | 
               | As an individual car buyer, I have no chance of doing
               | that.
               | 
               | Ergo, in one case, the market will work it out. In the
               | other, it won't.
        
             | tensor wrote:
             | I'm for this as long as the company also takes on liability
             | for human errors they could prevent. I'd want to see cars
             | enforcing speed limits and similar things. Humans are too
             | dangerous to drive.
        
             | stormfather wrote:
             | That would be good because it would incentivize all FSD
             | cars communicating with each other. Imagine how safe
             | driving would be if they are all broadcasting their speed
             | and position to each other. And each vehicle
             | sending/receiving gets cheaper insurance.
        
               | Terr_ wrote:
               | It goes kinda dsytopic if access to the network becomes a
               | monopolistic barrier.
        
             | KoolKat23 wrote:
             | That's just reducing the value of a life to a number. It
             | can be gamed to a situation where it's just more profitable
             | to mow down people.
             | 
             | What's an acceptable number/financial cost is also just an
             | indirect approximated way of implementing a more
             | direct/scientific regulation. Not everything needs to be
             | reduced to money.
        
               | jeremyjh wrote:
               | There is no way to game it successfully; if your
               | insurance costs are much higher than your competitors you
               | will lose in the long run. That doesn't mean there can't
               | be other penalties when there is gross negligence.
        
           | SoftTalker wrote:
           | It's your car, so ultimately the liability is yours. That's
           | why you have insurance. If Tesla retains ownership, and just
           | lets you drive it, then they have (more) liability.
        
           | mrpippy wrote:
           | Tesla officially renamed it to "Full Self Driving
           | (supervised)" a few months ago, previously it was "Full Self
           | Driving (beta)"
           | 
           | Both names are ridiculous, for different reasons. Nothing
           | called a "beta" should be tested on public roads without a
           | trained employee supervising it (i.e. being paid to pay
           | attention). And of course it was not "full", it always
           | required supervision.
           | 
           | And "Full Self Driving (supervised)" is an absurd oxymoron.
           | Given the deaths and crashes that we've already seen, I'm
           | skeptical of the entire concept of a system that works 98% of
           | the time, but also needs to be closely supervised for the 2%
           | of the time when it tries to kill you or others (with no
           | alerts).
           | 
           | It's an abdication of duty that NHTSA has let this continue
           | for so long, they've picked up the pace recently and I
           | wouldn't be surprised if they come down hard on Tesla (unless
           | Trump wins, in which case Elon will be put in charge of
           | NHTSA, the SEC, and FAA)
        
         | gambiting wrote:
         | >>. How about a quarter? Or a tenth?
         | 
         | The answer is zero. An airplane autopilot has increased the
         | overall safety of airplanes by several orders of magnitude
         | compared to human pilots, but literally no errors in its
         | operation are tolerated, whether they are deadly or not. The
         | exact same standard has to apply to cars or any automated
         | machine for that matter. If there is any issue discovered in
         | any car with this tech then it should be disabled worldwide
         | until the root cause is found and eliminated.
         | 
         | >> It's all too easy to see a potential future where many
         | people die needlessly because technology that could save lives
         | is regulated into a greatly reduced role.
         | 
         | I really don't like this argument, because we could already
         | prevent literally all automotive deaths tomorrow through
         | existing technology and legislation and yet we are choosing not
         | to do this for economic and social reasons.
        
           | esaym wrote:
           | You can't equate airplane safety with automotive safety. I
           | worked at an aircraft repair facility doing government
           | contracts for a number of years. In one instance, somebody
           | lost the toilet paper holder for one of the aircraft. This
           | holder was simply a piece of 10 gauge wire that was bent in a
           | way to hold it and supported by wire clamps screwed to the
           | wall. Making a new one was easy but since it was a new part
           | going on the aircraft we had to send it to a lab to be
           | certified to hold a roll of toilet paper to 9 g's. In case
           | the airplane crashed you wouldn't want a roll of toilet paper
           | flying around I guess. And that cost $1,200.
        
             | gambiting wrote:
             | No, I'm pretty sure I can in this regard - any automotive
             | "autopilot" has to be held to the same standard. It's
             | either zero accidents or nothing.
        
           | travem wrote:
           | > The answer is zero
           | 
           | If autopilot is 10x safer then preventing its use would lead
           | to more preventable deaths and injuries than allowing it.
           | 
           | I agree that it should be regulated and incidents thoroughly
           | investigated, however letting perfect be the enemy of good
           | leads to stagnation and lack of practical improvement and
           | greater injury to the population as a whole.
        
             | gambiting wrote:
             | >>If autopilot is 10x safer then preventing its use would
             | lead to more preventable deaths and injuries than allowing
             | it.
             | 
             | And yet whenever there is a problem with any plane
             | autopilot it's preemptively disabled fleet wide and pilots
             | have to fly manually even though we absolutely beyond a
             | shadow of a doubt know that it's less safe.
             | 
             | If an automated system makes a wrong decision and it
             | contributes to harm/death then it cannot be allowed on
             | public roads full stop, no matter how many lives it saves
             | otherwise.
        
               | exe34 wrote:
               | > And yet whenever there is a problem with any plane
               | autopilot it's preemptively disabled fleet wide and
               | pilots have to fly manually even though we absolutely
               | beyond a shadow of a doubt know that it's less safe.
               | 
               | just because we do something dumb in one scenario isn't a
               | very persuasive reason to do the same in another.
               | 
               | > then it cannot be allowed on public roads full stop, no
               | matter how many lives it saves otherwise.
               | 
               | ambulances sometimes get into accidents - we should ban
               | all ambulances, no matter how many lives they save
               | otherwise.
        
               | CrimsonRain wrote:
               | So your only concern is, when something goes wrong, need
               | someone to blame. Who cares about lives saved. Vaccines
               | can cause adverse effects. Let's ban all of them.
               | 
               | If people like you were in charge of anything, we'd still
               | be hitting rocks for fire in caves.
        
             | penjelly wrote:
             | I'd challenge the legitimacy of the claim that it's 10x
             | safer, or even safer at all. The safety data provided isn't
             | compelling to me, it can be games or misrepresented in
             | various ways, as pointed out by others.
        
               | yCombLinks wrote:
               | That claim wasn't made. It was a hypothetical, what if it
               | was 10x safer? Then would people tolerate it.
        
           | V99 wrote:
           | Airplane autopilots follow a lateral & sometimes vertical
           | path through the sky prescribed by the pilot(s). They are
           | good at doing that. This does increase safety, because it
           | frees up the pilot(s) from having to carefully maintain a
           | straight 3d line through the sky for hours at a time.
           | 
           | But they do not listen to ATC. They do not know where other
           | planes are. They do not keep themselves away from other
           | planes. Or the ground. Or a flock of birds. They do not
           | handle emergencies. They make only the most basic control-
           | loop decisions about the control surface and power (if even
           | autothrottle equipped, otherwise that's still the meatbag's
           | job) changes needed to follow the magenta line drawn by the
           | pilot given a very small set of input data (position,
           | airspeed, current control positions, etc).
           | 
           | The next nearest airplane is typically at least 3 miles
           | laterally and/or 500' vertically away, because the errors
           | allowed with all these components are measured in hundreds of
           | feet.
           | 
           | None of this is even remotely comparable to a car using a
           | dozen cameras (or lidar) to make real-time decisions to drive
           | itself around imperfect public streets full of erratic
           | drivers and other pedestrians a few feet away.
           | 
           | What it is a lot like is what Tesla actually sells (despite
           | the marketing name). Yes it's "flying" the plane, but you're
           | still responsible for making sure it's doing the right thing,
           | the right way, and not and not going to hit anything or kill
           | anybody.
        
           | Aloisius wrote:
           | Autopilots aren't held to a zero error standard let alone a
           | zero accident standard.
        
           | peterdsharpe wrote:
           | > literally no errors in its operation are tolerated
           | 
           | Aircraft designer here, this is not true. We typically
           | certify to <1 catastrophic failure per 1e9 flight hours. Not
           | zero.
        
         | croes wrote:
         | > It's clear that having half the casualty rate per distance
         | traveled of the median human driver isn't acceptable.
         | 
         | Were the Teslas driving under all weather conditions at any
         | location like humans do or is it just cherry picked from the
         | easy travelling conditions?
        
         | jakelazaroff wrote:
         | I think we should not be satisfied with merely "better than a
         | human". Flying is so safe precisely because we treat any
         | casualty as unacceptable. We should aspire to make automobiles
         | _at least_ that safe.
        
           | aantix wrote:
           | Before FSD is allowed on public roads?
           | 
           | It's a net positive, saving lives right now.
        
         | akira2501 wrote:
         | > traveled of the median human driver isn't acceptable.
         | 
         | It's completely acceptable. In fact the numbers are lower than
         | they have been since we've started driving.
         | 
         | > Accidents caused by human drivers
         | 
         | Are there any other types of drivers?
         | 
         | > are one of the largest causes of injury and death
         | 
         | More than half the fatalities on the road are actually caused
         | by the use of drugs and alcohol. The statistics are very clear
         | on this. Impaired people cannot drive well. Non impaired people
         | drive orders of magnitude better.
         | 
         | > technology that could save lives
         | 
         | There is absolutely zero evidence this is true. Everyone is
         | basing this off of a total misunderstanding of the source of
         | fatalities and a willful misapprehension of the technology.
        
           | blargey wrote:
           | > Non impaired people drive orders of magnitude better.
           | 
           | That raises the question - how many _impaired_ driver-miles
           | are being baked into the collision statistics for  "median
           | human" driver-miles? Shouldn't we demand non-impaired driving
           | as the standard for automation, rather than "averaged with
           | drunk / phone-fiddling /senile" driving? We don't give people
           | N-mile allowances for drunk driving based on the size of the
           | drunk driver population, after all.
        
         | aithrowawaycomm wrote:
         | Many people don't (and shouldn't) take the "half the casualty
         | rate" at face value. My biggest concern is that Waymo and Tesla
         | are juking the stats to make self-driving cars seem safer than
         | they really are. I believe this is largely an unintentional
         | consequence of bad actuary science coming from bad qualitative
         | statistics; the worst kind of lying with numbers is lying to
         | yourself.
         | 
         | The biggest gap in these studies: I have yet to see a
         | comparison with human drivers that filters out DUIs, reckless
         | speeding, or mechanical failures. Without doing this it is
         | simply not a fair comparison, because:
         | 
         | 1) Self-driving cars won't end drunk driving unless it's made
         | mandatory by outlawing manual driving or ignition is tied to a
         | breathalyzer. Many people will continue to make the dumb
         | decision to drive themselves home because they are drunk and
         | driving is fun. This needs regulation, not technology. And DUIs
         | need to be filtered from the crash statistics when comparing
         | with Waymo.
         | 
         | 2) A self-driving car which speeds and runs red lights might
         | well be more dangerous than a similar human, but the data says
         | nothing about this since Waymo is currently on their best
         | behavior. Yet Tesla's own behavior and customers prove that
         | there is demand for reckless self-driving cars, and
         | manufacturers will meet the demand unless the law steps in.
         | Imagine a Waymo competitor that promises Uber-level ETAs for
         | people in a hurry. Technology could in theory solve this but in
         | practice the market could make things worse for several decades
         | until the next research breakthrough. Human accidents coming
         | from distraction are a fair comparison to Waymo, but speeding
         | or aggressiveness should be filtered out. The difficulty of
         | doing so is one of the many reasons I am so skeptical of these
         | stats.
         | 
         | 3) Mechanical failures are a hornets' nest of ML edge cases
         | that might work in the lab but fail miserably on the road.
         | Currently it's not a big deal because the cars are shiny and
         | new. Eventually we'll have self-driving clunkers owned by
         | drivers who don't want to pay for the maintenance.
         | 
         | And that's not even mentioning that Waymos are not self-
         | driving, they rely on close remote oversight to guide AI
         | through the many billions of common-sense problems that
         | computets will not able to solve for at least the next decade,
         | probably much longer. True self-driving cars will continue to
         | make inexplicably stupid decisions: these machines are still
         | much dumber than lizards. Stories like "the Tesla slammed into
         | an overturned tractor trailer because the AI wasn't trained on
         | overturned trucks" are a huge problem and society will not let
         | Tesla try to launder it away with statistics.
         | 
         | Self-driving cars might end up saving lives. But would they
         | save more lives than adding mandatory breathalyzers and GPS-
         | based speed limits? And if market competition overtakes
         | business ethics, would they cost more lives than they save? The
         | stats say very little about this.
        
         | smitty1110 wrote:
         | There's two things going on here with there average person that
         | you need to overcome: That when Tesla dodges responsibility all
         | anyone sees is a liar, and that people amalgamate all the FSD
         | crashes and treat the system like a dangerous local driver that
         | nobody can get off the road.
         | 
         | Tesla markets FSD like it's a silver bullet, and the name is
         | truly misleading. The fine print says you need attention and
         | all that. But again, people read "Full Self Driving" and all
         | the marketing copy and think the system is assuming
         | responsibility for the outcomes. Then a crash happens, Tesla
         | throws the driver under the bus, and everyone gets a bit more
         | skeptical of the system. Plus, doing that to a person rubs
         | people the wrong way, and is in some respects a barrier to
         | sales.
         | 
         | Which leads to the other point: People are tallying up all the
         | accidents and treating the system like a person, and wondering
         | why this dangerous driver is still on the road. Most accidents
         | with dead pedestrian start with someone doing something stupid,
         | which is when they assume all responsibility, legally speaking.
         | Drunk, speeding, etc. Normal drivers in poor conditions slow
         | down and drive carefully. People see this accident, and treat
         | FSD like a serial drunk driver. It's to the point that I know
         | people that openly say they treat teslas on roads like they're
         | erratic drivers just for existing.
         | 
         | Until Elon figures out how to fix his perception problem, the
         | calls for investigations and to keep his robotaxis is off the
         | road will only grow.
        
         | danans wrote:
         | > The interesting question is how good self-driving has to be
         | before people tolerate it.
         | 
         | It's pretty simple: as good as it can be given available
         | technologies and techniques, without sacrificing safety for
         | cost or style.
         | 
         | With AVs, function and safety should obviate concerns of style,
         | cost, and marketing. If that doesn't work with your business
         | model, well tough luck.
         | 
         | Airplanes are far safer than cars yet we subject their
         | manufacturers to rigorous standards, or seemingly did until
         | recently, as the 737 max saga has revealed. Even still the
         | rigor is very high compared to road vehicles.
         | 
         | And AVs do have to be way better than people at driving because
         | they are machines that have no sense of human judgement, though
         | they operate in a human physical context.
         | 
         | Machines run by corporations are less accountable than human
         | drivers, not at the least because of the wealth and legal
         | armies of those corporations who may have interests other than
         | making the safest possible AV.
        
           | mavhc wrote:
           | Surely the number of cars than can do it, and the price, also
           | matters, unless you're going to ban private cars
        
             | danans wrote:
             | > Surely the number of cars than can do it, and the price,
             | also matters, unless you're going to ban private cars
             | 
             | Indeed, like this: the more cars sold that claim fully
             | autonomous capability, and the more affordable they get,
             | the higher the standards should be compared to their _AV_
             | predecessors, even if they have long eclipsed human driver
             | 's safety record.
             | 
             | If this is unpalatable, then let's assign 100% liability
             | with steep monetary penalties to the AV manufacturer for
             | any crash that happens under autonomous driving mode.
        
         | Terr_ wrote:
         | > It's clear that having half the casualty rate per distance
         | traveled of the median human driver isn't acceptable.
         | 
         | Even if we optimistically assume no "gotchas" in the statistics
         | [0], distilling performance down to a casualty/injury/accident-
         | rate can still be dangerously reductive, when the have a
         | different _distribution_ of failure-modes which do /don't mesh
         | with our other systems and defenses.
         | 
         | A quick thought experiment to prove the point: Imagine a system
         | which compared to human drivers had only half the rate of
         | accidents... But many of those are because it unpredictably
         | decides to jump the sidewalk curb and kill a targeted
         | pedestrian.
         | 
         | The raw numbers are encouraging, but it represents a risk
         | profile that clashes horribly with our other systems of road
         | design, car design, and what incidents humans are expecting and
         | capable of preventing or recovering-from.
         | 
         | [0] Ex: Automation is only being used on certain _subsets_ of
         | all travel which are the  "easier" miles or circumstances than
         | the whole gamut a human would handle.
        
         | __loam wrote:
         | The problem is that Tesla is way behind the industry standards
         | here and it's misrepresenting how good their tech is.
        
         | alkonaut wrote:
         | > How about a quarter? Or a tenth?
         | 
         | Probably closer to the latter. The "skin in the game"
         | (physically) argument makes me more willing to accept drunk
         | drivers than greedy manufacturers when it comes to making
         | mistakes or being negligent.
        
         | sebzim4500 wrote:
         | >It's clear that having half the casualty rate per distance
         | traveled of the median human driver isn't acceptable.
         | 
         | Are you sure? Right now FSD is active with no one actually
         | knowing its casualty rate, and the for the most part the only
         | people upset about it are terminally online people on twitter
         | or luddites on HN.
        
       | frabjoused wrote:
       | I don't understand why this debate/probing is not just data
       | driven. Driving is all big data.
       | 
       | https://www.tesla.com/VehicleSafetyReport
       | 
       | This report does not include fatalities, which seems to be the
       | key point in question. Unless the above report has some bias or
       | is false, Teslas in autopilot appear 10 times safer than the US
       | average.
       | 
       | Is there public data on deaths reported by Tesla?
       | 
       | And otherwise, if the stats say it is safer, why is there any
       | debate at all?
        
         | bastawhiz wrote:
         | Autopilot is not FSD.
        
           | frabjoused wrote:
           | That's a good point. Are there no published numbers on FSD?
        
         | JTatters wrote:
         | Those statistics are incredibly misleading.
         | 
         | - It is safe to assume that the vast majority of autopilot
         | miles are on highways (although Tesla don't release this
         | information).
         | 
         | - By far the safest roads per mile driven are highways.
         | 
         | - Autopilot will engage least during the most dangerous
         | conditions (heavy rain, snow, fog, nighttime).
        
         | notshift wrote:
         | Without opening the link, the problem with every piece of data
         | I've seen from Tesla is they're comparing apples to oranges.
         | FSD won't activate in adverse driving conditions, aka when
         | accidents are much more likely to occur. And/or drivers are
         | choosing not to use it in those conditions.
        
         | FireBeyond wrote:
         | > Unless the above report has some bias or is false
         | 
         | Welcome to Tesla.
         | 
         | The report measures accidents in FSD mode. Qualifiers to FSD
         | mode: the conditions, weather, road, location, traffic all have
         | to meet a certain quality threshold before the system will be
         | enabled (or not disable itself). Compare Sunnyvale on a clear
         | spring day to Pittsburgh December nights.
         | 
         | There's no qualifier to the "comparison": all drivers, all
         | conditions, all weather, all roads, all location, all traffic.
         | 
         | It's not remotely comparable, and Tesla's data people are not
         | that stupid, so it's willfully misleading.
         | 
         | > This report does not include fatalities
         | 
         | It also doesn't consider any incident where there was not
         | airbag deployment to be an accident. Sounds potentially
         | reasonable until you consider:
         | 
         | - first gen airbag systems were primitive: collision exceeds
         | threshold, deploy. Currently, vehicle safety systems consider
         | duration of impact, speeds, G-forces, amount of intrusion,
         | angle of collision, and a multitude of other factors before
         | deciding what, if any, systems to fire (seatbelt tensioners,
         | airbags, etc.) So hit something at 30mph with the right
         | variables? Tesla: "this is not an accident".
         | 
         | - Tesla also does not consider "incident was so catastrophic
         | that airbags COULD NOT deploy*" to be an accident, because
         | "airbags didn't deploy". This umbrella could also include
         | egregious, "systems failed to deploy for any reason up to and
         | including poor assembly line quality control", as also not an
         | accident and also "not counted".
         | 
         | > Is there public data on deaths reported by Tesla?
         | 
         | They do not.
         | 
         | They also refuse to give the public much of any data beyond
         | these carefully curated numbers. Hell, NHTSA/NTSB also mostly
         | have to drag heavily redacted data kicking and screaming out of
         | Tesla's hands.
        
         | jsight wrote:
         | The report from Tesla is very biased. It doesn't normalize for
         | the difficulty of the conditions involved, and is basically for
         | marketing purposes.
         | 
         | IMO, the challenge for NHTSA is that they can get tremendous
         | detail from Tesla but not from other makes. This will make it
         | very difficult for them to get a solid baseline for collisions
         | due to glare in non-FSD equipped vehicles.
        
       | testfrequency wrote:
       | I was in a Model 3 Uber yesterday and my driver had to serve onto
       | and up a curb to avoid an (idiot) who was trying to turn into
       | traffic going in the other direction.
       | 
       | The Model 3 had every opportunity in the world to brake and it
       | didn't, we were probably only going 25mph. I know this is about
       | FSD here, but that moment 100% made me realize Tesla has awful
       | obstacle avoidance.
       | 
       | I just happen to be looking forward and it was a very plain and
       | clear T-Bone avoidance, and at no point did the car handle or
       | trigger anything.
       | 
       | Thankfully everyone was ok, but the front lip got pretty beat up
       | from driving up the curb. Of course the driver at fault that
       | caused the whole incident drove off.
        
       | bastawhiz wrote:
       | Lots of people are asking how good the self driving has to be
       | before we tolerate it. I got a one month free trial of FSD and
       | turned it off after two weeks. Quite simply: it's dangerous.
       | 
       | - It failed with a cryptic system error while driving
       | 
       | - It started making a left turn far too early that would have
       | scraped the left side of the car on a sign. I had to manually
       | intervene.
       | 
       | - In my opinion, the default setting accelerates way too
       | aggressively. I'd call myself a fairly aggressive driver and it
       | is too aggressive for my taste.
       | 
       | - It tried to make way too many right turns on red when it wasn't
       | safe to. It would creep into the road, almost into the path of
       | oncoming vehicles.
       | 
       | - It didn't merge left to make room for vehicles merging onto the
       | highway. The vehicles then tried to cut in. The system should
       | have avoided an unsafe situation like this in the first place.
       | 
       | - It would switch lanes to go faster on the highway, but then
       | missed an exit on at least one occasion because it couldn't make
       | it back into the right lane in time. Stupid.
       | 
       | After the system error, I lost all trust in FSD from Tesla. Until
       | I ride in one and _feel_ safe, I can 't have any faith that this
       | is a reasonable system. Hell, even autopilot does dumb shit on a
       | regular basis. I'm grateful to be getting a car from another
       | manufacturer this year.
        
         | frabjoused wrote:
         | The thing that doesn't make sense is the numbers. If it is
         | dangerous in your anecdotes, why don't the reported numbers
         | show more accidents when FSD is on?
         | 
         | When I did the trial on my Tesla, I also noted these kinds of
         | things and felt like I had to take control.
         | 
         | But at the end of the day, only the numbers matter.
        
           | akira2501 wrote:
           | You can measure risks without having to witness disaster.
        
           | ForHackernews wrote:
           | Maybe other human drivers are reacting quickly and avoiding
           | potential accidents from dangerous computer driving? That
           | would be ironic, but I'm sure it's possible in some
           | situations.
        
           | jsight wrote:
           | Because it is bad enough that people really do supervise it.
           | I see people who say that wouldn't happen because the drivers
           | become complacent.
           | 
           | Maybe that could be a problem with future versions, but I
           | don't see it happening with 12.3.x. I've also heard that
           | driver attention monitoring is pretty good in the later
           | versions, but I have no first hand experience yet.
        
             | valval wrote:
             | Very good point. The product that requires supervision and
             | tells the user to keep their hands on the wheel every 10
             | seconds is not good enough to be used unsupervised.
             | 
             | I wonder how things are inside your head. Are you ignorant
             | or affected by some strong bias?
        
           | bastawhiz wrote:
           | Is Tesla required to report system failures or the vehicle
           | damaging itself? How do we know they're not optimizing for
           | the benchmark (what they're legally required to report)?
        
             | rvnx wrote:
             | If the question is: "was FSD activated at the time of the
             | accident: yes/no", they can legally claim no, for example
             | if luckily the FSD disconnects half a second before a
             | dangerous situation (eg: glare obstructing cameras), which
             | may coincide exactly with the times of some accidents.
        
             | Uzza wrote:
             | All manufacturers have for some time been required by
             | regulators to report any accident where an autonomous or
             | partially autonomous system was active within 30 seconds of
             | an accident.
        
           | timabdulla wrote:
           | > If it is dangerous in your anecdotes, why don't the
           | reported numbers show more accidents when FSD is on?
           | 
           | Even if it is true that the data show that with FSD (not
           | Autopilot) enabled, drivers are in fewer crashes, I would be
           | worried about other confounding factors.
           | 
           | For instance, I would assume that drivers are more likely to
           | engage FSD in situations of lower complexity (less traffic,
           | little construction or other impediments, overall lesser
           | traffic flow control complexity, etc.) I also believe that at
           | least initially, Tesla only released FSD to drivers with high
           | safety scores relative to their total driver base, another
           | obvious confounding factor.
           | 
           | Happy to be proven wrong though if you have a link to a
           | recent study that goes through all of this.
        
             | valval wrote:
             | Either the system causes less loss of life than a human
             | driver or it doesn't. The confounding factors don't matter,
             | as Tesla hasn't presented a study on the subject. That's in
             | the future, and all stats that are being gathered right now
             | are just that.
        
               | unbrice wrote:
               | > Either the system causes less loss of life than a human
               | driver or it doesn't. The confounding factors don't
               | matter.
               | 
               | Confounding factors are what allows one to tell appart
               | "the system cause less loss of life" from "the system
               | causes more loss of life yet it is only enabled in
               | situations were fewer lives are lost".
        
           | nkrisc wrote:
           | What numbers? Who's measuring? What are they measuring?
        
           | rvnx wrote:
           | There is an easy way to know what is really behind the
           | numbers: look who is paying in case of accident.
           | 
           | You have a Mercedes, Mercedes takes responsibility.
           | 
           | You have a Tesla, you take the responsibility.
           | 
           | Says a lot.
        
             | tensor wrote:
             | You have a Mercedes, and you have a system that works
             | virtually nowhere.
        
               | therouwboat wrote:
               | Better that way than "Oh it tried to run red light, but
               | otherwise it's great."
        
               | tensor wrote:
               | "Oh we tried to build it but no one bought it! So we gave
               | up." - Mercedes before Tesla.
               | 
               | Perhaps FSD isn't ready for city streets yet, but it's
               | great on the highways and I'd 1000x prefer we make
               | progress rather than settle for the status quo garbage
               | that the legacy makers put out. Also, human drivers are
               | the most dangerous, by far, we need to make progress to
               | eventual phase them out.
        
             | sebzim4500 wrote:
             | Mercedes had the insight that if no one is able to actually
             | use the system then it can't cause any crashes.
             | 
             | Technically, that is the easiest way to get a perfect
             | safety record and journalists will seemingly just go along
             | with the charade.
        
           | lawn wrote:
           | > The thing that doesn't make sense is the numbers.
           | 
           | Oh? Who are presenting the numbers?
           | 
           | Is a crash that fails to trigger the airbags still not
           | counted as a crash?
           | 
           | What about the car turning off FSD right before a crash?
           | 
           | How about adjusting for factors such as age of driver and the
           | type of miles driven?
           | 
           | The numbers don't make sense because they're not good
           | comparisons and are made to make Tesla look good.
        
           | gamblor956 wrote:
           | The numbers collected by the NHTSA and insurance companies do
           | show that FSD is dangerous...that's why the NHTSA started
           | investigating and its why most insurance companies won't
           | insure Tesla vehicles or charge significantly higher rates.
           | 
           | Also, Tesla is known to disable self-driving features right
           | before collisions to give the appearance of driver fault.
           | 
           | And the coup de grace: if Tesla's own data showed that FSD
           | was actually safer, they'd be shouting it from the moon,
           | using that data to get self-driving permits in CA, and
           | offering to assume liability if FSD actually caused an
           | accident (like Mercedes does with its self driving system).
        
           | throwaway562if1 wrote:
           | AIUI the numbers are for accidents where FSD is in control.
           | Which means if it does a turn into oncoming traffic and the
           | driver yanks the wheel or slams the brakes 500ms before
           | collision, it's not considered a crash during FSD.
        
             | Uzza wrote:
             | That is not correct. Tesla counts any accident within 5
             | seconds of Autopilot/FSD turning off as the system being
             | involved. Regulators extend that period to 30 seconds, and
             | Tesla must comply with that when reporting to them.
        
             | concordDance wrote:
             | Several people in this thread have been saying this or
             | similar. It's incorrect, from Tesla:
             | 
             | "To ensure our statistics are conservative, we count any
             | crash in which Autopilot was deactivated within 5 seconds
             | before impact"
             | 
             | https://www.tesla.com/en_gb/VehicleSafetyReport
             | 
             | Situations which inevitably cause a crash more than 5
             | seconds later seem like they would be extremely rare.
        
           | johnneville wrote:
           | are there even transparent reported numbers available ?
           | 
           | for whatever does exist, it is also easy to imagine how they
           | could be misleading. for instance i've disengaged FSD when i
           | noticed i was about to be in an accident. if i couldn't
           | recover in time, the accident would not be when FSD is on and
           | depending on the metric, would not be reported as a FSD
           | induced accident.
        
         | thomastjeffery wrote:
         | It's not just about relative safety compared to all human
         | driving.
         | 
         | We all know that some humans are sometimes terrible drivers!
         | 
         | We also know what that looks like: Driving too fast or slow
         | relative to surroundings. Quickly turning every once in a while
         | to stay in their lane. Aggressively weaving through traffic.
         | Going through an intersection without spending the time to
         | actually look for pedestrians. The list goes on..
         | 
         | Bad human driving can be seen. Bad automated driving _is
         | invisible_. Do you think the people who were about to be hit by
         | a Tesla even realized that was the case? I sincerely doubt it.
        
           | bastawhiz wrote:
           | > Bad automated driving is invisible.
           | 
           | I'm literally saying that it is visible, to me, the
           | passenger. And for reasons that aren't just bad vibes. If I'm
           | in an Uber and I feel unsafe, I'll report the driver. Why
           | would I pay for my car to do that to me?
        
             | wizzwizz4 wrote:
             | GP means that the signs aren't obvious to other drivers. We
             | generally underestimate how important psychological
             | modelling is for communication, because it's transparent to
             | most of us under most circumstances, but AI systems have
             | _very_ different psychology to humans. It is easier to
             | interpret the body language of a fox than a self-driving
             | car.
        
         | dekhn wrote:
         | I don't think you're supposed to merge left when people are
         | merging on the highway into your lane- you have right of way. I
         | find even with the right of way many people merging aren't
         | paying attention, but I deal with that by slightly speeding up
         | (so they can see me in front of them).
        
           | sangnoir wrote:
           | You don't have a right of way over a slow moving vehicle that
           | merged _ahead_ of you. Most ramps are not long enough to
           | allow merging traffic to accelerate to highway speeds before
           | merging, so many drivers free up the right-most lane for this
           | purpose (by merging left)
        
             | potato3732842 wrote:
             | Most ramps are more than long enough to accelerate close
             | enough to traffic speed if one wants to, especially in most
             | modern vehicles.
        
               | wizzwizz4 wrote:
               | Unless the driver in front of you didn't.
        
             | SoftTalker wrote:
             | If you can safely move left to make room for merging
             | traffic, you should. It's considerate and reduces the
             | chances of an accident.
        
             | dekhn wrote:
             | Since a number of people are giving pushback, can you point
             | to any (California-oriented) driving instructions
             | consistent with this? I'm not seeing any. I see people
             | saying "it's curteous", but when I'm driving I'm managing
             | hundreds of variables and changing lanes is often risky,
             | given motorcycles lanesplitting at high speed (quite
             | common).
        
               | davidcalloway wrote:
               | Definitely not California but literally the first part of
               | traffic law in Germany says that caution and
               | consideration are required from all partaking in traffic.
               | 
               | Germans are not known for poor driving.
        
               | dekhn wrote:
               | Right- but the "consideration" here is the person merging
               | onto the highway actually paying attention and adjusting,
               | rather than pointedly not even looking (this is a very
               | common merging behavior where I life). Changing lanes
               | isn't without risk even on a clear day with good
               | visibility. Seems like my suggestion of slowing down or
               | speeding up makes perfect sense because it's less risky
               | overall, and is still being considerate.
               | 
               | Note that I personally do change lanes at times when it's
               | safe, convenient, I am experienced with the intersection,
               | and the merging driver is being especially unaware.
        
           | bastawhiz wrote:
           | Just because you have the right of way doesn't mean the
           | correct thing to do is to remain in the lane. If remaining in
           | your lane is likely to make _someone else_ do something
           | reckless, you should have been proactive. Not legally, for
           | the sake of being a good driver.
        
             | dekhn wrote:
             | Can you point to some online documentation that recommends
             | changing lanes in preference to speeding up when a person
             | is merging at too slow a speed? What I'm doing is following
             | CHP guidance in this post:
             | https://www.facebook.com/chpmarin/posts/lets-talk-about-
             | merg... """Finally, if you are the vehicle already
             | traveling in the slow lane, show some common courtesy and
             | do what you can to create a space for the person by slowing
             | down a bit or speeding up if it is safer. """
             | 
             | (you probably misinterpreted what I said. I do sometimes
             | change lanes, even well in advance of a merge I know is
             | prone to problems, if that's the safest and most
             | convenient. What I am saying is the guidance I have read
             | indicates that staying in the same lane is generally safer
             | than changing lanes, and speeding up into an empty space is
             | better for everybody than slowing down, especially because
             | many people who are merging will keep slowing down more and
             | more when the highway driver slows for them)
        
         | modeless wrote:
         | Tesla jumped the gun on the FSD free trial earlier this year.
         | It was nowhere near good enough at the time. Most people who
         | tried it for the first time probably share your opinion.
         | 
         | That said, there is a night and day difference between FSD 12.3
         | that you experienced earlier this year and the latest version
         | 12.6. It will still make mistakes from time to time but the
         | improvement is massive and obvious. More importantly, the rate
         | of improvement in the past two months has been much faster than
         | before.
         | 
         | Yesterday I spent an hour in the car over three drives and did
         | not have to turn the steering wheel at all except for parking.
         | That _never_ happened on 12.3. And I don 't even have 12.6 yet,
         | this is still 12.5; others report that 12.6 is a noticeable
         | improvement over 12.5. And version 13 is scheduled for release
         | in the next two weeks, and the FSD team has actually hit their
         | last few release milestones.
         | 
         | People are right that it is still not ready yet, but if they
         | think it will stay that way forever they are about to be very
         | surprised. At the current rate of improvement it will be quite
         | good within a year and in two or three I could see it actually
         | reaching the point where it could operate unsupervised.
        
           | seizethecheese wrote:
           | _If_ this is the case, the calls for heavy regulation in this
           | thread will lead to many more deaths than otherwise.
        
           | jvanderbot wrote:
           | I have yet to see a difference. I let it highway drive for an
           | hour and it cut off a semi, coming within 9 to 12 inches of
           | the bumper for no reason. I heard about that one believe me.
           | 
           | It got stuck in a side street trying to get to a target
           | parking lot, shaking the wheel back and forth.
           | 
           | It's no better so far and this is the first day.
        
             | modeless wrote:
             | You have 12.6?
             | 
             | As I said, it still makes mistakes and it is not ready yet.
             | But 12.3 was much worse. It's the rate of improvement I am
             | impressed with.
             | 
             | I will also note that the predicted epidemic of crashes
             | from people abusing FSD never happened. It's been on the
             | road for a long time now. The idea that it is
             | "irresponsible" to deploy it in its current state seems
             | conclusively disproven. You can argue about exactly what
             | the rate of crashes is but it seems clear that it has been
             | at the very least no worse than normal driving.
        
               | jvanderbot wrote:
               | Hm. I thought that was the latest release but it looks
               | like no. But there seems to be no improvements from the
               | last trial, so maybe 12.6 is magically better.
        
               | modeless wrote:
               | A lot of people have been getting the free trial with
               | 12.3 still on their cars today. Tesla has really screwed
               | up on the free trial for sure. Nobody should be getting
               | it unless they have 12.6 at least.
        
               | jvanderbot wrote:
               | I have 12.5. maybe 12.6 is better but I've heard that
               | before.
               | 
               | Don't get me wrong without a concerted data team building
               | maps a priori, this is pretty incredible. But from a pure
               | performance standpoint it's a shaky product.
        
               | KaoruAoiShiho wrote:
               | The latest version is 12.5.6, I think he got confused by
               | the .6 at the end. If you think that's bad then there
               | isn't a better version available. However it is a
               | dramatic improvement over 12.3, don't know how much you
               | tested on it.
        
               | modeless wrote:
               | You're right, thanks. One of the biggest updates in
               | 12.5.6 is transitioning the highway Autopilot to FSD. If
               | he has 12.5.4 then it may still be using the old non-FSD
               | Autopilot on highways which would explain why he hasn't
               | noticed improvement there; there hasn't been any until
               | 12.5.6.
        
             | hilux wrote:
             | > ... coming within 9 to 12 inches of the bumper for no
             | reason. I heard about that one believe me.
             | 
             | Oh dear.
             | 
             | Glad you're okay!
        
             | eric_cc wrote:
             | Is it possible you have a lemon? Genuine question. I've had
             | nothing but positive experiences with FSD for the last
             | several months and many thousands of miles.
        
               | ben_w wrote:
               | I've had nothing but positive experiences with
               | ChatGPT-4o, that doesn't make people wrong to criticise
               | either as modelling their training data too much and
               | generalising too little when they need to use it for
               | something where the inference domain is too far outside
               | the training domain.
        
           | snypher wrote:
           | So just a few more years of death and injury until they reach
           | a finished product?
        
           | misiti3780 wrote:
           | i have the same experience 12.5 is insanely good. HN is full
           | of people that dont want self driving to succeed for some
           | reason. fortunately, it's clear as day to some of us that
           | tesla approach will work
        
             | ethbr1 wrote:
             | Curiousity about why they're against it and enunciating
             | your why you think it will work would be more helpful.
        
               | misiti3780 wrote:
               | It's evident to Tesla drivers using Full Self-Driving
               | (FSD) that the technology is rapidly improving and will
               | likely succeed. The key reason for this anticipated
               | success is data: any reasonably intelligent observer
               | recognizes that training exceptional deep neural networks
               | requires vast amounts of data, and Tesla has accumulated
               | more relevant data than any of its competitors. Tesla
               | recently held a robotaxi event, explicitly informing
               | investors of their plans to launch an autonomous
               | competitor to Uber. While Elon Musk's timeline
               | predictions and politics may be controversial, his
               | ability to achieve results and attract top engineering
               | and management talent is undeniable.
        
             | eric_cc wrote:
             | Completely agree. It's very strange. But honestly it's
             | their loss. FSD is fantastic.
        
           | bastawhiz wrote:
           | > At the current rate of improvement it will be quite good
           | within a year
           | 
           | I'll believe it when I see it. I'm not sure "quite good" is
           | the next step after "feels dangerous".
        
           | delusional wrote:
           | > That said, there is a night and day difference between FSD
           | 12.3 that you experienced earlier this year and the latest
           | version 12.6
           | 
           | >And I don't even have 12.6 yet, this is still 12.5;
           | 
           | How am i supposed to take anything you say seriously when
           | your only claim is a personal anecdote that doesn't even
           | apply to your own argument. Please, think about what you're
           | writing, and please stop repeating information you heard on
           | youtube as if it's fact.
           | 
           | The is one of the reasons (among many) that I can't take
           | Tesla booster seriously. I have absolutely zero faith in your
           | anecdote that you didn't touch the steering wheel. I bet it's
           | a lie.
        
             | modeless wrote:
             | The version I have is already a night and day difference
             | from 12.3 and the current version is better still. Nothing
             | I said is contradictory in the slightest. Apply some basic
             | reasoning, please.
             | 
             | I didn't say I didn't touch the steering wheel. I had my
             | hands lightly touching it most of the time, as one should
             | for safety. I occasionally used the controls on the wheel
             | as well as the accelerator pedal to adjust the set speed,
             | and I used the turn signal to suggest lane changes from
             | time to time, though most lane choices were made
             | automatically. But I did not _turn_ the wheel. All turning
             | was performed by the system. (If you turn the wheel
             | manually the system disengages). Other than parking, as I
             | mentioned, though FSD did handle some navigation into and
             | inside parking lots.
        
             | eric_cc wrote:
             | I can second this experience. I rarely touch the wheel
             | anymore. I'd say I'm 98% FSD. I take over in school zones,
             | parking lots, and complex construction.
        
           | wstrange wrote:
           | I have a 2024 Model 3, and it's a a great car. That being
           | said, I'm under no illusion that the car will _ever_ be self
           | driving (unsupervised).
           | 
           | 12.5.6 Still fails to read very obvious signs for 30 Km/h
           | playgrounds zones.
           | 
           | The current vehicles lack sufficient sensors, and likely do
           | not have enough compute power and memory to cover all edge
           | cases.
           | 
           | I think it's a matter of time before Tesla faces a lawsuit
           | over continual FSD claims.
           | 
           | My hope is that the board will grow a spine and bring in a
           | more focused CEO.
           | 
           | Hats off to Elon for getting Tesla to this point, but right
           | now they need a mature (and boring) CEO.
        
           | jeffbee wrote:
           | If I had a dime for every hackernews who commented that FSD
           | version X was like a revelation compared to FSD version X-e
           | I'd have like thirty bucks. I will grant you that every
           | release has surprisingly different behaviors.
           | 
           | Here's an unintentionally hilarious meta-post on the subject
           | https://news.ycombinator.com/item?id=29531915
        
             | modeless wrote:
             | Sure, plenty of people have been saying it's great for a
             | long time, when it clearly was not (looking at you, Whole
             | Mars Catalog). _I_ was not saying it was super great back
             | then. I have consistently been critical of Elon for
             | promising human level self driving  "next year" for like 10
             | years in a row and being wrong every time. He said it this
             | year again and I still think he's wrong.
             | 
             | But the rate of progress I see right now has me thinking
             | that it may not be more than two or three years before that
             | threshold is finally reached.
        
               | ben_w wrote:
               | The most important lesson I've had from me incorrectly
               | predicting in 2009 that we'd have cars that don't come
               | with steering wheels in 2018, and thinking that the
               | progress I saw each year up to then was consistent with
               | that prediction, is that it's really hard to guess how
               | long it takes to walk the fractal path that is software
               | R&D.
               | 
               | How far are we now, 6 years later than I expected?
               | 
               | Dunno.
               | 
               | I suspect it's gonna need an invention on the same level
               | as Diffusion or Transformer models to be able to get all
               | the edge cases we can get, and that might mean we only
               | get it with human level AGI.
               | 
               | But I don't know that, it might be we've already got all
               | we need architecture-wise and it's just a matter of
               | scale.
               | 
               | Only thing I can be really sure of is we're making
               | progress "quite fast" in a non-objective use of the words
               | -- it's not going to need a re-run of 6 million years of
               | mammilian evolution or anything like that, but even 20
               | years wall clock time would be a disappointment.
        
               | modeless wrote:
               | Waymo went driverless in 2020, maybe you weren't that far
               | off. Predicting that in 2009 would have been pretty good.
               | They could and should have had vehicles without steering
               | wheels anytime since then, it's just a matter of hardware
               | development. Their steering wheel free car program was
               | derailed when they hired traditional car company
               | executives.
        
               | ben_w wrote:
               | Waymo for sure, but I meant also without any geolock
               | etc., so I can't claim credit for my prediction.
               | 
               | They may well best Tesla to this, though.
        
             | Laaas wrote:
             | Doesn't this just mean it's improving rapidly which is a
             | good thing?
        
         | potato3732842 wrote:
         | If you were a poorer driver who did these things you wouldn't
         | find these faults so damning because it'd only be say 10%
         | dumber than you rather than 40% or whatever (just making up
         | those numbers).
        
           | bastawhiz wrote:
           | That just implies FSD is as good as a bad driver, which isn't
           | really an endorsement.
        
         | dreamcompiler wrote:
         | > It didn't merge left to make room for vehicles merging onto
         | the highway. The vehicles then tried to cut in. The system
         | should have avoided an unsafe situation like this in the first
         | place.
         | 
         | This is what bugs me about ordinary autopilot. Autopilot
         | doesn't switch lanes, but I like to slow down or speed up as
         | needed to allow merging cars to enter my lane. Autopilot never
         | does that, and I've had some close calls with irate mergers who
         | expected me to work with them. And I don't think they're wrong.
         | 
         | Just means that when I'm cruising in the right lane with
         | autopilot I have to take over if a car tries to merge.
        
         | paulcole wrote:
         | > Until I ride in one and feel safe, I can't have any faith
         | that this is a reasonable system
         | 
         | This is probably the worst way to evaluate self-driving for
         | society though, right?
        
         | TheCleric wrote:
         | > Lots of people are asking how good the self driving has to be
         | before we tolerate it.
         | 
         | There's a simple answer to this. As soon as it's good enough
         | for Tesla to accept liability for accidents. Until then if
         | Tesla doesn't trust it, why should I?
        
           | genocidicbunny wrote:
           | I think this is probably both the most concise and most
           | reasonable take. It doesn't require anyone to define some
           | level of autonomy or argue about specific edge cases of how
           | the self driving system behaves. And it's easy to apply this
           | principle to not only Tesla, but to all companies making self
           | driving cars and similar features.
        
           | concordDance wrote:
           | Whats the current total liability cost for all Tesla drivers?
           | 
           | The average for all USA cars seems to be around $2000/year,
           | so even if FSD was half as dangerous Tesla would still be
           | paying $1000/year equivalent (not sure how big insurance
           | margins are, assuming nominal) per car.
           | 
           | Now, if legally the driver could avoid paying insurance for
           | the few times they want/need to drive themselves (e.g. snow?
           | Dunno what FSD supports atm) then it might make sense
           | economically, but otherwise I don't think it would work out.
        
             | Retric wrote:
             | Liability alone isn't nearly that high.
             | 
             | Car insurance payments include people stealing your car,
             | uninsured motorists, rental cars, and other issues not the
             | drivers fault. Further insurance payments also include
             | profits for the insurance company, advertising, billing,
             | and other overhead from running a business.
             | 
             | Also, if Tesla was taking on these risks you'd expect your
             | insurance costs to drop.
        
               | TheCleric wrote:
               | Yeah any automaker doing this would just negotiate a flat
               | rate per car in the US and the insurer would average the
               | danger to make a rate. This would be much cheaper than
               | the average individual's cost for liability on their
               | insurance.
        
               | concordDance wrote:
               | Good points, thanks.
        
           | bdcravens wrote:
           | The liability for killing someone can include prison time.
        
             | TheCleric wrote:
             | Good. If you write software that people rely on with their
             | lives, and it fails, you should be held liable for that
             | criminally.
        
               | beej71 wrote:
               | And such coders should carry malpractice insurance.
        
         | mike_d wrote:
         | > Lots of people are asking how good the self driving has to be
         | before we tolerate it.
         | 
         | When I feel as safe as I do sitting in the back of a Waymo.
        
         | eric_cc wrote:
         | That sucks that you had that negative experience. I've driven
         | thousands of miles in FSD and love it. Could not imagine going
         | back. I rarely need to intervene and when I do it's not because
         | the car did something dangerous. There are just times I'd
         | rather take over due to cyclists, road construction, etc.
        
           | windexh8er wrote:
           | I don't believe this at all. I don't own one but know about a
           | half dozen people that got suckered into paying for FSD. All
           | of them don't use it and 3 of them have stated it's put them
           | in dangerous situations.
           | 
           | I've ridden in an X, S and Y with it on. Talk about vomit
           | inducing when letting it drive during "city" driving. I don't
           | doubt it's OK on highway driving, but Ford Blue Cruise and
           | GM's Super Cruise are better there.
        
           | itsoktocry wrote:
           | These "works for me!" comments are exhausting. Nobody
           | believes you "rarely intervene", otherwise Tesla themselves
           | would be promoting the heck out of the technology.
           | 
           | Bring on the videos of you in the passenger seat on FSD for
           | any amount of time.
        
         | concordDance wrote:
         | This would be more helpful with a date. Was this in 2020 or
         | 2024? I've been told FSD had a complete rearchitecting.
        
         | dchichkov wrote:
         | > I'm grateful to be getting a car from another manufacturer
         | this year.
         | 
         | I'm curious, what is the alternative that you are considering?
         | I've been delaying an upgrade to electric for some time. And
         | now, a car manufacturer that is contributing to the making of
         | another Jan 6th, 2021 is not an option, in my opinion.
        
         | pbasista wrote:
         | > I'm grateful to be getting a car from another manufacturer
         | this year.
         | 
         | I have no illusions about Tesla's ability to deliver an
         | unsupervised self-driving car any time soon. However, as far as
         | I understand, their autosteer system, in spite of all its
         | flaws, is still the best out there.
         | 
         | Do you have any reason to believe that there actually is
         | something better?
        
           | throwaway314155 wrote:
           | I believe they're fine with losing auto steering
           | capabilities, based on the tone of their comment.
        
         | geoka9 wrote:
         | > It didn't merge left to make room for vehicles merging onto
         | the highway. The vehicles then tried to cut in. The system
         | should have avoided an unsafe situation like this in the first
         | place.
         | 
         | I've been on the receiving end of this with the offender being
         | a Tesla so many times that I figured it must be FSD.
        
       | yieldcrv wrote:
       | Come on US, regulate interstate commerce and tell them to delete
       | these cameras
       | 
       | Lidar is goated and if tesla didn't want that they can pursue a
       | different perception solution, allowing for innovation
       | 
       | But just visual cameras aiming to replicate us, ban that
        
       | xqcgrek2 wrote:
       | Lawfare by the Biden administration
        
         | jsight wrote:
         | In this case, I do not think so. NHTSA generally does an
         | excellent job of looking at the big picture without bias.
         | 
         | Although I must admit that their last investigation felt like
         | an exception. The changes that they enforced seemed to be
         | fairly dubious.
        
       | bastloing wrote:
       | It was way safer to ride a horse and buggy
        
       | jgalt212 wrote:
       | The SEC is clearly afraid of Musk. I wonder what the intimidation
       | factor is at NHTSA.
        
       | Rebuff5007 wrote:
       | Tesla testing and developing FSD with normal consumer drivers
       | frankly seems criminal. Test drivers for AV companies get
       | advanced driver training, need to filed detailed reports about
       | the cars response to various driving scenarios, and generally are
       | paid to be as attentive as possible. The fact that any old tech-
       | bro or un-assuming old lady can buy this thing and be on their
       | phone when the car could potentially turn into oncoming traffic
       | is mind boggling.
        
       | quitit wrote:
       | "Full Self-Driving" but it's not "full" self-driving, as it
       | requires active supervision.
       | 
       | So it's marketed with a nod and wink, as if the supervision
       | requirement is just a peel away disclaimer to satisfy old and
       | stuffy laws that are out of step with the latest technology. When
       | in reality it really does need active supervision.
       | 
       | But the nature of the technology is this approach invites the
       | driver to distraction, because what's the use in "full self
       | driving" if one needs to have their hands on the wheel and feet
       | near the pedals ready to take control at a moments notice?
       | Worsening this problem is that the Teslas have shown themselves
       | to drive erratically at unexpected times such as phantom braking
       | or misidentifying natural phenomena for traffic lights.
       | 
       | One day people will look back on letting FSD exist in the market
       | and roll their eyes in disbelief of the recklessness.
        
       | JumpinJack_Cash wrote:
       | Unpopular take: Even with perfect FSD which is much better than
       | the average human driver (say having the robotic equivalent of a
       | Lewis Hamilton in every car) the productivity and health gains
       | won't be as great as people anticipate.
       | 
       | Sure way less traffic deaths but the spike in depression
       | especially among males would be something very big. Life events
       | are much outside of our control, having a 5000lbs thing that can
       | get to 150mph if needed and responds exactly to the accelerator,
       | brake and steering wheel input...well that makes people feel in
       | control and very powerful while behind the aforementioned
       | steering wheel.
       | 
       | Also productivity...I don't know...people think a whole lot and
       | do a whole lot of self reflection while they are driving and when
       | they arrive at destination they just implement the thoughts they
       | had while driving. The ability to talk on the phone has been
       | there for quite some time now too, so thinking and communicating
       | can be done while driving already, what would FSD add?
        
         | HaZeust wrote:
         | As a sports car owner, I see where you're coming from -- but
         | MANY do not. We are the 10%, the other 90% see their vehicle as
         | an A-B tool, and you can clearly see that displayed with the
         | average, utilitarian car models that the vast majority of the
         | public buy. There will be no "spike" in depression; simply put,
         | there's not enough people that care about their car, how it
         | gets from point A to point B, or what contribution they give,
         | if any, into that.
        
           | JumpinJack_Cash wrote:
           | Maybe they don't care about their car to be a sports car but
           | they surely enjoy some pleasure out of the control of being
           | at the helm of something powerful like a car (even though
           | it's not a sports car)
           | 
           | Also even people in small cars they think a lot while driving
           | already, and they also communicate, how much more productive
           | they could be with FSD?
        
             | HaZeust wrote:
             | I really don't think you're right about the average person,
             | or even a notable size of people, believing in the idea of
             | their car being their "frontier of freedom" as was popular
             | in the 70-80's media. I don't think that many people _care_
             | about driving nowadays.
        
       | diebeforei485 wrote:
       | This feels like more lawfare from the Biden administration.
       | 
       | They're doing this based on four collisions? Really?
        
       | drodio wrote:
       | I drive a 2024 Tesla Model Y and another person in my family
       | drives a 2021 Model Y. Both cars are substantially similar (the
       | 2021 actually has _more_ sensors than the 2024, which is strictly
       | cameras-only).
       | 
       | Both cars are running 12.5 -- and I agree that it's dramatically
       | improved over 12.3.
       | 
       | I really enjoy driving. I've got a #vanlife Sprinter that I'll do
       | 14 hour roadtrips in with my kids. For me, the Tesla's self-
       | driving capability is a "nice to have" -- it sometimes drives
       | like a 16 year old who just got their license (especially around
       | braking. Somehow it's really hard to nail the "soft brake at a
       | stop sign" which seems like it should be be easy. I find that
       | passengers in the car are most uncomfortable when the car brakes
       | like this -- and I'm the most embarrassed because they all look
       | at me like I completely forgot how to do a smooth stop at a stop
       | sign).
       | 
       | Other times, the Tesla's self-driving is magical and nearly
       | flawless -- especially on long highway road trips, like up to
       | Tahoe. Even someone like me who loves doing road trips really
       | appreciates the ability to relax and not have to be driving.
       | 
       | But here's one observation I've had that I don't see quite
       | sufficiently represented in the comments:
       | 
       | The other person in my family with the 2021 Model Y does not like
       | to drive like I do, and they really appreciate that the Tesla is
       | a better driver than they feel themselves to be. And as a
       | passenger in their car, I also really appreciate that when the
       | Tesla is driving, I generally feel much more comfortable in the
       | car. Not always, but often.
       | 
       | There's so much variance in us as humans around driving skills
       | and enjoyment. It's easy to lump us together and say "the car
       | isn't as good as the human." And I know there's conflicting data
       | from Tesla and NHTSA about whether in aggregate, Teslas are safer
       | than human drivers or not.
       | 
       | But what I definitely know from my experience is that the Tesla
       | is already a better driver than _many_ humans are -- especially
       | those that don 't enjoy driving. And as @modeless points out, the
       | rate of improvement is now vastly accelerating.
        
       | fortran77 wrote:
       | I have FSD in my Plaid. I don't use it. Too scary.
        
       ___________________________________________________________________
       (page generated 2024-10-19 23:01 UTC)