[HN Gopher] U.S. opens probe into Tesla's Autopilot over emergen...
       ___________________________________________________________________
        
       U.S. opens probe into Tesla's Autopilot over emergency vehicle
       crashes
        
       Author : etimberg
       Score  : 322 points
       Date   : 2021-08-16 12:43 UTC (10 hours ago)
        
 (HTM) web link (www.reuters.com)
 (TXT) w3m dump (www.reuters.com)
        
       | hetspookjee wrote:
       | I wonder how many accidents happened because people were fiddling
       | with their screen to turn on the windscreen wipers and took their
       | eyes of the road too long in poor sight. I can't believe the law
       | allowed the controls for the wiper speed to be put into this
       | capacitive screen.
        
       | okareaman wrote:
       | When I was learning to drive, my grandmother drilled into me to
       | never swerve for an animal that jumps out in front of the car.
       | This saved me when I was driving by the Grand Canyon and jack
       | rabbits kept jumping, out of nowhere seemingly, in front of my
       | car. I drilled "never swerve" into my son and it saved him on a
       | mountain road when he hit a deer. He didn't go into the trees.
       | When I drove in Alaska I asked why the forest was cut back from
       | the road. They said that moose like to step out in front of cars.
       | 
       | I have no idea how self-driving fits into this. I don't have a
       | feel how self-driving responds to emergencies. I'd have to
       | experience an emergency in one. For that reason, I don't see
       | myself ever trusting self-driving.
        
         | kwhitefoot wrote:
         | When my wife learned to drive in Norway she was instructed
         | never to swerve to avoid a collision regardless of whether it
         | was an elk, a dog or a human being in front of the car, just to
         | stamp hard on the brake.
         | 
         | The rationale being that swerving most likely puts more people
         | at risk more of the time. Especially true here where leaving
         | the road often means either colliding with the granite cliff
         | wall or ending up in the fjord or lake.
        
         | lawn wrote:
         | It's always contextual. If you run into a moose head on, you're
         | in for a very bad time.
        
           | rad_gruchalski wrote:
           | That's where the other three rules apply:
           | 
           | 1. always pay full attention to where you are because there
           | might be a truck or a family of 5 coming from the opposite
           | direction, 2. never lift, 3. always look in the direction you
           | want to travel in, not in the direction you currently travel
        
       | leroman wrote:
       | Seeing all the near-misses and life saving interventions on
       | YouTube involving Tesla vehicles, 1 death seems not such a bad
       | result..
       | 
       | The question should be - how many lives were saved by this system
       | vs how many would die if driven "normally"?
        
         | thereisnospork wrote:
         | >The question should be - how many lives were saved by this
         | system vs how many would die if driven "normally"?
         | 
         | It is also necessary to project this into the future, i.e.
         | looking at the integral of expected lives lost 'rushing' self
         | driving cars vs. 'waiting-and-seeing' (as Americans die at a
         | rate of 40,000 per annum).
         | 
         | If twice as many people die for a fixed number of years to
         | create a self driving system that results in half the fatality
         | rate of the status quo, that becomes worth it very, very
         | quickly.
        
         | young_unixer wrote:
         | I don't think that should be the question.
         | 
         | For example: it's not morally equivalent to die while drunk
         | driving at 150 km/h vs dying as a pedestrian because someone
         | ran over you.
         | 
         | I would prefer 10 drunk drivers die instead of just one
         | innocent person.
        
           | leroman wrote:
           | There's a place for a discussion about the moral
           | repercussions, I doubt it's a 10 drunk driver vs 1 soccer mom
           | situation ;)
        
           | nickik wrote:
           | The only person who died fell asleep in the car and would
           | have died in any car as far as I remember.
        
       | rvz wrote:
       | Did I not just say this before? [0] [1] [2], seems like the
       | safety probe formally agrees with my (and many other's) concerns
       | over the deceptive advertising of the FSD package and its safety
       | risks towards other drivers.
       | 
       | Perhaps this is for the best.
       | 
       | [0] https://news.ycombinator.com/item?id=27996321
       | 
       | [1] https://news.ycombinator.com/item?id=27863941
       | 
       | [2] https://news.ycombinator.com/item?id=28053883
        
         | nickik wrote:
         | That is literally not what this is about.
        
       | Animats wrote:
       | Right. As I've pointed out previously, Tesla seems to be unable
       | to detect sizable stationary obstacles that are partly blocking a
       | lane, especially if they don't look like the rear end of a car.
       | In addition to emergency vehicles, Teslas on autopilot have
       | plowed into freeway barriers and a street sweeper. That's the
       | usual situation for first responders, who usually try to block as
       | little of the road as possible but often don't have enough
       | shoulder space.
       | 
       | It's clear what Tesla really has - a good lane follower and
       | cruise control that slows down for cars ahead. That's a level 2
       | system. That's useful, but, despite all the hype about "full self
       | driving", it seems that's all they've got.
       | 
       | "Full self driving" just adds some lane-changing assistance and
       | hints from the nav system.
        
         | asdff wrote:
         | I don't understand how its even possible for these cars to be
         | crashing. My car beeps like a missile is locked on when I am
         | coming too close to an object I might hit. Just a simple sensor
         | in the front. If my car can beep and give me enough time to
         | slam on the brakes, why can't Tesla's do the same?
        
           | quartesixte wrote:
           | Well for starters, they're taking out the radars that other
           | cars rely on to accomplish this.
        
             | shrimpx wrote:
             | In other parts of this thread, people are suggesting that
             | these crashes are the _radar 's fault_ and deploying their
             | vision-only system will fix the problem.
        
           | HALtheWise wrote:
           | You're probably referring to parking sensors, which are
           | typically ultrasonic sensors mounted to the bumpers.
           | Unfortunately, ultrasonics have both a limited range of ~20ft
           | for practical uses, and more damningly, a relatively wide
           | field of view with no ability to distinguish where in the
           | field of view an object is. While 20ft range is more than
           | enough to give you time to slam on the brakes in your garage,
           | it's basically useless for high speed autonomy, except for
           | some very limited blindspot-awareness type tasks.
        
         | bit_logic wrote:
         | I think we need to add a new level 2 assisted driving skills
         | section to driving tests. Level 2 can be safer but it really
         | requires understanding how level 2 works and limitations. For
         | example, when I use level 2 (mine is Honda but applies to other
         | level 2 as well since they mostly share the same vendor), these
         | are the rules I follow:
         | 
         | - Car switching in/out of my lane, I manually take over
         | 
         | - Tight curve in the freeway, manually take over
         | 
         | - Very frequently check the dashboard indicator that shows if
         | the sensors "sees" the car front or not
         | 
         | - Anything unusual like construction, cones, car on shoulder,
         | manually take over
         | 
         | - Anything that looks difficult like weird merging lanes,
         | manually take over
         | 
         | - Any bad weather or condition like sun directly in front,
         | manual drive
         | 
         | - Frequently adjusting max speed setting on ACC. It's safer to
         | not be too much above the prevailing speeds. Otherwise, if ACC
         | suddenly becomes blind, it can accelerate dangerously as it
         | tries to reach max set speed.
         | 
         | - I don't trust lane keep much, it's mostly a backup for my own
         | steering and making my arms less tired turning the wheel
         | 
         | The key thing is to recognize just how dumb this technology is.
         | It's not smart, it's not AI. It's just a bit above the old
         | cruise control. With that mindset it can be used safely.
        
           | sunshineforever wrote:
           | I think it's early to be adding stuff like that to government
           | mandated driving tests when these cars are only theoretically
           | available to the ever dwindling middle class and above.
           | Unless my circumstances change there's no chance I'll be in
           | one for at least 10-15 years.
        
           | jumpkick wrote:
           | If level 2 requires such handholding, what's the point? Seems
           | to me like it just leads to a false sense of security, giving
           | drivers the feeling that they can trust the self-driving
           | system a lot more than they safely can.
        
           | tayo42 wrote:
           | Seems like it's easier just to drive regularly. This sounds
           | very distracted
        
           | mmcconnell1618 wrote:
           | I wonder if some sort of standard display showing what the
           | vehicle "sees" and is predicted to do will be regulated. For
           | example, if the display shows the vehicle doesn't understand
           | a firetruck parked half way in the lane or the tight curve on
           | the freeway, at least the driver can validate on the display
           | and have some time to react.
        
             | sunshineforever wrote:
             | I would strongly prefer a car with such features.
        
           | ajross wrote:
           | > Level 2 can be safer but [...]
           | 
           | I think that requires more numerate analysis than you're
           | giving though. The data from the story is a sample size of 11
           | crashes over three years (I think). If that's really the size
           | of the effect, then your "but [...]" clause seems very
           | suspect.
           | 
           | There are almost two million of these cars on the roads now.
           | It seems extremely likely that the number of accidents
           | _prevented_ by AP dwarfs this effect, so arguing against it
           | even by implication as you do here seems likely to be doing
           | more harm than good.
           | 
           | That doesn't mean it's not worth investigating what seems
           | like an identifiable edge case in the AP obstacle detection.
           | But that's a bug fix, not an argument about "Level 2
           | Autonomy" in general.
        
         | qweqwweqwe-90i wrote:
         | Yeah, let's ignore all the good and force everyone to go back
         | to human drivers that are 10x worse.
        
           | jacquesm wrote:
           | Can you please stop repeating this tripe, it's been debunked
           | over-and-over again, and it is really getting tiring.
        
           | evanextreme wrote:
           | No one is saying this should be the case, just that the
           | feature is not what the company advertises (the ability for
           | the car to fully drive itself) and that said feature is
           | further away from completion than many might lead you to
           | believe. as someone who drives a tesla with autopilot, I
           | agree with this. Autopilot is the best lane assistance system
           | ive used, but thats all it is.
        
           | paxys wrote:
           | Less than 100 first responders are hit annually in the USA.
           | The fact that just Tesla has managed to hit 11 since 2018
           | makes it pretty clear that human drivers are not "10x worse"
           | than Tesla's tech, but quite the opposite.
        
             | qweqwweqwe-90i wrote:
             | You are comparing two different things. 11 responder
             | vehicles crashes is not comparable to how many first
             | responders (people) are hit.
        
           | breakfastduck wrote:
           | Go back? Human drivers are _the norm_ , not the exception.
           | This argument is tiresome beyond belief.
           | 
           | But no, just keep disregarding the clearly _significant_
           | issues and mis-marketing, because progress, right?
        
           | thebruce87m wrote:
           | There is no independent data to support your 10X claim. Tesla
           | marketing or an Elon tweet doesn't count.
        
           | clifdweller wrote:
           | the point isn't to ignore it but to make sure everyone using
           | it is 100% aware they need to be paying attention as this is
           | a edge case that is common that the system cant handle so
           | drivers are still responsible to take over
        
           | kube-system wrote:
           | Tesla does not offer a vehicle that does not require human
           | drivers (not even momentarily). Tesla's autopilot and FSD
           | systems are both SAE Level 2, which means the human is still
           | in operation of the vehicle _at all times_. _All_ of Tesla 's
           | driving assistance technologies require a human to monitor
           | operation of the vehicle and intervene if necessary. The fact
           | that they have given anyone an impression otherwise is
           | problematic.
           | 
           | https://www.sae.org/binaries/content/gallery/cm/articles/pre.
           | ..
        
             | ajross wrote:
             | > The fact that they have given anyone an impression
             | otherwise is problematic.
             | 
             | Good grief. This meme will not die. The car literally tells
             | you to keep your hands on the wheel every time you engage
             | autopilot, yells at you if you don't, will lock you out of
             | the system as punishment if you don't comply, and if you
             | really seem disabled will bring the car to a stop and turn
             | the hazards on. It _simply will not operate_ without an
             | attentive driver, or at the very least one spending
             | considerable energy at defeating the attention nags.
             | 
             | There are exactly zero Tesla drivers in the world who don't
             | know these rules. Just stop with the nonsense. Please.
        
               | vkou wrote:
               | > There are exactly zero Tesla drivers in the world who
               | don't know these rules. Just stop with the nonsense.
               | Please.
               | 
               | Tesla's marketing also knows that there are exactly zero
               | drivers in the world who follow those rules, but that
               | doesn't stop them from overselling the capabilities of
               | what they ship.
        
               | ajross wrote:
               | Stop it. Please. Again, there are no Tesla drivers who
               | have been misled about the capabilities of the system.
               | The people who have been mislead are folks like you who
               | read arguments on the internet and don't drive these
               | cars. Go try one and see how the system works. It doesn't
               | permit the kind of confusion that everyone constantly
               | assumes. It just doesn't.
        
         | shrimpx wrote:
         | Elon keeps warning the world about the impending
         | hyperintelligent AI that will retool human economies, politics,
         | and religions, yet year after year his cobbled-together AI
         | fails at basic object detection.
        
           | president wrote:
           | It's all marketing and image projection.
        
         | postmeta wrote:
         | As this reddit post pointed out, this appears to be a common
         | problem with radar TACC.
         | https://www.reddit.com/r/teslamotors/comments/p5ekci/us_agen...
         | 
         | """ These events occur typically when a vehicle is partially in
         | a lane and radar has to ignore a stationary object. This is
         | pretty standard and inherent with TACC + radar.
         | 
         | The faster Tesla pushes the vision only stack to all cars after
         | they've validated the data, the faster this topic becomes moot.
         | Andrej Karpathy talks and shows examples of what that would do
         | here. Minutes 23:00-28:00 https://youtu.be/a510m7s_SVI
         | 
         | Older examples from manuals of other TACC systems which use
         | radar:
         | 
         | Volvo's Pilot Assist regarding AEB/TACC.
         | 
         | According to Wired, Volvo's Pilot Assist system is much the
         | same. The vehicles' manual explains that not only will the car
         | fail to brake for a sudden stationary object, it may actually
         | race toward it to regain its set speed:
         | 
         | "Pilot Assist will ignore the stationary vehicle and instead
         | accelerate to the stored speed. The driver must then intervene
         | and apply the brakes."
         | 
         | Cadillac Super Cruise - Page 252
         | 
         | Stationary or Very Slow-Moving Objects
         | 
         | ACC may not detect and react to stopped or slow-moving vehicles
         | ahead of you. For example, the system may not brake for a
         | vehicle it has never detected moving. This can occur in stop-
         | and-go traffic or when a vehicle suddenly appears due to a
         | vehicle ahead changing lanes. Your vehicle may not stop and
         | could cause a crash. Use caution when using ACC. Your complete
         | attention is always required while driving and you should be
         | ready to take action and apply the brakes.
         | 
         | BMW Driving Assistant Plus - Page 124
         | 
         | A warning may not be issued when approaching a stationary or
         | very slow-moving obstacle. You must react yourself; otherwise,
         | there is the danger of an accident occurring.
         | 
         | If a vehicle ahead of you unexpectedly moves into another lane
         | from behind a stopped vehicle, you yourself must react, as the
         | system does not react to stopped vehicles. """
        
           | ajross wrote:
           | FWIW: Tesla AP is primarily vision based now. Newer cars in
           | the US aren't even being fitted out with the radar units
           | anymore (mine doesn't have it, for instance). So while this
           | may in some sense be an unavoidable edge case for radar, it
           | really _shouldn 't_ be for Tesla Autopilot.
           | 
           | It's worth checking out for sure. Not worth the headline
           | bandwidth and flamage budget being spent on it.
        
           | pkulak wrote:
           | I'll believe it when I see it. From what I can tell, Tesla
           | has made no progress at all in three years. I just drove my
           | buddy's 3, and it was still diving to the right when a lane
           | merges and the line disappears. This drove me nuts when I
           | test drove years ago. Other cars do lane keeping so much
           | better than Tesla at this point.
        
           | sjg007 wrote:
           | What about Subaru eyesight? I thought it did..
        
             | Loughla wrote:
             | Subaru's eyesight absolutely will stop you when you think
             | you're about to hit something, regardless of whether or not
             | that something was moving previously.
             | 
             | It's actually really annoying if you live in a rural area
             | without clearly defined lanes, and large, stationary
             | objects (tractors and whatnot) close to the road.
        
               | nzrf wrote:
               | As I think I previously posted about this. It will also
               | see exhaust coming up on cold winter day as an obstacle
               | and brake unexpectedly at light. It literally is worst
               | and wish it could be disabled by default.
               | 
               | Additionally, the back up sensor is a tad over zealous
               | also.
        
         | icelandicmoss wrote:
         | I feel like part of the problem with the kind of autopilot
         | crashes you describe here is how _inexplicable_ they are to
         | humans. Whilst humans can be dangerous drivers, the incidents
         | they cause generally have a narrative sequence of events that
         | are comprehensible to us -- for instance, driver was
         | distracted, or visibility was poor.
         | 
         | But when a supposedly 'all-seeing always watching' autopilot
         | drives straight into a large stationary object in clear
         | daylight, we have no understanding of how the situation
         | occurred.
         | 
         | This I think has a couple of effects:
         | 
         | 1) The apparent randomness makes the idea of these crashes a
         | lot more scary -- psychologically we seem to have a greater
         | aversion to danger we can't predict, and we can't tell
         | ourselves the 'ah but that wouldn't happen to me' story.
         | 
         | 2) Predictability of road incidents actually is a relevant
         | piece of information. As a road user (including pedestrian),
         | most of my actions are taken on the basis of what I am
         | _expecting_ to happen next, and my model for this is how humans
         | drive (and walk). Automated drivers have different
         | characteristics and failure modes, and that makes them an
         | interaction problem for me.
        
           | oaw-bct-ar-bamf wrote:
           | In my opinion the underlying assumption autopilots are built
           | with are wrong. It is assumed that the road is free to drive
           | on.
           | 
           | Only when the vehicle computer detects a known object on the
           | road that it knows should not be there it is applying brakes
           | or trying to steer around.
           | 
           | I would feel safer if the algorithm would assume the negative
           | case as default and only give the ,,green light" once it
           | determined that the road is free to drive on. In case of
           | unknown (not yet supervised) road obstructions the worst
           | needs to be assumed.
           | 
           | That's where the ,unexplainable' crashes are coming from.
           | Something the size of an actual truck is obstructing the
           | road. But couldn't quite classify it because the truck has
           | tipped over and is lying on the road sideways. Not yet
           | learned by the algorithm. Can't be that bad, green light, no
           | need to avoid or brake.
        
             | [deleted]
        
             | quartesixte wrote:
             | > It is assumed that the road is free to drive on.
             | 
             | Trying to remember if the opposite of this is how human
             | drivers are taught, or if this is implicit in how we move
             | about the world. My initial gut reaction says yes and this
             | is a great phrasing of something that was always bothering
             | me about automated driving.
             | 
             | Perhaps we should model our autopilots after horses:
             | refusal to move against anything unfamiliar, and biased
             | towards going back home on familiar routes.
        
             | willcipriano wrote:
             | I agree. In the north east at least pothole avoidance is a
             | critically important skill. Any "autopilot" without it
             | would be fairly useless around me as I'd have to take over
             | every 30 seconds to not end up with a flat tire. I have
             | adaptive cruse control and that's about as far as I'll
             | trust a computer to drive given the current tech.
        
       | nikkinana wrote:
       | Shake 'em down, shut that shit down. Non union fuckers.
        
       | jedberg wrote:
       | A lot of people in here saying it is not possible to drive safely
       | with partial self driving. I wonder, how many of those people
       | have actually driven a car with autopilot?
       | 
       | I have autopilot on my car, and it definitely makes me a better
       | and safer driver. It maintains my distance from the car in front
       | and my speed while keeping me in my lane, so my brain no longer
       | has to worry about those mundane things. Instead I can spend all
       | my brainpower focused on looking for potential emergencies,
       | instead of splitting time between lane keeping/following and
       | looking for emergencies.
       | 
       | I no longer have to look at my speedometer or the lane markers, I
       | can take a much broader view of the traffic and conditions around
       | me.
       | 
       | Before you say it's impossible to be safe driving with an
       | assistive product, I suggest trying one out.
        
         | yumraj wrote:
         | It is possible that you've learnt to drive like that and that
         | it works for you.
         | 
         | But, I feel that this depends on the type of driver and their
         | personality. I, for on, have never felt comfortable with cruise
         | controls, even adaptive ones, let alone partial self-driving. I
         | had always found that I am more comfortable when I was driving
         | rather than trying to make sure that the adaptive cruise
         | control is able to make a complete stop in case of emergencies.
         | Perhaps I'm just a little untrusting and paranoid :).
        
         | new_realist wrote:
         | As someone who has used Autopilot extensively, I can tell you:
         | you only have the illusion of enhanced safety. In reality,
         | parts of your brain have shut down to save energy, and you've
         | lost some situational awareness, but you can't tell that's
         | happened.
        
         | e40 wrote:
         | For you, definitely. For the people I see reading while the car
         | drives them, not at all.
        
         | asdff wrote:
         | I think that's pretty reckless honestly, you put a lot of faith
         | in the system being able to detect lane markers. Other than
         | that I could see how adaptive cruise control can be nice, but
         | it's also not hard to engage cruise control and fine tune your
         | speed to the conditions by tapping up or down on the buttons on
         | the wheel.
        
         | MisterTea wrote:
         | > "* It maintains my distance from the car in front and my
         | speed while keeping me in my lane, so my brain no longer has to
         | worry about those mundane things.*"
         | 
         | Ive been driving for 25 years, cars, trucks, trailers, standard
         | and auto transmissions, and I have never once thought to myself
         | "I'd be such a better diver if I didn't have to pay attention
         | to my speed, lane keeping or following distance" Why? Because
         | those mundane things are already on autopilot in my brain.
         | 
         | Posts like yours are so absurd to met that I cant help but
         | think shill.
        
           | jedberg wrote:
           | I've been driving for 29 years, and I never thought those
           | things either until I got autopilot (and I don't have a Tesla
           | BTW, I have a different autopilot system). While those things
           | were autopilot in my brain, they still took brain power. It's
           | so much more relaxing not worrying about those things.
           | 
           | It's like people who do math by hand and then get a
           | calculator.
        
           | spywaregorilla wrote:
           | Maybe you're a great driver then. Have you ever shared the
           | road with someone who was a terrible driver?
        
             | studentrob wrote:
             | Yes. Those are people who think they can go hands free, use
             | their phone or watch a movie while on autopilot.
        
         | sorokod wrote:
         | Survivorship bias?
        
           | spywaregorilla wrote:
           | That would only be relevant is a substantial portion of
           | people who felt poorly about tesla autopilot had literally
           | perished from it.
        
             | sorokod wrote:
             | It was said in jest, but to your comment, no need to perish
             | - just not be vocal about the negative feelings.
        
         | bob1029 wrote:
         | > my brain no longer has to worry about those mundane things
         | 
         | I would be terrified to share the road with someone of this
         | mindset. Your vehicle is a lethal weapon when you are driving
         | it around (assisted or otherwise). At no point can someone
         | claim that a tesla vehicle circa today is able to completely
         | assume the duty of driving. You are still 100% responsible for
         | everything that happens in and around that car. You had better
         | have a plan for what happens if autopilot decides to shit the
         | bed while a semi jackknifes in front of you.
         | 
         | The exceptions are what will kill you - and others - every
         | single time. It's not the boring daily drives where 50 car
         | pileups and random battery explosions occur. Maybe your risk
         | tolerance is higher. Mine is not. Please consider this next
         | time you are on the road.
        
           | jedberg wrote:
           | Do you get concerned about mathematical errors because the
           | computer is doing the calculation instead of someone doing it
           | by hand?
           | 
           | It's the same thing here. The computer is _assisting_ me so
           | that I can take care of the novel situations, the exceptions
           | if you will. I can pay closer attention to the road and see
           | that jackknifed trailer sooner because I wasn 't looking at
           | my speedometer to check my speed.
           | 
           | And I don't have a Tesla, I use a different autopilot system.
        
           | nexuist wrote:
           | This...is entirely the point OP is making. You get more brain
           | power to watch out for the semi jackknifing into you, the car
           | switching lanes without signaling, the truck about to lose a
           | bucket or chair from its bed. This is stuff you may not catch
           | when you're spending your brain power focusing on staying
           | between the lanes and keeping your distance between the car
           | in front.
           | 
           | When you automate away the mundane, exceptions are much
           | easier to catch.
        
         | throwaway09223 wrote:
         | Agreed. I have a rudimentary radar enhanced cruise control in
         | my minivan and I've found it's really helpful for maintaining a
         | safe stopping distance while driving.
        
         | tomdell wrote:
         | I would argue that partial self driving is an irresponsible
         | product not because it's impossible to drive safely with it,
         | but because so many people will use it as an excuse to pay
         | little to no attention to the road. If you personally are a
         | responsible driver and even a better driver with it, that's
         | great - but most people probably aren't going to use it the
         | same way, especially those without much of an understanding of
         | the technology - and especially given the way that Tesla
         | markets it.
        
           | joshuanapoli wrote:
           | > so many people will use it as an excuse to pay little to no
           | attention to the road
           | 
           | I guess that we have to look at the results here to judge
           | whether too many people are not paying attention. Hopefully
           | the investigation will reveal whether the autopilot incidents
           | of collision with emergency vehicles is significantly more
           | frequent or less frequent than from vehicles being driven in
           | the traditional way.
        
           | samstave wrote:
           | One would think that with "autopilot" there would be a limit
           | to speed and an increased "caution distance" the vehicle
           | maintains with everything.
           | 
           | I also think there should be dedicated lanes for self driving
           | cars..
           | 
           | A very good friend of mine was a sensor engineer at google
           | working on virtual sensors that interacted with hand gestures
           | in the air... and is now a pre-eminent sensor engineer for a
           | large japanese company everyone has heard of...
           | 
           | We drove from the bay to nprthern california in his tesla and
           | it was terrifying how much trust he put into that car. I got
           | car sick and ended up throwing up out the window...
           | 
           | Knowing what I know of machines working in tech since 1995 --
           | I would trust SHIT for self-driving just yet.
        
           | birken wrote:
           | Absolutely not true as a blanket statement. Maybe if the
           | driver monitoring is so lax that you could conceivably trick
           | the car into poorly driving itself, but the system I use,
           | Comma [1], has incredibly strict driver monitoring.
           | 
           | There is absolutely no doubt I'm a safer driver with Comma
           | than without it. I'm still in control, but Comma not only
           | allows me to expend less effort driving (which allows me to
           | stay alert over longer periods of time), but also be much
           | less emotional when driving. I'm pretty convinced that a
           | large percentage of accidents are caused by frustrated or
           | bored drivers doing crazy things that you just don't feel the
           | urge to do with the assistance of self-driving.
           | 
           | 1: https://comma.ai/
        
             | jedberg wrote:
             | I use the same system as you do, and I've noticed that if
             | you mention that system's name, you tend to get downvotes.
             | I haven't yet figured out why, not sure if there is a bot
             | or just a lot of Tesla fans who downvote the mention of our
             | system.
             | 
             | Edit: After one minute I got a downvote.
        
               | SECProto wrote:
               | It sounds like you're advertising it. "The future can be
               | yours, today. For the introductory monthly price of
               | 79.99. Sign up here[1]"
        
               | jedberg wrote:
               | This doesn't even make sense. Simply mentioning the name
               | of a product I use is not advertising. Otherwise, is
               | every person here who mentions Tesla advertising too?
        
           | woah wrote:
           | I once talked to a guy who bragged about having Autopilot
           | drive him home when he's drunk
        
           | jskrn wrote:
           | Well said, that last bit especially. The regulations on
           | medical devices are on how the manufacturer markets it.
           | Should be the same for driving technology.
        
           | Drunk_Engineer wrote:
           | The technical term for this is Risk Compensation:
           | 
           | "Risk compensation is a theory which suggests that people
           | typically adjust their behavior in response to perceived
           | levels of risk, becoming more careful where they sense
           | greater risk and less careful if they feel more protected."
        
             | xahrepap wrote:
             | Reminds me of this kind of thing:
             | 
             | https://usa.streetsblog.org/2017/09/13/wide-residential-
             | stre...
             | 
             | I was first introduced to "wide streets in neighborhoods
             | are more dangerous than narrow" on HN years ago. (I don't
             | think it was the linked article, but that was the first one
             | that came up just now after a search :P )
             | 
             | Since having read that, I've actually noticed how true this
             | is, at least to me anecdotally. When I'm driving in a
             | neighborhood with crowded streets, I can't bring myself to
             | go over 15MPH, much less over the speed limit (typically 25
             | in neighborhoods in the US).
             | 
             | Wide streets give a sense of security. So I feel like
             | people are less likely to pay attention going around bends,
             | parked cars, etc, than if they didn't have that sense of
             | security.
        
             | MichaelZuo wrote:
             | Also moral hazard, kinda.
        
           | wilg wrote:
           | This is a question that is answerable with the right data -
           | we can just see if it's safer or not.
        
             | malwarebytess wrote:
             | Doesn't the data show that cars with assistive technologies
             | are in fewer non-fatal and fatal accidents?
        
               | new_realist wrote:
               | Tesla marketed Autopilot != responsibly implemented
               | assistive safety systems.
        
             | tomdell wrote:
             | It looks like the federal government is beginning to
             | collect and analyze relevant data, which will be
             | interesting.
             | 
             | https://www.latimes.com/business/story/2021-06-29/nhtsa-
             | adas...
             | 
             | Tesla released data in the past, but that's quite suspect
             | as they have an obvious agenda and aren't known for open
             | and honest communication.
             | 
             | https://www.latimes.com/business/autos/la-fi-hy-tesla-
             | safety...
        
           | kemiller wrote:
           | Yes there have been stories about irresponsible people. Do
           | you have any evidence that this is the common case? The
           | aggregate evidence seems to suggest reduced accidents and
           | reduced fatalities.
        
             | gusgus01 wrote:
             | There was a study that adaptive cruise control and lane
             | assist leads to more people speeding:
             | https://www.iihs.org/news/detail/adaptive-cruise-control-
             | spu...
             | 
             | They then use a "common formula" to show that that leads to
             | more fatal accidents, but didn't actually study on actual
             | crash data.
        
         | rubicon33 wrote:
         | Something tells me the majority of people with partial self
         | driving aren't using it as a means of staying more focussed on
         | the road. There's a pesky little device buzzing around in
         | everyone's pocket that is more likely the recipient of this
         | newfound attention.
        
         | hellbannedguy wrote:
         | There's not a small part of your psyche that tells you it's ok
         | to drive while tired, or all the way home from that Las Vegas
         | trip because the technology is so good?
        
           | jedberg wrote:
           | The thing is, I get a lot less tired when I'm driving now,
           | because I get to focus only on the novel stimulus (possible
           | emergencies) and not the mundane.
           | 
           | But no, I don't trust it to drive itself. If I'm tired I
           | won't drive, regardless of autopilot.
        
         | breakfastduck wrote:
         | Thats not what they bloody sell it as, though. Thats the key.
        
         | sgustard wrote:
         | I agree 100% with jedberg as to my own driving experience with
         | autopilot. Works great, and I still pay complete attention
         | because I don't want to die in a fiery crash. If you're not
         | going to pay attention, driving a dumber car doesn't make it
         | safer.
        
         | jeffrallen wrote:
         | I tried it and found I was spending brainpower fighting the
         | system, sending noise inputs wrt real world conditions in order
         | to trick the system into not disengaging because it decided I
         | was not "driving" it enough. The hacker in meet loved it, the
         | rational person in me said, turn that off before it makes you
         | crash!
        
         | andyxor wrote:
         | i've rented model X with the latest FSD a few weeks ago and
         | even simple things like lane detection are very inconsistent
         | and unpredictable.
         | 
         | I don't know if this "AI" has any sort of quality control, but
         | how difficult is it to test if it detects a solid white line on
         | the side of the road in at least 6 out of 10 tries
         | 
         | it also tends to suddenly disengage and pass control to the
         | driver at most dangerous parts of the trip e.g. when passing
         | other car in narrow lane, etc.
         | 
         | This "driver assistant" is a series of disasters in the making.
        
         | int_19h wrote:
         | I have a car that does those things as well, and I use it a
         | lot... but it's not Tesla, and its manufacturer doesn't refer
         | to it "autopilot" or "self driving", but rather "advanced
         | cruise control".
        
         | aguasfrias wrote:
         | You might be giving too much credit to your ability to pay
         | attention to your surroundings. It is possible that looking
         | around as a passenger might actually increases risk. There's no
         | way to tell other than looking at the data.
         | 
         | Personally, I tend to turn off things like lane keeping because
         | I end up having to babysit them more than I would like. It
         | doesn't always read the lanes correctly, though I have not
         | tried Tesla's technology yet.
        
         | CommieBobDole wrote:
         | I drove a friend's Model 3, and within five minutes of driving
         | on autopilot it got confused at an intersection and tried to
         | make a pretty sudden 'lane change' to the wrong side of a
         | divided road.
         | 
         | Obviously that's a single anecdote, and I don't know if it
         | would have gone through with it because I immediately
         | corrected, but that was my experience.
        
           | ec109685 wrote:
           | I bet it would have made that mistake.
           | 
           | The question is whether a system that absolutely requires
           | that you pay attention going through intersections (which you
           | should obviously do) is safer in aggregate than not having
           | those features enabled at all in those situations.
           | 
           | E.g. are weird lane changes that people don't catch happening
           | more frequently than people zooming through red lights
           | because they weren't paying attention. Only the data can show
           | that, and Tesla _should_ share it.
        
       | t0rt01se wrote:
       | About time some adults got involved.
        
       | myko wrote:
       | I don't know how Tesla ever presumes to achieve FSD when they
       | cannot detect stopped objects in the road, especially emergency
       | vehicles. This is incredibly disappointing.
       | 
       | Does anyone know if the FSD Beta has this ability?
        
         | _ph_ wrote:
         | All these crashes happened with the radar-controlled auto
         | pilot. A radar system basically cannot detect static
         | obstactles, as it doesn't have the spacial resolution to
         | distinguish an obstacle in your lane from something right
         | beside or above it (bridges). They can only use the radar to
         | follow other cars, because these are not stationary objects.
         | 
         | Recently, Tesla switched from radar-based to pure optical
         | obstacle recognition. This should vastly improve this kind of
         | behavior. Ironically that the investigation starts at a moment
         | when they basically got rid of the old system.
         | 
         | Look on youtube for videos of the FSD beta. It is amazingly
         | good at recognizing the surroundings of a car, including parked
         | vehicles at the road side.
        
         | kwhitefoot wrote:
         | > cannot detect stopped objects in the road,
         | 
         | Neither can Volvo or VW.
         | 
         | Actually my 2015 Ap1.5 Model S does detect stopped vehicles,
         | unfortunately not reliably.
        
       | guerby wrote:
       | There is about 36000 death on the road per year in the USA for
       | 280 millions vehicles, that's 128.5 death/million vehicule/year
       | 
       | If we assume the number of tesla autopilot death double this year
       | to 8 (from 4 at the time of probe launch), for about 900 thousand
       | tesla on the road in USA, that's 8.9 autopilot death/million
       | tesla/year.
       | 
       | Ratio between the numbers of 14.4.
       | 
       | Tesla reporting says for Q1 2021 one crash on autopilot per 4.19
       | millions miles vs one crash per 484 thousand miles all vehicules.
       | 
       | Ratio between the numbers of 8.7
       | 
       | All numbers are full of biases and their ratio probably aren't
       | that meaningful but they end up in the same magnitude.
       | 
       | Interesting data there "Fatality Facts 2019 Urban/rural
       | comparison":
       | 
       | https://www.iihs.org/topics/fatality-statistics/detail/urban...
       | 
       | "Although 19 percent of people in the U.S. live in rural areas
       | and 30 percent of the vehicle miles traveled occur in rural
       | areas, almost half of crash deaths occur there. "
       | 
       | I was shocked that in the USA in 2019 about 40-46% of all road
       | death people were unbelted, while 90% of front seat people wear
       | seat belts according to observation studies.
       | 
       | Incidentally tesla car will beep to no end if weight is detected
       | on a seat and seat belt isn't clicked: I have to click the seat
       | belt when I put my (not so heavy) bag on the passenger seat since
       | there's no software option to disable the beeping.
        
         | jazzyjackson wrote:
         | My issue with comparing these statistics is that on the highway
         | I see no shortage of reckless driving: speeding 20 over,
         | weaving traffic, etc. Subtract this population (maybe by taking
         | away their license and building public transit for them) and
         | what do the numbers look like? Your seat belt stat supports
         | this, a lot of drivers aren't even trying not to die.
         | 
         | Of course, highway driving is the least dangerous (per
         | passenger mile) since everything is so predictable, I don't
         | know how many deaths are caused by t-bones at intersections but
         | that at least should disappear now that auto-radar brakes are a
         | thing... (tesla thinks it's too good for radar of course, even
         | tho it can't recognize a god damn fire truck is in the way)
        
         | jeffbee wrote:
         | You can't really make the comparison between the entire US
         | fleet and Tesla alone. All Teslas are newer than the median car
         | in the fleet, and Tesla owners are self-selected among wealthy
         | people, because the cars are pretty expensive. The IIHS says
         | that deaths per million vehicle-years among midsize luxury cars
         | is 20. There are many cars where no driver died in a given
         | year, for example the Mercedes C-class "4matic" sedan .
        
         | akira2501 wrote:
         | > If we assume
         | 
         | You shouldn't. 16% of accidents are pedestrians. 8% are
         | motorcyclists. 40% of accidents involve excessive speed or
         | drugs and alcohol.
         | 
         | Accidents aren't a fungible item you can do this with.
         | 
         | > bag on the passenger seat since there's no software option to
         | disable the beeping.
         | 
         | There is in the US for the rear seats. Additionally, you can
         | just leave the belt always clicked in and just sit on top of
         | them. There aren't many great technological solutions to human
         | behavior.
        
         | btbuildem wrote:
         | > I was shocked that in the USA in 2019 about 40-46% of all
         | road death people were unbelted, while 90% of front seat people
         | wear seat belts according to observation studies.
         | 
         | Doesn't that just speak to the effectiveness of seatbelts? Most
         | people wear them, and two-fifths of those who died in a crash
         | did not wear a seatbelt.
         | 
         | If we had the same proportion of deaths as we have seatbelt
         | wearers, that would indicate the belts are ineffective.
        
           | guerby wrote:
           | Yes we agree.
           | 
           | It's just that in France "only" 20-25% of fatalities are for
           | people not wearing seatbelt.
           | 
           | Observation statistics are at about 98-99% of front seat
           | users wearing seat belt in France.
           | 
           | Seat belt in front is mandatory since 1st july 1973 and back
           | seat since 1st october 1990.
           | 
           | So seat belts not Tesla autopilot or whatever would save
           | around 8000 lives per year in the USA.
           | 
           | Are tesla cars in the USA nagging about seat belt with no
           | software off switch like in France?
        
           | zzt123 wrote:
           | Assuming that belted and unbelted people get into accidents
           | at the same rate, and given another commenter mentioning that
           | Autopilot users have 8.7x lower accident rate than baseline,
           | that makes Autopilot a greater safety add than seat belts,
           | no?
        
             | foepys wrote:
             | Can Autopilot work in heavy rain or fog? If not, those
             | comparisons are useless. Those are the conditions where
             | most accidents occur, not in sunny Californian weather.
        
               | guerby wrote:
               | Yes up to a point where it gives up and asks the driver
               | to take back the wheel.
               | 
               | But at this point you really see nothing and you'll limit
               | your speed to 10-40 km/h by yourself.
               | 
               | I used it in those situation on my tesla model 3 to be
               | able to focus a maximum on the small visibility left as a
               | driver, and with both hands strongly on the wheel and
               | foot on the brake, low visibility is really dangerous and
               | scary on the road.
               | 
               | Part of the issue is that you don't know what speed the
               | car arriving behind you will have so where's your optimum
               | speed? Too slow and rear ended by bad drivers, too fast
               | and it won't go well.
               | 
               | It's fresh on my mind since I had such driving conditions
               | two weeks ago on the highway. Trucks stayed at suicidal
               | 90 km/h ...
        
         | darkwizard42 wrote:
         | Two notes:
         | 
         | 1. generally speaking the right way to think about
         | accidents/fatalities/hard breaking events is per miles driven
         | given that risk scales with time spent on road (and miles
         | driven is the best proxy we have at the moment, insurance
         | companies use this stat)
         | 
         | 2. If wearing a seat belt prevents a ton of fatalities as
         | advertised and generally proven, it would make sense that of
         | the road fatalities that do happen, many are due to not wearing
         | a seat belt.
         | 
         | 10% of people not wearing seat belts is still hundreds of
         | millions of miles driven without seat belts.
        
       | supperburg wrote:
       | The Tesla FSD beta is what we all dreamed of in the 90s. It's
       | mind blowing. Its so crazy that it's finally arrived, though not
       | able to fully and reliably drive itself in every circumstance,
       | and nobody seems to care. People only seem to be foaming at the
       | mouth about the way way the product is labeled. If HN found the
       | holy grail but it was labeled "un-holy grail" then HN would
       | apparently chuck it over their shoulder.
        
         | [deleted]
        
       | antattack wrote:
       | NHTSA is lumping TACC with Autopilot to increase number of
       | incidents and make the case sound more serious:
       | 
       | "The involved subject vehicles were all confirmed to have been
       | engaged in either Autopilot or Traffic Aware Cruise Control
       | during the approach to the crashes," NHTSA said in a document
       | opening the investigation.
       | 
       | TACC is very different from Autopilot.
        
         | jdavis703 wrote:
         | NHTSA reports are usually very neutral. I'd be very surprised
         | if they were out to get Tesla, or really any other corporation
         | or individual.
        
           | antattack wrote:
           | Every car with adaptive cruise has similar disclaimer,
           | pointing how unreliable system is recognizing parked vehicles
           | [1]:
           | 
           | Safety Consideration When Using Adaptive Cruise Control
           | 
           | * The system can only brake so much. Your complete attention
           | is always required while driving.
           | 
           | * Adaptive Cruise Control does not steer your vehicle. You
           | must always be in control of vehicle steering.
           | 
           | * The system may not react to parked, stopped or slow-moving
           | vehicles. You should always be ready to take action and apply
           | the brakes.
           | 
           | [1]https://my.gmc.com/how-to-support/driving-
           | performance/drivin...
        
       | literallyaduck wrote:
       | I believe we should have safety probes. Lots of people who have
       | taken money from the auto industry want this specifically for
       | Tesla. There is a strong possibility this is political punishment
       | for wrongthink.
        
       | tmountain wrote:
       | > In one of the cases, a doctor was watching a movie on a phone
       | when his vehicle rammed into a state trooper in North Carolina.
       | 
       | Doesn't autopilot require you to put your hands on the wheel
       | fairly regularly? Are these incidents just a matter of people
       | using this feature outside of its intended use case?
        
         | [deleted]
        
         | Ajedi32 wrote:
         | One hand on his phone one hand on the wheel, I assume.
         | 
         | Newer versions of Autopilot watch to make sure you keep your
         | eyes on the road, probably to prevent this very scenario[1].
         | 
         | [1]: https://www.theverge.com/2021/5/27/22457430/tesla-in-car-
         | cam...
        
       | [deleted]
        
       | gamblor956 wrote:
       | Craziest statistic: of the 31 serious crashes involving driver-
       | assist systems in the U.S. since June 2016, _25 of them_ involved
       | Tesla Autopilot.
        
         | blueplanet200 wrote:
         | https://en.wikipedia.org/wiki/Base_rate_fallacy
        
           | gamblor956 wrote:
           | It does not matter how you dice up the statistics. Of the
           | millions of cars with driver assist, Tesla's make up the
           | majority of serious accidents, and the majority of accidents
           | of vehicles crashing into emergency vehicles on the road. On
           | an absolute basis, relative basis, per capita basis, etc.,
           | Tesla Autopilot has more, serious crashes than all other
           | driver assist systems. (And note that this does not include
           | any FSD-related accidents.)
           | 
           | This is an issue because _Tesla_ markets its cars as being
           | "safer" than other company's vehicles, and the data shows
           | that their driver assist system is objectively not.
        
       | gundmc wrote:
       | This is why I believe the approach of incremental improvement
       | towards full self driving is fundamentally flawed. These advanced
       | driver assist tools are good enough to lull users into a false
       | sense of security. No amount of "but our terms and conditions say
       | you need to always pay attention!" will overcome human nature
       | building that trust and dependence.
        
         | antattack wrote:
         | All Level 2 systems need to better integrate with the driver.
         | Upon engagement driver and driver assist are team where
         | communication and predictability is crucial.
        
         | dalbasal wrote:
         | I think the fundamental flaw is indisputable. Everyone is aware
         | of in-between stage problems. I don't think it's an
         | insurmountable flaw.
         | 
         | These things are on the road already. They have issues, but so
         | do human only cars. Tweaks probably get made, like some special
         | handling of emergency vehicle scenarios. But, it's not enough
         | to stop it.
         | 
         | Meanwhile, it's not a permanent state. Self driving technology
         | is advancing, becoming more common on roads. Procedures, as
         | well as infrastructure, is growing around the existence of self
         | driven cars. Human supervisor or not, the way these things use
         | the road affects the design of roads. If your emergency speed
         | sign isn't being headed by self driven cars, your emergency
         | speed sign has a bug.
        
         | slg wrote:
         | One problem that is often ignored in these debates is that
         | people already don't always pay attention while driving. Spend
         | some time looking at other drivers next time you are a
         | passenger in slow traffic. The number of drivers on their
         | phones, eating, doing makeup, shaving, or even reading a book
         | is scary.
         | 
         | It therefore isn't a clean swap of a human paying attention to
         | a human who isn't. It becomes a complicated equation that we
         | can't just dismiss with "people won't pay attention". It is
         | possible that a 90%/10% split of drivers paying attention to
         | not paying attention is more dangerous when they are all
         | driving manually than a 70%/30% split if those drivers are all
         | using self-driving tech to cover for them. Wouldn't you feel
         | safer if the driver behind you who is answering texts was using
         | this incremental self-driving tech rather than driving
         | manually?
         | 
         | No one has enough data on the performance of these systems or
         | how the population of drivers use them to say definitively that
         | they are either safer or more dangerous on the whole. But it is
         | definitely something that needs to be investigated and
         | researched.
        
         | MR4D wrote:
         | A car that beeps when I drift out of lane, or beeps when I go
         | too fast before a curve, or _beeps like hell_ if I cross over
         | the center median would be hugely useful, because a record of
         | every warning would be there, whether correct or not.
         | 
         | Conversely, if it didn't warn me right before an accident, then
         | the absence of that warning would be useful too.
         | 
         | All of that information should be put back into the model based
         | on crash reporting. Everything else can be ignored.
         | 
         | I would argue that the information should be available to all
         | automakers (perhaps using the NHTSA as a conduit), so that each
         | of them have the same _safety information_ , but can still
         | develop their own models. The FAA actually does this already
         | with the FAA Accident and Incident Data Systems [0] and it has
         | worked pretty darn well.
         | 
         | [0] - https://www.asias.faa.gov/apex/f?p=100:2:::NO:::
        
           | oceanghost wrote:
           | The new Toyota RAV4's have this feature-- if you go out of
           | bounds in your lane they beep and the steer5ing wheel gives a
           | bit of resistance.
           | 
           | It also reads the speed limit signs and places a reminder in
           | the display. I think it can brake if it detects something in
           | front of it, but I'm not certain.
        
             | MR4D wrote:
             | Many other cars do as well.
             | 
             | My main point (perhaps buried more than it should have
             | been) is that by centralizing accident data along with
             | whether an alert went off (or not), and sharing that with
             | all automobile manufacturers can help this process proceed
             | better.
             | 
             | Right now the data is highly fragmented and there is not
             | really a common objective metric by which to make decisions
             | to improve models.
        
         | api wrote:
         | Full self driving is one of those things where getting 80% of
         | the way there will take 20% of the effort and getting the
         | remaining 20% of the way there will take 80% of the effort.
         | 
         | Tesla auto-drive seems like it's about 80% of the way there.
        
         | Robotbeat wrote:
         | I actually disagree. (And before you respond, please read my
         | post because it's not a trivial point.)
         | 
         | The fact that an huge formal investigation happened with just a
         | single casualty is proof that it may actually be superior for
         | safety in the long-term (when combined with feedback from
         | regulators and government investigators). One death in
         | conventional vehicles is irrelevant. But because of the high
         | profile of Tesla's technology, it garners a bunch of attention
         | from the public and therefore regulators. This is PRECISELY the
         | dynamic that led to the ridiculously safe airline record. The
         | safer it is, the more that rare deaths will be investigated and
         | the causes sussed out and fixed by industry and regulators
         | together.
         | 
         | Perhaps industry/Tesla/whoever hates the regulators and
         | investigations. But I think they are precisely what will cause
         | self driving to become ever safer, and eventually become as
         | safe as industry/Tesla claims, safer than human drivers while
         | also being cheap and ubiquitous. Just like airline travel
         | today. A remarkable combination of safety and affordability.
         | 
         | This might be the only way to ever do it. I don't think the
         | airline industry could've ever gotten to current levels of
         | safety by testing everything on closed airfields and over empty
         | land for hundreds of millions of flight hours before they had
         | sufficient statistics to be equal to today.
         | 
         | It can't happen without regulators and enforcement, either.
        
           | dboreham wrote:
           | This line of thinking is flawed because it assumes a smooth
           | surface over the safety space, where if you make incremental
           | improvements you will head towards some maxima of safety.
           | e.g. : the wing fell off; investigate; find that you can't
           | use brittle aluminum; tell aircraft manf. to use a more
           | ductile alloy. Self driving technology isn't like that -- you
           | can't just file a bug "don't mistake a human for a plastic
           | bag", fix that bug and move on to the next one. No number of
           | incremental fixes will make self driving that works as any
           | reasonable human would expect it to work.
        
           | 3pt14159 wrote:
           | I largely agree with you, but I just wish regulators would
           | start by only allowing these assist programs for people that
           | are already known to be poor drivers. The elderly and
           | convicted drunk drivers, for example. That way we could have
           | the best of both worlds.
        
             | unionpivo wrote:
             | I disagree. I would not put people who shoved poor judgment
             | in situation, where they can further hurt other or
             | themselves it. People like that are more likely not to pay
             | attention and do other irresponsible things.
             | 
             | Go with safest drivers first.
        
             | Seanambers wrote:
             | Teslas Autopilot system is almost 10X safer than the
             | average human driver already based on the latest 2021 Q1
             | numbers.
             | 
             | https://www.tesla.com/en_CA/VehicleSafetyReport
        
               | [deleted]
        
               | throwaway0a5e wrote:
               | That impressive claim narrows to approximately the noise
               | floor if you compare to comparable drivers in comparable
               | cars.
        
               | [deleted]
        
               | ben_w wrote:
               | > In the 1st quarter, we registered one accident for
               | every 4.19 million miles driven in which drivers had
               | Autopilot engaged. For those driving without Autopilot
               | but with our active safety features, we registered one
               | accident for every 2.05 million miles driven. For those
               | driving without Autopilot and without our active safety
               | features, we registered one accident for every 978
               | thousand miles driven. By comparison, NHTSA's most recent
               | data shows that in the United States there is an
               | automobile crash every 484,000 miles.
               | 
               | I think the comparison should be Tesla with/without AI,
               | not Tesla/not-Tesla; so roughly either x2 or x4 depending
               | on what the other active safety features do.
               | 
               | It's not nothing, but it's much less than the current
               | sales pitch -- and the current sales pitch is itself the
               | problem here, for many legislators.
        
               | Ajedi32 wrote:
               | > we registered one accident for every 4.19 million miles
               | driven in which drivers had Autopilot engaged [...] for
               | those driving without Autopilot but with our active
               | safety features, we registered one accident for every
               | 2.05 million miles driven
               | 
               | This still isn't the correct comparison. Major selection
               | bias with comparing miles with autopilot engaged to miles
               | without it engaged, since autopilot cannot be engaged in
               | all situations.
               | 
               | A better test would be to compare accidents in Tesla
               | vehicles with the autopilot feature enabled (engaged or
               | not) to accidents in Tesla vehicles with the autopilot
               | feature disabled.
        
               | HPsquared wrote:
               | Even then, there's selection: people who do a lot of
               | highway driving are more likely to opt for Autopilot than
               | those who mostly drive in the city.
        
               | bluGill wrote:
               | As was stated elsewhere, most accidents happen in city
               | driving where autopilot cannot be activated so the
               | with/without AI is meaningless. We need to figure out
               | when the AI could have been activated but wasn't, if you
               | do that then you are correct.
        
               | freshpots wrote:
               | "..most accidents happen in city driving where autopilot
               | cannot be activated so the with."
               | 
               | Yes it can. The only time it can't be activated is if
               | there is no clearly marked center line.
        
               | Robotbeat wrote:
               | On the contrary to your overall point: The fatal crash
               | rate per miles driven is almost 2 times higher in rural
               | areas than urban areas. Urban areas may have more
               | accidents, but the speeds are likely lower (fender
               | benders).
               | 
               | https://www.iihs.org/topics/fatality-
               | statistics/detail/urban...
        
               | Ajedi32 wrote:
               | Tesla's _vehicles_ are almost 10X safer than the average
               | _vehicle_. Whether their autopilot system is contributing
               | positively or negatively to that safety record is
               | unclear.
               | 
               | The real test of this would be: of all Tesla vehicles,
               | are the ones with autopilot enabled statistically safer
               | or less safe than the ones without autopilot enabled?
        
               | input_sh wrote:
               | ...according to Tesla, based on the data nobody else can
               | see?
        
               | czzr wrote:
               | No, it's not. I guess we're doomed to see this at-best-
               | misleading-but-really-just-straight-up-lying analysis
               | every time there's an article about this.
               | 
               | Makes me laugh, especially with the "geeks are immune to
               | marketing" trope that floats around here equally as
               | regularly.
        
               | ra7 wrote:
               | Tesla's safety report lacks data and is extremely
               | misleading.
               | 
               | 1. Autopilot only works on (or intended to work on)
               | highways. But they are comparing their highway record to
               | all accident records including city driving, where
               | accident rate is far higher than highway driving.
               | 
               | 2. They're also comparing with every vehicle in the
               | United States including millions of older vehicles.
               | Modern vehicles are built for higher safety and have a
               | ton of active safety features (emergency braking,
               | collision prevention etc). Older vehicles are much more
               | prone to accidents and that skews the numbers.
               | 
               | The reality is Teslas are no safer than any other
               | vehicles in its class ($40k+). Their safety report is
               | purely marketing spin.
        
               | deegles wrote:
               | They also include miles driven by previous versions of
               | their software in the "safe miles driven" tally. There's
               | no guarantee any improvement would not have resulted in
               | more accidents. They should reset the counter on every
               | release.
        
               | wilg wrote:
               | > The reality is Teslas are no safer than any other
               | vehicles in its class ($40k+).
               | 
               | Would another way of saying this be that they are as safe
               | as other vehicles in that class? And that therefore
               | Autopilot is not more unsafe than driving those other
               | cars?
        
               | ra7 wrote:
               | I would probably agree, but I also think it's a case of
               | "need more data".
               | 
               | We should really compare Autopilot with its competitors
               | like GM's Super Cruise or Ford's Blue Cruise, both of
               | which offer more capabilities than Autopilot. That will
               | show if Tesla's driver assist system is more or less safe
               | than their competitors product.
        
               | ggreer wrote:
               | What capabilities does GM or Ford have that Tesla
               | doesn't? Neither GM nor Ford have rolled out automatic
               | lane changing. Teslas have been doing that since 2019.
               | 
               | The reason GM's Super Cruise got a higher rating by
               | Consumer Reports was because CR didn't even test the
               | capabilities that only Tesla had (such as automatic lane
               | change and taking offramps/onramps). Also, the majority
               | of the evaluation criteria weren't about capabilities.
               | eg: "unresponsive driver", "clear when safe to use", and
               | "keeping the driver engaged".[1]
               | 
               | 1. https://www.consumerreports.org/car-safety/cadillac-
               | super-cr...
        
             | lastofthemojito wrote:
             | Incentivizing drunk driving seems dangerous.
        
               | joshgrib wrote:
               | Requiring people to buy a special car/system to be able
               | to drive doesn't seem like an incentive - it seems
               | similar to the interlock system we currently require
               | drunk drivers to purchase to be able to drive.
               | 
               | If anything a driver monitoring system seems even better
               | than the interlock system, for example you couldn't have
               | your kids/friends blow for you to bypass it.
        
           | treeman79 wrote:
           | Ignoring other issues.
           | 
           | Asking someone to pay attention when they are not doing
           | anything is unrealistic. I would be constantly bored /
           | distracted. My wife would instantly fall asleep. Etc etc.
        
           | vntok wrote:
           | This argument is flawed, because when regulators investigate
           | a Tesla crash, Waymo doesn't care the slightest. The
           | technologies (emphasis on having skeuomorphic cameras vs a
           | lidar), approaches (emphasis on generating as many situations
           | as possible in simulated worlds and carefully transitioning
           | to the business case vs testing as early in the real world
           | with background data captation) and results are so different
           | between the actors in this specific industry that one's flaws
           | being fixed or improved won't necessarily translate into
           | others benefitting from it.
           | 
           | Conversely, when Waymo iterates and improves their own safety
           | ratios by a significant amount, that evidently does not
           | result in Tesla's improving in return.
        
             | unionpivo wrote:
             | well when regulators investigate Boing, Airbus probably
             | doesn't care either.
             | 
             | Until it leads to something systemic, that then regulator
             | mandates for all vehicles
        
               | vntok wrote:
               | Boeing and Airbus operate largely in the same direction
               | with similar technical solutions to somilar problems.
               | 
               | Not the case at all between lidars and cameras.
        
           | sandworm101 wrote:
           | Then why not flip the scheme. Instead of have the human as
           | backup to the machine, make the machine backup the human. Let
           | the human do all the driving and have the robot jump in
           | whenever the human makes a mistake. Telemetry can then record
           | all the situations where the human and the machine disagreed.
           | That should provide all the necessary data, with the benefit
           | of the robot perhaps preventing many accidents.
           | 
           | Of course this is impossible in the real world. Nobody is
           | going to buy a car that will randomly make its own decisions,
           | that will pull the wheel from your hands ever time it thinks
           | you are making an illegal lane change. Want safety? How about
           | a Tesla that is electronically incapable of speeding. Good
           | luck selling that one.
        
             | tablespoon wrote:
             | > Of course this is impossible in the real world. Nobody is
             | going to buy a car that will randomly make its own
             | decisions, that will pull the wheel from your hands ever
             | time it thinks you are making an illegal lane change.
             | 
             | Yeah, add to that the unreliability of Tesla's system means
             | that it _cannot_ pull the wheel from the driver, because it
             | 's not unusual for it to want to do something dangerous and
             | need to be stopped. You don't want it to "fix" a mistake by
             | driving someone into the median divider.
        
             | CrazyStat wrote:
             | > Want safety? How about a Tesla that is electronically
             | incapable of speeding.
             | 
             | That would be unsafe in many situations. If the flow of
             | traffic is substantially above the speed limit--which it
             | often is--being unable to match it increases the risk of
             | accident. This is known as the Solomon curve [1].
             | 
             | [1] https://en.wikipedia.org/wiki/Solomon_curve
        
               | [deleted]
        
               | MichaelGroves wrote:
               | A self driving car obviously needs to be aware of other
               | cars on the road. I don't see any reason why the car
               | couldn't observe other cars, see what speed they are
               | going at, and refuse to go faster than the rest. A car
               | that refuses to do 120mph when all the other cars are
               | doing 60mph in a 50mph zone should be trivial.
               | 
               | (Trivial _if_ the self driving tech works at all....)
        
               | TheCapn wrote:
               | You're getting downvoted for this comment apparently, but
               | I'm still of the firm belief that we will never see full
               | autonomous driving without some sort of P2P network among
               | cars/infrastructure.
               | 
               | There's just too much shit that can't be "seen" with a
               | camera/sensor in conjested traffic. Having a swarm of
               | vehicles all gathering/sharing data is one of the only
               | true ways forward IMO.
        
               | sandworm101 wrote:
               | Ok. Electronically incapable of driving more than 10%
               | faster than other traffic.
        
               | rad_gruchalski wrote:
               | How do you know which traffic is "the other traffic"? So
               | basically - there's no top limit.
        
               | treesknees wrote:
               | > Subsequent research suggests significant biases in the
               | Solomon study, which may cast doubt on its findings
               | 
               | With the logic presented in the theoretical foundation
               | section, it seems that the safer move would actually be
               | slow down and match the speed of all the trucks and other
               | large vehicles... which won't happen.
               | 
               | Matching speed sounds great, except there are always
               | people willing to go faster and faster. In my state they
               | raised the speed limit from 70 to 75, it just means more
               | people are going 85-90. How is that safer?
        
               | filoleg wrote:
               | To address your last paragraph, everyone going 85-90 is
               | less safe than everyone going 70-75, you are correct.
               | 
               | However, you individually going 70-75 when everyone else
               | is going 85-90 is less safe than you going 85-90 like
               | everyone else in the exact same situation.
               | 
               | >there are always people willing to go faster and faster
               | 
               | That's why no one says "go as fast as the fastest vehicle
               | you see", it is "go with the general speed of traffic".
               | That's an exercise for human judgement to figure that one
               | out, which is why imo it isn't a smart idea to have the
               | car automatically lock you out of overriding the speed
               | limit.
        
               | kiba wrote:
               | People are going faster because they felt it's safer, not
               | because of the speed limit. You can design roads that
               | cause humans to slow down and be more careful.
        
               | CamperBob2 wrote:
               | Or you can just set the speed limit appropriately, in
               | accordance with sound engineering principles. A radical
               | notion, I guess.
        
             | velcii wrote:
             | >Let the human do all the driving and have the robot jump
             | in whenever the human makes a mistake.
             | 
             | I really don't think that would give many data points,
             | because all of the instances would be when a human fell
             | asleep or wasn't paying attention.
        
               | [deleted]
        
             | wizzwizz4 wrote:
             | > * Let the human do all the driving and have the robot
             | jump in whenever the human makes a mistake.*
             | 
             | Because when the human disagrees with the machine, the
             | machine is usually the one making a mistake. It might
             | prevent accidents, but it would also cause them, and you
             | lose predictability in the process (you have to model the
             | human _and_ the machine).
        
               | foobiekr wrote:
               | I don't know how anyone can look at the types of
               | accidents Tesla is having and conclude that it should
               | override the human driver.
        
             | cactus2093 wrote:
             | > How about a Tesla that is electronically incapable of
             | speeding. Good luck selling that one.
             | 
             | Instead they did the exact opposite with the plaid mode
             | model S, lol. It kind of works against their claims that
             | they prioritize safety when their hottest new car - fully
             | intended for public roads - has as its main selling point
             | the ability to accelerate from 60-120 mph faster than any
             | other car.
        
             | jeofken wrote:
             | On most or all roads below 100km/h autopilot won't allow
             | speeding, and therefore I drive at the limit, which I know
             | I would not have done if I controlled it. It also stays in
             | the lane better than I do, keeps distance better, and more.
             | Sometimes it's wonky when the street lines are unclear.
             | It's not perfect but a better driver than I am in 80% of
             | cases.
             | 
             | My insurance company gives a lower rate if you buy the full
             | autopilot option, and that to me indicates they agree it
             | drives better than I, or other humans, do.
        
               | Johnny555 wrote:
               | _On most or all roads below 100km /h autopilot won't
               | allow speeding, and therefore I drive at the limit, which
               | I know I would not have done if I controlled it_
               | 
               | If following the speed limit makes cars safer, another
               | way to achieve that without autopilot is to just have all
               | cars limit their speed to the speed limit.
               | 
               |  _Sometimes it's wonky when the street lines are unclear.
               | It's not perfect but a better driver than I am in 80% of
               | cases_
               | 
               | The problem is in those 20% of cases where you'd lulled
               | into boredom by autopilot as you concentrate on designing
               | your next project in your head, then suddenly autopilot
               | says "I lost track of where the road is, here you do it!"
               | and you have to quickly gain context and figure out what
               | the right thing to do is.
               | 
               | Some autopilot systems use eye tracking to make sure that
               | the driver is at least looking at the road, but that
               | doesn't guarantee that he's paying attention. But at
               | least that's harder to defeat than Tesla's "nudge the
               | steering wheel once in a while" method.
        
               | beambot wrote:
               | > just have all cars limit their speed to the speed
               | limit.
               | 
               | The devil is in the details... GPS may not provide
               | sufficient resolution. Construction zones. School zones
               | with variable hours. Tunnels. Adverse road conditions.
               | Changes to the underlying roads. Different classes of
               | vehicles. Etc.
               | 
               | By the time you account for all the mapping and/or
               | perception, you could've just improved the autonomous
               | driving and eliminated the biggest source of humans
               | driving: The human.
        
               | asdff wrote:
               | GPS is extremely accurate honestly. My garmen adjusts
               | itself the very instant I cross over a speed limit sign
               | to a new speed, somehow. Maybe they have good metadata,
               | but its all public anyway under some department of
               | transportation domain and probably not hard to mine with
               | the price of compute these days. Even just setting a top
               | speed in residential areas of like 35mph would be good
               | and save a lot of lives that are lost when pedestrians
               | meet cars traveling at 50mph. A freeway presents a good
               | opportunity to add sensors to the limited on and off
               | ramps for the car to detect that its on a freeway. Many
               | freeways already have some sort of sensor based system
               | for charging fees.
               | 
               | What would be even easier than all of that, though, is
               | just installing speeding cameras and mailing tickets.
        
               | watt wrote:
               | Just add all those to the map system. It could be made
               | incredibly accurate, if construction companies are able
               | to actually submit their work zones and "geofence" them
               | off on the map.
        
               | Johnny555 wrote:
               | But you can still impose a max speed limit based on
               | available data to cover most normal driving conditions
               | but it's still on the driver to drive slower if
               | appropriate. And that could be implemented today, not a
               | decade from now when autonomous driving is trustable.
               | 
               | The parent post said that autopilot won't let him go over
               | the speed limit and implies that makes him safer. My
               | point is that you don't need full autopilot for that.
               | 
               | So this is not a technical problem at all, but a
               | political one. As the past year has shown, people won't
               | put up with any convenience or restriction, even if it
               | could save lives (not even if it could save thousands of
               | lives)
        
               | freeone3000 wrote:
               | The single system you're describing, with all of its
               | complexity, is a subset of what is required for
               | autonomous vehicles. We will continue to have road
               | construction, tunnels, and weather long past the last
               | human driver. Improving the system here simply improves
               | the system here -- you cannot forsake this work by saying
               | "oh the autonomous system will solve it" -- this is part
               | of the autonomous system.
        
               | jeofken wrote:
               | During the 3 years I've owned it there are 2 places where
               | lines are wonky and I know to take over.
               | 
               | I have not yet struggled to stay alert when it drives me,
               | and it has driven better than I would have - so it
               | certainly is an improvement over me driving 100% of the
               | time. It does not have road rage and it does not enjoy
               | the feeling of speeding, like I do when I drive, nor does
               | it feel like driving is a competition, like I must admit
               | I do when I am hungry, stressed, or tired.
               | 
               | > just have all cars limit their speed to the speed limit
               | 
               | No way I'd buy a car that does not accelerate when I hit
               | the pedal. Would you buy a machine that is not your
               | servant?
        
             | FridayoLeary wrote:
             | >Let the human do all the driving and have the robot jump
             | in whenever the human makes a mistake.
             | 
             | Nothing more annoying then a car that thinks i don't know
             | how to drive (warning beeps etc.).
        
             | alistairSH wrote:
             | _Nobody is going to buy a car that will randomly make its
             | own decisions, that will pull the wheel from your hands
             | ever time it thinks you are making an illegal lane change._
             | 
             | That's almost exactly what my Honda does. Illegal (no
             | signal) lane change results in a steering wheel shaker (and
             | optional audio alert). And the car, when sensing an abrupt
             | swerve which is interpreted as the vehicle leaving the
             | roadway, attempts to correct that via steering and brake
             | inputs.
             | 
             | But, I agree with your more general point - the human still
             | needs to be primary. My Honda doesn't allow me to remove my
             | hands from the steering wheel for more than a second or
             | two. Tesla should be doing the same, as no current
             | "autopilot" system is truly automatic.
        
               | nthj wrote:
               | Just to add, I have a 2021 Honda, and disabling this
               | functionality is a 1-button-press toggle on the dash to
               | the left of the steering wheel. Not mandatory.
        
               | Robotbeat wrote:
               | Tesla's system also requires driving's to have their
               | hands on the steering wheel and occasionally provide
               | torque input.
        
               | alistairSH wrote:
               | Interesting, I assumed it didn't, given the prevalence of
               | stories about driver watching movies on their phones. I
               | guess they just leave one hand lightly on the wheel, but
               | are still able to be ~100% disengaged from driving the
               | car.
        
               | peeters wrote:
               | > Illegal (no signal) lane change results in a steering
               | wheel shaker (and optional audio alert).
               | 
               | To be clear, tying the warning to the signal isn't about
               | preventing unsignaled lane changes, it's gauging driver
               | intent (i.e. is he asleep and drifting or just trying to
               | change lanes). It's just gravy that it will train bad
               | drivers to use their signals properly.
        
               | alistairSH wrote:
               | Correct. It's not (primarily) a training thing, but used
               | to ensure the driver is driving and not sleeping/watching
               | movies/whatever.
        
               | sandworm101 wrote:
               | Is a lane change without signal always illegal? I know
               | that it almost certainly make you liable for any
               | resulting accident, but I'm not sure that it is
               | universally illegal.
        
               | hermitdev wrote:
               | Yes, failure to signal is a traffic violation. At least
               | everywhere I've lived/traveled in the US. It's also a
               | rather convenient excuse for police to "randomly" pull
               | you over (I've been pulled over by Chicago PD for not
               | signaling for a lane change, despite actually having done
               | so).
        
               | peeters wrote:
               | This is technically true in Ontario (TIL).
               | 
               | > 142 (1) The driver or operator of a vehicle upon a
               | highway before turning (...) from one lane for traffic to
               | another lane for traffic (...) shall first see that the
               | movement can be made in safety, and _if the operation of
               | any other vehicle may be affected by the movement_ shall
               | give a signal plainly visible to the driver or operator
               | of the other vehicle of the intention to make the
               | movement. R.S.O. 1990, c. H.8, s. 142 (1).
               | 
               | That said there's zero cost to doing so regardless of
               | whether other drivers are affected.
               | 
               | https://www.ontario.ca/laws/statute/90h08#BK243
        
               | sandworm101 wrote:
               | That's the sort of law I remember. It is considered a
               | failure to communicate your intention rather than a
               | violation per se in every circumstance.
        
               | dragonwriter wrote:
               | > Is a lane change without signal always illegal? I know
               | that it almost certainly make you liable for any
               | resulting accident
               | 
               | Usually, it makes you liable _because_ it is illegal. CA
               | law for instance reqires signalling 100ft before a lane
               | change or turn.
        
               | alistairSH wrote:
               | I have no idea, but the point wasn't so much that the
               | lane change is illegal, but that lack of signal is used
               | to indicate lack of driver attention. I shouldn't have
               | used "illegal" in my original post.
        
               | ohazi wrote:
               | > That's almost exactly what my Honda does. Illegal (no
               | signal) lane change results in a steering wheel shaker
               | (and optional audio alert). And the car, when sensing an
               | abrupt swerve which is interpreted as the vehicle leaving
               | the roadway, attempts to correct that via steering and
               | brake inputs.
               | 
               | By the way, this is fucking terrifying when you first
               | encounter it in a rental car on a dark road with poor
               | lane markings while just trying to get to your hotel
               | after a five hour flight.
               | 
               | I didn't encounter an obvious wheel shaker, but this
               | psychotic car was just yanking the wheel in different
               | directions as I was trying to merge onto a highway.
               | 
               | Must be what a malfunctioning MCAS felt like in a 737
               | MAX, but thankfully without the hundreds of pounds of
               | hydraulic force.
        
             | gambiting wrote:
             | I keep saying the same thing actually whenever people say
             | that manual driving will be outlawed. Like, no, it won't be
             | - because the computers will still save you in most cases
             | either way, autopilot enabled or not.
             | 
             | >>How about a Tesla that is electronically incapable of
             | speeding. Good luck selling that one.
             | 
             | From 2022 all cars sold in the EU have to have an
             | electronic limiter that keeps you to the posted speed
             | limit(by cutting power if you are already going faster) -
             | the regulation does allow the system to be temporarily
             | disabled however.
        
               | ggreer wrote:
               | Your summary is incorrect. The ETSC recommends that
               | Intelligent Speed Assistance should be able to be
               | overridden.[1] It's supposed to not accelerate as much if
               | you're exceeding the speed limit, and if you override by
               | pressing the accelerator harder, it should show some
               | warning messages and make an annoying sound. It's stupid,
               | but it doesn't actually limit the speed of your car.
               | 
               | I think it's a silly law and I'm very glad I don't live
               | in a place that requires such annoyances, but it's not as
               | bad as you're claiming.
               | 
               | 1. https://etsc.eu/briefing-intelligent-speed-assistance-
               | isa/
        
               | Symbiote wrote:
               | I hired a new car with Intelligent Speed Assistance this
               | summer, though it was set (and I left it) just to "ping"
               | rather than do any limiting. I drove it to a fairly
               | unusual place, though still in Europe and with standard
               | European signs. It did not have a GPS map of the area.
               | 
               | It could reliably recognize the speed limit signs (red
               | circle), but it never recognized the similar grey-slash
               | end-of-limit signs. It also didn't recognize the start-
               | of-town or end-of-town signs, so it didn't do anything
               | about the limits they implied.
               | 
               | I would certainly have had to disable it, had it been
               | reducing the acceleration in the way that document
               | describes.
        
             | Someone wrote:
             | I think that's the approach many car manufacturers have
             | been on for decades.
             | 
             | As a simple example, ABS
             | (https://en.wikipedia.org/wiki/Anti-lock_braking_system)
             | only interferes with what the driver does when an error
             | occurs.
             | 
             | More related to self-driving, there's various variants of
             | https://en.wikipedia.org/wiki/Lane_departure_warning_system
             | that do take control of the car.
             | 
             | And it is far from "incapable of speeding", but BMW, Audi
             | and Mercedes-Benz "voluntarily" and sort-of limit the speed
             | of their cars to 250km/hour
             | (https://www.autoevolution.com/news/gentlemens-agreement-
             | not-...)
        
         | judge2020 wrote:
         | Your reasoning doesn't apply to the incremental improvements to
         | self-driving, rather Tesla's decision to allow all cars to use
         | TACC/auto-steer. They haven't even given people "the button" to
         | enroll in FSD beta, likely because they know it would be
         | extremely bad PR when a bunch of people use it without paying
         | attention.
        
         | nathias wrote:
         | that's not human nature, that's user stupidity
        
           | weird-eye-issue wrote:
           | What's the... Difference?
        
             | nathias wrote:
             | One is a permanent property of our nature the other a
             | choice.
        
           | [deleted]
        
           | formerly_proven wrote:
           | The design is fine, it's all the users who are idiots.
           | 
           | P.S. /s. Obviously, Mr. Poe.
        
             | salawat wrote:
             | T. Every maligned designer when someone points out a flaw
             | 
             | It's okay. I do it too. Really need to work on seeing
             | yourself making that argument as a starting point and not
             | an endpoint.
        
             | rvz wrote:
             | Well they did not _' pay'_ attention. They _' paid'_ for
             | the "Fools Self Driving" package.
             | 
             | This is why 'attention' and 'driver monitoring' was not
             | included.
        
         | mnmmn123456 wrote:
         | Today, there is at least one of the most advanced Neural
         | Networks entering each car: A human being. If we could just
         | implement the AI to add to this person and not replace it...
        
           | ben_w wrote:
           | What would such an AI even look like? If it spots every
           | _real_ danger but also hallucinates even a few dangers that
           | aren't really there, it gets ignored or switched off for
           | needlessly slowing the traveler down (false positives,
           | apparently an issue with early Google examples [0]); if it
           | only spots real dangers but misses most of them, it is not
           | helping (false negatives, even worse if a human is blindly
           | assuming the machine knows best and what happened with e.g.
           | Uber [1]); if it's about the same as humans overall but makes
           | different _types_ of mistake, people rely on it right up
           | until it crashes then go apoplectic because it didn't see
           | something any human would consider obvious (e.g. Tesla, which
           | gets _slightly_ safer when the AI is active, but people keep
           | showing the AI getting confused about things that they
           | consider obvious [2]).
           | 
           | [0] https://theoatmeal.com/blog/google_self_driving_car
           | 
           | [1] https://en.wikipedia.org/wiki/Death_of_Elaine_Herzberg
           | 
           | [2] https://youtube.com/watch?v=Wz6Ins1D9ak
        
           | salawat wrote:
           | This is the bit nobody likes to realize. FSD at it's
           | best...is still about as fallible as a human driver. Minus
           | free will (it is hoped).
           | 
           | I will be amused, intrigued, and possibly a bit horrified if
           | by the time FSD hits level 5, and they stick with the Neural
           | Net of Neural Nets architecture if there isn't a rash of
           | system induced variance in behavior as emergent phenomena
           | take shape.
           | 
           | Imagined news: All Tesla's on I-95 engaged in creating
           | patterns whereby all non-Teala traffic was bordered by a
           | Tesla on each side. Almost like a game of Go, says expert.
           | Researchers stumped.
           | 
           | Then again, that's imply you had an NN capable of retraining
           | itself on the fly to some limited degree, which I assume no
           | one sane would put into service... Hopefully this comment
           | doesn't suffer a date of not aging well.
        
         | fzzzy wrote:
         | Do you hate regular cruise control? How is that not partial
         | self driving?
        
           | ghaff wrote:
           | To tell you the truth, I generally do and haven't used it for
           | ages. Where I drive, the roads have some amount of traffic. I
           | find (traditional) cruise control encourages driving at a
           | constant speed to a degree that I wouldn't as a driver with a
           | foot on the gas. So I don't "hate" regular cruise control but
           | I basically never use it.
        
             | hcurtiss wrote:
             | I think you are in a distinct minority.
        
               | ghaff wrote:
               | Maybe a fairly small sample size but I don't know the
               | last time I've been in a car where the driver has turned
               | on cruise control. But it probably varies by area of the
               | country. In the Northeast, there's just enough traffic in
               | general that it's not worth it for me.
        
               | minhazm wrote:
               | In traffic is where traffic aware cruise control is most
               | useful. A lot of people I knew who bought Tesla's in the
               | bay area specifically bought it so their commutes would
               | be less stressful with the bumper to bumper traffic. I
               | drove 3000+ miles across the country last year with > 90%
               | of it on AP and I was way less tired with AP on vs off
               | and it allowed me to just stay focused on the road and
               | look out for any issues.
        
               | ghaff wrote:
               | Yes. I was (explicitly) talking about traditional "dumb"
               | cruise control. I haven't used adaptive cruise control
               | but I agree it sounds more useful than traditional cruise
               | control once you get above minimal traffic.
        
               | int_19h wrote:
               | One thing worth noting about Subaru's approach to this
               | that is specifically relevant to bumper-to-bumper
               | traffic, is that it will stop by itself, but it won't
               | _start moving_ by itself - the driver needs to tap the
               | accelerator for that. It will warn you when the car in
               | front starts moving, though.
        
         | helsinkiandrew wrote:
         | > you need to always pay attention
         | 
         | That is the fatal flaw in anything but a perfect system - any
         | kind of system that is taking the decisions about steering from
         | the driver is going to result in the driver at best thinking
         | about other things and worse getting into the back seat to
         | change. If you had to develop a system to make sure someone was
         | paying attention, you wouldn't make them sit in a warm comfy
         | seat looking at a screen - you would make them actively engage
         | with what they were looking at - like steering.
         | 
         | And ultimately it doesn't matter how many hundreds of thousands
         | of hours of driving you teach your system with, it may
         | eventually be able to learn about parked cars, kerbs and road
         | signs, but there won't be enough examples of different
         | accidents and how emergency vehicles behave to ever make it
         | behave safely. Humans can cope with driving emergencies fairly
         | well (not perfectly admittedly) no matter how many they've been
         | involved in using logic and higher level reasoning.
        
         | dheera wrote:
         | Is it? Tesla is still alive because they're selling cars.
         | 
         | It's just that the companies that are NOT doing incremental
         | approaches are largely at the mercy of some investors who don't
         | know a thing about self-driving, and they may die at any time.
         | 
         | I agree with you that it is _technically_ flawed, but it may
         | still be viable in the end. At least their existence is not
         | dependent on the mercy of some fools who don 't get it, they
         | just sell cars to stay alive.
         | 
         | That's one of the major problems of today's version of
         | capitalism -- it encourages technically flawed ways to achieve
         | scientific advancement.
        
           | yawaworht1978 wrote:
           | Would be interesting to know how many buy based on the fsd
           | hype(including the ones who don't pay for the package) and
           | how many buy because of the "green" factor. However many
           | there are who buy because of the fsd promise, all that
           | revenue is coming from vaporware (beta ware at best) and is
           | possible due to lack of regulatory enforcement. History shows
           | that the longer the self regulatory entities take the p, the
           | harder the regulatory hammer comes down eventually.
        
         | cmpb wrote:
         | I disagree. One feature my car has is to pull me back into the
         | lane when I veer out of it (Subaru's lane keep assist). That is
         | still incremental improvement towards "full self driving". I
         | agree, however, that Tesla's Autopilot is not functional
         | enough, and any tool designed to allow humans to remove their
         | hands from the wheel should not require their immediate
         | attention in any way.
        
           | Robotbeat wrote:
           | Tesla's autopilot does not allow you to remove your hands
           | from the wheel. You must keep them on, and apply torque
           | occasionally, to keep it engaged.
        
             | Karunamon wrote:
             | In reality, it'll let you get away with going handsfree for
             | upwards of 30 seconds. That's more than long enough to lose
             | your attention.
        
           | tapoxi wrote:
           | I think people just assume Tesla's Autopilot is more capable
           | than it really is.
           | 
           | My car has adaptive cruise control and lane keep assist, but
           | I'm not relying on either for anything more complex than
           | sipping a drink while on the highway.
        
             | rmckayfleming wrote:
             | Yep, if anything they're just a way to make long drives or
             | stop and go highway traffic more tolerable. When I got my
             | first car with those features it seemed like a gimmick, but
             | they really help to reduce fatigue.
        
         | mrfusion wrote:
         | I'd be curious if there are studies out there on how to do
         | automated assists in machines that require vigilance that don't
         | have this problem.
        
           | jdavis703 wrote:
           | Both airplanes and trains have automated "assist." At least
           | in the case of WMATA they give up on automatic train control
           | after a fatal crash.
        
             | userbinator wrote:
             | In a plane you also have far more time to react once the
             | autopilot disconnects for whatever reason, than the
             | fraction of a second that a car gives you.
        
               | jdavis703 wrote:
               | Then the automation needs to be more conservative in its
               | ability and request intervention sooner.
        
             | yawaworht1978 wrote:
             | The difference is, they have traffic controllers and the
             | train have their own dedicated rails, almost no
             | obstructions and a train into train crash danger situation
             | rarely arises. The planes have a lot of maneuvering space
             | to all sides.
             | 
             | Car traffic and streets are more dense and often have
             | humans crossing them without regards to laws, bicycles,
             | motorbikes, road construction and bad weather.
             | 
             | Not saying one auto pilot system is better than the other,
             | however, they operate in different environments.
        
             | merrywhether wrote:
             | This is a misrepresentation of the dumpster fire that was
             | the WMATA train situation. Yes, the fatal crash was the
             | last straw, but the root problem was not the automation
             | system but rather the complete lack of maintenance that led
             | to its inability to work properly. Congress refusing to
             | fund maintenance and then falling behind 10-15 years on it
             | lead to all kinds of systems failing. The fatal fire in the
             | blue line tunnel under the river occurred with a human at
             | the controls, but we're similarly not blaming that incident
             | on the perils of human operation.
        
               | jdavis703 wrote:
               | I don't blame the operator for the crash. The other train
               | was behind a blind curve and she hit the emergency brake
               | within a reasonable amount of time given what she could
               | see. However the speeds of the system were set too high
               | for the operator to safely stop because they assumed the
               | ATC would work perfectly.
        
         | Waterluvian wrote:
         | I have a Subaru Forester base model with lane keeping and
         | adaptive cruise control.
         | 
         | I need to be touching the wheel and applying some force to it
         | or it begins yelling at me and eventually brings me slowly to a
         | stop.
         | 
         | I've had it for a year now and I cannot perceive of a way,
         | without physically altering the system (like hanging a weight
         | from the wheel maybe?) that would allow me to stop being an
         | active participant.
         | 
         | I think the opposite is true: Tesla's move fast and kill people
         | approach is the mistake. Incremental mastering of autonomous
         | capabilities is the way to go.
        
           | ChrisClark wrote:
           | That's exactly what my Tesla does. I need a constant torque
           | on the steering wheel or it yells at me and slowly comes to a
           | stop.
        
             | Waterluvian wrote:
             | Personally I've found this to be sufficient in my Forester.
             | Even holding the wheel but not being "there" isn't enough.
             | The car is really picky about it.
        
           | SEJeff wrote:
           | You're literally describing how the Tesla system works. It
           | requires you to keep your hand on the wheel and apply a
           | slight pressure every so often. The cabin camera watches the
           | driver and if they're looking down or at their phone, it does
           | that much more often.
           | 
           | People causing these problems almost certainly are putting
           | something over the cabin camera and a defeat device on the
           | steering wheel.
        
           | jeffnappi wrote:
           | I own a Model Y and am a pretty heavy Autopilot user. You
           | have to regularly give input on the steering wheel and if you
           | fail a few times it won't let you re-engage until you park
           | and start again.
           | 
           | Personally Autopilot has actually made driving safer for
           | me... I think there's likely abuse of the system though that
           | Tesla could work harder to prevent.
        
             | DrBenCarson wrote:
             | I personally think the issue boils down to their use of the
             | term "Autopilot" for a product that is not Autpilot (and
             | never will be with the sensor array they're using IMO.)
             | 
             | They are sending multiple signals that this car can drive
             | itself (going so far as charging people money explicitly
             | for the "self-driving" feature) when it cannot in the
             | slightest do much more than stay straight on an empty
             | highway.
             | 
             | They should be forced to change the name of the self-
             | driving features, I personally think "Backseat Driver"
             | would be more appropriate.
        
               | jhgb wrote:
               | > the issue boils down to their use of the term
               | "Autopilot" for a product that is not Autpilot
               | 
               | It is literally an autopilot. Just like an autopilot on
               | an airplane, it keeps you stable and in a certain flight
               | corridor. There's virtually no difference except for
               | Tesla's Autopilot's need to deal with curved
               | trajectories.
        
               | labcomputer wrote:
               | > There's virtually no difference except for Tesla's
               | Autopilot's need to deal with curved trajectories.
               | 
               | Well, and it actively avoids collisions with other
               | vehicles (most of the time). Airplane (and boat)
               | autopilots don't do that.
               | 
               | "But you're using the word autopilot wrong!"
        
               | Kaytaro wrote:
               | Autopilot is precisely the correct term - An autopilot is
               | a system used to control the path of an aircraft, marine
               | craft or spacecraft without requiring constant manual
               | control by a human operator. _Autopilots do not replace
               | human operators._
        
           | gccs wrote:
           | Shove a can of soda in the wheel and it will stop beeping.
        
             | mellavora wrote:
             | yes, but what if you also have to sing the jingle?
             | 
             | Damn, this 'drink verification can' is going to get us all
             | killed.
        
           | tgsovlerkhgsel wrote:
           | Tesla had a similar system, and
           | 
           | > physically altering the system (like hanging a weight from
           | the wheel maybe?)
           | 
           | was exactly what people were doing. But it's also possible to
           | be physically present, applying force, but being "zoned out",
           | even without malicious intent.
        
             | johnnyApplePRNG wrote:
             | >But it's also possible to be physically present, applying
             | force, but being "zoned out", even without malicious
             | intent.
             | 
             | I've occasionally noticed myself zoning out behind the
             | wheel of my non-self-driving car as well.
             | 
             | It's actually very common. [0]
             | 
             | [0] https://www.actuarialpost.co.uk/article/quarter-of-
             | fatal-cra...
        
           | shakna wrote:
           | > I need to be touching the wheel and applying some force to
           | it or it begins yelling at me and eventually brings me slowly
           | to a stop.
           | 
           | > I've had it for a year now and I cannot perceive of a way,
           | without physically altering the system (like hanging a weight
           | from the wheel maybe?) that would allow me to stop being an
           | active participant.
           | 
           | That's exactly what people were doing with the Tesla. Hanging
           | a weight to ensure the safety system doesn't kick in. [0][1]
           | 
           | [0] https://edition.cnn.com/2021/04/28/cars/tesla-texas-
           | crash-au...
           | 
           | [1] https://twitter.com/ItsKimJava/status/1388240600491859968
           | /ph...
        
             | Waterluvian wrote:
             | If people are consciously modifying their car to defeat
             | obvious safety systems, I have a really hard time seeing
             | how the auto manufacturer should be responsible.
             | 
             | I guess the probe will reveal what share of fatal accidents
             | are caused by this.
        
               | rcxdude wrote:
               | Well, it doesn't help when the CEO of the company
               | publically states that the system is good enough to drive
               | on its own and those safety systems are only there
               | because of regulatory requirements.
        
         | tgsovlerkhgsel wrote:
         | It depends. As long as the resulting package (flawed self
         | driving system + the average driver) isn't significantly more
         | dangerous than the average unassisted human driver, I don't
         | consider it irresponsible to deploy it.
         | 
         | "The average driver" includes everyone, ranging from drivers
         | using it as intended with close supervision, drivers who become
         | inattentive because nothing is happening, and drivers who think
         | it's a reasonable idea to climb into the back seat with a water
         | bottle duct taped to the steering wheel to bypass the sensor.
         | 
         | OTOH, the average driver for the unassisted scenario also
         | includes the driver who thinks they're able to drive a car
         | while texting.
        
           | TacticalCoder wrote:
           | > As long as the resulting package (flawed self driving
           | system + the average driver) isn't significantly more
           | dangerous than the average unassisted human driver...
           | 
           | Shouldn't that compared to "average driver + myriad of modern
           | little safety features" instead of "average unassisted
           | driver"? The one who has the means to drive a Tesla with the
           | "full driving" mode certain has the means to buy, say, a
           | Toyota full of assistance/safety features (lane change
           | assist, unwanted lane change warning and whatnots).
        
             | politician wrote:
             | Why isn't defeating the self-driving attention controls a
             | crime like reckless driving? Isn't that the obvious
             | solution?
        
               | tgsovlerkhgsel wrote:
               | It almost certainly is, at least when combined with the
               | intentional inattention that follows.
               | 
               | Making it a crime isn't an "obvious solution" to actually
               | make it not happen. Drunk driving is a crime and yet
               | people keep doing it. Same with texting and driving.
        
               | politician wrote:
               | The problem is determining who is liable for damages, not
               | prevention. Shifting the liability for willfully
               | disabling a safety control puts them on notice.
               | 
               | Prevention as a goal is how we end up with dystopia.
        
               | rcxdude wrote:
               | Gonna be pretty difficult to enforce. Many US states
               | don't even enforce a minimum roadworthiness of cars on
               | the roads.
        
               | politician wrote:
               | Does that even matter? If the state doesn't care to
               | enforce its laws against reckless driving, why should the
               | manufacturer be encumbered with that responsibility?
        
           | jdavis703 wrote:
           | The average driver breaks multiple laws on every trip. Most
           | of the time no one gets hurt. But calibrating performance
           | against folks violating traffic and criminal laws sets the
           | bar too low for an automated system. We should be aiming for
           | standards that either match European safety levels or the
           | safety of modes of air travel or rail travel.
        
             | Robotbeat wrote:
             | Yes, I agree. We should hold automated systems to a higher
             | standard. Unless you're proposing we ban automated systems
             | until they're effectively perfect because that would
             | perversely result in a worse outcome: being stuck with
             | unassisted driving forever.
        
             | tgsovlerkhgsel wrote:
             | I disagree. Perfect is the enemy of good, and rejecting a
             | better system because it isn't perfect seems like an absurd
             | choice.
             | 
             | I'm not saying improvements should stop there, but once the
             | system has reached parity, it's OK to deploy it and let it
             | improve from there.
        
               | bcrl wrote:
               | Except that doesn't work if you're trying to produce a
               | safe product. Investigations into crashes in the airline
               | industry have proven that removing pilots from active
               | participation in the control loop of the airplane results
               | in distraction and an increased response time when an
               | abnormal situation occurs. Learning how to deal with this
               | is part of pilots' training, plus they have a co-pilot to
               | keep an eye on things and back them up.
               | 
               | An imperfect self driving vehicle is the worst of all
               | worlds: they lull the driver into the perception that the
               | vehicle is safe while not being able to handle abnormal
               | situations. The fact that there are multiple crashes on
               | the record where Telsas have driven into stationary
               | trucks and obstacles on roads is pretty damning proof
               | that drivers can't always react in the time required when
               | an imperfect self driving system is in use. They're not
               | intrinsically safe.
               | 
               | At the very least drivers should be required additional
               | training to operate these systems. Like pilots, drivers
               | need to be taught how to recognize when things go awry
               | and react to possible failures. Anything less is not
               | rooted in safety culture, and it's good to see there are
               | at least a few people starting to shine the light on how
               | these systems are being implemented from a safety
               | perspective.
        
               | notahacker wrote:
               | > Perfect is the enemy of good, and rejecting a better
               | system because it isn't perfect seems like an absurd
               | choice.
               | 
               | Nothing absurd about thinking a system which has parity
               | with the average human driver is too risky to buy unless
               | you consider yourself to be below average at driving. (As
               | it is, most people consider themselves to be better than
               | average drivers, and some of them are even right!) The
               | accident statistics that comprise the "average human
               | accident rate" are also disproportionately caused by
               | humans you'd try to discourage from driving in those
               | circumstances...
               | 
               | Another very obvious problem is that an automated system
               | which kills at the same rate per mile as an average human
               | drivers will tend to be driven a lot more because no
               | effort (and probably replace better-than-average
               | commercial drivers long before teenagers and occasional-
               | but-disproportionately-deadly drivers can afford it).
        
           | CaptArmchair wrote:
           | > drivers using it as intended with close supervision
           | 
           | Doesn't this hide a paradox? Using a self-driving car as
           | intended implies that the driver relinquishes a part of the
           | human decision making process to the car. While close
           | supervision implies that the driver can always take control
           | back from the car, and therefore carries full personal
           | responsibility of what happens.
           | 
           | The caveat here is that the car might make decisions in a
           | rapidly changing, complex context which the driver might
           | disagree with, but has no time to correct for through manual
           | intervention. e.g. hitting a cyclist because the autonomous
           | system made an erroneous assertion.
           | 
           | Here's another way of looking at this: if you're in a self-
           | driving car, are you a passenger or a driver? Do you intend
           | to drive the car yourself or let the car transport you to
           | your destination?
           | 
           | In the unassisted scenario, it's clear that both intentions
           | are one and the same. If you want to get to your location,
           | you can't but drive the car yourself. Therefore you can't but
           | assume full personal responsibility for your driving. Can the
           | same be said about a vehicle that's specifically designed and
           | marketed as "self-driving" and "autonomous"?
           | 
           | As a driver, you don't just relinquish part of the decision
           | making process to the car, what essentially happens is that
           | you put your trust in how the machine learning processes that
           | steer the car were taught to perceive the world by their
           | manufacturer. So, if both car and occupant disagree and the
           | ensuing result is an accident, who's at fault? The car? The
           | occupant? The manufacturer? Or the person seeking damages
           | because their dog ended up wounded?
           | 
           | The issue here isn't that self-driving cars are inherently
           | more dangerous then their "dumb" counter parts. It's that
           | driving a self-driving car creates it's own separate class of
           | liabilities and questions regarding responsible driving when
           | accidents do happen.
        
         | Baeocystin wrote:
         | I remember reading Donald Norman's books _decades_ ago, and one
         | of the prime examples of the dangers of automation in cars was
         | adaptive cruise control- which would then suddenly accelerate
         | forward in a now-clear off-ramp, surprising the heck out of the
         | previously-complacent driver, and leading to accidents.
         | 
         | We've known for a very long time that this sort of
         | automation/manual control handoff failure is a very big deal,
         | and yet there seems to be an almost willful blindness from the
         | manufacturers to address it in a meaningful way.
        
         | [deleted]
        
         | swiley wrote:
         | We have an education problem. People have no idea what
         | computers do because they're illiterate (literacy would mean
         | knowing at least one language well enough to read and write in
         | it) so they just take other people's word that they can do some
         | magical thing with software updates. The most extreme examples
         | of this were the iPhone hoaxes telling people that software
         | updates provided waterproofing or microwave charging.
        
         | backtoyoujim wrote:
         | I mean there are videos of a vehicle's occupant sitting in the
         | rear seats making food and drinks while the vehicles are
         | tricked into operating off of the vehicles sensors.
         | 
         | It is not solely the trust and dependence but inclusive is the
         | group of idiots with access to wealth without regard to human
         | life.
        
         | bishoprook2 wrote:
         | I expect that to design self-driving you need to push the
         | limits (with some accidents) a bit with a bunch of telemetry.
         | Going from not-much to full-self-driving requires a lot of
         | design increments.
        
         | supperburg wrote:
         | Lex Fridman said they studied this and found that people don't
         | become "lulled" even after using the system for a long period
         | of time.
        
       | phoe18 wrote:
       | > "The involved subject vehicles were all confirmed to have been
       | engaged in either Autopilot or Traffic Aware Cruise Control
       | during the approach to the crashes,"
       | 
       | No mention of the deceptive marketing name "Full Self Driving" in
       | the article.
        
         | xeromal wrote:
         | I'm pretty sure because FSD is out to a limited number of users
         | at the moment. I think it totals around a 1000.
        
           | kube-system wrote:
           | This is just more evidence of the confusion that Tesla
           | marketing has created. "Full Self-Driving Capability" is the
           | literally quoted option they've been selling for years now.
        
         | rvz wrote:
         | Exactly. Some of these cars do not even have 'Driver
         | Monitoring', which means the car doesn't even track if the
         | driver has their _eyes on the road_ at all times, which puts
         | many other drivers at risk.
         | 
         | On top of that, FSD is still admittedly Level 2; Not exactly
         | 'Full Self Driving'? And the controls can easily be tricked to
         | think that the driver has their 'hands on the wheel' which is
         | not enough to determine driver attentiveness while FSD is
         | switched on.
        
         | dmix wrote:
         | I checked the website and they seem to be contextualizing
         | "Full-self driving" with it coming at a future date:
         | 
         | > All new Tesla cars have the hardware needed in the future for
         | full self-driving in almost all circumstances. [...] As these
         | self-driving capabilities are introduced, your car will be
         | continuously upgraded through over-the-air software updates.
         | 
         | https://www.tesla.com/en_CA/autopilot
         | 
         | I also personally would prefer they stuck to 'autopilot' and
         | avoided the word full in 'full self-driving' and otherwise be
         | more specific about what it means.
         | 
         | Other car companies typically productize the various features
         | like lane assist, following cruise control, etc rather than
         | bundle it into one. But that definitely makes communicating it
         | more difficult.
         | 
         | Tesla probably doesn't want to call it 'limited self-driving'
         | or 'partial self-driving'. Maybe 'computer assisted driving'
         | but that doesn't sound as appealing. I can see the difficulty
         | marketing here. But again not using 'full' as in it's complete
         | and ready-to-go would help.
        
       | joewadcan wrote:
       | This will end up being a very good thing for Tesla. They were
       | able to operate semi-autonomous vehicles for years while they
       | iterated through software versions. They are faaar from done, but
       | a likely outcome will be more stringent regulation on companies
       | that want to do a similar approach of putting the betas in
       | customers hands. This makes it way harder for automotive
       | companies to get the same leeway, putting Tesla further ahead.
        
         | jazzyjackson wrote:
         | This is good for Bitcoin
        
       | dotdi wrote:
       | Unfortunately the article doesn't mention anything about how
       | common it is for human drivers to crash into first responder
       | vehicles during the night. I'm not trying to downplay these
       | cases, as hitting emergency vehicles is very bad indeed, yet ~4
       | such crashes per year might be in the same ballpark or even
       | better than "unassisted" drivers that cause such crashes.
        
         | josefx wrote:
         | Might be around 98 a year if this[1] is the correct list.
         | 
         | Edit: I think the page count at the bottom of that list is off,
         | it seems to repeat the last page so it might be less.
         | 
         | [1]https://www.respondersafety.com/news/struck-by-
         | incidents/?da...
        
           | onlyrealcuzzo wrote:
           | Considering that less than 0.3% of cars in the US are Teslas
           | and that - I would guess - less than 10% of them are using
           | autopilot at any given time - they are likely 100s of times
           | more likely to hit first responders.
        
         | toomuchtodo wrote:
         | Also doesn't mention that this is an inherent limitation in
         | TACC+ systems, and is specifically called out as such in Volvo,
         | BMW, and Cadillac vehicle manuals as a limitation. Much ado
         | about nothing unless regulators are going to outlaw radar based
         | adaptive cruise control (which, of course, they're not).
        
         | btbuildem wrote:
         | Indeed! I was looking for the same comparison.
         | 
         | In Canada the Highway Act states that you must move over
         | (change lanes) for stopped emergency vehicles. It seems to
         | solve that problem gracefully, leaving an empty lane between
         | the stopped vehicles and traffic.
        
         | salawat wrote:
         | frankly, it's hard to even crash into an emerfency in my
         | opinion while actually driving snd paying attention given tgeir
         | lights have gotten so darn bright it's damn near blinding.
         | frankly, I have to slow to a crawl not out of rubbernecking
         | fascination, butout of self preservation to adapt to the dang
         | lighting searing my retinas at night.
         | 
         | now running into unlit emergency vehicles? still think tgat's
         | rather difficult sans inebriation or sleep dep.
        
           | lacksconfidence wrote:
           | the lights are actually what cause the crash. Some drivers
           | just drive straight into the lights. This is part of why
           | police, at least around here, have particular protocols
           | around how far they stop behind a car, and never standing
           | between the cop car and the car they stopped.
        
       | mshumi wrote:
       | Judging by the lack of a market reaction this morning, this is
       | mostly immaterial.
        
         | bathtub365 wrote:
         | The NASDAQ wasn't open at the time of your comment, so how can
         | you even make that determination?
         | 
         | TSLA is down almost 2% in pre-market trading at the time of
         | this comment, though.
        
         | phpnode wrote:
         | Stocks like Tesla have long been divorced from business
         | realities so I wouldn't put too much stake in that
        
           | smallhands wrote:
           | time to short Tesla stocks!
        
             | phpnode wrote:
             | Markets and irrationality and solvency quote goes here
        
               | rvz wrote:
               | Well I did give a NKLA short 17 days ago and well I ended
               | up laughing all the way to the bank. [0]
               | 
               | [0] https://news.ycombinator.com/item?id=27996773
        
               | phpnode wrote:
               | Fortune favours the brave, occasionally!
        
         | catillac wrote:
         | I don't know much about trading, but it appears to be down
         | nearly 5% this morning as of right now. Regardless, I think
         | you're conflating trading price with whether something is
         | material in general.
        
       | darkerside wrote:
       | Long overdue. We're going to need a more rapid and iterative way
       | to do this if we got to have even a chance of autopilot type
       | technologies succeeding over the long run. Companies and
       | regulators need to be collecting feedback and pushing for
       | improvement on a regular basis. I still don't think it's likely
       | to succeed, but if it did, this would be the way.
        
       | fallingknife wrote:
       | I always see anecdotes about Tesla crashes, but not any
       | statistics vs other cars. I tend to assume that this is because
       | they aren't more dangerous.
        
         | catillac wrote:
         | My thought would be that we hear lots of anecdotes because they
         | claim a full self driving capability, so it's particularly
         | interesting when they crash and are using these assist
         | features. Indeed when I bought my Model Y I paid extra for that
         | capability.
         | 
         | Here's more detail: https://www.tesla.com/support/autopilot
        
       | crubier wrote:
       | Predictable outcome of a sensing system fully based on deep
       | learning. Rare unusual situations don't have enough training data
       | and lead to unpredictable output.
       | 
       | I still think that Tesla's approach is the right one, I just
       | think they need to gather more data before letting this product
       | be used in the wild unsupervised.
        
         | judge2020 wrote:
         | Current TACC/auto-steer doesn't use deep learning except on the
         | newest Model 3/Y vehicles with "TeslaVision". All cars with
         | radar use the radar and radar only to determine if they should
         | stop for the following car.
        
           | laichzeit0 wrote:
           | How does TeslaVision work with stationary objects at night?
           | Like say a big ass truck with its lights off? Do you just
           | pray the vision system recognizes "something" is there? I
           | know they want to pursue a pure-vision system with no radar
           | input, but it seems like there will be some crazy low light /
           | low visibility edge cases you'd have to deal with.
        
             | marvin wrote:
             | How does a human detect a big ass truck with its lights off
             | at night? This is solvable with computer vision. Tesla's
             | dataset is almost nothing but edge cases, and they keep
             | adding more all the time. My money says they'll get there.
        
           | dawnerd wrote:
           | Definitely use cameras as well to determine stopping.
           | Otherwise there wouldn't have been the issue with bridges or
           | shadows causing phantom braking.
        
         | 360walk wrote:
         | I think it is necessary for the crashes to occur, to gather the
         | data required to re-train the auto-pilot. We as a society need
         | to decide whether we want to pay this cost of technological
         | advancement.
        
           | crubier wrote:
           | No. Gathering data of human drivers braking in those
           | circumstances would result in a perfectly fine dataset. This
           | idea of needing human sacrifice is bonkers.
        
       | thoughtstheseus wrote:
       | Ban human driving on the interstate, highways, etc. Boom, self
       | driving now works at scale.
        
       | tacobelllover99 wrote:
       | Man Tesla are the wrose. Causing more crashes and more likely to
       | catch on fire!
       | 
       | Oh wait NM that's tradional ICE cars.
       | 
       | FUD is dangerous
        
       | zebnyc wrote:
       | I was excited to read about Tesla's "autopilot" until I read the
       | details. To me as a consumer, autopilot would let me get in my
       | car after dinner in SF, set the destination as Las Vegas and wake
       | up in the morning at Vegas. Wake me up when that exists.
       | 
       | Or I can tell my car, "Hey tesla, go pickup my kid from soccer
       | practice" and it would know what to do.
        
       | MonadIsPronad wrote:
       | Oopsie. This strikes me as perhaps one of the growing pains of
       | not-quite-self-driving: common sense would dictate that manual
       | control would be taken by the driver when approaching an unusual
       | situation like a roadside incident, but we just can't trust the
       | common sense of a minority of people.
       | 
       | Tesla perhaps isn't being loud enough about how autopilot isn't
       | self-driving, and _shouldn 't even be relied upon to hit the
       | brakes when something is in front of you_.
        
         | ghaff wrote:
         | What on earth makes you think it's the minority of people who
         | stop paying attention when machinery is handling some task all
         | by itself the vast majority of the time? There's plenty of
         | research that says otherwise even among highly-trained
         | individuals.
        
       | yawaworht1978 wrote:
       | I remember that video "driver is only there because of regulatory
       | rules". That is a flat out lie, safe to say so by now. The
       | autopilot accidents per distance is also cherry picked, turn on
       | autopilot everywhere , including bad weather and see how that
       | comparison goes. And the claim that the cars have all the
       | hardware for future fsd is quite out there too. It's a bit like
       | saying I have the next Michael Phelps here, he just cannot swim
       | yet.
        
       | mensetmanusman wrote:
       | I remember when Google was first presenting on driverless
       | technology about 10 years ago, and they mentioned how you have to
       | go right to full self driving, because any advanced driver
       | assistance will clash with human risk compensation behavior.
       | 
       | Risk compensation is fascinating; driving with a bike helmet
       | causes the biker and drivers around the biker to behave more
       | dangerously.
       | 
       | Is society sophisticated enough to deal with advanced driver
       | assistance? Is it possible to gather enough data to create self
       | driving ML systems?
        
         | WA wrote:
         | > Risk compensation is fascinating; driving with a bike helmet
         | causes the biker and drivers around the biker to behave more
         | dangerously.
         | 
         | Do you have a truly reliable source for that? Because I hear
         | this statement once in a while, and it feels flawed.
         | 
         | A helmet protects you from severe head injury if you are in an
         | accident. There are more reasons for accidents than reckless
         | car drivers. For example:
         | 
         | - Bad weather
         | 
         | - Driver not seeing the biker at all (no matter with or without
         | helmet)
         | 
         | - Crash between 2 cyclists
        
           | xsmasher wrote:
           | Parent did not say that helmets make you less safe. They said
           | that helmets make drivers around the biker behave more
           | dangerously.
           | 
           | https://www.bicycling.com/news/a25358099/drivers-give-
           | helmet...
        
             | brandmeyer wrote:
             | 3.5 inches on an average of ~1 meter was the measurement,
             | in a study that a single researcher performed using himself
             | as the rider.
             | 
             | This result is both weakly supported and small, and it
             | shouldn't be considered actionable.
        
         | jacquesm wrote:
         | Risk compensation probably also works the other way, looking
         | forward to all news cars standard supplied with a new safety
         | device that cuts traffic accidents to a small fraction of what
         | they used to be, the only ones remaining are all fatal for the
         | driver.
         | 
         | A nice and _very_ sharp 8 " stainless steel spike on the
         | steering wheel facing the driver.
        
           | toast0 wrote:
           | > A nice and very sharp 8" stainless steel spike on the
           | steering wheel facing the driver.
           | 
           | Didn't we have those in the 50s and 60s? Maybe not sharp, but
           | collapsable steering columns are a significant improvement to
           | survivability.
        
         | barbazoo wrote:
         | > Risk compensation is fascinating; driving with a bike helmet
         | causes the biker and drivers around the biker to behave more
         | dangerously.
         | 
         | Source please
        
           | bllguo wrote:
           | I remember reading that viewpoint in this essay:
           | https://cyclingtips.com/2018/11/commentary-why-i-stopped-
           | wea...
           | 
           | there are some sources and studies linked. i.e. countries
           | with the highest rate of helmet use also have the highest
           | cyclist fatality rates
        
           | xsznix wrote:
           | Here you go:
           | 
           | https://www.sciencedirect.com/science/article/abs/pii/S00014.
           | ..
           | 
           | https://psyarxiv.com/nxw2k
        
             | barbazoo wrote:
             | I cannot open the study in the first link but the second on
             | seems to actually refute the claim instead of supporting
             | it.
             | 
             | > There is a body of research on how driver behaviour might
             | change in response to bicyclists' appearance. In 2007,
             | Walker published a study suggesting motorists drove closer
             | on average when passing a bicyclist if the rider wore a
             | helmet, potentially increasing the risk of a collision.
             | Olivier and Walter re-analysed the same data in 2013 and
             | claimed helmet wearing was not associated with close
             | vehicle passing.
        
               | xsmasher wrote:
               | Keep reading.
               | 
               | > We then present a new analysis of the original dataset,
               | measuring directly the extent to which drivers changed
               | their behaviour in response to helmet wearing. This
               | analysis confirms that drivers did, overall, get closer
               | when the rider wore a helmet.
        
       | sidibe wrote:
       | Glad the regulators are looking into this. It bothers me that now
       | Tesla seems to have no liability at all for the system not
       | working, since it's always the driver's fault for not paying
       | enough attention.
       | 
       | As Teslas get better at driving the drivers will be paying less
       | attention inevitably, Tesla needs to start being responsible at
       | some point
        
         | bob33212 wrote:
         | Every year young drivers die because they were inexperienced
         | and didn't realize they were going too fast to too slow for a
         | certain situation.
         | 
         | Once full self driving is statistically safer than humans how
         | will you not let people use it? It is like saying you would
         | rather have 10 children die because of bad driving skills
         | rather than 1 child die because they were not paying attention
         | at all times.
        
           | sidibe wrote:
           | >Once full self driving is statistically safer than humans
           | how will you not let people use it?
           | 
           | I'm fine with self-driving if/when it works (though I'm
           | pretty sure from watching FSD Beta videos shot and edited by
           | their biggest fans with a few interventions every 5 minutes,
           | this is many many many years away for Tesla). But the company
           | selling the self driving has to be responsible to some degree
           | for the mistakes it makes.
        
             | WA wrote:
             | And responsible for the marketing it puts out:
             | 
             | https://www.tesla.com/videos/autopilot-self-driving-
             | hardware...
             | 
             | "... HE IS NOT DOING ANYTHING. THE CAR IS DRIVING ITSELF."
             | 
             | Online since 2016, debunked as a lie. Still on Tesla's
             | website.
        
           | jazzyjackson wrote:
           | As far as that goes if we want to save lives we can just
           | regulate that semi-autonomous cars have to enforce the speed
           | limit + 0 visibility from fog and heavy rain is an automatic
           | pull over and wait for conditions to improve.
        
           | thebruce87m wrote:
           | _Just_ statistically safer won't cut it - it will have to me
           | many orders of magnitude safer. Instead of drunk people and
           | mobile phone users dying it will be random accidents that
           | humans would easily have avoided but is some weird edge case
           | for the ML model. It'll be a cars plowing down kids on trikes
           | on a clear day, all captured in perfect HD on the cars
           | cameras and in the press the next day with the crying driver
           | blaming the car.
           | 
           | That'll be a hard thing to overcome for the public. The drunk
           | person "had it coming", but did little Timmy?
        
       | zugi wrote:
       | Teslas are the safest vehicles on the road, according to the
       | National Highway Traffic Safety Administration.
       | (https://drivemag.com/news/how-safe-are-tesla-cars-5-facts-
       | an...).
       | 
       | Teslas crash 40% less than other cars, and 1/3 the number of
       | people are killed in Teslas versus other cars.
       | 
       | Indeed once a common failure mode like this is identified it
       | needs to be investigated and fixed. Something similar happened a
       | few years ago when someone driving a Tesla while watching a movie
       | (not paying attention) died when they crashed into a light-
       | colored tractor trailer directly crossing the road. So an
       | investigation makes sense. But much of the general criticism of
       | self-driving and autopilot here seems misplaced. Teslas and other
       | self-driving vehicle technologies are saving lives. They will
       | continue to save lives compared to human drivers, as long as we
       | let them.
        
         | derbOac wrote:
         | I really wrestle with this line of reasoning. Tesla keeps
         | pointing this out, and it's appealing to me, but at the same
         | time something about it seems off to me. I can't tell if this
         | is erroneous intuition on my part blinding me to a more
         | rational assessment, or if that intuition is onto something
         | important.
         | 
         | Some top-of-my-head thoughts:
         | 
         | 1. I think to make a fair comparison of Tesla versus other
         | cars, you'd have to really ask "how much safer are _Tesla
         | owners_ in Teslas compared to other cars randomly assigned to
         | them? " That is, comparing the accident rates of Teslas
         | compared to other cars is misleading because Tesla owners are
         | not a random slice of the population. I almost guarantee that
         | if you e.g., looked at their accident rates prior to owning a
         | Tesla their accident rates would be lower than the general
         | population.
         | 
         | 2. In these autopilot situations, bringing up general accident
         | rates seems sort of like a red herring to me. The actual
         | causally relevant issue is "what would happen in this scenario
         | if someone were driving without an autopilot?" So, for example,
         | in the example of the rider who was killed when the autopilot
         | drove them into a semi, the actually relevant question is "what
         | would have happened if that driver, or someone interchangeable
         | with them, was driving without autopilot? Would have they drove
         | themselves into a semi?"
         | 
         | 3. Various experts have argued general vehicle accident rates
         | aren't comparable to Teslas because average cars are much, much
         | older. As such, you should be comparing accident rates of cars
         | of the same age, if nothing else. So, aside from the driver
         | effect pointed out earlier, you have the question of "what
         | would the accident rate look like in a Tesla or a car identical
         | to it without autopilot?"
         | 
         | 4. At some point with autopilot -- whether it be Tesla or other
         | companies -- you have to start treating it comparably to a
         | single individual. So, for example, what are the odds of Person
         | A27K38, driving the same number of miles as Tesla, having a
         | certain pattern of accidents? If you found a specific person
         | drove into first responders on the side of the road 11 times,
         | wouldn't that be suggestive of a pattern? Or would it? It's not
         | enough to ask "how often do non autopilot drivers drive into
         | first responders on the side of the road", it seems to me
         | important to ask "how often would a _single driver_ drive into
         | first responders on the side of the road, given a certain
         | number of miles driven in that same period? " At some point,
         | autopilot becomes a driver, in the sense it has a unique
         | identity regardless of how many copies of it there are? Maybe
         | that's not right but it seems like that is the case.
        
       | sunshineforever wrote:
       | I wonder, how does the autopilot safety record compare to driving
       | in similar conditions to those which AP is typically used (open
       | highway, good weather).
        
       | tyingq wrote:
       | The crash in the city of Woodlands, Texas, was pretty terrifying.
       | After hitting a tree, the car caught on fire. The driver was
       | found in the back seat, presumably because he couldn't figure out
       | how to open the door to get out.
        
         | Meekro wrote:
         | Interesting! When HN discussed that story a few months ago[1],
         | the common notion was that the driver had enabled autopilot and
         | climbed into the back seat.
         | 
         | Your take seems a lot more plausible.
         | 
         | [1] https://news.ycombinator.com/item?id=26869962
        
           | tyingq wrote:
           | That discussion was because the local sheriff's office said
           | that there was nobody in the drivers seat at the time of
           | impact. No idea why they would have said that. Tesla says the
           | steering wheel was deformed in a manner consistent with a
           | person being in the driver's seat when it crashed.
        
         | bishoprook2 wrote:
         | I just read about that after your post. Even with the typical
         | lawyer hyperbole it's pretty bad.
         | 
         | It seems to me that Tesla door handles (in a world where
         | they've been designing door latches for some time) are just
         | plain ridiculous and likely unreliable but are a side effect of
         | the market the company has been selling into. Gadgets go a long
         | way with Tesla owners.
         | 
         | Obviously, things like a latch should not only work under all
         | conditions including no-power, but they should probably be the
         | same under all conditions. 'Emergency' latches aren't going to
         | be used during an emergency as muscle memory is too important.
        
       | TacticalCoder wrote:
       | I drive a lot across Europe: as in, really a lot, long trip
       | across several countries, several times a year. I drive enough on
       | the highways to know a few scary situations, like the truck
       | driver in a big curve slightly deviating out of his lane and
       | "pushing" me dangerously close to the median strip for example.
       | 
       | To me driving requires paying constant attention to the road and
       | being always ready to act swiftly: I just don't understand how
       | you can have a "self driving car but you must but be ready to put
       | your hands back on the steering wheel and your foot on the
       | pedal(s)".
       | 
       | I have nothing against many "recent" safety features, like the
       | steering wheel shaking a bit if the car detects you're getting
       | out of your lane without having activated your blinker. Or the
       | car beginning to brake if it detects an obstacle. Or the car
       | giving you a warning if there's a risk when you change lane, etc.
       | 
       | But how can you react promptly if you're not ready? I just don't
       | get this.
       | 
       | Unless it's a fully self-driving car, without even a steering
       | wheel, a car should help you focus more, not less.
        
         | [deleted]
        
         | zip1234 wrote:
         | Also, these cars know the speed limits for the road but let you
         | set cruise control/self driving above the speed limit. Seems
         | like for safety purposes that should not be allowed. Not only
         | are people paying significantly less attention but they also
         | are speeding.
        
           | Sargos wrote:
           | Going slower than traffic is actually unsafe and increases
           | the chances of collisions with other drivers.
        
             | zip1234 wrote:
             | Going slower than traffic happens all the time. Over the
             | road trucks often have speed governors set to 60-70 mph for
             | example.
        
           | filoleg wrote:
           | That's a feature accommodating realities of driving on public
           | roads, not a bug.
           | 
           | If you drive on a 60mph speed limit highway, no one is
           | driving 60mph, everyone is going around 70mph. If you decide
           | to use autopilot and it limits you to 60mph, you
           | singlehandedly start disrupting the flow of traffic (that
           | goes 70mph) and end up becoming an increased danger to
           | yourself and others.
           | 
           | Not even mentioning cases when the speed limits change
           | overnight or the map data is outdated or if a physical sign
           | is unreadable.
        
             | zip1234 wrote:
             | Over the road trucks often have speed governors, some
             | companies limit their trucks to 60 mph because it saves a
             | lot of fuel and leads to a much (50%) lower risk of
             | collisions.
        
               | filoleg wrote:
               | Apples to oranges. Stopping distance of a 16-wheeler is
               | magnitudes larger than that of a typical sedan, so in
               | their case it makes sense.
               | 
               | For specific numbers (after subtracting reaction distance
               | being the same for both):
               | 
               | 55mph: car 165ft, 16-wheeler 225ft. 65mph: car 245ft,
               | 16-wheeler 454ft.
               | 
               | As you can see, the gap between a car's stopping distance
               | and a 16-wheeler's stopping distance increases with speed
               | increasing, and non-linearly at that. Not even mentioning
               | the destructive potential of a car vs. a 16-wheeler.
               | 
               | I would agree with your point if majority of the roads
               | were occupied by 16-wheelers, but it isn't the case (at
               | least in the metro area that I commute to work in).
               | 
               | Source for numbers used:
               | https://trucksmart.udot.utah.gov/motorist-home/stopping-
               | dist...
               | 
               | Note: I agree that it would be safer if everyone drove
               | the exact speed limit, as opposed to everyone going 10mph
               | above the speed limit. However, in a situation where
               | everyone is driving 10mph above the speed limit, you are
               | creating a more dangerous situation by driving 10mph
               | slower instead of driving 10mph above like everyone else.
        
           | aembleton wrote:
           | > Also, these cars know the speed limits for the road
           | 
           | Does it always get this correct, or does it sometimes read a
           | 30mph sign on a side road and then slow the car on the
           | motorway down to that speed?
        
             | zip1234 wrote:
             | I'm not sure how the cars know the speed limit. Maybe
             | someone else knows? My guess is combo of GPS/camera to
             | position correctly on road and the lookup of known speed
             | limit data. Perhaps it reads signs though?
        
               | cranekam wrote:
               | The rental car I am using now certainly a) reads road
               | signs for speed limit information, b) is definitely
               | fooled by signs on off ramps etc.
               | 
               | It's hard to imagine how speed limit systems would work
               | without some sort of vision capabilities -- a database of
               | speed limits would never be up to date with roadworks and
               | so on.
        
               | zip1234 wrote:
               | Nobody should be using autopilot driving through
               | roadworks anyways.
        
               | hermitdev wrote:
               | My car shows the speed limit of roads it knows. It uses
               | GPS and stored limits. It also doesn't know the limits of
               | non-major roads and doesn't attempt to show a limit then.
               | My car is a 2013, and I've not paid the $$ to update the
               | maps in that time (seriously, they want $200-$400 to
               | update the maps).
               | 
               | Since I bought my car, Illinois (where I live) has raised
               | the maximum limit on interstates by 10 MPH. My car
               | doesn't know about it. If my car limited me to what it
               | thought the limit was, I'd probably be driving 20 MPH
               | slower than prevailing traffic, a decidedly unsafe
               | situation.
        
             | rad_gruchalski wrote:
             | Different manufacturers probably use different systems but
             | no. BMW attempts to read the speed limit signs using the
             | frontal camera with a mix of some sort of stored info - it
             | knows that the speed limit is about to change (Mobileye?),
             | but it is very often that it won't catch a sign in the bend
             | or when the weather is bad. Also, it does not recognize
             | time restricted speed limits, for example 30kph from 7:00
             | to 17:00 Monday to Friday so it would keep driving 30kph
             | outside of those hours while 50kph is allowed. In some
             | places in Germany, it does not recognize the city limits
             | and carries on showing 70kph for a kilometer longer than it
             | should.
        
           | emerged wrote:
           | Almost nobody drives at or below the speed limit. It's
           | dangerous to do so in many places.
        
         | lastofthemojito wrote:
         | I learned this playing Gran Turismo video games way back when.
         | The game has long endurance races (I seem to remember races
         | that ran about 2 hours, but there may have been longer ones).
         | Eventually you get hungry or thirsty or have to use the
         | bathroom, so you pause the game, take care of business, and
         | resume. It's really easy to screw up if the game was paused
         | while your car was doing anything other than stable, straight
         | travel. A turn that I successfully handled 100 times before can
         | suddenly feel foreign and challenging if I resume there with
         | little context.
         | 
         | Obviously that's not exactly the same thing as taking over for
         | a real car when the driver assistance features give up, but
         | seems similarly challenging to take over the controls at the
         | most precarious moment of travel, without being sort of "warmed
         | up" as a driver.
        
           | jcpham2 wrote:
           | 500 laps at Laguna Seca in a manual transmission car let's
           | go!
        
         | zemptime wrote:
         | I see a lot of comments here postulating how autopilot is a
         | terribly designed feature from people who appear not to be
         | speaking from first hand experience and now I feel compelled to
         | comment, exactly following that HN pattern someone posted about
         | how HN discussions go. That said thanks for keeping this
         | discussion focused & framed as a system design one, doesn't
         | feel like a Tesla hate train so I feel comfortable hoppin' in
         | and sharing. This is a little refreshing to see.
         | 
         | Anyway, perhaps I'm in a minority here, but I feel as though my
         | driving has gotten _significantly safer_ since getting a Tesla,
         | particularly on longer road trips.
         | 
         | Instead of burning energy making sure my car stays in the lane
         | I can spend nearly all my time observing drivers around me and
         | paying closer attention farther down the road. My preventative
         | and defensive driving has gone up a level.
         | 
         | > I just don't understand how you can have a "self driving car
         | but you must but be ready to put your hands back on the
         | steering wheel and your foot on the pedal(s)".
         | 
         | I've not hit animals and dodged random things rolling/blowing
         | into the road at a moment's notice. This isn't letting
         | autopilot drive, it's like a hybrid act where it does the rote
         | driving and I constantly take over to quickly pass a semi on a
         | windy day, not pass it on a curve, or get over some lanes to
         | avoid tire remnants in the road up ahead. I'm able to watch the
         | traffic in front and behind and find pockets on the highway
         | with nobody around me and no clumping bound to occur (<3
         | those).
         | 
         | To your suspicion, it is a different mode of driving. Recently
         | I did a roadtrip (about half the height of the USA) in a non-
         | Tesla, and I found myself way more exhausted and less alert
         | towards the end of it. Could be I'm out of habit but egh.
         | 
         | Anyway, so far I've been super lucky. I don't think it's
         | possible to avoid all car crashes no matter how well you drive.
         | But I _for sure_ have avoided avoidable ones and taken myself
         | out of situations where they later occurred thanks to the extra
         | mental cycles afforded to me by auto-pilot. My safety record in
         | the Tesla is currently perfect and I'll try and keep it that
         | way.
         | 
         | I don't think autopilot is perfect either but I do think it's a
         | good tool and I'm a better driver for it. Autopilot has
         | definitely helped me spend better focus on driving.
        
           | somerandomqaguy wrote:
           | I think you two are talking about different things.
           | 
           | You're talking about Autopilot which is just driver
           | assistance technologies; lane keep assistance, adaptive
           | cruise control, blind spot monitoring, etc. It's not to
           | replace driver attention, it's just monitor sections of the
           | road the the driver can't pay attention to full time. The
           | driver is still remaining in control and attentive to the
           | road.
           | 
           | The person you're responding to seems to be talking talking
           | about the Full Self Driving feature who's initial marketing
           | implied that the driver need not be mentally engaged at all
           | or too impaired to drive normally. Which was later back pedal
           | led to say that you need to pay attention.
        
           | TacticalCoder wrote:
           | There's zero Tesla hate here and certainly zero EV hate here,
           | on the contrary: I just feel the interior build quality on
           | the Tesla could be a bit better but I'm sure they'll get
           | there.
           | 
           | I wouldn't want my, strangely enough upvoted a lot, comment,
           | to be mistaken for Tesla hate. I like what they're doing. I
           | just think the auto-pilot shouldn't give a false sense of
           | security.
           | 
           | > I've not hit animals and dodged random things
           | rolling/blowing into the road at a moment's notice.
           | 
           | > I don't think it's possible to avoid all car crashes no
           | matter how well you drive.
           | 
           | Same here... And animals are my worst nightmare: there are
           | videos on YouTube just terrifying.
           | 
           | For I do regularly watch crash videos to remind me of some of
           | the dangers on the road.
        
           | kwhitefoot wrote:
           | What I always tell people is that together me and my car
           | drive better than either of us on their own (Tesla Model S
           | 70D, 2015, AP1.5).
        
           | malwrar wrote:
           | This expresses the mindset I find myself in when I use
           | Autopilot. It's like enabling cruise control, you're still
           | watching traffic around you but now you don't need to focus
           | on maintaining the correct speed _or_ worry about keeping
           | your car perfectly in a lane. You can more or less let the
           | car handle that (with your hands on the wheel to guard
           | against the occasional jerky maneuver when a lane widens for
           | example) while you focus on the conditions around you.
        
             | somedude895 wrote:
             | Exactly this. I treat AP like I'm letting a learner drive.
             | Constantly observing to make sure it's doing the right
             | thing. I've been on long road trips and with AP my mind
             | stays fresh for much longer compared to with other cars.
        
             | throwaway0a5e wrote:
             | Exactly. It frees the driver from increasingly advanced
             | levels of mundane driving (cruise control manages just
             | speed, adaptive cruise also deals with following distance,
             | lane keeping deals with most of the steering input, etc)
             | allowing the driver to focus more on monitoring the
             | situation and strategic portion of driving rather than the
             | tactical. Of course, this relies on the driver to actually
             | do that. They could just use devote that extra attention to
             | their phone.
        
               | scrumbledober wrote:
               | my 2021 Subaru Forester does all of these things and I do
               | feel like I am safer with them on and paying attention to
               | the rest of driving.
        
           | AndrewBissell wrote:
           | The problem is, even if your subjective idea of how Autopilot
           | affects your own driving is correct, it appears not to be the
           | case for a significant subset of Tesla drivers, enough that
           | they've been plowing into emergency vehicles at such an
           | elevated rate as to cause NHTSA to open an investigation.
           | 
           | Also, your subjective impressions may be what they are simply
           | because you have not yet encountered the unlucky set of
           | conditions which would radically change your view, as was
           | surely the case for all the drivers involved in these sorts
           | of incidents.
        
           | gugagore wrote:
           | Some people activate cruise control and then rest their right
           | foot on the floor. I activate cruise control whenever
           | possible because while it is activated, I can drive with my
           | foot resting on the brake pedal. I like being marginally more
           | responsive to an event that requires braking since I don't
           | need to move my foot from the accelerator.
        
         | rad_gruchalski wrote:
         | The most useful button on my car is the speed limiter.
         | Everything else can go.
        
         | comeonseriously wrote:
         | I agree with everything you said. I do hope that eventually the
         | tech gets to the point where it can take over full time. We
         | recently took a road trip for our vacation and the amount of
         | road rage we witnessed was ... mind boggling. Don't get me
         | wrong, not everyone is a raging asshole, but there were enough
         | to make me wonder just why so many people are so freaking
         | angry.
        
         | KronisLV wrote:
         | > But how can you react promptly if you're not ready? I just
         | don't get this.
         | 
         | You cannot, that's the simple truth. You're supposed to focus
         | on the road anyways and should be able to take over once any
         | sort of autopilot or assist system starts working erroneously,
         | yet in practice many people simply assume that those systems
         | being there in the first place mean that you can simply stop
         | focusing on the road altogether.
         | 
         | It feels like the claim of "fully self driving vehicle" is at
         | odds with actual safety, or at least will remain so until the
         | technology actually progresses far enough to be on average
         | safer than human drivers, moral issues aside. Whether that will
         | take 15, 50 or 500 years, i cannot say, however.
         | 
         | That said, currently such functionality could be good enough
         | for the driver to take a sip from a drink, or fiddle around
         | with a message on their phone, or even mess around on the
         | navigation system or the radio - things that would get done
         | regardless because people are irresponsible, but making which a
         | little bit safer is feasible.
        
           | [deleted]
        
           | cma wrote:
           | I feel like driver monitoring can keep it safe, and should
           | even be available without autopilot enabled.
           | 
           | Comma.ai makes the monitoring more strict when the system is
           | less certain or when in denser traffic.
        
           | ghaff wrote:
           | It's nothing (well certainly not everything) to do with
           | people's assumptions. There's a ton of research around how
           | people simply stop paying attention when there's no reason
           | for them to pay attention 99% of the time. It doesn't even
           | need to be about them pulling out a book or watching a movie.
           | It can simply be zoning out.
           | 
           | Maybe, as you say, it's feasible today or soon to better
           | handle brief distractions but once you allow that it's
           | probably dangerous to assume that people won't stretch out
           | those distractions.
        
             | Retric wrote:
             | We have empirical data showing how safe actual level 2 self
             | driving cars are in practice. So there's no reason to work
             | from base assumptions. Yes, level 2 self driving cars cause
             | avoidable accidents, but overall rate is very close to the
             | rate people do. The only way that's happing is they are
             | causing and preventing roughly similar numbers of
             | accidents.
             | 
             | Which means people are either paying enough attention or
             | these self driving systems are quite good. My suspicion is
             | it's a mix of both, where people tend to zone out in less
             | hazardous driving conditions and start paying attention
             | when things start looking dangerous. Unfortunately, that's
             | going to cause an equilibrium where people pay less
             | attention as these systems get better.
        
               | Brakenshire wrote:
               | > We have empirical data showing how safe actual level 2
               | self driving cars are in practice.
               | 
               | Do we? Where does that come from? The data Tesla provides
               | is hopelessly non-representative because it makes the
               | assumption that the safety of any given road is
               | independent of whether a driver chooses to switch on the
               | system there.
        
               | Retric wrote:
               | Only overall numbers actually mater here, if self driving
               | is off then that's just the default risk from human
               | driving in those conditions. Talk to your insurance
               | company, they can give you a break down by make, model,
               | and trim levels.
        
               | SpelingBeeChamp wrote:
               | I am pretty sure that if I call Geico they will not
               | provide me with those data. Am I wrong?
        
               | Retric wrote:
               | Mine did, but I don't use Geico. If they don't give you
               | the underlying data you can at least compare rates to
               | figure out relative risks.
        
         | Faaak wrote:
         | To me they are really aids. Of course you keep being
         | concentrated, but I found that it takes out a lot of mental
         | load like keeping the car straight, constantly tweaking the
         | accelerator, etc..
         | 
         | It just makes the trips easier on the brain, and thus, for me,
         | safer overall: its easier to see the overall situation when
         | you've got free mental capacity
        
         | kbshacker wrote:
         | Exactly, the only driving assistance feature I use is adaptive
         | cruise control, and I don't have plans to use anything more. If
         | I trust autonomous systems too much, I would not be ready when
         | it matters.
        
         | pedrocr wrote:
         | I drive a Tesla and don't use the self-steering feature exactly
         | because of this. What I do instead is enable the warnings from
         | the same software like the ones you describe. That is actually
         | a large gain. I'm already paying attention as I'm driving the
         | car at all times and the software helps me catch things I
         | haven't noticed for some reason. Those features seem really
         | well done as the false positives are not too frequent and just
         | a nuisance but the warnings are often valuable.
        
           | oblio wrote:
           | Does it have/use emergency braking in case of danger, if you
           | don't use self-driving?
        
             | jazzyjackson wrote:
             | Yes but they are phasing out radar in favor of vision-only.
             | Model 3 and Y have been shipping without radar braking for
             | the past few months.
        
               | wilg wrote:
               | They still do emergency braking regardless of the sensor
               | technology.
        
               | nickik wrote:
               | The vision-only system has passed all required tests for
               | certification and Tesla themselves consider it to be a
               | much safer system now.
        
             | caf wrote:
             | Yes.
        
         | robomartin wrote:
         | It is my belief that the most ideal form of truly self driving
         | vehicles will not happen until a time when vehicles can talk to
         | each other on the road to make each other aware of position and
         | speed data. I don't think this has to be full GPS coordinates
         | at all. This is about short range relative position
         | information.
         | 
         | A mesh network of vehicles on the road would add the ability
         | for vehicles to become aware of far more than a human driver
         | can ever know. For example, if cars become aware of a problem a
         | few km/miles ahead, they can all adjust speed way before
         | encountering the constriction in order to optimize for traffic
         | flow (or safety, etc.).
         | 
         | Of course, this does not adequately deal with pedestrians,
         | bikes, pets, fallen trees, debris on the road, etc.
         | 
         | Not saying cars would exclusively use the mesh network as the
         | sole method for navigation, they have to be highly capable
         | without it. The mesh network would be an enhancement layer. On
         | highways this would allow for optimization that would bring
         | forth some potentially nice benefits. For example, I can
         | envision reducing emissions through traffic flow optimization.
         | 
         | Remember that electric cars still produce emissions, just not
         | necessarily directly while driving. The energy has to come from
         | somewhere and, unless we build a massive number of nuclear
         | plants, that somewhere will likely include a significant
         | percentage of coal and natural gas power plants.
         | 
         | The timeline for this utopia is likely in the 20+ year range. I
         | say this because of the simple reality of car and truck
         | ownership. People who are buying cars today are not going to
         | dispose of them in ten years. A car that is new today will
         | likely enter into the used market in 8 to 10 years and be
         | around another 5 to 10. The situation is different with
         | commercial vehicles. Commercial trucks tend to have longer
         | service lives by either design or maintenance. So, yeah, 20 to
         | 30 years seems reasonable.
        
         | mhb wrote:
         | Yes. This also makes me kind of nervous when just using normal
         | car adaptive cruise control. I feel as though my foot needs to
         | be hovering near the pedal anyway and that's often less
         | comfortable than actually pushing on the pedal and controlling
         | it myself.
        
         | hnarn wrote:
         | > like the truck driver in a big curve slightly deviating out
         | of his lane and "pushing" me dangerously close to the median
         | strip for example
         | 
         | This is a situation that you simply shouldn't put yourself in.
         | There is no reason to ever drive right next to a large vehicle,
         | on either side, except for very short periods when overtaking
         | them on a straight road.
        
           | throwaway0a5e wrote:
           | This just isn't realistically possible on most highways
           | except in the lightest traffic conditions. You are gonna
           | spend some time beside trucks whether you like it or not.
        
             | hnarn wrote:
             | Spending time right next to a truck is completely optional.
             | You can either speed up or slow down, either of them will
             | put you in a position where you are no longer right next to
             | them.
        
               | occamrazor wrote:
               | What if there is a more or less uninterrupted row of
               | trucks in the right lane?
        
               | hnarn wrote:
               | We can play "what if" all day, but I'm not interested. In
               | 99,9% of cases you can and should avoid driving next to a
               | large vehicle.
        
         | JohnJamesRambo wrote:
         | These are exactly my arguments to my girlfriend on why she
         | shouldn't use the Autopilot on our Tesla. Your mind will stray,
         | the feature is exactly meant to do that to you. The feedback
         | loop goes the wrong way. Then boom you don't see emergency
         | vehicles at a wreck apparently. I do blame Elon, he did the
         | Silicon Valley thing of just promise a lot of untested stuff
         | before the laws have solidified. Uber, Lime scooters, etc. The
         | Tesla is a great car, but self-driving is orders of magnitude
         | harder than he thinks.
        
           | jays wrote:
           | Agreed. I'd also add that other car manufacturers have made
           | tradeoffs on safety issues for decades.
           | 
           | So I wonder if it's more about Telsa capitalizing on the hype
           | of self driving cars (with the expensive self-driving add-on)
           | in the short term and less about him misunderstanding the
           | magnitude of difficulty.
           | 
           | Telsa is using the proceeds from that add-on to make them
           | seem more profitable and fund the actual development. It's
           | smart in some aspects, but very risky to consumers and Telsa.
        
             | ghaff wrote:
             | If you go back a few years, there were clearly expectations
             | being set around L4/5 self-driving that that have very
             | clearly not been met.
             | 
             | I still wonder to what degree this was a collective
             | delusion based on spectacular but narrow gains mostly
             | related to supervised learning in machine vision, how much
             | was fake it till you make it, and how much was pure
             | investor/customer fleecing.
        
         | ocdtrekkie wrote:
         | It can be really jarring too when a car behaves differently
         | than you expect: I regularly use cruise control on my Kia,
         | which makes driving much less stressful. It keeps the car
         | centered in the lane, more or less turns the car with the road,
         | and of course, matches the speed of the car in front of it with
         | reasonable stopping distance. I wouldn't call it "self-driving"
         | by any means, but if not for the alert that gets ticked off if
         | your hands are off the wheel too long, it'd probably go on it's
         | own for quite a long time without an incident.
         | 
         | However, I also once so far have experienced what happens when
         | this system experiences a poorly-marked construction zone.
         | Whilst most construction sites on the interstate system place
         | temporary road lines for lane shifts, this one solely used
         | cones. While I was paying attention and never left the flow of
         | traffic, the car actually fought a little bit against me
         | following the cones into another lane, because it didn't see
         | the cones, it was following the lines.
         | 
         | It doesn't surprise me at all that if someone gets too
         | comfortable trusting the car to do the work, even if they
         | _think_ they 're paying attention, they could get driven off
         | the roadway.
        
           | hermitdev wrote:
           | I was thinking about this the other day - driving in
           | construction. The town I live in is currently doing water
           | main replacement. So, lots of torn up roads, closed lanes and
           | even single-lane only with a flagger alternating directions.
           | No amount of safety cones will make it obvious what's going
           | on.
           | 
           | How do automated systems deal with flaggers? Visibility of
           | the stop/slow sign isn't sufficient to make a determination
           | on whether it's safe to proceed (not to mention "stop"
           | changes meaning here, entirely, from a typical stop sign).
           | Often, whether or not you can proceed comes down to hand
           | gestures from the flagger proper.
           | 
           | Not that I expect any reasonable driver to be using something
           | like autopilot through such a situation, but we've also seen
           | plenty of evidence that there are unreasonable drivers
           | currently using these systems, as well.
        
             | ocdtrekkie wrote:
             | Conceivably in the somewhat-near future (10 years+), most
             | cars on the road will have some sort of ADAS system, in
             | which I'd presume it'd start to make sense for construction
             | to use some sort of digital signalling. Something like a
             | radio signal broadcast that can send basic slow/stop
             | flagging signals to a lane of traffic.
             | 
             | Of course, the problem is, if we haven't developed it
             | today, the ADAS systems of today won't understand it in ten
             | years when there's enough saturation to be practical to use
             | it. Apart from Tesla, very few car manufacturers are
             | reckless enough to send OTA updates that can impact driving
             | behavior.
             | 
             | Lane-following ADAS systems of today, mind you, can work
             | relatively fine in construction areas... provided lane
             | lines are moved, as opposed to relying solely on traffic
             | cones.
        
         | paul7986 wrote:
         | Fully automated Self driving cars is either a pipe dream or
         | decades away in which many more people will be killed on the
         | road in the name of technological progress.
        
           | [deleted]
        
           | hnburnsy wrote:
           | Will changes such as machine-readable road markings, car to
           | car communications, and traffic management systems make this
           | happen quicker.
           | 
           | For example, couldn't emergency vehicles could send out a
           | signal directly to autonomous vehicles or via a traffic
           | managagemnt system to slow down or require the driver to take
           | over when approaching. An elementary version of this is Waze
           | which will notify you of road hazards or cars stopped on the
           | side of the road.
        
           | ra7 wrote:
           | Fully autonomous cars are already a reality with Waymo in AZ
           | and AutoX, Baidu in China. I don't know how safe the Chinese
           | companies are, but Waymo's safety record [1] is nothing short
           | of stellar.
           | 
           | [1] https://waymo.com/safety
        
             | ocdtrekkie wrote:
             | Waymo selected the one state willing to entirely remove any
             | safety reporting requirements for self-driving cars as the
             | place to launch their service. Regardless of what they
             | _claim_ to the contrary, if they had confidence in their
             | safety record, they would 've launched it in California,
             | not Arizona.
             | 
             | Waymo has lied about the capabilities of their technology
             | regularly, and for that reason alone, should be assumed
             | unsafe. A former employee expressed disappointment they
             | weren't the first self-driving car company to kill someone,
             | because that meant they were behind.
        
               | ra7 wrote:
               | > Regardless of what they claim to the contrary, if they
               | had confidence in their safety record, they would've
               | launched it in California, not Arizona.
               | 
               | California only months ago opened up permits for paid
               | robotaxi rides. So no, they couldn't have launched it in
               | CA. If you've noticed, they actually are testing in SF
               | with a permit.
               | 
               | > Waymo has lied about the capabilities of their
               | technology regularly, and for that reason alone, should
               | be assumed unsafe.
               | 
               | What lies? Their CA disengagement miles are for everyone
               | to see, their safety report is open, they have had 0
               | fatalities in their years of operation. Seems like you
               | just made this up.
        
               | dragonwriter wrote:
               | > California only months ago opened up permits for paid
               | robotaxi rides. So no, they couldn't have launched it in
               | CA.
               | 
               | Well, yeah, that's the logic of an established business.
               | Disruptive startups flout laws rather than following
               | them.
        
               | ocdtrekkie wrote:
               | I recall a particular incident where Waymo was marketing
               | their car being able to drive a blind man to a drive-
               | thru, way before the thing could safely drive more than a
               | mile on it's own. My understanding is that in 2021, it
               | still can't navigate parking lots (which would preclude
               | using it for drive-thrus).
               | 
               | Later, they were talking about how sophisticated their
               | technology was: It can detect the hand signals of someone
               | directing traffic in the middle of an intersection. Funny
               | that a few months later, a journalist got an admission
               | out of a Waymo engineer that the car wouldn't even stop
               | at a stoplight unless the stoplight was explicitly mapped
               | (with centimeter-level precision) so the car knew to look
               | for it and where to look for the signal.
               | 
               | https://www.technologyreview.com/2014/08/28/171520/hidden
               | -ob...
               | 
               | The article is seven years old at this point, but it's
               | also incredibly humbling in how much bull- Waymo puts
               | out, especially compared to the impressions their
               | marketing team puts out. (Urmson's son presumably has a
               | driver's license by now.)
               | 
               | In at least one scenario, the former Waymo engineer upset
               | he had failed to kill anyone yet ("I'm pissed we didn't
               | have the first death"), caused a hit-and-run accident
               | with a Waymo car, and didn't report it to authorities,
               | amongst other serious accidents:
               | https://www.salon.com/2018/10/16/googles-self-driving-
               | cars-i... Said star Waymo engineer eventually went to
               | prison for stealing trade secrets and then got pardoned
               | by Donald Trump. Google didn't fire him for trying to
               | kill people, they only really got upset with him because
               | he took their tech to Uber.
               | 
               | I'd say Waymo has a storied history of dishonesty and
               | coverups, behind a technology that's more or less a
               | remote control car that only runs in a narrow group of
               | carefully premapped streets.
        
               | ra7 wrote:
               | > I recall a particular incident where Waymo was
               | marketing their car being able to drive a blind man to a
               | drive-thru, way before the thing could safely drive more
               | than a mile on it's own.
               | 
               | How is a marketing video relevant from 2015 relevant to
               | their safety record? They weren't even operating a public
               | robotaxi service back then.
               | 
               | > My understanding is that in 2021, it still can't
               | navigate parking lots (which would preclude using it for
               | drive-thrus).
               | 
               | Completely false. Here is one navigating a Costco parking
               | lot (can't get any busier than that) [1]. If you watch
               | any videos in that YouTube channel, it picks you up and
               | drops you off right from the parking lot. Yes, you can't
               | use it for drive-thrus, but it doesn't qualify as "lying
               | about capabilities".
               | 
               | > Later, they were talking about how sophisticated their
               | technology was: It can detect the hand signals of someone
               | directing traffic in the middle of an intersection. Funny
               | that a few months later, a journalist got an admission
               | out of a Waymo engineer that the car wouldn't even stop
               | at a stoplight unless the stoplight was explicitly mapped
               | (with centimeter-level precision) so the car knew to look
               | for it and where to look for the signal.
               | 
               | Here is one recognizing a handheld stop sign from a
               | police officer while it stopped for an emergency vehicle
               | [2].
               | 
               | [1] https://www.youtube.com/watch?v=p5CXcJD3mcU
               | 
               | [2] https://www.youtube.com/watch?v=MpDbX1FViWk&t=75s
        
               | nradov wrote:
               | The workers doing road repairs in my neighborhood don't
               | even use handheld stop signs. Just vague and confusing
               | gestures.
        
               | ra7 wrote:
               | I think in those cases a Waymo vehicle would probably
               | require remote assistance. It's a really difficult
               | scenario for a computer to make sense of.
        
             | ghaff wrote:
             | Good for Waymo and hopefully Google keeps up this science
             | project. But it's a very limited and almost as perfect an
             | environment as you could have outside of a controlled test
             | area. Those who were saying L4/5 would be decades at least
             | away seem to be those who were on the right track. Kids
             | growing up today are going to have to learn to drive.
        
               | ra7 wrote:
               | L5 may be decades away. I think we will see L4 in some
               | major metro areas in the US by end of this decade. SF is
               | heating up with Cruise and Waymo's heavy testing. Their
               | progress will be a great indicator for true city driving.
        
               | ghaff wrote:
               | >we will see L4 in some major metro areas in the US by
               | end of this decade
               | 
               | I think you're far more likely to see L4 on limited
               | access highways in good weather. A robotaxi service in a
               | major city seems much more problematic given all the
               | random behavior by other cars, pedestrians, cyclists,
               | etc. and picking up/dropping off people in the fairly
               | random ways that taxis/Ubers do. (And you'll rightly be
               | shut down 6 months for an investigation the first time
               | you run over someone even if they weren't crossing at a
               | crosswalk.)
               | 
               | And for many people, including myself, automated highway
               | driving would actually be a much bigger win than urban
               | taxi rides which I rarely have a need for.
        
           | andreilys wrote:
           | _which many more people will be killed on the rise in the
           | name of technological progress._
           | 
           | Seeing as car crashes are the leading cause of deaths from
           | people aged 1-54, it may be an improvement from the status
           | quo
           | 
           |  _More than 38,000 people die every year in crashes on U.S.
           | roadways. The U.S. traffic fatality rate is 12.4 deaths per
           | 100,000 inhabitants. An additional 4.4 million are injured
           | seriously enough to require medical attention. Road crashes
           | are the leading cause of death in the U.S. for people aged
           | 1-54._
        
             | ac29 wrote:
             | > Road crashes are the leading cause of death in the U.S.
             | for people aged 1-54
             | 
             | This isnt true according to the CDC. Cancer and heart
             | disease lead for the 44-54 group, and while "accidental
             | injury" does lead from 1-44, if you break down the data, in
             | many cases vehicle based accidents are not not the largest
             | single source. For example:
             | 
             | Drowning is the largest single cause in 1-4
             | 
             | Cancer is the largest single cause in 5-9
             | 
             | Suicide is the largest single cause 10-14
             | 
             | https://wisqars-viz.cdc.gov:8006/lcd/home
        
             | hn8788 wrote:
             | I'd say it depends on how many of those deaths are caused
             | by the driver doing something unsafe. I'd be more
             | comfortable with higher traffic deaths that primarily
             | affect bad drivers than a lower number of deaths randomly
             | spread across all drivers by a blackbox algorithm.
        
               | _ph_ wrote:
               | If you are texting while driving and hit a stopped car or
               | run a red light, you are very lightly to kill others.
               | Actually more likely, as a side impact is more dangerous
               | than a frontal one.
        
               | jazzyjackson wrote:
               | But the car doesn't need to drive itself to avoid those
               | factors, it just needs to have radar auto braking
        
       | kube-system wrote:
       | There are a lot of good points here in the comments already about
       | the relative safety of Tesla's system compared to other vehicles
       | and other automated driving system -- and I think they're
       | probably right.
       | 
       | The differentiating issue with Tesla's system is the way it is
       | sold and marketed. Important operational safety information
       | shouldn't be hidden in fine print. Subtly misleading marketing
       | has unfortunately become acceptable in our culture, but this idea
       | needs to stay out of safety-critical systems.
       | 
       | We need a mandate for clear and standardized labelling for these
       | features, a la the Monroney sticker. All manufacturers should
       | have to label and market their cars with something like SAE
       | J3016.
       | https://www.sae.org/binaries/content/gallery/cm/articles/pre...
        
       | kemiller wrote:
       | OK people. There have been a grand total of 11 cases in 2.5
       | years. NHTSA investigates a lots of things. How many regular
       | drivers collided with emergency vehicles in the same time frame?
        
         | jdavis703 wrote:
         | The FBI has stats on police deaths by type of death. If memory
         | serves correctly slightly more cops were killed in traffic
         | crashes than that.
         | 
         | However, I'm assuming the crashes were quite varied: anything
         | from a driver recklessly fleeing a stop to some drunk crashing
         | into a cop on the highway shoulder. Most likely these deaths
         | didn't have a systematic pattern to them that could be
         | prevented if only we knew what the root cause was.
        
         | [deleted]
        
       | kelvin0 wrote:
       | Having human drivers and assisted drivers on the same road is
       | problematic currently.
       | 
       | I think the best situation would be to have 'automated' stretches
       | of highway specially designed to 'help' self driving systems.
       | 
       | Only self driving vehicles would be allowed on such special
       | highways, and everything would be built around such systems.
        
         | ghaff wrote:
         | Who is going to pay for these dedicated stretches of highway
         | that only, presumably, relatively wealthy owners of self-
         | driving cars are going to be allowed to use?
        
           | kelvin0 wrote:
           | Any entity (individuals or corporate) could use it of course.
           | Rich or not, since Electric vehicles such as buses could be a
           | public form of transportation on these specially adapted
           | roads.
        
         | SCNP wrote:
         | Let me preface by saying that I hold no strong opinions on this
         | matter and my comments are purely speculative.
         | 
         | This is kind of a position I've held for a long time but a
         | different aspect of the problem. I think a system similar to
         | IFF in aircraft would solve all of these issue. If every car
         | knew where every other car was at all times, you could easily
         | devise a system that would be nearly flawless. The issue is,
         | there is no incremental path to this solution. You would
         | essentially have to start over with the existing transportation
         | network.
        
           | [deleted]
        
           | mattnewton wrote:
           | The problem is that you don't just need to know about every
           | other vehicle, you still need all the perceptual stuff for
           | pedestrians, bikers, baby carriages, trash, road closures,
           | traffic cops in the middle of the road, etc. All those things
           | are arguably harder to detect reliably than a somewhat
           | standard sized box of metal with pairs of lights in the front
           | and back. I think shooting for superhuman perception of all
           | these things is still where Tesla is failing.
        
             | SCNP wrote:
             | True. I guess I was thinking that if you build totally new
             | infrastructure for these new overhauled cars, you'd keep it
             | completely separate from other modes of transportation. My
             | sci-fi inclinations had me imagining tubes like Logan's
             | Run.
        
       ___________________________________________________________________
       (page generated 2021-08-16 23:01 UTC)