[HN Gopher] US moves closer to recalling Tesla's self-driving so...
___________________________________________________________________
US moves closer to recalling Tesla's self-driving software
Author : heavyset_go
Score : 233 points
Date : 2022-06-11 18:39 UTC (4 hours ago)
(HTM) web link (fortune.com)
(TXT) w3m dump (fortune.com)
| musesum wrote:
| My use case was autopilot steering me off the left hand side of
| the road at 80 MPH during a sunset [1] My (WAG) guess is that the
| ML corpus was based on a model 3, while the model Y, I was
| driving, had placed the sensors slightly higher? I dunno; again,
| tis a WAG.
|
| I am a huge fan of Tesla's permise. My Dad sold Vanguard Citicars
| [2], and I worked on GM's interactive ad for the EV-1. I put a
| down payment on Aptera's original EV and again on the resurrected
| version.
|
| But, I backed out of buying a Tesla. Even though I disabled
| Autopilot and ran purely cruise control, the Model Y would brake
| for phantom obstacles. Such as: low rises in the road. Or:
| passing a 18-wheeler on the left. Driving from AZ to CA, the
| phantom braking was occuring every 5 miles or so. So, I had to
| drive 800 miles with a manual accelerator. Bummer!
|
| [1] https://news.ycombinator.com/item?id=31504583
|
| [2] https://en.wikipedia.org/wiki/Citicar
| chrisseaton wrote:
| > My (WAG) guess
|
| Wives-and-girlfriends?
| matheist wrote:
| Wild-ass guess
| chrisseaton wrote:
| Wild-ass guess guess?
| fragmede wrote:
| Yes, like when people say PIN number or ATM machine. Or
| HIV virus or LCD display.
|
| https://en.wikipedia.org/wiki/RAS_syndrome
| hyperbovine wrote:
| Thank God. I for one did not opt into this beta test every time I
| get on the road.
| MBCook wrote:
| This, more than anything, is what worries me. I do not own a
| Tesla. I didn't agree to anything. I'm not choosing to enable
| auto-pilot (or FSD).
|
| And yet my life is/may be in danger because of it.
|
| No other car manufacturer has ever done anything like this as
| far as I know.
| epgui wrote:
| This is not about FSD.
| robonerd wrote:
| Hyperbovine didn't mention FSD. Let me spell it out for you:
| Autopilot is beta grade software _at best._ It drives
| straight into parked trucks.
| KennyBlanken wrote:
| It routinely drives straight into parked emergency vehicles
| displaying a blaze of flashing lights.
| md2020 wrote:
| "Routinely" isn't even close to the truth.
| epgui wrote:
| The company doesn't consider autopilot to be in beta, but
| it does call FSD this.
| robonerd wrote:
| > _The company doesn't consider autopilot to be in beta_
|
| That only makes it even worse.
| epgui wrote:
| I think the question of whether something is a beta is
| different than whether anyone thinks it ought to be.
| There's a lot of "non-beta", but bad, software.
| robonerd wrote:
| Tesla thinking this software is production ready shows a
| complete disconnect from either reality or common
| morality.
| Barrin92 wrote:
| the difference being that most of that software doesn't
| move two tons of steel through traffic at lethal
| velocities so maybe accurate terminology is appropriate
| here.
| [deleted]
| md2020 wrote:
| What about other driving assistance system that claim to do
| the same thing as Autopilot? My 2022 Hyundai Elantra has a
| system that drives itself on highways. GM Super/UltraCruise
| is available on highways. Is NHTSA gonna investigate them
| too? Presumably they're no farther along in self driving
| than Tesla is.
| Vladimof wrote:
| It doesn't help that Musk tries to create confusion by naming
| things something they're not. Next thing he'll implement
| might be the Flying Car Update... which will turn lights
| green as you drive by using the 14hz infrared signal.
| [deleted]
| freeflight wrote:
| With the way Tesla markets autopilot [0] it's really no
| surprise people are using these as interchangeably as Tesla
| itself tends to do.
|
| [0] https://www.bbc.com/news/technology-53418069
| bagels wrote:
| It used to say on the autopilot page "full self driving
| hardware". They left of the distinction that "full self
| driving software" is not done yet.
| [deleted]
| chromejs10 wrote:
| One thing I don't quite understand is that many cars have
| autopilot-like software. Unlike tesla's which constantly requires
| putting pressure on the wheel to show you're there and paying
| attention, Ford's lets you drive indefinitely without any hands
| on the wheel. Wouldn't this same investigation be put onto other
| manufactures as a giant audit? Hitting emergency vehicles is
| obviously bad but 1) it happens in cars without autopilot and 2)
| if you're hitting one you're clearly no paying attention. it's
| not like they just appear out of no where
| donw wrote:
| The ruling class doesn't like Elon Musk.
| bigbluedots wrote:
| Seriously? If money is power, Musk _is_ the ruling class.
| heavyset_go wrote:
| > _On Thursday, NHTSA said it had discovered in 16 separate
| instances when this occurred that Autopilot "aborted vehicle
| control less than one second prior to the first impact,"
| suggesting the driver was not prepared to assume full control
| over the vehicle._
|
| > _CEO Elon Musk has often claimed that accidents cannot be the
| fault of the company, as data it extracted invariably showed
| Autopilot was not active in the moment of the collision._
|
| > _While anything that might indicate the system was designed to
| shut off when it sensed an imminent accident might damage Tesla's
| image, legally the company would be a difficult target._
| 01100011 wrote:
| > legally the company would be a difficult target
|
| Is this the case?
|
| Look at the Toyota "Acceleration Gate" fiasco. Toyota paid
| billions(full amount isn't disclosed due to NDAs in hundreds of
| cases) because of poor software. If Tesla engineers failed to
| adhere to industry best practices(hard to do when you're using
| deep learning in your control loop) then they'll likely be
| liable.
|
| An overview of the Toyota scandal:
| https://users.ece.cmu.edu/~koopman/pubs/koopman14_toyota_ua_...
| kylecordes wrote:
| I remember reading that at the time, and just looked at the
| slides again.
|
| A bug which commands wide open throttle also therefore
| depletes engine vacuum, leading to a lack of power assist if
| the brakes are pressed and released and pressed again.
|
| Drivers who have been around the block a bit and have some
| mechanical sympathy would take a moment to realize neutral
| will resolve the situation. But many other drivers would not
| realize this.
|
| Although it would not absolve Toyota from responsibility in
| such a case, I wish driver training required and tested for
| dealing with a long list of adverse situations within N
| seconds each.
| lawn wrote:
| So what happens is
|
| 1. Speed straight towards your doom.
|
| 2. Give back control a moment before collision, when it
| realizes it fucked up.
|
| 3. "It was the driver's fault"
| epgui wrote:
| If you read the actual report, you will see that NHTSA
| actually acknowledges that autopilot makes things safer
| overall, and they agree that crashes are due to misuse of the
| software. They highlight that on average, when there was a
| crash, the driver should have been able to recognize the
| hazard ~8 seconds in advance.
| lazide wrote:
| The challenge of course is what level of misuse is to be
| expected (especially once it becomes more widely
| available), and if using the software with normal levels of
| misuse results in an overall safer level of driving than
| without it.
| epgui wrote:
| It appears to be so: misuse is included in the overall
| safety assessment.
|
| But this is exactly the position of NHTSA in the report.
| They say that if Tesla can reasonably do something to
| mitigate misuse, then they should. That's what this is
| about.
| lazide wrote:
| My point is:
|
| 1) This is currently limited to a pretty restricted
| market (people who drive Teslas, which is a small subset
| of car drivers). When used more widely by a wider set of
| people, it may have a worse track record.
|
| 2) It isn't clear that such misuse has reasonable
| mitigations that will continue to make it attractive to
| use.
|
| But it's all speculation at this point.
| CSMastermind wrote:
| > If you read the actual report, you will see that NHTSA
| actually acknowledges that autopilot makes things safer
| overall
|
| Where does it say that in the report?
|
| > and they agree that crashes are due to misuse of the
| software
|
| I also do not see this in the report other than
| acknowledging that is Tesla's stance. In fact, they
| explicitly reject that as a defense, to quote:
|
| "A driver's use or misuse of vehicle components, or
| operation of a vehicle in an unintended manner does not
| necessarily preclude a system defect. This is particularly
| the case if the driver behavior in question is foreseeable
| in light of the system's design or operation."
|
| > They highlight that on average, when there was a crash,
| the driver should have been able to recognize the hazard ~8
| seconds in advance
|
| That's simply not what the report says. Here's the passage
| you're referencing:
|
| "Where incident video was available, the approach to the
| first responder scene would have been visible to the driver
| an average of 8 seconds leading up to impact."
|
| _Except_ you're leaving out some extremely important
| context.
|
| 1. This sentence is part of the expository section of the
| report, it's in no way "highlighted".
|
| 2. This is referring _only_ to 11 specific collisions in
| which Teslas struck first responders, not the other 191
| collisions they examined.
|
| I've seen some other things you've said in this thread so I
| feel like I should address those as well.
|
| > this is only autopilot not FSD.
|
| To quote the report:
|
| "Each of these crashes involved a report of a Tesla vehicle
| operating one of its Autopilot versions (Autopilot or Full-
| Self Driving, or associated Tesla features such as Traffic-
| Aware Cruise Control, Autosteer, Navigate on Autopilot, and
| Auto Lane Change)."
|
| > this will be fixed by an over the airwaves update.
|
| I see nowhere in the report that says this. In fact as far
| as I can tell from reading it the report says they're
| upgrading and escalating their investigation because of the
| seriousness of the concerns.
|
| Here's the report's conclusion in its entirety:
|
| "Accordingly, PE21-020 is upgraded to an Engineering
| Analysis to extend the existing crash analysis, evaluate
| additional data sets, perform vehicle evaluations, and to
| explore the degree to which Autopilot and associated Tesla
| systems may exacerbate human factors or behavioral safety
| risks by undermining the effectiveness of the driver's
| supervision. In doing so, NHTSA plans to continue its
| assessment of vehicle control authority, driver engagement
| technologies, and related human factors considerations."
|
| Actual report here for those curious:
| https://static.nhtsa.gov/odi/inv/2022/INOA-EA22002-3184.PDF
|
| And for those who don't here's the best summary I can
| provide.
|
| * The government received 11 reports of Teslas striking
| first responders with autopilot engaged.
|
| * After an initial investigation of these reports they
| deemed them serious enough to open a broader investigation
| into the technology.
|
| * They found 191 cases where vehicles using the technology
| crashed unexpectedly.
|
| * Of those 85 were dismissed because of external factors or
| lack of data (leaving 106).
|
| * Of those "approximately half" could be attributed to
| drivers not paying attention, not having their hands on the
| wheel, or not responding to vehicle attention prompts.
|
| * Of the 106 "approximately a quarter" could be attributed
| to operating the system in environments where it's not
| fully supported, such as rain, snow, or ice.
|
| * That leaves 37 cases where it appears the cars
| malfunctioned, both not prompting the driver and not taking
| preventative action like braking until it was too late to
| avoid the crash or not at all.
|
| * As a result, the NHTSA is escalating their investigation
| to do an engineering analysis of these systems looking to
| better understand the defects.
| [deleted]
| zozbot234 wrote:
| Plot twist: it's always the driver's fault. Autopilot is
| glorified cruise control (the technical term is 'Level 2' on
| the SAE scale). It literally doesn't matter when Autopilot
| bails out altogether (even if it manages to automatically
| avert a crash on other occasions), the driver _must_ be fully
| in control at all times. Would you trust your cruise control
| to prevent a collision? Of course not, that would be crazy.
| It 's not what it's built for. What a worthless discussion.
| Sakos wrote:
| It seems so weird to me how people so vehemently defend
| Tesla instead of holding them to higher standards. What do
| you owe them and why?
| smokey_circles wrote:
| Well when you wrap your identity around someone else's
| idea, you get weird outputs at the edge cases
| abawany wrote:
| Some might also be defending their stock portfolios,
| 401ks, and future tesla purchases built on tsla price
| appreciations.
| robonerd wrote:
| > _the driver must be fully in control at all times_
|
| Yes, that's precisely what the manual said last time I
| checked. But it's not how Elon Musk has presented it, nor
| how innumerable Tesla drivers treat it.
| cma wrote:
| Tesla say in the video here the driver is only there
| because of regulators, not ethical reasons:
|
| https://www.tesla.com/autopilot
|
| This video was from 2018 or so I think and they pretended
| the driver wasn't needed.
| [deleted]
| legalcorrection wrote:
| The problem is that people can't actually take over fast
| enough necessarily. If you're not actively driving, then
| you can't just immediately react.
| zozbot234 wrote:
| Sure, but what does that have to do with Autopilot?
| "Driver gets into a crash because they were spaced out
| and not actually paying attention to the road" is
| something that happens all the time, no need for modern
| ADAS.
| epgui wrote:
| Exactly, the most important metric/question is whether it
| improves overall safety compared to the baseline.
| wolverine876 wrote:
| That's the metric Tesla and the self-driving car industry
| would like to use, but I don't see why it's the most
| important. Also, it misconstrues 'most important' to mean
| 'all that's important'.
|
| We could just ban personal vehicles to improve overall
| safety.
| Spivak wrote:
| Because "drivers space out when using a system that
| removes the stimulus of driving that keeps them alert" is
| not solely the fault of the individual when it's not
| possible in real world scenarios to use it safely.
| googlryas wrote:
| But you're supposed to be actively driving with
| autopilot.
|
| I use autopilot every now and then, and when I do, I
| literally keep my hands on the wheel in a driving
| position, and my foot resting lightly on the pedal.
| legalcorrection wrote:
| You by definition can't be actively driving. You're just
| watching. It's active driving that keeps you alert in the
| first place.
| epgui wrote:
| The NHTSA report says that on average, an attentive
| driver would have been able to react 8 seconds before the
| crash. 8 seconds is a lot of time on the highway (where
| autopilot can be used).
| freeflight wrote:
| Yet the autopilot system chose to only warn the drivers 1
| second before impact, which is basically no time at all
| to react to anything.
|
| So if those 8 seconds of "autopilot ain't too sure"
| actually exist, then the car needs to signal that to the
| driver so the driver can be ready to take over.
|
| The fact it still doesn't do that, speaks volumes about
| the kind of "progress" Tesla seems to be making, none of
| it good.
| zozbot234 wrote:
| The crashes were not of a kind that Autopilot could have
| detected earlier. Several involved stationary objects,
| that Autopilot seems to intentionally ignore because it
| can't tell apart what's actually in the way of the car
| from harmless features of the surroundings that will
| never cause a collision. Which just goes to prove:
| Autopilot = glorified cruise control. You must never
| expect it to keep you safe.
| chrisseaton wrote:
| > Autopilot = glorified cruise control
|
| Isn't that what the name implies? An autopilot keeps
| heading and speed.
| lawn wrote:
| The whole concept is flawed.
|
| We cannot expect humans to be able to pay attention and be
| able to take over at a moment's notice at all times, simply
| because that's not how our brains work.
|
| Autopilot is in the danger zone, where it's good enough to
| make your brain relax, but it's bad enough to require your
| brain to not do that. So it's fundamentally unsafe.
|
| Cruise control in contrast isn't good enough so your brain
| will have to pay attention, otherwise you'll crash very
| quickly.
|
| And this is all made much, much worse by Elon's and Tesla's
| irresponsible marketing, and the believers who dismiss this
| as a "worthless discussion".
| zozbot234 wrote:
| > We cannot expect humans to be able to pay attention and
| be able to take over at a moment's notice at all times
|
| And yet, this is what ordinary driving requires already.
| It's not a new requirement. The whole point of this
| investigation is to figure out the matter of whether a
| SAE Level 2 ADAS really makes distracted driving more
| likely. The jury is still out on that, and even if it
| does it would be reasonably easy to cope with the problem
| by tuning the system to alert the driver more frequently,
| especially ahead of impending danger (such as in low
| visibility conditions, which are expressly mentioned as a
| key factor in these crashes).
| bbarnett wrote:
| _And yet, this is what ordinary driving requires already_
|
| No, it's not. Not at all.
| heretogetout wrote:
| Exactly as predicted by many, many skeptics. Of course that's
| what his statement meant.
| Bellend wrote:
| lol @ anything Elon Musk. Did he not say last week he was sending
| 1000 Starships to Mars? What a fucking tosser.
| faebi wrote:
| I think too much focus at Tesla is on FSD beta while barely
| anything trickles down to standard cars. They need more
| developers who extract new features down and make them ready for
| a worldwide rollout. The best symptom of this is the horrible
| state of Auto Pilot in Europe. Or how their auto blinding lights
| are just outright bad even though they have all the tools to fix
| it with a nice update.
| xaduha wrote:
| Lidar approach just has to be the winner in my opinion, simpler
| is better.
| nemothekid wrote:
| LIDAR isn't simpler.
| xaduha wrote:
| Why not? Sure it's all complicated enough, but conceptually
| measuring distance with a laser is simple.
|
| Do that many, many times and you have a good enough picture
| of the surrounding area in a machine-readable 3d format, so
| you have a good starting point.
|
| How is that not simpler than what Tesla is doing?
| Zigurd wrote:
| Lidar is lo-res, though it has other advantages. Sensor fusion
| is also not without complexity and problems. But camera-only
| systems seem like one of those bets that hangs out there for
| too long before it will have to be walked back, leaving a lot
| of customers with stranded technology and unmet expectations.
| keewee7 wrote:
| A future with high powered infrared lasers on top of every car
| will probably lead to a mysterious rise in people slowly losing
| their vision. Someone should do a study of the eyes and vision
| of people who work with LIDAR.
|
| Why not instead use multiple radars (we already put 1-2 radars
| in cars with adaptive cruise control) to augment Tesla's vision
| based approach?
| kibwen wrote:
| According to this article, lidar using eye-safe lasers is
| standard: https://photonicsreport.com/blog/is-lidar-
| dangerous-for-our-...
|
| _" If a lidar system ever uses a laser with a higher safety
| class, then it could pose a serious hazard for your eyes.
| However, as it stands, according to current information and
| practices, lidar manufacturers use class 1 eye-safe (near)
| infrared laser diodes."_
| bbarnett wrote:
| And what if you are hit in the eyes with the outout of 20
| of them at once?
|
| Like on a busy highway.
|
| What if you are hit all day long, for hours, every single
| day?
|
| History is replete with studies which were used to presume
| different exposures would be fine.
| coolspot wrote:
| You're right that it needs further studying, to not end
| as asbestos.
|
| On a side note, you eyes are hit with more powerful and
| full-spectrum radiation (from IR to UV) from a burning
| star every day for hours (or at least they are designed
| to).
| KennyBlanken wrote:
| You sound like the kind of internet commenter who whinges
| about how dangerous bicycles are on the road because
| someone in a car might swerve to avoid them and head-on
| collide with a tractor trailer.
| Misdicorl wrote:
| The math makes this quite implausible for a random
| scenario. Perhaps a toll collector could introduce the
| right systematics to make the scenario not ridiculous on
| it's face.
|
| Imagine pointing a laser at something more than five feet
| distant the size of an eyeball and then getting another
| laser to point at the same spot at the same time. And
| then another five. And those lasers are all attached to
| cars moving and the eyeball is moving too...
| SoftTalker wrote:
| Put an IR filter in the car window glass?
| md2020 wrote:
| Simpler is better, so you want to add another sensor that
| humans don't have, to do a task that only humans can currently
| do safely? This makes no sense to me.
| xaduha wrote:
| ryukoposting wrote:
| FMCW radar is more tenable for automotive applications, in my
| opinion. No lasers necessary.
| IshKebab wrote:
| Way too basic. They don't even produce an image.
| chrinic3839 wrote:
| It's a shame. I really wanted Tesla to succeed.
|
| But as time goes on, their products become less impressive and
| their CEO is not helping the brand.
| mise_en_place wrote:
| TSLA will succeed, if you believe the hypothesis that oil
| prices will continue to climb to the point that many buyers are
| priced out of that market and switch to electric vehicles. It's
| screwed if the alternate hypothesis is true (that is we
| continue to use oil forever).
| SoftTalker wrote:
| Tesla succeeded in showing the world that an electic car can be
| something more than a glorified golf cart. That is frankly a
| huge achievement, compared to what came before. EVs are now the
| centerpiece of every manufacturer's plans for the coming
| decades, and that is because of Tesla. Whether they succeed as
| a brand is not really important anymore.
| robonerd wrote:
| They could turn this around if they gave up on the futurologist
| software bullshit and strived to simply sell battery-powered
| cars. Stop selling dreams of robotaxis, just sell cars.
| bbarnett wrote:
| Or maybe use radar or real sensors, again.
| nawgz wrote:
| That doesn't sound like it would lead back to obscene
| valuations decoupled from today's reality
| extheat wrote:
| Yes become a boring company nobody cares about like the rest
| of the car industry. Nobody should try and innovate with
| futuristic products and become bureaucratic behemoths like
| the government, which has an excellent track record of
| success.
| freeflight wrote:
| Having your paying customers act as alpha testers, for
| software that can potentially injure and kill other people,
| is imho not exactly the kind of "innovation" we should
| strive for.
|
| Btw; Weird dig at "the governments success", considering a
| whole lot of Musk's ventures wouldn't be around without all
| kinds of government subsidies.
| a9h74j wrote:
| MFAKP?
| rurp wrote:
| For all of it's flaws the US govt _does_ have a remarkable
| track record. It has presided over one of the most stable
| and productive societies in history. Sure it hasn 't always
| been great, and the future outlook isn't looking so strong
| at the moment, but it has been wildly successful for the
| past few centuries.
| ken47 wrote:
| Some form of Tesla would exist under this future. TSLA the
| stock might not.
| rtlfe wrote:
| That sounds better for basically everybody except Elon.
| t_mann wrote:
| > CEO Elon Musk has often claimed that accidents cannot be the
| fault of the company, as data it extracted invariably showed
| Autopilot was not active in the moment of the collision.
|
| > NHTSA said it had discovered in 16 separate instances when this
| occurred that Autopilot "aborted vehicle control less than one
| second prior to the first impact," suggesting the driver was not
| prepared to assume full control over the vehicle.
|
| Ouch.
| gnicholas wrote:
| Would this mean that Tesla would have to provide refunds for
| customers who purchased FSD? If so, would they pro-rate the
| refund based, as is customary under lemon law refunds? On the
| flip side, could they be required to inflation-adjust the
| refunds, so that customers get back the same purchasing power
| that they spent?
|
| Separately, would this prevent Tesla from having a beta tester
| group that tries out FSD features at no cost?
| [deleted]
| the8472 wrote:
| They can still do the shadow driver thing for testing.
| epgui wrote:
| No, this is about autopilot, not FSD. And the actions being
| considered would involve only an over the air update of the
| software.
| sva_ wrote:
| What a shitty website, hacking the back button to confirm if I
| want to leave.
| Beltiras wrote:
| I bought FSD version of Model 3 in the spring of 2020. I'm still
| waiting for what I purchased to be turned on. Frankly, I'd be
| psyched if they would expand on the little things instead of the
| big things. "Park the car in that spot". "Exit the parking spot".
| These would be worth the price I paid.
| toast0 wrote:
| I don't see why you don't return the car? It doesn't have what
| you paid for.
| TulliusCicero wrote:
| The fact that we don't have reliable, fully automated parking
| yet is bizarre. I'd love a solution that automatically parallel
| parks for me, and a computer with sensors around the vehicle
| should be able to do a better job. Plus, the problem is of
| limited scope, and low speed, so you don't have to deal with
| most of the potential issues and dangers with full self driving
| on a road.
| kwhitefoot wrote:
| My 2015 Model S does parallel parking between two other cars
| quite well. It will also reverse into the space between two
| cars parked perpendicular to the road.
| causality0 wrote:
| Ford does that with Active Park Assist.
| lima wrote:
| Mercedes too, here's a demo:
| https://www.youtube.com/watch?v=hz4Bm2jui-c
| 331c8c71 wrote:
| Also Peugeot/Citroen and VW/Audi. Probably all major car
| manufacturers have it by now?
| akmarinov wrote:
| BMWs have had that for quite a while. My 2016 5 series does
| it perfectly 100% of the time. It even gets into some spots
| that I consider way too short, but it surprises me.
|
| I use it all the time, it's quick and easy.
| pclmulqdq wrote:
| Tesla doesn't have it. Many other automakers do offer that
| functionality on their luxury lines.
| TulliusCicero wrote:
| This is weird to me because there's another comment reply
| to me saying that their Model S has this functionality.
|
| So it looks like it's more common than I thought, but also
| of mixed availability/awareness?
| causality0 wrote:
| _I bought FSD version of Model 3 in the spring of 2020_
|
| I don't understand how people keep falling for this. Sure, it
| seemed realistic enough at first but how many cracks in the
| facade are too many to ignore? In 2015 Musk said two years. In
| 2016 he said two years. In 2017 he said two years. Tesla did a
| 180 on the _fundamental requirements_ of FSD and decided it
| doesn 't need lidar just because they had a falling-out with
| their lidar supplier. That level of ego-driven horseshit is
| _dangerous_.
| bagels wrote:
| It's still not actually available to people who paid for it,
| and it doesn't actually work* (at least not to a degree where
| it can be trusted not to crash in to street signs or parked
| cars). I have no idea why anyone pays a $10k premium for
| vaporware.
| tonguez wrote:
| leobg wrote:
| They had a falling out with MobileEye, who provided the AP
| 1.0 hardware. It never used LIDAR. Tesla doesn't use LIDAR
| because you need to solve vision anyway. And once you solve
| that, LIDAR makes no more sense.
|
| (You need to solve vision anyway, because for that object, of
| which LIDAR tells you is exactly 12.327 ft away, you still
| need to figure out whether it is a trashcan or a child. And
| if it is a child, whether it is about to jump on the road or
| walking away from the road. LIDAR does not tell you these
| things. It can only tell you how far they are away. It is not
| some magical sensor which Tesla is just too cheap to employ.)
| xaduha wrote:
| > you still need to figure out whether it is a trashcan or
| a child
|
| Why? First of all don't drive into anything that moves,
| period. Secondly, if you can avoid bumping into anything at
| all, do that.
| a9h74j wrote:
| IDK. Avoiding crashing into anything 12.327 ft way might be
| considered a feature.
| starik36 wrote:
| I don't know. The latest version is insanely impressive.
| https://www.youtube.com/watch?v=fwduh2kRj3M
|
| I am cautiously optimistic about future of FSD in general
| now.
| systemvoltage wrote:
| Wow, I haven't kept up with FSD but I am impressed.
| zozbot234 wrote:
| > In 2015 Musk said two years. In 2016 he said two years. In
| 2017 he said two years.
|
| Look, the message is clear enough already: FSD is two years
| away, and always will be! You've got to admire that kind of
| consistency.
| joakleaf wrote:
| Unfortunately, it is not even consistent:
|
| * In January 2021 he said end of 2021 [2]. * In May 2022 he
| said May 2023 [1].
|
| So he moved from around 2 years to 1 year.
|
| Looking at the rate of progress from Tesla FSD videos on
| YouTube, I wouldn't bet on 2023.
|
| Whenever people talk about Tesla FSD, I always like to
| point to Cruise who just got a permit to carry paying
| riders in California [3]. Honestly, their software looks
| significantly smarter than Tesla's -- Highly recommend
| their 'Cruise Under the Hood 2021' video for anyone
| interested in self-driving![4]
|
| [1] https://electrek.co/2022/05/22/elon-musk-tesla-self-
| driving-...
|
| [2] https://www.cnet.com/roadshow/news/elon-musk-full-self-
| drivi...
|
| [3] https://www.reuters.com/business/autos-
| transportation/self-d...
|
| [4] https://www.youtube.com/watch?v=uJWN0K26NxQ
| a9h74j wrote:
| 2023 will be the year of FSDIRTYA on the Tesla. (FSD is
| really two years away.)
| Geee wrote:
| If you have been following their updates, it looks like that
| they have finally found the correct approach for FSD and it is
| improving very quickly. At this rate, it seems that it'll be
| close to level 5 this year or next year.
|
| Video of the latest version:
| https://www.youtube.com/watch?v=fwduh2kRj3M
| avalys wrote:
| No kidding!
|
| The ability to use the phone or remote to move the car forward
| or back in a straight line is super useful and a cool, novel
| feature by itself. It's also a buggy piece of shit that a few
| engineers could probably greatly improve in a month. Doesn't
| seem like Tesla cares, it's been stagnant for years.
|
| Meanwhile Tesla is still charging people $10,000 for an FSD
| function that doesn't exist.
| mvgoogler wrote:
| It's up to $12,000 now.
|
| https://www.tesla.com/model3/design#overview
| hedora wrote:
| What happens if the phone UI thread / touch screen hangs?
| pclmulqdq wrote:
| The charge isn't really for the FSD functionality. It's a
| charge to cut the line. There is a long waitlist if you don't
| pay for FSD.
| cranekam wrote:
| > The ability to use the phone or remote to move the car
| forward or back in a straight line is super useful and a
| cool, novel feature by itself.
|
| Is it? It's hard to think of a situation where moving a car
| I'm at most a couple of hundred feet from backward or
| forwards in a straight line by fiddling with my phone is
| superior to just getting into the car and moving it myself.
| Maybe I'm not finding myself and my car on opposite sides of
| a gorge often enough?
| avalys wrote:
| I found it useful in a variety of circumstances.
|
| Someone parked too close to your driver's door? Just back
| the car out remotely and get in.
|
| Parallel parked, then walked away from your car and noticed
| you didn't leave the car behind enough space to exit
| without scraping your bumper? Pull it forward a little
| without getting back in.
|
| Parked in your driveway and need to move it three feet so
| you have space to get the lawnmower out of the garage? Use
| the remote.
|
| All of these use cases depend on the functionality being
| fast and hassle-free to use. It works that well about 60%
| of the time - the other 40% the phone fails to connect to
| the car, or seems to connect but the car inexplicably
| doesn't move, or gives a useless "Something went wrong"
| error message, etc.
| maybelsyrup wrote:
| epgui wrote:
| aaa_aaa wrote:
| Your second try is even less funny.
| la64710 wrote:
| This stuff is scary and certainly more important than twitter
| spam bots.
| djcannabiz wrote:
| epgui wrote:
| To other commenters: this is a technical audit of autopilot
| software, not Tesla FSD.
|
| And the actual context is much less of a big deal than it seems:
| the biggest plausible consequence would be forcing Tesla to push
| an over-the-air update with better driver attention monitoring or
| alerting.
|
| I encourage reading the actual report.
| dangrossman wrote:
| The worst case outcome would require Tesla to disable the
| autopilot suite entirely, for an indeterminate amount of time,
| perhaps permanently on the existing fleet of vehicles.
|
| The NHTSA is tired of Tesla's hand-waving away their safety
| investigations into Autopilot by pushing stealth updates that
| fix specific scenarios in specific places being investigated.
| NHTSA wisened up to that and independently purchased their own
| Tesla vehicles, and disabled software updates, so that they can
| reproduce those scenarios themselves.
|
| If NHTSA asks Tesla to provide system validation tests showing
| that an updated version of their software meets the design
| intent of the system, Tesla would not be able to do so. If they
| can't prove the new Autopilot software corrects the safety-
| related defects identified in the current version, then it's
| not a valid recall remedy.
|
| All evidence from their own AI/AP team and presentations is
| that there is no real design and system validation going on
| over there. They're flying by the seat of their pants,
| introducing potentially lethal regressions in every update.
| AzzieElbab wrote:
| If disabling FSD makes teslas less safe then what is the
| point? Are they saying fsd can potentially go berserk? Are we
| into future crime prevention?
| hedora wrote:
| Yes, in the same way that taking down an active gunman with
| a loaded weapon is future crime prevention.
| epgui wrote:
| If you read the report, you will realize that NHTSA is
| considering requiring Tesla to do better driver attention
| monitoring, or to improve alerting. They are not considering
| banning autopilot.
| dragonwriter wrote:
| > If you read the report, you will realize that NHTSA is
| considering requiring Tesla to do better driver attention
| monitoring, or to improve alerting
|
| If you read the report, you will realize that it says
| nothing about NHTSA might do if the kind of defect they are
| focussing on is confirmed.
|
| It is certainly the kind of defect where it is _plausible_
| that better attention monitoring and alerting _might_ be at
| least a partial mitigation, but that 's about all you can
| reasonably conclude on that from from the report.
| dangrossman wrote:
| I assure you, I'd already read the report before it was
| shared here. I also assure you, there's more to the
| investigation than that.
| epgui wrote:
| Perhaps, but that's speculation at best.
| moralestapia wrote:
| 99% of everything is speculation at best, even this site
| is pretty much (high value) speculation as a service.
|
| The NHTSA has a reputation of not f*king around so I
| would definitely side with @dangrossman on this thing.
|
| As of today, Autopilot IS dangerous software and it is
| not something that should be tested live on the streets.
| [deleted]
| sumy23 wrote:
| But you've been assured with absolutely no evidence /s.
| [deleted]
| wolverine876 wrote:
| > The NHTSA is tired of Tesla's hand-waving away their safety
| investigations into Autopilot by pushing stealth updates that
| fix specific scenarios in specific places being investigated.
|
| Why isn't Tesla prosecuted for that? It's lawless!
| jquery wrote:
| It really is insane. It's one thing to have flaws, it's
| quite another to stealth cover-them-up like it's a game of
| hide and seek.
| extheat wrote:
| No, that's typical software development. Find a bug in
| circulation, fix and deploy a fix. There are probably
| hundreds of internal issues that get fixed per normal
| protocol, as with any piece of software. Putting out a
| "recall" or alert for every modification to the code is
| pointless. What regulators need to do is keep up with the
| times. They need to have their own safety test suite which
| manufacturers can test against, and be independently
| audited
| hedora wrote:
| They've been accused of silently pushing updates to fix
| specific scenarios in order to gaslight the regulators.
|
| Imagine an airbag incorrectly deployed sometimes, and the
| fix was to use a GPS geofence disable the airbag entirely
| on the test track, but only on days when the regulator
| was trying to reproduce spurious airbag deployments, not
| on crash test days.
| bdavis__ wrote:
| No, it is not. You sure don't do it in aerospace. Each
| change is verified, the entire system is validated prior
| to a release.
| rtlfe wrote:
| > No, that's typical software development.
|
| Software that controls multi-thousand pound machines at
| 70+mph isn't typical, and typical practices don't
| necessarily apply.
| codebolt wrote:
| Yes, those practices absolutely shouldn't apply for self
| driving cars. Good luck regression testing how a system
| change impacts the AI handling of every edge case of
| traffic.
| sidibe wrote:
| Waymo does this. Their infrastructure costs would be
| astronomical though if not attached to a company with its
| own cloud
| idealmedtech wrote:
| It seems like the test suites for these deep learning
| based models are themselves almost comprehensive
| knowledge bases from which you could build a more
| traditional control software.
| cool_dude85 wrote:
| Sorry, typical software development is for a national
| regulator to find some illegal feature in your website,
| and then you disable the feature for specific IP ranges
| or geofenced areas where the regulator's office is? No, I
| don't think it is.
| Apocryphon wrote:
| Typical software development as practiced by Uber,
| perhaps
| pyrale wrote:
| > They need to have their own safety test suite which
| manufacturers can test against
|
| Coaxing regulators into producing a test that can be
| optimized for is exactly how we got the WW scandal.
|
| > What regulators need to do is keep up with the times.
|
| Keeping up with the times sounds awfully like allowing
| insane things because some whiz kid believes there's no
| difference between a car and a website.
| Bubble_Pop_22 wrote:
| > No, that's typical software development.
|
| Cars will never be software, much like pacemakers and
| ICDs won't ever be software
| vkou wrote:
| > No, that's typical software development.
|
| It's not typical software development in life-critical
| systems. If you think it is, you should not be working on
| life-critical systems.
| ajross wrote:
| > introducing potentially lethal regressions in every update.
|
| Meh. I mean, I understand the emotional content of that
| argument, and the propriety angle is real enough. I really do
| get what you're saying. And if your prior is "Tesla is bad",
| that's going to be really convincing. But if it's not...
|
| The bottom line is that they're getting close to 3M of these
| vehicles on the roads now. You can't spit without hitting one
| in the south bay. And the accident statistics just aren't
| there. They're not. There's a small handful of verifiable
| accidents, all on significantly older versions of the
| software. Bugs are real. They've happened before and they'll
| no doubt happen again, just like they do with every other
| product.
|
| But the Simple Truth remains that these are very safe cars.
| They are. So... what exactly are people getting upset about?
| Because it doesn't seem to be what people claim they're
| getting upset about.
| ken47 wrote:
| Total straw man. The question isn't whether Tesla's are
| safe to drive. The question is whether "autopilot" is safe
| to auto pilot.
| Retric wrote:
| Actual real world data says autopilot is on average at
| least as safe as the average driver in the average car.
| Of course that's on average, in many specific situations
| it's much worse but conversely it means it's better in
| other situations.
|
| How regulators deal with this frankly tricky as the same
| will likely apply to all self driving systems.
| justapassenger wrote:
| By actual real world data, you mean cherry picked average
| data published by Tesla, that doesn't account for any
| bias, and wasn't audited by independent third parties?
| ken47 wrote:
| Sources for claims would be appreciated.
| usrusr wrote:
| Those real world autopilot averages happen exclusively in
| the most trivial driving situations. Fair weather and
| almost exclusively on limited access roads. No real world
| average driver dataset exists that is similarly
| restricted to the subset of least error-prone driving
| situations.
| Retric wrote:
| Fair weather limited access highways is one of many
| datasets available. However, Autopilot operates in wet
| and rainy conditions so that's hardly an accurate
| assessment. Weather isn't actually that big of a
| percentage of accounts.
|
| "On average, there are over 5,891,000 vehicle crashes
| each year. Approximately 21% of these crashes - nearly
| 1,235,000 - are weather-related" " 70% on wet pavement
| and 46% during rainfall. A much smaller percentage of
| weather-related crashes occur during winter conditions:
| 18% during snow or sleet, 13% occur on icy pavement and
| 16% of weather-related crashes take place on snowy or
| slushy pavement. Only 3% happen in the presence of fog."
|
| https://ops.fhwa.dot.gov/weather/q1_roadimpact.htm
| dragonwriter wrote:
| > Actual real world data says autopilot is on average at
| least as safe as the average driver in the average car.
|
| Unless this is on the same road and conditions, instead
| of "autopilot where and when used vs. real drivers
| everywhere and everywhen" it is meaningless, even moreso
| if it doesn't also account for "autopilot disengages
| immediately before anticipated collision so it doesn't
| count as driving when it occurred."
| fshbbdssbbgdd wrote:
| If autopilot disengages 1 second before crashing into a
| stationary object, does this count as autopilot crashing
| or the human driver?
|
| Is autopilot engaged in the places where crashes are
| frequent, eg. during left turns?
|
| What are the "scenario-equalized" safety stats for
| autopilot vs human drivers?
| kylecordes wrote:
| Tesla's statistics count it as autopilot if it was
| engaged within 5 seconds of the collision.
|
| It seems reasonable for a regulator to decide what that
| time span is and require all automated driving assist
| systems to report in a consistent way. I'm curious what %
| of the time a crash in a typical car occurs within N
| seconds of cruise control or lane assist or traffic aware
| cruise control engaged.
| hedora wrote:
| The article says they're using a 1 second threshold, not
| 5, and that a substantial number of accidents fall
| between the two numbers.
| jquery wrote:
| It's also worth noting Tesla is nearly infamous at this
| point for making owners sign NDAs in exchange for repairs
| when their autopilot is likely at fault.
| bbarnett wrote:
| Really? Wow. That's a lawsuit waiting to happen.
|
| No way would I sign, and they'd fix it, or see me in
| court.
|
| And not rich guy wins US court, but Canadian court. And
| yeah, it's different.
| ev1 wrote:
| Don't you have to waive your right to sue in the US to
| purchase or boot a Tesla?
| [deleted]
| gamblor956 wrote:
| Tesla leads the world in driving fatalities related to AP
| and FSD-type systems.
|
| The entire rest of the industry has 1 fatality. Tesla has
| dozens, and 14 of those are old enough (and located in the
| right country) to be part of this investigation. (The
| multiple Tesla autopilot/FSD fatalities from 2022,
| including the 3 from last month, are not part of this
| investigation.)
| zozbot234 wrote:
| The proper comparison is AP vs. other SAE Level 2 systems
| throughout the industry.
| hedora wrote:
| So, the rest of the industry has at most one fatality?
| How does that change the conclusion?
| baryphonic wrote:
| > All evidence from their own AI/AP team and presentations is
| that there is no real design and system validation going on
| over there. They're flying by the seat of their pants,
| introducing potentially lethal regressions in every update.
|
| What is this evidence?
|
| I've seen a few talks from Andrej Karpathy that indicate to
| me a more deliberate approach.[0] "Software 2.0" itself seems
| like an approach meant to systematize the development,
| validation & testing of AI systems, hardly a seat-of-your-
| pants approach to releases. I have my own criticisms of their
| approach, but it seems there is pretty deliberate care taken
| when developing models.
|
| [0] https://youtu.be/hx7BXih7zx8
| justapassenger wrote:
| I've been working few years ago at a very big tech company,
| focusing on validation of the AI systems.
|
| It's all smoke and mirrors. You cannot perform proper
| validation of AI systems. Rollbacks of new versions of ML
| models are very common in production, and even after very
| extensive validation you can see that real life results are
| nothing like what tests have shown.
| stefan_ wrote:
| This is the Karpathy that gave a big talk about how vision
| was superior to radar when Tesla dropped all radar units at
| the height of the chip crisis. Now they are bringing radar
| back.
|
| Give it a few years and they will probably introduce LIDAR.
| pbronez wrote:
| Tesla is bringing Radar back? First I've heard about it,
| and good news if true.
| kelnos wrote:
| > _What is this evidence?_
|
| I think the onus should be on Tesla to prove that their
| testing and validation methodology is sufficient. Until and
| unless they have done so, Autopilot should be completely
| disabled.
|
| I really don't get why the regulatory environment is so
| behind here. None of these driver assistance technologies
| (from any manufacturer, not just Tesla) should be by
| default legal to put in a car.
| phkahler wrote:
| >> They're flying by the seat of their pants, introducing
| potentially lethal regressions in every update.
|
| >> What is this evidence?
|
| Without a documented development and testing program, every
| development is essentially this.
| refulgentis wrote:
| I see your point, to OP's point, I know a couple people who
| were horrified at what they saw and it did not match this
| public talk. Both started at least 6 months after this
| video, and both left Tesla within 8 months, of their own
| volition. Unfortunately, off the record.
|
| Not to disparage Andrej, sometimes (frequently, even) what
| executive leadership thinks is going is not the day-to-day
| reality of the team.
| plankers wrote:
| can confirm, a former coworker had just come from Tesla 5
| years ago and he had serious ethical problems with his
| work over there. Tesla is killing people through
| negligence and greed, it's pretty disgusting, but par for
| the course
| dragonwriter wrote:
| > the biggest plausible consequence would be forcing Tesla to
| push an over-the-air update with better driver attention
| monitoring or alerting.
|
| I see no basis for this conclusion, which appears to be pure
| speculation about what NHTSA might decide is necessary and
| sufficient to address the potential problems if confirmed by
| the deeper analysis. I encourage reading the actual report.
|
| > I encourage reading the actual report.
|
| I did, and the conclusion do which you appeal to it does not
| appear to be well-supported by it.
| Isinlor wrote:
| Do you have the link to the report?
| dangrossman wrote:
| It's in the article:
| https://static.nhtsa.gov/odi/inv/2022/INOA-EA22002-3184.PDF
| justapassenger wrote:
| Tesla has no hardware for proper driver monitoring. Most of
| model S have no internal camera. And model 3 internal camera
| wasn't designed for it (doesn't work in the dark, cannot see
| through sunglasses, cannot see where your eyes are actually
| pointing, etc).
|
| You cannot OTA HW deficiencies.
|
| Now, will they be forced to have monitoring like that, to be on
| par with their competitors? That's a different story, and given
| how weak USA regulatory agencies are, and how reckless Tesla is
| at disregarding them - I'm pretty sure Tesla won't be hurt by
| it.
| zionic wrote:
| Because that's a stupid system. I will _never_ accept being
| under camera surveillance constantly by the car, no matter
| what stupid name people come up for it ("attention
| monitoring")
| [deleted]
| muttantt wrote:
| CamperBob2 wrote:
| What I'm afraid of is that the people in my society with large
| amounts of money, power, weapons, and influence are acting like
| a bunch of sixth-graders.
|
| I'm not asking for miracles. I just want the crazy to stop.
| jonahbenton wrote:
| How many government agencies can Elon Musk piss off at once?
| herpderperator wrote:
| > Since Tesla vehicles can have their software overwritten via a
| wireless connection to the cloud [...]
|
| Tangential, and we all knew this happens already, but wow, it
| sounds crazy that we live in a world where a car can completely
| change its operation based on someone somewhere pressing a button
| on their computer.
| mensetmanusman wrote:
| Considering the personalities involved, I wouldn't be surprised
| if a recall is issued, and then the recall is canceled when a new
| administration gets voted in.
___________________________________________________________________
(page generated 2022-06-11 23:00 UTC)