[HN Gopher] Tesla recalls nearly all 2M of its vehicles on US roads
___________________________________________________________________
Tesla recalls nearly all 2M of its vehicles on US roads
Author : bigfudge
Score : 12 points
Date : 2023-12-13 21:37 UTC (1 hours ago)
(HTM) web link (www.cnn.com)
(TXT) w3m dump (www.cnn.com)
| bigfudge wrote:
| This seems like quite a big deal. The article doesn't state the
| likely cost, but that may be less important than the substantial
| loss of face for Tesla given all the claims that have been made
| for autopilot.
| rogerkirkness wrote:
| All they have to do is an OTA update though. It's like a bug
| fix basically.
| natch wrote:
| Like a bug fix, except without the bug.
| natch wrote:
| No loss of face. Some regulators perceive a problem and have
| proposed (and mandated) some measures to alleviate their
| perceived problem.
|
| These measures will come in the form of a software update,
| something that is totally business as usual for Tesla, with
| zero car owners needing to do anything they wouldn't already do
| anyway in the comfort of their own homes, much less needing to
| take the car in for service somewhere... they don't need to do
| that either. Basically zero impact on owners. Tiny tweak for
| Tesla. Not a big deal.
|
| The measures in the update simply add more reminders for humans
| to do the things they are already aware they are responsible
| for doing such as watching the road. On top of the reminders
| that are already given when an inattentive driver is detected.
| dheera wrote:
| I'm a cardiac arrest survivor with an implanted defibrillator and
| although after a cardiac ablation operation and anti-arrythmic
| medications, I have been incident free for a few years now and
| medically cleared to drive, I have a LOT of anxiety around the
| "what if" I went into sudden cardiac death again while driving.
| The reality is that if I went into sudden cardiac death, my
| implant would restart my heart within 20 seconds, so with the
| advent of the implant, the only real death risk is a car crash,
| not my heart. With a non-self-driving car, you would almost
| certainly crash if you let go of all the controls for 20 seconds.
|
| Tesla FSD has almost FULLY gotten rid of my anxiety because in
| any random 20 seconds out of a million seconds there's a very
| high, likely 99.99%+ probability it will drive just fine until
| I'm conscious again.
|
| It would be distressing if they made a software modification that
| caused it to disable or "give up" if the driver isn't responsive.
| As of now the car's behavior is to continue self-driving and
| reducing speed to a full stop and putting on the emergency
| flashers, which I think is a reasonable way to deal with the
| situation. Pulling over on the shoulder or at least moving to the
| right-most lane would be nice, but Teslas don't do that yet.
|
| Teslas also punish you by not allowing you to use autopilot for a
| while, which I hate because it often punishes me on very long
| straight roads where there is no necessity to apply force to the
| steering wheel. There is also that if you have 5 "strikes" on FSD
| you are punished by being forced to use a crappier, worse driving
| algorithm for a week. That mind-boggles me because as an engineer
| you'd think you should give the best self-driving algorithms to
| the least reliable drivers, but Tesla punishes the supposedly
| least reliable drivers by handing them less reliable software and
| forcing them to use it for a week.
|
| A lot of other cars' active lane keep systems will "give up" and
| crash the car if the driver goes unconscious and stops touching
| the wheel, which is about the most unsafe thing it can do. I
| truly hope Tesla isn't pushed by the NSA toward implementing it
| that way.
| 1970-01-01 wrote:
| This is an interesting edge-case, thanks for sharing it. This
| brings up the point that there are many common medical events
| that FSD mitigates or contains that receive very little
| attention. The only one I can remember reading about is from
| 2016 when someone needed the Tesla to drive itself to the
| nearest hospital.
|
| https://www.theguardian.com/technology/2016/aug/08/tesla-mod...
| basil-rash wrote:
| Its tough. On one hand I feel for your condition and I can see
| how in your case the autopilot is a net benefit, but on the
| other hand this sort of sentiment is just reinforcing the "my
| car is driving, so it's ok that I switched my brain off"
| colloquialism.
|
| Just in the past week I've encountered multiple Tesla's being
| driven by someone with no observable mental capacity: in one
| case, a Tesla floored its way into a left turn at a 4 way stop
| intersection while my and my friend were walking directly in
| its path on the crosswalk. Luckily, the Tesla eventually did
| slam its breaks in the middle of the intersection (blocking all
| other traffic), but the weirdest part for me was that when I
| looked at the driver and gave the universal "what gives" hand
| motion, which I'd usually expect to be met with the "my b"
| wave, she had a completely blank expression. It was as if she
| had no recognition of anything that was happening at that
| moment, much less the idea that accelerating into pedestrians
| at a crosswalk is dangerous and illegal.
|
| In another, a Tesla was being driven at an impossibly slow rate
| on a main road (~10 mph on a 40 limit), and blocking tons of
| cars behind it. I honked at it for maybe 10 seconds before they
| finally pulled into the center divider and let people pass,
| then proceeded to merge back into the lane. I figured maybe
| they were trying to make it home on a dead battery or something
| - annoying but whatever. Fast forward like 10 minutes and I'm
| coming back the other way after dropping my friend off, and I
| see the same Tesla now going the other direction at ~10 mph,
| but this time swerving back and forth erratically. My suspicion
| is the "driver" was plastered drunk (it was ~1am) and tried to
| get autopilot to take them home, but for whatever reason it
| wasn't doing it so now the drunkard was (very slowly) careening
| through the streets.
| dheera wrote:
| > Its tough. On one hand I feel for your condition and I can
| see how in your case the autopilot is a net benefit, but on
| the other hand this sort of sentiment is just reinforcing the
| "my car is driving, so it's ok that I switched my brain off"
| colloquialism.
|
| I definitely don't use it that way. I'm always actively
| driving the car and very frequently override the FSD (in a
| fair percentage of cases for safety reasons, in a larger
| percentage of cases because it's being annoying to other
| drivers e.g. not leaving enough space for other cars to
| zipper merge or not leaving space for a car to parallel
| park).
|
| For me I'm always the driver, FSD is a backup for the "what
| if", not the other way around.
|
| But I agree, many people may use it the other way around. The
| other way around is really the way it _should_ be long term
| but its tech isn 't there yet.
|
| I really wish they would stop using the driver facing camera
| and wheel force to determine if the driver is attentive. It
| very frequently judges against me because my arm is light, so
| I'm punished even though I'm gripping the wheel. The driver
| facing camera is blocked easily by the sun shade so that also
| causes inadvertent punishment. I shouldn't have to look at
| the sun just to appease the camera. I wish they had some kind
| of capacitive sensing lining the steering wheel instead if
| that's what they need to be compliant.
| basil-rash wrote:
| > I really wish they would stop using the driver facing
| camera and wheel force to determine if the driver is
| attentive.
|
| Interesting, I didn't know they had a driver facing camera
| like that - that makes my encounter with that woman make a
| bit more sense, if she is required to be eyes open, looking
| forward, hands on wheel for autopilot to work, there's
| nothing to stop her from maintaining that position but
| being lost in a totally separate daydream and having no
| recognition of what's going on around her, which is in line
| with what I observed. Frightening.
|
| As for the capacitance sensing, wouldn't that not work with
| gloves?
|
| In general I don't think there's really any way to
| mechanically determine if someone is "paying attention". So
| any system that relies on driver attention and disaster
| intervention for safe operation is destined to be unsafe,
| and in my opinion and observation much more unsafe than a
| purely human driver.
| dheera wrote:
| > if she is required to be eyes open, looking forward,
| hands on wheel for autopilot to work
|
| Once FSD switches on, the behavior I have gathered is
| that if it thinks you're looking away from the road for
| too large a percentage of the time (you're allowed some
| percentage of time to look at the mirrors etc.) or if you
| don't apply force to the steering wheel for a certain
| period of time, it classifies you as not attentive and
| starts warning you with a flashing blue gradient on the
| screen and a small-font text at the bottom saying "Please
| pay attention to the road" or similar.
|
| The weird thing is, because my arm is too light for the
| force sensor to pick up, and the camera's judgement is
| easily obstructed by sun shade or by sunglasses, I have
| to pay attention to the screen to avoid missing the
| flashing blue gradient, and jerk the steering wheel as
| soon as I see the gradient. (Yeah, even if there are
| trucks on both sides, it requires you to jerk the wheel
| slightly to prove you're there or you get punished)
|
| If you don't see the flashing blue gradient for several
| seconds, it beeps for about 2 seconds, turns fiery red,
| and punishes you with an FSD strike (5 strikes and your
| software is downgraded to a more dangerous version for a
| week) and not allowed to use self-driving until the next
| time you exit and re-enter the car.
|
| Sometimes I'm often too distracted by the road to notice
| these stupid gradients on the screen and get a FSD
| strike.
|
| But there are really no better cars on the market that
| serve my medical needs. Almost all the alternative cars'
| lane keep products are programmed to crash, not drive, if
| the driver loses consciousness. So while I hate Tesla's
| driver attentiveness implementation, it's still the only
| brand with a lane keep implementation that's designed to
| actually keep the driver alive.
|
| > As for the capacitance sensing, wouldn't that not work
| with gloves?
|
| Capacitive sensing can easily work with gloves. It's not
| required to be pixel-accurate like a mobile phone, it
| just needs to sense whether or not there's a hand there
| and that can easily be done through gloves.
| fkarg wrote:
| Discussed already earlier today. Technically a recall, but for
| Tesla just an OTA.
| metadat wrote:
| Thanks, here's the discussion:
|
| _Tesla Recalls 2M Cars to Fix Autopilot Safety Flaws_
|
| https://news.ycombinator.com/item?id=38625652 (200 comments)
___________________________________________________________________
(page generated 2023-12-13 23:01 UTC)