[HN Gopher] Tesla recalls 362,758 vehicles, says full self-drivi...
       ___________________________________________________________________
        
       Tesla recalls 362,758 vehicles, says full self-driving beta may
       cause crashes
        
       Author : jeffpalmer
       Score  : 579 points
       Date   : 2023-02-16 17:52 UTC (5 hours ago)
        
 (HTM) web link (www.cnbc.com)
 (TXT) w3m dump (www.cnbc.com)
        
       | maxdo wrote:
       | FSD beta is a super weird beast. I managed to go in New York with
       | 0 interventions to my friend 3-4 miles away one day, yet another
       | day it can act stupid, and barely handle a trivial intersection.
        
       | kbos87 wrote:
       | The amount of hyperbole and completely uninformed takes around
       | FSD is eye opening.
       | 
       | I'll completely agree that "Full self driving" is a misleading
       | name, and they should be forced to change it, full stop.
       | 
       | That being said, it's exceptionally clear that all the
       | responsibility is on you, the driver, while you are using it. The
       | messaging in the vehicle is concise and constant (not just the
       | wall of text you read when opting in.) Eye tracking makes sure
       | you are actually engaged and looking at the road, otherwise the
       | car will disengage and you'll eventually lose the ability to use
       | FSD. Why is there never a mention of this in any coverage?
       | Because it's more salacious to believe people are asleep at the
       | wheel.
       | 
       | Is it perfect? No, though it's a lot better than many people seem
       | to want the world to believe. It's so easy to overlook the fact
       | that human drivers are also very, very far from perfect.
        
         | InCityDreams wrote:
         | >...."Full self driving" is a misleading name, and they should
         | be forced to change it, full stop.
         | 
         | Problem and solution.
         | 
         | Nothing more need be said.
        
           | culi wrote:
           | This one issue is one of the primary reasons Google ended up
           | deciding not to acquire Tesla really early on. Google's been
           | way ahead of them for a long time and their engineers were
           | extremely aware of how reckless this type of marketing from
           | Musk was
           | 
           | But the marketing worked. They sold the dream to people and
           | only got sued a couple times and still got the most
           | subsidization of any automaker
        
         | Barrin92 wrote:
         | >That being said, it's exceptionally clear that all the
         | responsibility is on you, the driver
         | 
         | Virtually every study ever done on human-machine interaction
         | shows that users will inevitably lose reaction time and
         | attention when they are engaged with half automated systems
         | given that constant context switching creates extreme issues.
         | 
         | Waymo did studies on this in the earlier days and very quickly
         | came to the conclusion that it's full autonomy or nothing.
         | Comparisons to human performance are nonsensical because
         | machines don't operate like human beings. If factory floor
         | robots had the error rate of a human being you'd find a lot of
         | limbs on the floor. When we interact with autonomous systems
         | that a human can never predict precision needs to _far exceed_
         | that of a human being for the cooperation to work. A two ton
         | blackbox moving at speeds that kill people is not something any
         | user can responsibly engage with _at all_.
        
       | alluro2 wrote:
       | I generally root for Tesla as a former underdog who lit the fire
       | under the asses of stagnating manufacturers, but, seeing videos
       | of FSD in action, I'm fully on the side of people who think that
       | calling it FSD should be considered fraud.
       | 
       | Given how much time and data they had so far, and the state it's
       | in, it really makes news like Zoox getting a testing permit for
       | public roads, without any manual controls in the vehicle, seem
       | crazy irresponsible and dangerous. Is it possible that they are
       | just that much better at cracking autonomous driving?
        
         | esalman wrote:
         | Unlike Tesla, Mercedes will take responsibility if level-3
         | autonomous system is malfunctioning. If I ever get an
         | autonomous vehicle this will be the main deciding factor.
        
       | rootusrootus wrote:
       | Meanwhile, the only actual L3 capable car is a Mercedes. Plenty
       | of people will argue that AP is actually superior, but I don't
       | think there's any way to say that for sure, given the tight
       | constraints Mercedes has put on their driver assistance in order
       | to lower the liability enough to make L3 not a bankruptcy event.
       | For certain, however, they are more _confident_ in their
       | technology than Tesla is, otherwise Tesla would release an L3
       | capable car too.
       | 
       | To be fair, though, Tesla has no sensors other than cameras, and
       | I believe the Mercedes has a half dozen or so, several radars and
       | even a lidar.
        
         | lastLinkedList wrote:
         | Don't LIDAR/radar sensors (I'm not familiar with exactly what
         | the options are) have benefits that vision doesn't have, like
         | working in poor lighting/visibility? Why would Tesla move away
         | from these sensors?
        
           | ggreer wrote:
           | Lidar has advantages over cameras but it also has some
           | downsides. Sunlight, rain, and snow can interfere with the
           | sensors, as can other nearby lidar devices (though de-noising
           | algorithms are always improving). There are also issues with
           | object detection. Lidar gives you a point cloud, and you need
           | software that can detect and categorize things from that
           | point cloud. Because lidar is new, this problem hasn't had as
           | much R&D put into it as similar problems in computer vision.
           | 
           | Then there's the issue of sensor fusion. Lidar requires
           | sensor fusion because it can't see road lines, traffic
           | signals, or signs. It also can't see lights on vehicles or
           | pedestrians. So you still have to solve most of the computer
           | vision issues _and_ you have to build software that can
           | reliably merge the data from both sensors. What if the
           | sensors disagree? If you err on the side of caution and brake
           | if either sensor detects an object, you get lots of false
           | positives and phantom braking (increasing the chance of rear-
           | end collisions). If you YOLO it, you might hit something. If
           | you improve the software to the point that lidar and cameras
           | never disagree, well then what do you need lidar for?
           | 
           | I think lidar will become more prevalent, and I wouldn't be
           | surprised if Tesla added it to their vehicles in the future.
           | But the primary sensor will always be cameras.
        
           | newaccount74 wrote:
           | Because LIDAR is expensive and available in limited
           | quantities. There's no way they could sell the amount of cars
           | they are selling right now if each one came with a LIDAR.
           | 
           | Camera modules are cheap and available in huge quantities.
        
         | drowsspa wrote:
         | That Tesla was allowed to test in production without any legal
         | liability with real human lives mostly because of Musk's
         | personal influence is a disgrace
        
           | worik wrote:
           | > That Tesla was allowed to test in production without any
           | legal liability with real human lives mostly because of
           | Musk's personal influence is a disgrace
           | 
           | I am unsure what"allowed" means in that context. They just
           | did it. The lawsuits are coming, surely?
           | 
           | Sometimes it is better to ask for forgiveness than for
           | permission. This was an audacious example. Time will tell if
           | there is anything that will stop them.
        
             | drowsspa wrote:
             | I mean, "ask for forgiveness than permission" for an AI
             | conducting a 1 ton machine that carries live human beings
             | borders on psychopathy.
        
         | [deleted]
        
         | naillo wrote:
         | Makes me think karpathys "bet" on vision being the only sensor
         | was maybe misguided
        
           | sockaddr wrote:
           | Yup. You need an AGI behind that vision for that premise to
           | work.
           | 
           | Even a fruit-fly class AGI would do it.
        
             | justapassenger wrote:
             | Flies hit windows and cars. Even much more complex animals
             | cannot manage effectively moving in a complex environment
             | without hitting each other (like flocks of sheep).
             | 
             | There's no data that would support that anything than human
             | level AGI is required to drive cars with how current
             | infrastructure looks like.
        
               | worik wrote:
               | > Even much more complex animals cannot manage
               | effectively moving in a complex environment without
               | hitting each other (like flocks of sheep).
               | 
               | No. You have to force flocking animals into extreme
               | circumstances to have them start crashing,.
               | 
               | Sounds like the Tesla cannot manage that. Not even "bird
               | brained".
        
           | panick21_ wrote:
           | Again, people need to understand this L3 stuff is for an
           | extremely, extremely limited amount of situations.
           | 
           | Tesla software is used far, far, far more, in far, far more
           | situation. Even compare those to things is kind of silly.
           | 
           | Its like comparing a system designed for only race tracks
           | with Honda Civic. They are simply not designed for the same
           | thing.
           | 
           | If Mercedes achieves L3 in all the places Tesla now allows AP
           | (or FSD Beta) then that would prove the 'bet' on vision
           | wrong.
           | 
           | Until then, nobody has proven anything.
        
           | scottyah wrote:
           | Radar could help so much in thick fog and rain...
        
         | LanceJones wrote:
         | I've used it in Germany over the Christmas holidays. In 8 hours
         | of driving it was only available for 12 minutes.
        
           | newaccount74 wrote:
           | It's probably only worth it if you have a commute in heavy
           | traffic. If you're stuck in traffic on the Autobahn every
           | morning, then it could be useful.
           | 
           | Otherwise, not really.
        
         | diebeforei485 wrote:
         | This is a PR stunt from Mercedes, because in practice their L3
         | system is not usually available.
        
           | lolinder wrote:
           | It's available in bumper to bumper rush hour traffic on
           | freeways, and they take liability for any accidents that
           | happen while it's on. That's exactly the kind of system and
           | guarantee that could really have a positive impact on
           | someone's commute.
        
             | diebeforei485 wrote:
             | No, it is rarely available even in those conditions.
        
         | lolinder wrote:
         | > To be fair, though, Tesla has no sensors other than cameras,
         | and I believe the Mercedes has a half dozen or so, several
         | radars and even a lidar.
         | 
         | This doesn't mitigate Tesla's gross ethical violations in
         | letting this thing loose, it makes them worse. Tesla _knew_
         | that cameras alone would be harder to make work than cameras
         | plus radar plus lidar, and they shouldn 't be lauded for
         | attempting FSD with just cameras. It's an arbitrary constraint
         | that they imposed on themselves that is putting people's lives
         | in danger.
        
           | ok123456 wrote:
           | Remember that big announcement musk did where he stated, that
           | going forward, Teslas would not use any additional sensors to
           | aid the video---to the point of laughing at every other
           | company that was still using them. Yeah.
        
             | freejazz wrote:
             | Basically criminal negligence to anyone that actually gets
             | hurt by the FSD beta
        
         | toomuchtodo wrote:
         | This is limited to certain pre mapped roads, under 40 mph and
         | requires a car in front to follow. This in no way compares to
         | Tesla's Autopilot system.
         | 
         | Edit (because HN throttling replies): > Mercedes is being
         | responsible about their rollout
         | 
         | You can always succeed if you define success. If rolling out
         | something that doesn't do much of anything is success, success?
         | Oh well. Less than 20 people have died related to Autopilot in
         | ~5 billion miles of travel. Zero deaths is unattainable and
         | unrealistic, so "responsible" and "safe" is based on whatever
         | narrative is being pushed. 43k people died in US car accidents
         | last year, roughly half a million deaths in the entire time
         | Tesla has offered some sort of driver assistance system called
         | Autopilot.
         | 
         | How many deaths would be be okay with Mercedes' system where is
         | would still be "responsible"? Because it won't be zero. Even
         | assuming 100% penetration of Automatic Emergency Braking
         | systems, it's only predicted to reduce fatalities by 13.2%
         | (China specific study), and injuries by a similar number.
         | 
         | TLDR "Safe" is not zero deaths. It is a level of death we are
         | comfortable with to continue to use vehicles to travel at
         | scale.
         | 
         | https://www.tesladeaths.com/
         | 
         | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7037779/
        
           | TheCoelacanth wrote:
           | It is in no way comparable to Tesla's Autopilot system.
           | Mercedes has an actual feature while Tesla recklessly gambles
           | with other peoples' lives to juice their stock price.
           | 
           | If Tesla believes that their feature is safe, then let them
           | take legal liability for it.
        
             | toomuchtodo wrote:
             | The law does not require they take liability, so they
             | shouldn't. If you don't like the law, change the law. If
             | you don't want to use a driver assist feature, don't. But
             | the data shows Autopilot is robust and regulators allow its
             | continued sale and operation.
        
               | TheCoelacanth wrote:
               | Legal isn't ethical. Musk is recklessly gambling with
               | other people's lives.
               | 
               | I don't have a choice not to use this feature. The person
               | driving their Tesla death machine on the roads with me
               | already made that decision for me.
        
           | lolinder wrote:
           | Mercedes is being responsible about their rollout, so people
           | like you jump to the conclusion that their tech is worse. For
           | my part, I see Mercedes taking liability for accidents that
           | happen and think that they must have a lot of confidence in
           | their tech to roll it out at all under those terms.
           | 
           | I would not be surprised to see Mercedes beat Tesla to _safe_
           | fully automated driving because they took it slow and steady
           | instead of trying to rush it out to legacy vehicles that don
           | 't have the right sensor arrays.
           | 
           | Edit: It's weird to reply to a comment by editing your own.
           | It feels like you're trying to preempt what your interlocutor
           | is saying, rather than reply.
        
           | reitzensteinm wrote:
           | Isn't 40mph a regulatory limit?
           | 
           | Do you have any information that it'll continue to be 40mph
           | even where not required to be?
           | 
           | I wouldn't be surprised in the answer is yes, but I've been
           | paying attention and haven't come across any yet.
        
         | panick21_ wrote:
         | Calling it L3 capable is like saying Tesla is Full Self
         | Driving.
         | 
         | Neither is true, both are marketing.
         | 
         | > but I don't think there's any way to say that for sure
         | 
         | I mean yes there is. If you want to have an object test where
         | you drop a car anywhere in the world and see if it can get
         | somewhere else, then clearly one is more useful then the other.
         | 
         | In basketball they say 'the best ability is availability'. In
         | terms of that AP is in a totally different dimension. AP has
         | been driven for 100s of millions of miles by know, it must be a
         | crazy high number by now. Mercedes L3 system has barley driven
         | at all, its available in very few cars.
         | 
         | The only way you can reasonably compare the Mercedes L3 system
         | is if you limit the comparison to the extremely limited cases
         | where the Mercedes L3 system is available. If you compare them
         | there, I would think they aren't that different.
         | 
         | > otherwise Tesla would release an L3 capable car too
         | 
         | No, because making something L3 in a extremely limited
         | selection of places is simply not something Tesla is interested
         | in doing. Doing so would be a lot of work that they simply
         | don't consider worth doing when they are trying to solve the
         | larger problem.
        
           | Dylan16807 wrote:
           | If Tesla's work is actually going somewhere, then they don't
           | need to change the engineering to get L3 in specific places.
           | They can have a list that gets added to over time.
        
             | panick21_ wrote:
             | Actually it would be a lot of work doing that. Even if not
             | just work on the software itself. Something like that would
             | cause a lot of work in literally all over the company.
             | 
             | Also, they don't really have an inattentive to do that, so
             | why should they.
        
               | Dylan16807 wrote:
               | > literally all over the company
               | 
               | Why? The design is locked in, the manufacturing doesn't
               | change, you don't need the designers to do much... I
               | would expect you need a handful of people, half of which
               | are lawyers.
               | 
               | > Also, they don't really have an inattentive to do that,
               | so why should they.
               | 
               | I'm pretty sure _almost all Tesla owners_ would like the
               | ability to stop looking at the road some of the time.
               | That amount of customer satisfaction is not a motivation
               | for the company?
        
         | YeGoblynQueenne wrote:
         | >> To be fair, though, Tesla has no sensors other than cameras,
         | and I believe the Mercedes has a half dozen or so, several
         | radars and even a lidar.
         | 
         | Pf. Tensors beat sensors.
         | 
         | Just you wait for another garbanzillion miles driven. Then
         | you'll see.
         | 
         | /s
        
       | londons_explore wrote:
       | This just a few days before the much anticipated 'v11' software
       | release...
       | 
       | V11 supposedly uses neural nets for deciding the driving path and
       | speed (rather than hand coded C++ rules).
        
         | londons_explore wrote:
         | I would imagine v11 will be cancelled or at least massively
         | delayed to deal with this recall...
        
       | jonny_eh wrote:
       | Does a recall like this actually mean owners return their car to
       | Tesla for modification? Or would it be an over-the-air update to
       | remove FSD?
        
         | diebeforei485 wrote:
         | Why does the media not make this clear in the headline?
        
           | panick21_ wrote:
           | Because it gets more clicks. Reading about how Tesla will
           | lose lots of money because they have to recall 350k vehicles
           | is juicy story, specially if its about removing a major
           | feature, a slight software update to that feature is boring.
        
           | ChickenNugger wrote:
           | They hate Musk ever since he took twitter away from them.
        
           | natch wrote:
           | Because Tesla does not pay for advertising in media outlets.
           | So many media outlets tend to have it in for Tesla.
        
         | tensor wrote:
         | Almost all of these "recalls" that make the news are just
         | software patches. The only difference between these and a
         | normal patch is that customers get an email about it.
         | 
         | I can understand why people might think that all these recalls
         | require going back to the shop, that's how most legacy makers
         | work to this day.
        
           | gowld wrote:
           | It's also the word "recall", as opposed to "urgent patch",
           | that makes people think that the car is going back to
           | manufacturer.
           | 
           | Non-Tesla automakers are not "legacy".
        
           | idop wrote:
           | The solution being (supposedly) easy doesn't discount the
           | severity of the issue. Owners must be informed and aware.
        
         | KaiserPro wrote:
         | That was my first thought.
         | 
         | Also what percentage of people who paid for "FSD" needed their
         | car recalling?
        
           | acchow wrote:
           | This is 100%. It's a required software update on all the FSD
           | vehicles.
        
         | lattalayta wrote:
         | Seems like they should come up with a new phrase to distinguish
         | between making modifications to the car at a dealer or service
         | provider vs. a downloadable update, right?
        
           | panick21_ wrote:
           | Yes, this is just a historical term that nobody bothered to
           | change.
           | 
           | We have to remember that Tesla was the first company to
           | really do this and even today almost no other company does
           | it. Most can update some parts of their system, but almost
           | non have anywhere close to the integration Tesla has.
           | 
           | So for 99% of recalls, it is a physical recall, its just
           | Tesla where most of the time it isn't.
        
         | kube-system wrote:
         | > The auto safety regulator said the Tesla software allows a
         | vehicle to "exceed speed limits or travel through intersections
         | in an unlawful or unpredictable manner increases the risk of a
         | crash." Tesla will release an over-the-air (OTA) software
         | update, free of charge.
         | 
         | https://www.reuters.com/business/autos-transportation/tesla-...
         | 
         | Sounds like it's just a patch release with regulators involved.
        
           | andrewmunsell wrote:
           | I'm not sure how to reconcile that statement with how FSD
           | beta works. Fine, the speed limit thing is easy to fix-- just
           | cap the max speed to the speed limit.
           | 
           | But the "[traveling] through intersections in an unlawful or
           | unpredictable manner" is _inherent_ to the FSD beta. Most of
           | the time it does  "fine", but there's some intersections
           | where it will inexplicably swerve into different lanes and
           | then correct itself. And this can change beta-to-beta (one in
           | particular used to be bad, it got fixed at some point, then
           | went back to the old swerving behavior).
        
             | vilhelm_s wrote:
             | If they accepted a software update, I would assume it would
             | be about things that are deliberately programmed in, rather
             | than just fundamental limitations? But I agree it's a bit
             | odd. The longer description says
             | 
             | > The FSD Beta system may allow the vehicle to act unsafe
             | around intersections, such as traveling straight through an
             | intersection while in a turn-only lane, entering a stop
             | sign-controlled intersection without coming to a complete
             | stop, or proceeding into an intersection during a steady
             | yellow traffic signal without due caution. In addition, the
             | system may respond insufficiently to changes in posted
             | speed limits or not adequately account for the driver's
             | adjustment of the vehicle's speed to exceed posted speed
             | limit. [https://www.nhtsa.gov/vehicle/2020/TESLA/MODEL%2525
             | 20Y/SUV/A...]
             | 
             | "entering a stop sign-controlled intersection without
             | coming to a complete stop" sounds a lot like the thing they
             | _already_ issued a recall over in Janary 2020 (further down
             | the same web page), so it seems a bit odd that it would
             | still be an issue. And  "due caution" to yellow lights
             | seems like it could just be tuning some parameter. On the
             | other hand, failing to recognize turn-only lanes sounds
             | more like a failure of the computer vision system...
        
               | starbase wrote:
               | I believe the stop sign adjustment is to increase dwell
               | time at empty intersections. i.e. stop and count to two
               | before proceeding.
        
             | shoelessone wrote:
             | This is what I'm most interested in.
             | 
             | I'm hopeful about Tesla FSD, and don't think it necessarily
             | needs to be perfect, just significantly better than humans.
             | So I'm rooting for Tesla FSD/Autopilot overall. I just
             | don't see how given the findings there is a solution
             | without removing removing the entire FSD feature.
        
         | autonomousErwin wrote:
         | I had the same question. I think this is the original report:
         | https://static.nhtsa.gov/odi/rcl/2023/RCLRPT-23V085-3451.PDF
         | 
         | On Page 4 it mentions the "Description of Remedy" as an OTA
         | update. Gives a new meaning to a recall!
         | 
         | https://howtune.com/recalls/ford/ford/1980/
         | 
         | Software > Stickers
        
       | lolinder wrote:
       | Note that this recall just means that there's a regulator-
       | mandated patch, not that FSD is being removed.
        
         | [deleted]
        
         | mdeeks wrote:
         | Both of my cars patched last night with "Bug fixes" as the
         | changelog. I don't think I've seen an update that only said
         | that before. I suspect that was this "recall". If that's the
         | case, then the recall (patch) is probably mostly done by now.
        
           | linsomniac wrote:
           | I've had 4-6 "misc bug fixes" patches since 2016, just FYI.
        
             | mrguyorama wrote:
             | If the changelog to my car's safety critical system was
             | limited to "misc bug fixes", I would never touch it again.
             | Either have extreme transparency into the system, or don't
             | have the system.
             | 
             | I leverage this complaint against my own car. The AEB radar
             | just stopped passing start up self tests for a while, and I
             | had no idea why, and could not find out without going to
             | the dealership. While waiting to be able to do that, it
             | started working again. I don't consider it a backup, or a
             | helper or anything. It's an independent roll of the dice
             | for an upcoming crash that I already failed to prevent. If
             | it does anything in that case then that's probably an
             | improvement, but I don't exactly trust it to save my bacon.
        
           | ryantgtg wrote:
           | I had that update message, too, though I thought I was just
           | installing that "adjustments to the steering wheel warmer"
           | update. I don't have FSD.
        
             | Groxx wrote:
             | Crash -> fire -> steering wheel is warmed.
             | 
             | Could still be related.
        
               | ryantgtg wrote:
               | For sure. Just another genius move in Elon's 4D chess
               | game.
        
       | sergiotapia wrote:
       | I thought emperically, statistically, the number of crashes from
       | FSD was drastically lower than human drivers. Was this number
       | hogwash or is this just lawboys saying the number must be 0
       | before it's allowed on the road?
       | 
       | If so, that's pretty crazy and people will die because of this
       | decision.
        
         | pja wrote:
         | You can't compare FSD crashes / mile with human data for a
         | couple of reasons.
         | 
         | The primary one is:
         | 
         | 1) FSD turns itself off whenever things get too difficult or
         | complex for it. Human drivers don't get to do this. Recent
         | crowdsourced data suggests a disengagment every 6 miles of
         | driving:
         | https://twitter.com/TaylorOgan/status/1602774341602639872
         | 
         | If you eliminate all the difficult bits of driving I bet you
         | could eliminate a lot of human driver crashes from your
         | statistics too!
         | 
         | A secondary issue, but relevant if you care about honesty in
         | statistical comparisons is
         | 
         | 2) The average vehicle & driver in the non-Tesla population is
         | completely different. Unless you correct for these differences
         | somehow, any comparison you might make is suspect.
        
         | Xylakant wrote:
         | The number was problematic for various reasons:
         | 
         | The usual comparison is vs. all cars on the road, but Teslas
         | are comparatively new cars with a lot of recent and expensive
         | safety features not available in older cars. They're also
         | likely to be maintained better. On top, they're also a driven
         | by a different demographic which skews accident statistics.
         | 
         | Teslas autopilot can only be enabled in comparatively safe and
         | simple circumstances, yet the comparison is made against all
         | driver in all situations. When autopilot detects a situation it
         | can't handle, it turns off and hands over to the human who gets
         | a few seconds warning. Human drivers can't just punt the issue
         | and then crash.
         | 
         | Tesla FSD may be safer than Human drivers for the limited set
         | of environments where you can use it, but last time I checked,
         | the numbers that Tesla published are useless to demonstrate
         | that.
        
           | valine wrote:
           | > Teslas autopilot can only be enabled in comparatively safe
           | and simple circumstances
           | 
           | This is incorrect. FSD can be enabled everywhere from dirt
           | road to parking lots, even highways and dense urban
           | environments like NYC. There is no geofence on FSD, you can
           | turn it on anywhere the car can see drivable space.
        
             | Xylakant wrote:
             | Does it enable in heavy rain or snow, on ice, fog? Does it
             | work in storm conditions with leaves and objects blown over
             | the street? Does it work in confusing traffic situations,
             | invalid signage, ...?
        
               | valine wrote:
               | I didn't say it works in those scenarios, I said it lets
               | you enable it.
               | 
               | It enables in snow and rain yes. The only time it refuses
               | to engage is in extreme weather that obstructs the
               | cameras.
               | 
               | It gets confused a lot in heavy traffic but it will
               | attempt anything.
        
       | jdelman wrote:
       | The confusion from multiple HN posters here confirms that the
       | "recall" language is inadequate to capture what is happening
       | here.
       | 
       | Obviously attention should be drawn to the fact that there is a
       | critical safety update being pushed OTA, but "recall" is too
       | overloaded a term if it means both "we're taking this back and
       | destroying it because it's fundamentally flawed" vs. "a software
       | update is being pushed to your vehicle (which may or may not be
       | fundamentally flawed...)"
       | 
       | I do think something beyond "software update" is necessary,
       | though - these aren't your typical "bug fixes and improvements"
       | type release notes that accompany many app software releases
       | these days. I don't think it would be too difficult to come up
       | with appropriate language. "Critical Safety Update"?
        
         | redundantly wrote:
         | In this scenario 'recall' is a legal/compliance term. It's
         | appropriately used.
        
         | [deleted]
        
         | 2h wrote:
         | I think recall is just fine. Recall offers no ambiguity in my
         | mind. It means the manufacturer fucked something up, big time.
         | Everything else is just details.
         | 
         | In the current world of forced updates (looking at you
         | Android), the word "update" itself is kind of toxic, and
         | doesn't (and I would argue cant) represent what has happened
         | here, even if its technically more correct.
        
         | tsgagnon wrote:
         | _Obviously attention should be drawn to the fact that there is
         | a critical safety update being pushed OTA, but "recall" is too
         | overloaded a term if it means both "we're taking this back and
         | destroying it because it's fundamentally flawed" vs. "a
         | software update is being pushed to your vehicle (which may or
         | may not be fundamentally flawed...)"_
         | 
         | How many times in history has a vehicle recall meant the cars
         | were returned and destroyed?
         | 
         | What makes this situation any more confusing than all the
         | previous times vehicles were recalled?
        
       | tevon wrote:
       | These articles really need to add a denominator to their numbers.
        
       | woeirua wrote:
       | Remains to be seen what the OTA patch will actually do. If they
       | could make FSD work correctly they would have _already_ done it.
       | So, my guess is more smoke and mirrors to mislead the NHTSA, and
       | then NHTSA will come down with the ban hammer on FSD and require
       | them to disable it entirely.
        
       | pellucide wrote:
       | Folks with Tesla FSD, please find an empty parking lot and drive
       | all you want. It helps Tesla's mission. Please don't test it on
       | real roads. At least not on busy roads
        
       | idlewords wrote:
       | I love that this is the same guy who wants to send people to Mars
       | one-way and then figure out the return trip later.
        
         | moomoo11 wrote:
         | MOVE FAST
         | 
         | BREAK THINGS
         | 
         | (including people)
        
           | mrguyorama wrote:
           | Remember, if you consider other human beings as simply
           | objects for you to do what you like with, you won't feel as
           | bad when you inevitably get them killed
        
             | moomoo11 wrote:
             | dont forget you better like dank memes
             | 
             | like physically hit the like button
        
         | SCLeo wrote:
         | Once they are there, can't you just recall them back with OTA
         | updates?
        
         | dshpala wrote:
         | You'd be surprised how many people want a one-way ticket to
         | Mars. I think Elon can make good money from that alone.
        
           | pclmulqdq wrote:
           | I'm not sure that the people who want one of those will want
           | them if:
           | 
           | * they are not the first (or in the first ~50 or so) to get
           | there
           | 
           | * they have a lot of money and a lot of income on Earth
           | 
           | This is kind of a glory-seeking thing for people who haven't
           | made a name for themselves otherwise. Rich people would want
           | to have a return flight. That is not a recipe for making a
           | ton of money.
        
             | maxdo wrote:
             | Think of it as a startup, everyone who will go first will
             | use benefits, own land, know how to do business of
             | constantly growing colony.
        
               | kibwen wrote:
               | Of all the possible reasons to go to Mars, "economic
               | opportunity" is not even remotely one of them.
        
             | hospadar wrote:
             | Gosh we can only hope all the rich people shoot themselves
             | to mars with no return trip though. What a dream that would
             | be for the rest of us.
        
             | maxdo wrote:
             | think of it this way: you go there, work for a company, in
             | a spare time, you find gold, raise capital to start mining
             | it, got rich, build a huge house with swimming pools, and
             | all the rich attributes, but on Mars. Mars will be
             | colonized with great mixture of science and initiative
             | people. Your kids will grew up in this environment, they'll
             | receive best education and endless room for possibilities.
        
               | mrguyorama wrote:
               | Surely elon will allow lots of other people to get rich,
               | instead of just capturing any possible value for himself,
               | since he would literally be a god emperor.
               | 
               | There are no rules on mars, except the ones that the
               | people who could simply push you out the airlock make.
               | Look how elon treats his workers and you will understand
               | how a mars colony with his backing will look, except even
               | worse.
        
               | pclmulqdq wrote:
               | Sure buddy. You will be able to prospect for and mine
               | gold in your spare time after your one-way trip to Mars,
               | including building all of the equipment you need, or
               | having it sent over by rocket. And you'll be able to own
               | land there too, because homestead laws and stuff will
               | definitely carry over.
               | 
               | If you are already rich on Earth, you don't need to take
               | risks that insane to try. In other words, nobody who can
               | afford a ticket to Mars will pay for one.
        
               | p_j_w wrote:
               | I hate that I don't know if this is satire or not.
        
               | deckard1 wrote:
               | I feel like _The Expanse_ will probably be a more
               | accurate picture of the realities of a Mars colony
        
           | reaperducer wrote:
           | _You 'd be surprised how many people want a one-way ticket to
           | Mars._
           | 
           | Can I nominate someone?
        
           | georgeburdell wrote:
           | Isn't Antarctica just Mars-lite? Do you think a mere treaty
           | is hampering settlement there? That's where my mind goes when
           | people talk about colonizing another planet.
        
             | mrguyorama wrote:
             | These people are incredibly stupid. They somehow think
             | colonizing mars is easier than regulating the environment
             | here on earth, somehow think elon as dictator is better
             | than any earth government, are interested in putting
             | undertested and sketchy implants into their brain even
             | without a proposed use case, and seem to think that the
             | only reason we don't regularly visit mars is because we
             | don't have a big enough rocket, as if easy access to tons
             | of powdered rust is economically useful.
        
             | [deleted]
        
             | dmix wrote:
             | Flying to another planet on a rocket is obviously very
             | different, the goal is way more interesting than the
             | challenge of living in a hostile environment. Even if it's
             | ultimately a useless exercise for humanity.
        
         | brhsagain wrote:
         | I get that gloating when your enemies fail is one of life's big
         | pleasures, but would your opinion on the guy change if we lived
         | an alternate universe where Tesla had gotten FSD working?
        
           | JKCalhoun wrote:
           | To answer your question, no, I am utterly unenthusiastic
           | about FSD. When I want to leave the driving to someone else
           | I'll take the train.
           | 
           | I suppose I think that's better in the long run, better for
           | society in addition to being better for the planet. I know
           | it's not a popular take.
        
             | brhsagain wrote:
             | * * *
        
         | scottyah wrote:
         | Colonization is his goal, guaranteeing a return trip for
         | everyone seems pretty dumb.
        
           | ryandvm wrote:
           | > Colonization is his goal
           | 
           | Bullshit. Colonization _might_ be a side effect. Bilking
           | various governments out of subsidies and fat contracts is the
           | goal. Just like every other single Musk venture, it is based
           | entirely on feeding at the trough of federal largesse.
        
         | facorreia wrote:
         | Right, but to be eligible for the return trip you'd first need
         | to pay the loan you took to go there by working as an
         | indentured servant and buying your oxygen and food at company-
         | set prices, right?
        
           | Reki wrote:
           | What, you think people should be able to breathe for free?
        
             | [deleted]
        
             | stonogo wrote:
             | A scene in Heinlein's The Cat Who Walks Through Walls has
             | the protagonist sternly lecturing a ruffian on the
             | importance of paying for air in a domed city on the moon. I
             | still can't tell if Heinlein was serious; a later scene has
             | the same character demanding a transplanted foot be cut off
             | because he felt it indebted him.
        
               | mrguyorama wrote:
               | Heinlein might be the kind of person to genuinely believe
               | that. He also was the kind of person who would write a
               | book as an excuse to write a manifesto. Starship Troopers
               | was just him bitching about "kids these days"
        
             | shagie wrote:
             | Part of the plot of the episode Oxygen (Series 10, episode
             | 5) - https://en.wikipedia.org/wiki/Oxygen_(Doctor_Who)
             | 
             | It's an interesting watch (especially with the context of
             | labor relations in mind).
        
             | HideousKojima wrote:
             | Life imitates art, you ever seen _Total Recall_?
        
               | Tade0 wrote:
               | Or _Spaceballs_ for that matter? Wouldn 't be the first
               | idea inspired by that film.
        
         | DesiLurker wrote:
         | yeah wonder how will the OTA updates work for MARS spacecrafts,
         | 'cause there is no air in between. Checkmate Musk!
        
         | james_pm wrote:
         | No need to worry about the return trip if it's unlikely that
         | anyone would survive the trip there, right?
        
           | pavlov wrote:
           | "Six days after the radiation-wrecked colonists emerged from
           | the battered Starship, the Emperor of Mars condemned them all
           | to death by suffocation for 'insufficiently engaging with his
           | tweets'. Historians debate the meaning of this expression
           | found carved in a rock outside the base."
        
             | [deleted]
        
             | SketchySeaBeast wrote:
             | The settlement communication log will be just a series of
             | "I need people to be even more hardcore" declarations
             | coupled with lists of people thrown out of airlocks.
        
         | MengerSponge wrote:
         | My Name is Elon Musk and I Want to Help You Die in Space
         | 
         | https://www.mcsweeneys.net/articles/my-name-is-elon-musk-and...
        
         | mometsi wrote:
         | Anybody else reminded of that story where the guy gets a
         | sketchy neural implant, then goes to a dystopian mars
         | settlement, and then rips the self-driving feature out of a car
         | with his bare hands?
         | 
         | What was that called again?
         | 
         | / _puts on sunglasses preemptively /_
        
           | b1zguy wrote:
           | Um, what is it?
        
             | joaogui1 wrote:
             | Total Recall
        
               | culi wrote:
               | Having never seen it, I have no context for which details
               | were left out or bent to fit the narrative, but this
               | still feels amazing. Well-earned sunglasses
        
             | jeffrallen wrote:
             | I Don't Recall
        
             | [deleted]
        
         | diebeforei485 wrote:
         | I don't think the one-way thing is real. SpaceX wants to re-use
         | their vehicles. It wouldn't make sense to leave spacecraft on
         | Mars.
        
           | jeffbee wrote:
           | Considering the energy cost of the return trip, it makes
           | abundant sense.
        
             | diebeforei485 wrote:
             | Propellant for the return trip can be made on Mars using
             | the Sabatier process. This has been known for decades.
             | 
             | (This is why the engines for Starship are designed to use
             | methane).
             | 
             | https://en.m.wikipedia.org/wiki/Mars_Direct
        
               | londons_explore wrote:
               | But it's probably cheaper to just make a new ship.
               | 
               | It's mostly made of iron and steel, that we have here on
               | earth in abundance.
               | 
               | One day, the economics of return trips will work out. But
               | to begin with at least, all trips will be one-way.
        
               | mrguyorama wrote:
               | The CONCEPT has been known for decades. Who has put a
               | fuel processing facility on mars? I think there was some
               | minor chemistry experiment on one of the rovers to go in
               | that direction.
               | 
               | The raw energy required to make that fuel, using ANY
               | conceivable process, is extreme though. What energy
               | solution has been proposed? How much solar acreage do you
               | need? How will you keep the dust off the panels? Will
               | there be any option other than sending panels from earth?
               | 
               | All the needed prereqs have zero practical knowledge with
               | them. Nobody has even had the chance to work out kinks
               | yet. It doesn't matter how much elon wants to do
               | something, new shit takes a lot of time, money, and
               | effort to shakedown.
        
               | kayodelycaon wrote:
               | > How will you keep the dust off the panels?
               | 
               | Assuming there are people involved, give them a broom. ;)
        
         | LoveMortuus wrote:
         | Under promise, over deliver
        
           | jskrablin wrote:
           | Over promise, never deliver.
        
             | hospadar wrote:
             | Over promise, accidentally buy a huge social media company
             | way over market value because of a prank, never deliver
        
         | Kiro wrote:
         | Why do you hate the idea of going to Mars so much?
         | 
         | https://idlewords.com/2023/1/why_not_mars.htm
         | 
         | I would love to go there, even if it meant I died the second I
         | step foot on it. Getting back would be the last thing on my
         | mind.
        
           | Tostino wrote:
           | There are plenty of other places you could go travel to where
           | you get to see some amazing wonders right before your
           | guaranteed death. Right here on earth too. Why no go for some
           | of those? You can likely do it for cheaper too.
        
             | JKCalhoun wrote:
             | My last thoughts as I gasped for air in the gripping cold
             | of the Martian plain was, "Wait, this is it?"
        
             | robswc wrote:
             | Not OP but IMO, nothing on earth would compare to another
             | planet.
             | 
             | I'm not into space exploration but sometimes I look up at
             | the moon and it blows my mind that we actually went there.
             | Almost feels impossible even. Mars is that x100.
             | 
             | I personally wouldn't want to die on Mars but I understand
             | the appeal.
        
             | starbase wrote:
             | There are plenty of people who would give their lives to
             | advance civilization for the sake of future generations.
             | The brief pleasure of seeing some amazing wonders is, in
             | comparison, irrelevant.
             | 
             | Death is guaranteed either way.
        
               | kibwen wrote:
               | There is no "advancement" to be gained by merely sending
               | tourists to die uselessly on a tomb world.
        
               | starbase wrote:
               | Colonization is not tourism.
        
           | maxlamb wrote:
           | That literally describes suicide, after 9 months of being
           | trapped in a small spacecraft ("even if it meant I died the
           | second I step foot on it").
        
           | gnulinux wrote:
           | > Why do you hate the idea of going to Mars so much?
           | 
           | Because there is so much to do on earth? Literally everyone
           | is here. What other answer does anyone else need? All this
           | Mars fandom sounds to me like clinically depressed sci-fi.
        
       | w0mbat wrote:
       | An automatic software update is not a recall.
        
         | freejazz wrote:
         | Apparently, it is!
        
         | asdff wrote:
         | Why is it not? a recall just means there is something wrong
         | that needs to be fixed in all models. Whether the fix is a
         | software update or a new screw is irrelevant.
        
         | therealcamino wrote:
         | IANAL! But maybe it's a recall when 49 CFR 573 says it is.
         | Defects or noncompliance with motor vehicle safety standards
         | are what make it a recall. It's not defined as whether you have
         | to take it to the dealer or not.
        
       | clouddrover wrote:
       | It's irrelevant anyway. None of those cars will be retrofit with
       | Hardware 4:
       | 
       | https://electrek.co/2023/02/15/tesla-self-driving-hw4-comput...
       | 
       | So much for the "full self-driving" fantasy all those people paid
       | for but will not get.
        
       | oblib wrote:
       | Splitting hairs on the definition of "recall" of a potential
       | deadly flaw in 362,758 automobiles is silly. But is is the
       | correct word for that industry when there is one.
       | 
       | I worked at a Oldsmobile dealer in the 80s and fixed all kinds of
       | issues on cars that were "recalled" and that is what we called it
       | way back then and long before it. Some were trivial and others
       | were serious safety issues.
       | 
       | https://www.kbb.com/car-advice/what-do-i-need-to-know-about-...
        
         | 1970-01-01 wrote:
         | The news headlines coming out are also interesting. 'Tesla
         | Recalls' vs 'Tesla Voluntary Recalls' are technically both
         | true, however the former is a much more popular headline while
         | being less precise.
        
       | thepasswordis wrote:
       | It seems like there is a huge disconnect between people who know
       | about FSD from _using it_ and people who know about FSD from
       | things they read on the internet.
       | 
       | I have FSD, I use it every single day. I love it. If every car on
       | the road had this the road would be a _substantially_ safer
       | place.
        
         | system16 wrote:
         | As advanced cruise control? It's great. As what it's marketed
         | as? It's fraud.
        
           | rvnx wrote:
           | Wait til they repack a variant of ChatGPT as Tesla bot and
           | take all the credit for it.
        
             | panick21_ wrote:
             | Didn't Musk partly create OpenAI?
        
             | ghqst wrote:
             | Pretty on brand for Elon
        
               | rvnx wrote:
               | AI model improvements:
               | 
               | * Optimus cannot say bad things about Elon Musk anymore.
               | 
               | * Added support for Donald Trump.
               | 
               | * Added support for The Boring Company flamethrower.
               | 
               | * Advertising for Dogecoin.
        
               | ghqst wrote:
               | * ChatGPT will now cite Elon Musk tweets as a source for
               | 50% of questions.
        
               | rvnx wrote:
               | Twitter as ultimate source of truth
        
         | bandyaboot wrote:
         | I suspect that the NHSTA know about FSD from all of the above
         | and more.
         | 
         | Even if Tesla's current implementation were objectively safer
         | than the average human driver, it would still represent a net
         | negative on safety because of the negative impact the glaring
         | flaws will have on peoples' confidence in self driving tech.
        
         | [deleted]
        
         | aredox wrote:
         | Like Wozniak? https://www.techspot.com/news/97563-steve-
         | wozniak-slams-dish...
        
         | yumraj wrote:
         | It seems there's a huge disconnect between people who know how
         | to drive and are confident in their skills, and those who are
         | scared to drive and want to offload it to Musk.
         | 
         | I'm actually afraid to drive behind a Tesla and either keep
         | extra distance or change lanes if possible. I still have more
         | faith in humans to not randomly brake them in _beta_ FSD.
         | 
         | It's one thing to put your own life in the hands of this _beta_
         | model, and it's another to endanger the life and property of
         | others.
        
           | cptskippy wrote:
           | > I'm actually afraid to drive behind a Tesla and either keep
           | extra distance or change lanes if possible.
           | 
           | Irrespective of the Teslas or FSD:
           | 
           | If you're afraid of an accident due to the vehicle in front
           | of you braking regardless of the circumstances, then you're
           | following too closely. It doesn't matter if the braking event
           | is anticipated or unexpected. If you're not confident in your
           | ability to avoid an accident if the vehicle in front of you
           | slams on their brakes then you are following to closely.
        
             | yumraj wrote:
             | > If you're afraid of an accident due to the vehicle in
             | front of you braking regardless of the circumstances, then
             | you're following too closely.
             | 
             | I know what you're saying, but that is not what I'd meant.
             | Also, I don't follow too closely.
             | 
             | The difference here is that _almost always_ a driver will
             | brake depending on the happenings in front of them. So, if
             | you pay attention to not only the car in front, but in
             | front of them and in the neighboring lanes and so on, you
             | can sense and also detect patterns in how a particular
             | driver is driving. There are many skittish drivers who
             | brake every 1 second and some don 't, and so on. Basically
             | based on your driving experience you can predict a little.
             | 
             | The problem here is that this stupid POC FSD will brake
             | randomly or change lane randomly or whatever, so there is
             | no way you can predict, and hence my concern and issue with
             | it. I just prefer to change the lane, but that's not always
             | an option.
        
               | cptskippy wrote:
               | > Basically based on your driving experience you can
               | predict a little.
               | 
               | Yes, we all do that AND you're using that predictability
               | to take liberties in safety such as following too
               | closely. FSD's unpredictability exposes the vulnerability
               | in your driving process and makes you feel uneasy.
               | 
               | You (and everyone else) can follow too closely AND FSD
               | can be an unsafe steaming pile of crap. It's not an
               | either or situation.
        
           | thepasswordis wrote:
           | Yeah it's not that I'm scared of my own driving, I mean that
           | if _other_ people had it it would be safer. Things which have
           | happened in the last 3 _days_ of moderate driving around a
           | city:
           | 
           | 1) A car gets impatient at some traffic turning right on a
           | green light in a construction zone, and comes _into my lane_
           | meaning into _oncoming traffic_. We have to swerve out of the
           | way and slam on the brakes to avoid them.
           | 
           | 2) A guy gets impatient behind me slowing down over a
           | speedbump, tailgates me a few _inches_ behind and then passes
           | me by cutting into oncoming traffic in a no passing zone,
           | then cuts me off, again with a few inches to spare, and
           | speeds through the neighborhood where we eventually meet at
           | the stoplight at the end of the road.
           | 
           | 3) Every single day people run red lights. Almost every light
           | where there are people turning left at a green there are 4-5
           | cars that go through the red light after it turns.
           | 
           | My safety score on my tesla is 99. I am an extremely safe
           | driver. I wish FSD was more common because so many people are
           | _terrible_ drivers.
        
             | bena wrote:
             | Your Tesla tells you you're a safe driver. Well, geez, I'm
             | convinced that the $55,000 (maybe+?) you spent tells you
             | things you like to hear. I mean, one of the criteria is
             | "Late Night Driving". Why not driving between the hours of
             | 4pm and 6pm? When there are a lot more cars on the road,
             | which is just statistically less safe.
             | 
             | https://www.tesla.com/support/safety-score#version-1.2
             | 
             | And looking at the rest of the criteria, they're ok, but
             | hardly comprehensive. This is like, the bare minimum. It
             | doesn't measure what I would call "doing stupid shit". Like
             | crossing several lanes of traffic to get in a turn lane.
             | Forcibly merging where you shouldn't. Nearly stopping in
             | the middle of traffic before merging into a turn lane at
             | the last minute. Straddling lanes because you're not sure
             | if you really want to change lanes or not. Making a left
             | turn from a major artery onto a side street at a place that
             | is not protected by a light. Coming to a near complete stop
             | for every speed bump, then rushing ahead to the next.
             | 
             | And a host of other things that demonstrates the person
             | does not consider other people on the road at all.
             | 
             | Here's an entire article about how to game the safety
             | score:
             | 
             | https://cleantechnica.com/2021/10/14/three-quick-tips-
             | for-a-...
             | 
             | One of the tips is to "drive around the neighborhood when
             | traffic is light".
             | 
             | And the car doesn't ding autopilot for behaviors it would
             | knock a human for. Because the assumption is that the car
             | would know better I guess. But then why isn't the safety
             | score simply a deviation from what autopilot _would_ do in
             | a situation? If autopilot would brake hard to avoid a
             | collision, shouldn 't you?
        
               | thepasswordis wrote:
               | The Tesla doesn't just tell me I'm a safe driver, it also
               | gives me a cheaper insurance rate due to the safe
               | driving. The safety score is related to my monthly
               | insurance premium.
               | 
               | So maybe it's to stroke my ego? But they're also putting
               | their proverbial money where their mouth is.
        
               | bena wrote:
               | The insurance you get from Tesla, which has a vested
               | interest in its own Safety Score. It's not giving a
               | cheaper insurance rate "due to the safe driving", it's
               | giving a cheaper rate due to having a better score on the
               | metrics it decided.
               | 
               | You see how that's circular, right. It does not mean you
               | are a safe driver.
        
         | screye wrote:
         | In fact, there is a huge disconnect between people who design
         | similar systems (Vision, safety, self driving, robotics,
         | simulations) and those who use them. The criticism of Tesla is
         | coming from experts, not jealous competitors.
         | 
         | The problem with systems where 99% is a failing grade, is that
         | 99% of the time, they work great. The other 1% you die. No one
         | is against FSD or safer tech. They are against Tesla's
         | haphazard 2d vision-first FSD.
         | 
         | Wanna hear about this new fangled tech that avoids 100% of
         | accidents, has fully-coordinated swarm robotics between all the
         | cars, never swerves and can operate in all weather + lighting
         | conditions ? It's called a street car.
         | 
         | The words 'road' and 'safety' should never be uttered in the
         | the same sentence.
        
           | buildbot wrote:
           | Yep, I used to be in this field back in 2018, and everyone
           | was extremely dismissive of the rest of the industries
           | skepticism
        
           | ajross wrote:
           | > The other 1% you die
           | 
           | Um... no one died. I understand your point is hyperbole, but
           | you're deploying it in service to what you seem to claim is
           | serious criticism coming from experts. Which experts are
           | claiming FSD beta killed someone? They aren't. So... maybe
           | the "experts" aren't saying what you think they're saying?
           | 
           | > Wanna hear about this new fangled tech that avoids 100% of
           | accidents, has fully-coordinated swarm robotics between all
           | the cars, never swerves and can operate in all weather +
           | lighting conditions ? It's called a street car.
           | 
           | And this is just laughably wrong. Street cars (because they
           | operate on streets) get into accidents every day, often with
           | horrifying results (because they weigh 20+ tons). They are
           | surely safer than routine passenger cars, but by the
           | standards you set yourself they "aren't safe" and thus must
           | be banned, right?
        
         | rootusrootus wrote:
         | I have it (subscription, just wanted to try it out). I won't
         | use it again. It confidently attempted to drive me into a
         | rather solid object with very little warning. If every other
         | car on the road had this, I would sell all my cars and forbid
         | family members from getting anywhere near the road.
         | 
         | It's nice on the freeway though.
        
           | qup wrote:
           | To me it's weird you can voice an opinion like "I won't use
           | it" and then say it's nice on the highway, as if you've given
           | it enough of a trial there to endorse it when you think
           | otherwise it's going to be killing people.
        
             | CharlesW wrote:
             | How is it weird that FSD might work well on extremely
             | simple, boring stretches of road but fall apart as
             | complexity increases?
        
             | bdcravens wrote:
             | Pretty sure almost crashing into a solid object seems good
             | enough for giving it a failing score.
        
         | ajross wrote:
         | Ditto. Most fun I've had in a vehicle in my whole life. A robot
         | drives me around every day, and the worst misfeatures are that
         | it's too timid at stop signs and left turns, makes poor lane
         | choices sometimes (oops, can't turn, will go around),
         | occasionally zipper merges like a jerk (yes, technically you
         | can use that lane, but if you try you'll get honked at -- car
         | doesn't care).
         | 
         | But the company is run by an asshole who people love to hate,
         | so... everything it does is Maximally Wrong. There's no space
         | for reasoned discourse like "FSD isn't finished but it's very
         | safe as deployed and amazingly fun". It's all about murder and
         | death and moral absolutism. The internet is a terrible place,
         | but I guess we knew that already. At least in the meantime we
         | have a fun car to play with.
        
         | baguettefurnace wrote:
         | This has been my experience as well - I use autopilot a lot and
         | find it works great
        
         | helf wrote:
         | Until your car decides to randomly slam on its brakes and cause
         | a pile up.
        
           | mrdatawolf wrote:
           | Because only a FSD car causes piles up... your right, I NEVER
           | saw 90 car pileups on the news before FSD. /s
           | 
           | I'm pointing this out because he said (in italics)
           | "substantially".
        
             | micromacrofoot wrote:
             | It doesn't really matter, we have multiple incidents of FSD
             | causing accidents due to outright mistakes, not even "a
             | human would have messed up here too" situations.
        
               | sebzim4500 wrote:
               | Humans also mess up in situations that aren't "a human
               | would have messed up here too".
        
               | micromacrofoot wrote:
               | Sure, but say I consider myself a diligent driver and
               | I've never caused an accident... what does FSD have to
               | offer me? Yet another random opportunity for a car to
               | fail me. Why would I surrender control for that?
        
               | panick21_ wrote:
               | Do we? Becuase most of the news reports about claims
               | blamed on Autopilot or FSD turn out after investigation
               | not to be cases.
               | 
               | In one famous case the policy claimed that the driver was
               | '100% not in the driver seat'. This caused a huge media
               | storm and anti-Tesla wave.
               | 
               | Just recently the full report came out and stated that
               | Tesla AP was not used at all and the drive was driving
               | normally.
               | 
               | https://electrek.co/2023/02/09/tesla-cleared-highly-
               | publiciz...
               | 
               | There are quite few such cases, this one maybe being the
               | one that caused the most media attention.
               | 
               | So I tend to discredit all such reports unless its
               | several months after and a full investigation report has
               | been done.
               | 
               | Do you have links to verified reports of FSD causing such
               | crashes.
               | 
               | It seems to me that phantom breaking could lead to such
               | issues, but I have not yet seen a real report that claims
               | this happened.
        
           | 12345hn6789 wrote:
           | If someone brake checks and you hit them, you're just as much
           | of a problem. Learn defensive driving and stop driving so
           | close to another driver. Keep a safe distance.
        
           | dpkirchner wrote:
           | I'm reminded of a thread on HN where someone did a cross
           | country trip and their car only phantom-braked a couple of
           | times, and called that a huge success.
        
             | rootusrootus wrote:
             | Ha, my new Model 3 phantom braked the first time on the
             | trip home from the service center, which was only 10 miles.
             | I doubt it had even gone over 10 miles on the odometer at
             | that moment.
        
               | DarmokJalad1701 wrote:
               | How were you able to put it into autopilot? Typically,
               | the autopilot is calibrating itself the first 60 or so
               | miles of driving, and it does not let you engage it.
        
               | nanidin wrote:
               | My 2022 Model Y didn't require any calibration to use
               | autopilot after initially picking it up. IIRC there were
               | 7 miles on the odometer at the time.
        
               | rootusrootus wrote:
               | Good question! It didn't argue one bit, literally drove
               | it off the lot at the service center and hit the freeway
               | and turned AP on (it's my second Model 3, so I already
               | have such habits).
               | 
               | It's entirely possible it had more miles than 5 on the
               | odometer when I picked it up. Officially the paperwork
               | says 15, the associate who gave me the car said it was
               | actually less, and the odometer isn't very prominent so I
               | never even looked. Maybe it had 50 miles and was a reject
               | ;). I should go check TeslaFi, since I reenabled that the
               | day I bought the car. I confess that the only way I ever
               | know how many miles my car has is when I see it on
               | TeslaFi.
        
           | thepasswordis wrote:
           | If my car slams on its brakes, and this causes you to hit me,
           | then you are _exactly_ the type of person I wish was using
           | FSD.
           | 
           | You're following too close to me for the conditions, you're
           | distracted, or some combination of both.
           | 
           | My car might have to emergency brake for any number of
           | reasons. This should never cause a pileup, and if it does
           | it's your fault, not mine.
        
         | [deleted]
        
         | ceejayoz wrote:
         | "Works on my machine!"
        
         | AYBABTME wrote:
         | It really depends where you drive, it behaves super well in
         | some conditions, and awfully in others. Hence the wide
         | disparity in experience.
        
         | mbreese wrote:
         | I have a feeling the differences between experiences is heavily
         | influenced by where you are located. There is no way they are
         | adequately able to train across all of the locations FSD is
         | expected to be able to work.
        
           | natch wrote:
           | Because of this, if Tesla solves it they are going to have an
           | almost insurmountable moat.
        
           | andrewinardeer wrote:
           | Australian here. We drive on the "wrong" side of the road. I
           | have no idea if FSD even accommodates this different
           | scenario. And if it did, and being from Melbourne, having
           | diabolical yet somewhat logical hook turns (turning from the
           | furthest left lane to turn right) I will bet would not even
           | fall in FSD scope.
           | 
           | https://en.wikipedia.org/wiki/Hook_turn?wprov=sfla1
        
         | enslavedrobot wrote:
         | I have the same experience. I think if there was more
         | transparency in the data around FSD related accidents the
         | conversation would be different. The last evidence we have is
         | Tesla stating that there have been no fatalities in 60 million
         | miles of driving under FSD. Pretty good so far.
        
           | pxx wrote:
           | Regardless of whether or not that's true, that's also not the
           | impressive number you think it is. Human-driven cars have a
           | fatality rate of about 1 per 100 million miles travelled, and
           | that's including drunk drivers, inattentive drivers, and all
           | environments, not just the ones people are comfortable
           | turning on these features in.
        
             | enslavedrobot wrote:
             | Yes. The data collection continues. But this initial report
             | indicates that FSD is probably not massively worse that a
             | human.
        
           | Veserv wrote:
           | There is no such evidence beyond unsupported proclamations by
           | the vendor who, by the way, has compulsively misrepresented
           | the capabilities of the product for years. The only evidence
           | available that is not completely tainted by a complete
           | conflict of interest is basically Youtube videos and self
           | reported information [1] by investors and super fans that on
           | average show critical driving errors every few minutes and
           | even those are tainted by self interest to over-represent the
           | product.
           | 
           | Tesla's statements should only be believed if they stop
           | deliberately hiding the data from the CA DMV by declaring
           | that FSD does not technically count as a autonomous driving
           | system and is thus not subject to the mandatory reporting
           | requirements for autonomous systems in development. If they
           | did that then there could actually be a sound independent
           | third party statements about their systems. Until that time
           | their claims should be as trusted as Ford's were on the
           | Pinto.
           | 
           | [1] https://www.teslafsdtracker.com/
        
             | enslavedrobot wrote:
             | Absolutely need more data, but if a public company makes
             | untruthful claims about the past they are in big trouble.
             | They are mostly free to make claims about the future
             | without much consequence.
             | 
             | This is why startups are allowed to project massive sales
             | next year but Elizabeth Holmes is headed to jail.
             | 
             | If it turns out that FSD has had a fatal crash and Tesla
             | lied about it Musk is headed to jail too.
        
           | rvnx wrote:
           | Yes if we forget that the FSD may conveniently disconnect
           | right before an accident
        
             | CydeWeys wrote:
             | Yeah, let's now hear the stats on "fatal accidents within
             | 30s of FSD being enabled".
             | 
             | You really just can't trust Tesla at all, about anything.
             | There's no integrity there. They won't even be honest with
             | the name of their headline feature!
        
             | bhauer wrote:
             | For Tesla's collection of data on accidents with Autopilot
             | technologies, they considers Autopilot to have been enabled
             | during the crash if it was on several seconds beforehand.
             | These systems may hand control back to the driver in crash
             | scenarios moments before impact, but that doesn't mean the
             | data will report that the system was not enabled for the
             | incident.
        
         | breck wrote:
         | Exactly.
         | 
         | I think a lot of "people" online are paid shills.
         | 
         | HN should severely downrank any "opinion" posts from anon
         | accounts.
         | 
         | I have a Tesla with FSD and it's incredible. Though it drives
         | way worse than me, it drives pretty darn good, and will save a
         | ton of lives and change the world by freeing up a lot of time
         | for people to do more important things.
        
           | gg-plz wrote:
           | Is your post supposed to be read as genuine or sarcastic? It
           | looks to me like a strawman of a Tesla owner stereotype, and
           | I don't want to accidentally eat the onion here by taking it
           | at face value.
           | 
           | Your post appears to be saying that Tesla's FSD is way worse
           | than human drivers, not safer than them, but we should still
           | welcome it for the convenience. Again, this reads like a
           | strawman. I would like to think the average Tesla owner would
           | not endorse that statement.
           | 
           | The goal isn't to save time at the cost of lives, it's to
           | save both, non?
        
             | breck wrote:
             | > is way worse than human drivers
             | 
             | No, I said it's way worse than _me_. Probably around
             | performance of the average human driver.
             | 
             | I am training to be an astronaut, so on the advanced side.
        
               | gg-plz wrote:
               | A lot of people say they're significantly better than
               | average driver. It's such a stereotypical thing to say,
               | that it probably has negative truth value on average,
               | since it reflects broad personal confidence more than
               | actual driving skill.
               | 
               | You still sound like you're trying to present the weakest
               | possible straw man for Tesla critics to attack. Stop this
               | bad faith attempt to make Tesla look bad; you're dragging
               | down the discourse. Nobody is going to take your bait and
               | feed you the expected counter arguments to the nonsense
               | you expressed up-thread.
        
         | root_axis wrote:
         | FSD is terrible. I have it, it's dangerous. If I had to go 5
         | miles through city streets I'd sooner trust a human after 3
         | shots of tequila before I'd trust FSD every time.
        
           | alfor wrote:
           | Do you have the latest beta?
           | 
           | How did it improve in the last 12 months?
        
             | psychomugs wrote:
             | That other drivers are beta testers for software
             | controlling a large mass of metal and combustibles hurtling
             | down asphalt at _ludicrous_ speeds is something you'd write
             | into a dystopia story.
             | 
             | The fact of the fiction is terrifying.
        
               | alfor wrote:
               | It's a choice, you are in control all the time.
        
               | CharlesW wrote:
               | You can _re-take_ control, which may take seconds
               | depending on the mental and physical state of the driver.
               | A lot can happen in a second.
        
               | ceejayoz wrote:
               | The other drivers on the road, and pedestrians, aren't
               | getting to make that choice.
        
               | psychomugs wrote:
               | https://www.sae.org/publications/technical-
               | papers/content/20...
               | 
               | "This study investigated the driving characteristics of
               | drivers when the system changes from autonomous driving
               | to manual driving in the case of low driver alertness.
               | 
               | ...
               | 
               | In the results, a significant difference was observed in
               | the reaction time and the brake pedal force for brake
               | pedal operation in the low alertness state, compared to
               | the normal alertness state of the driver, and there were
               | instances when the driver was unable to perform an
               | adequate avoidance operation."
        
             | cma wrote:
             | Musk has the latest internal release that will fix every
             | problem at the end of the month, every month.
        
             | root_axis wrote:
             | Not sure if it's the latest, and nowadays I rarely use it,
             | and when I do it's just to show off to curious friends, but
             | I live in an east coast metro and I've _never_ been able to
             | use FSD for more than 20 minutes without it making a
             | dangerous mistake - 90% of the time it 's been a mistake
             | making a left turn, turning into the wrong lane or oncoming
             | traffic after making the left, not correctly respecting the
             | traffic lines or the left turn lights and other very scary
             | situations.
        
               | alfor wrote:
               | So it's good enough that you let it control and become
               | less attentive?
               | 
               | I was thinking that you feel in what kind of conditions
               | it behave correctly (highway), low traffic and then use
               | it in progressively more difficult situations as it prove
               | to you it can handle them consistently.
               | 
               | I know it's not ready yet. I heard as saw so many
               | positive reviews (but still with corner cases and strange
               | behaviours)
        
               | root_axis wrote:
               | I've not had any FSD issues on the highway besides
               | phantom braking, it keeps lanes correctly, safely adjusts
               | to traffic speeds and it merges into offramp lane without
               | issue (though it still makes me nervous), once it gets
               | into city streets though...
        
           | billfor wrote:
           | Disagree. I've had the beta for a year and drive in NYC. It
           | has been mostly flawless considering all scenarios it has to
           | handle. It does make the occasional mistake but (1) it's a
           | beta and (2) I'm responsible for monitoring and correcting
           | it. Agree that those that expected more should be entitled to
           | a refund if they so desire.
        
             | root_axis wrote:
             | > _It does make the occasional mistake_
             | 
             | Yeah, that's what I mean by "dangerous".
        
               | panick21_ wrote:
               | Ok but don't you have to weigh that against the potential
               | of avoiding accidents by reacting quicker then you as a
               | human can?
               | 
               | I have defiantly seen cases where FSD Beta stopped and
               | the driver didn't understand instantly understand why.
               | 
               | Or simpler cases where the car follows a lane while the
               | driver wasn't paying attention (adjusting the radio or
               | whatever). Those can easily lead to an accident but are
               | unlikely to if you are in FSD.
               | 
               | How do you make that calculation?
        
               | root_axis wrote:
               | The calculation is easy, FSD does the wrong thing so
               | often that there's no question a human driver is safer.
        
               | panick21_ wrote:
               | The question is not if FSD makes mistakes, its if FSD+a
               | human driver monitoring is worse then a human driver.
        
             | ceejayoz wrote:
             | Consider the possibility that you might _both_ be right.
             | Two people may have vastly different experiences with FSD,
             | depending on the routes they take and other variables.
        
               | ricardobeat wrote:
               | Also some people may have a very low tolerance for the
               | car not driving like they would. It does make you
               | uncomfortable .
        
         | conductr wrote:
         | I've never used it but in my book "Full Self-Driving Beta" is
         | an oxymoron
        
       | joewadcan wrote:
       | Recall = Over the Air Update.
       | 
       | Not nothing, but not as big as CNBC is making it out to be.
        
       | peoplearepeople wrote:
       | So at what point is a refund required?
        
       | causi wrote:
       | I'm astonished consumers are _still_ paying fifteen thousand
       | dollars to be guinea pigs for this bullshit. No car that
       | currently has tires on the road is going to let you fall asleep
       | in your driveway and wake up at work.
        
         | valryon wrote:
         | But you can fall asleep and wake up at work with public
         | transportation.
        
           | millzlane wrote:
           | Noone is waking you for your stop lol.
        
             | toast0 wrote:
             | They might if you live and work at the end of the line.
        
             | gwbas1c wrote:
             | When I had the luxury of taking public transportation to
             | work, I did wake someone up once.
             | 
             | I personally can't fall asleep on trains.
        
             | panki27 wrote:
             | Have you tried asking other passengers?
        
             | kevincox wrote:
             | You could set an alarm. Worst case there is a delay and you
             | wake up slightly early. The subway isn't going to magically
             | go significantly faster.
        
               | vel0city wrote:
               | I've missed trains that left a station ahead of schedule
               | by a minute or two in my regional metro system. I've also
               | had busses go past a bus stop earlier than the posted
               | time.
        
             | bbarnett wrote:
             | This seems like a cool app. A bus route aware, "wake me 3
             | minutes before my stop" vibrate app.
        
               | jakear wrote:
               | You can do this pretty easily with the Shortcuts app
               | built in to iOS. I have a flow to text my friend whenever
               | I'm near their house.
        
               | panick21_ wrote:
               | The Swiss train up already does this. It also tells you
               | exactly where to go with a plan of the station, how full
               | it is where in the train, where the wagon for food is and
               | tons of other things.
        
               | cassianoleal wrote:
               | CityMapper notifies you a stop or 2 before yours.
        
               | kgermino wrote:
               | There's a few. I use one called Transit
        
             | kgermino wrote:
             | Your phone can.
             | 
             | Beyond that, you may be surprised how many people would
             | wake you up for your stop. Especially in US Commuter Rail
             | systems people tend to follow similar habits and get to
             | know each other over the years. Even on a normal city bus I
             | quickly (months) got used to the pattern of the same people
             | getting on and off at their various stops. If you're asleep
             | at your stop someone can usually nudge you
        
           | neither_color wrote:
           | Only if you can even get a seat, and don't have to change
           | buses/trains.
        
         | mdasen wrote:
         | The "still" is what surprises me. Back in 2016 or 2017, I was
         | excited. Musk still seemed like a pretty stable person back
         | then and Tesla had basically executed on its promises and Musk
         | said that by the end of 2017 you'd be able to drive from NY to
         | LA without a single touch on the wheel. Everyone had said that
         | EVs were doomed and Tesla really showed otherwise. Why not
         | believe him? I like being hopeful about the future.
         | 
         | At this point, it's looking pretty far off, especially for
         | Tesla. Even Cruise and Waymo are having issues and they have
         | far better sensors than Teslas. It seems silly to be paying
         | $15,000 for something that realistically might not happen
         | during my ownership of the vehicle.
         | 
         | Even if it does happen, I can purchase it later. Sure, it's
         | cheaper to purchase with the car because Tesla wants the money
         | now. However, I'd rather hedge my bets and with investment
         | gains on that $15,000 it might not really cost more if it
         | actually works in the future. 5 years at 9% becomes $23,000 and
         | I don't think Tesla will be charging more than $25,000 for the
         | feature as an upgrade (though I could be wrong). If we're
         | talking 10 years, that $15,000 grows to $35,500 and I can
         | potentially buy a new car with that money.
         | 
         | Plus, there's the genuine possibility that the hardware simply
         | won't support full self driving ever. Cruise and Waymo are both
         | betting that better sensors will be needed than what Tesla has.
         | Never mind the sensors, increased processing power in the
         | future might be important. If current Teslas don't have the
         | sensors or hardware ultimately needed, one has paid $15,000 for
         | something that won't happen. Maybe we will have self driving
         | vehicles and the current Teslas simply won't be able to do that
         | even if you paid the $15,000.
         | 
         | It just seems like a purchase that isn't prudent at this point
         | in time. The excitement has worn off and full self driving
         | doesn't seem imminent. Maybe it will be here in a decade, but
         | it seems to make a lot more sense to save that $15,000 (and let
         | it grow) waiting for it to be a reality than paying for it
         | today.
        
         | jehb wrote:
         | I mean, to be honest, I'm completely unwilling to pay _any_
         | additional cost for a license to proprietary software that on
         | my vehicle, and I won 't even consider a brand that _offers_
         | paid software upgrades to their vehicles, entirely on
         | principle. See the recent case around BMW offering
         | subscriptions for heated seats. Just no, absolutely not. Reject
         | faux ownership.
        
         | Izkata wrote:
         | Taking the bus is pretty close, just don't sleep on another
         | passenger / through your stop.
        
         | gwbas1c wrote:
         | After having a Tesla with Enhanced Autopilot for 4 years, I
         | decided it wasn't worth the $7,000 extra and stuck with
         | ordinary autopilot.
         | 
         | It's just too damn glitchy to be worth thousands of dollars
         | extra.
        
           | rootusrootus wrote:
           | Hell, right now I'd pay a few bucks to disable AP so I could
           | use the old school cruise control. I still don't understand
           | Tesla's opposition for making that an option whether you have
           | AP or not.
        
             | noduerme wrote:
             | Teslas don't have _cruise control_?! The arrogance...
        
               | xeromal wrote:
               | They do.
        
               | xdavidliu wrote:
               | they meant "save a few thousand dollars so my Tesla can
               | just have the ordinary cruise control that it comes with,
               | without the autopilot which I believe to be unusable"
        
               | gwbas1c wrote:
               | No, I didn't mean that at all.
               | 
               | All Teslas come with Autopilot. It's adaptive cruise
               | control and automatic steering that will keep you in your
               | lane. It's awesome, and I love it.
               | 
               | "Enhanced" autopilot, which currently is $6000 extra, is
               | automatic lane change, automatically taking exits,
               | summon, and automatic parking.
               | 
               | The automatic lane change is nice, but it fails too often
               | to be worth thousands of dollars. All the rest of the
               | features are basically party tricks.
               | 
               | If I could have automatic lane changes for much less, I'd
               | happily pay extra for it.
               | 
               | Edit: Enhanced Autopilot was $7k extra when I bought my
               | 2nd Tesla, now it's $6k extra.
        
               | vel0city wrote:
               | > All Teslas come with Autopilot. It's adaptive cruise
               | control and automatic steering that will keep you in your
               | lane.
               | 
               | So like every Honda Accord as well.
        
             | ggreer wrote:
             | To engage traffic aware cruise control, pull the right
             | stalk down once.
        
               | 15155 wrote:
               | But make sure you have your seatbelt on first!
        
         | [deleted]
        
         | adampk wrote:
         | When someone can drive from Palm Springs to SF with no
         | intervention does that not make it bullshit anymore?
        
           | olliej wrote:
           | FSD requires you to be 100% in control of the vehicle for the
           | entire drive, and paying full attention to the driving, and
           | also do nothing.
           | 
           | This is demonstrably not a task that anyone can do, let alone
           | Joa[n] Average car driver. Highly trained pilots are not
           | expected to do that, and that's when autopilot is being used
           | in a vehicle that can provide significant amounts of time to
           | handle the autopilot going wrong - seriously, when autopilots
           | go wrong in aircraft at cruising altitude pilots can have 10s
           | of seconds, or even minutes, to handle whatever has gone
           | wrong, Tesla's FSD provides people a couple of seconds prior
           | to impact.
           | 
           | That said in countries other than the US people can reliably
           | use trains and buses, which also means that they don't have
           | to intervene in driving the vehicle.
        
             | xdavidliu wrote:
             | > Joa[n]
             | 
             | nit: Jo[a]n
        
               | olliej wrote:
               | nooooo! :D
        
             | trinix912 wrote:
             | > That said in countries other than the US people can
             | reliably use trains and buses, which also means that they
             | don't have to intervene in driving the vehicle.
             | 
             | Most of Europe doesn't have ideal public transport either,
             | I'd imagine South America, Africa being the same or even
             | worse in this aspect. It gets drastically worse the moment
             | you want to go somewhere in the countryside.
        
             | causi wrote:
             | Yeah I don't understand the "legitimate" use case. I would
             | imagine people who actually shell out the money get their
             | car onto the interstate then strap a weight to the steering
             | wheel until their exit comes up. Having to watch the road
             | without the stimulation of actively driving is worse than
             | having no assistance at all.
        
             | [deleted]
        
           | vore wrote:
           | Sure, but they can't.
        
           | onlyrealcuzzo wrote:
           | If you have a 99% chance of death - 1% of people are still
           | going to make it.
           | 
           | You obviously don't have a 99% chance of death - but just by
           | virtue of it being possible does not also mean it is not BS.
           | 
           | You can drive drunk from PS to SF also.
           | 
           | What's your point?
        
             | RajT88 wrote:
             | I know a Tesla owner. The chance of death (well, a crash
             | anyways) is more like 1%, according to him.
             | 
             | Those odds are good enough for the occasional trip home
             | from a cocktail party, but hardly cost-competitive with
             | Rideshares/Taxis.
             | 
             | How many 9's do we need before we can say it's reliable
             | enough to trust it? A few more, for sure.
        
               | rootusrootus wrote:
               | > How many 9's do we need before we can say it's reliable
               | enough to trust it? A few more, for sure.
               | 
               | Yeah, 1% is definitely not going to cut it. What are the
               | odds of dying when a human is at the wheel? Something
               | like 0.000025% if my napkin math is right and my
               | assumptions in the right ballpark.
        
           | latchkey wrote:
           | I'd love to see how one would do in the streets of Saigon.
           | Just a couple blocks would be enough... if it gets that far.
           | 
           | Update: https://insideevs.com/news/498137/autopilot-defeated-
           | congest...
        
           | [deleted]
        
           | kube-system wrote:
           | The technology is mostly fine. The bullshit is mostly the
           | false expectations they set, and the subsequent risks they
           | choose to take in implementation in order to minimize their
           | expectations miss. Other driving assistance systems try to do
           | less, but they do it honestly.
        
           | causi wrote:
           | _When someone can drive from Palm Springs to SF_
           | 
           | Does that mean "stay in the correct lane on the interstate
           | and take the proper exits without hitting anything" or does
           | that mean "begin inside your garage and end inside another
           | garage five hundred miles away without touching anything"?
           | The first one is nearly trivial.
           | 
           | It stops being bullshit when they stop telling the user they
           | need to pay attention to the road, and not one second before.
        
             | smachiz wrote:
             | I wouldn't describe it as trivial. My Tesla would phantom
             | brake consistently in a lot of spots - overpasses were a
             | very big trigger.
             | 
             | Also anytime I'd pass an exit I wasn't taking in the right
             | lane it would veer and slow down aggressively, same with
             | trying to speed match a car rapidly accelerating on an on
             | ramp.
             | 
             | Bottom line, there's absolutely nothing that isn't driver
             | assist; everything requires a significant amount of
             | vigilance when you aren't driving in a straight line, in
             | the left lane, on an empty highway with no one entering and
             | exiting.
        
             | trinix912 wrote:
             | Imagine a scenario: someone's on a bike, running down the
             | street towards the pedestrian crossing. There's a vehicle
             | (truck, bus...) on the left covering the view. The cyclist
             | swiftly crosses the road, the car doesn't detect anything
             | as there's nothing to detect (because of the truck/bus),
             | bam!
             | 
             | Many drivers would see the cyclist before and be very
             | careful when passing the other vehicle (although we don't
             | predict correctly 100% of the time either). I haven't had a
             | chance to test any self-driving system yet so I'm
             | legitimately interested, do those systems reliably detect
             | such things?
             | 
             | Not to mention people (esp. kids) racing through
             | intersections on electric scooters, casually ignoring
             | traffic rules...
        
         | ChickenNugger wrote:
         | If this comment were aimed at any other company it would be
         | flagged and dead as flamebait.
        
         | TekMol wrote:
         | Waymo does.
        
         | idlewords wrote:
         | Only a horse can do that.
        
         | DesiLurker wrote:
         | How about asleep in your driveway to ER hospital bed? does that
         | qualifies, it might excel in that.
        
       | 1970-01-01 wrote:
       | I found an interesting phrase in the official report. NHTSA
       | formally says that not respecting 'local driving customs' is a
       | defect:                    ...the feature could potentially
       | infringe upon local traffic *laws or customs* while executing
       | certain driving maneuvers...
       | 
       | Do they want Tesla to create a DB with 'allowed' and 'locally
       | faux pas' driving maneuvers? It sure reads like they do.
        
         | idop wrote:
         | Yes, and good that they do.
        
       | swyx wrote:
       | is this the largest car recall in history? holy crap
       | 
       | edit: looks like no.. the largest was 578607 cars... also by
       | Tesla lol
        
         | ceejayoz wrote:
         | That's definitely not the largest.
         | 
         | Toyota, 2.9M: https://www.consumerreports.org/car-recalls-
         | defects/toyota-r...
         | 
         | Ford, 21M: https://247wallst.com/autos/2021/07/24/this-is-the-
         | largest-c...
         | 
         | Takata airbags, possibly 100M+:
         | https://www.carvertical.com/blog/top-10-worst-car-recalls-in...
         | 
         | > Yet even after the company declared bankruptcy in 2017, the
         | Takata recall kept on giving. 65-70 million vehicles with
         | faulty Takata airbags were recalled by the end of 2019, with
         | approx. 42 million still to be recalled.
        
         | mdeeks wrote:
         | The word recall is misleading. It's a software update. My cars
         | both updated last night. I'm guessing it was this patch. If
         | that's the case then the "recall" is probably largely already
         | done.
        
           | swyx wrote:
           | well yeah if thats the case its not really a recall, agreed
        
         | sixQuarks wrote:
         | Where do you anti-Tesla folks get your information? Just wow
        
       | jeffbee wrote:
       | The word "assertive" does not appear in the comments but this
       | recall is about the "Assertive" FSD profile, which speeds and
       | runs stop signs. It is not a defect it is a design flaw.
        
         | t0mas88 wrote:
         | It's a defect in Tesla's engineering teams to release that kind
         | of thing. Their self driving is a fraud, it does not work.
         | 
         | Their CEO has claimed "it will be ready next year" for
         | literally 8 years now. How much more bullshit is he going to
         | sell?
        
           | TheCoelacanth wrote:
           | It's a defect in the US legal system that Musk didn't see the
           | inside of a jail cell five minutes after that feature was
           | inflicted upon public roads.
        
           | jeffbee wrote:
           | Yeah. I'm just saying it's not a mistake, they did this on
           | purpose.
           | 
           | Whatever PEs were involved in shipping this need to have
           | their licenses reviewed.
        
       | nmca wrote:
       | Because this is a mandatory OTA patch and does not involve stock
       | being physically returned or taken for repair (as is typically
       | implied by "recall") this headline is clearly very misleading.
        
         | [deleted]
        
       | panick21_ wrote:
       | I wish all the ink spilled on controversy over Tesla would
       | instead be spilled on doing basic safety improvement to the road
       | system that are cheap and proven to save many lives.
       | 
       | Lets be real here, automated driving or not, having actually save
       | roads helps prevent death and harm in many cases.
       | 
       | The hyper focus on high tech software by all the agencies engaged
       | in 'automotive security' is totally wrongly focused. What they
       | should actually do is point out how insanely unsafe and broken
       | the infrastructure is, specially for people outside of cars.
       | 
       | See: "Confessions of a Recovering Engineer: Transportation for a
       | Strong Town"
        
         | 2h wrote:
         | I disagree. If someone is a bad driver, and causing crashes, we
         | don't say "we need to improve the road system". We suspend the
         | drivers license until they can prove they are capable of being
         | a safe driver. We should hold this software to the same
         | standard. Until it can demonstrate safety at or above human
         | level, it should be outlawed.
         | 
         | Road systems should always be worked on, but when a crash
         | happens its usually the drivers fault, except in a minority of
         | cases where bad road engineering is to blame. this self driving
         | is fucking up enough that it cannot be blamed on the roads
         | anymore, if it ever could.
        
           | panick21_ wrote:
           | > I disagree. If someone is a bad driver, and causing
           | crashes, we don't say "we need to improve the road system".
           | 
           | Well yes, and that is literally exactly the problem. That is
           | exactly why its so unsafe in the US. Because instead of
           | building a safe system everything is blamed on people.
           | 
           | In countries that take road safety seriously, every crash is
           | analyzed and often the road system is changed to make sure it
           | does not happen again. That is why places like Finland,
           | Netherlands and so on have been consistently improving in
           | terms of death and harm caused by the road system.
           | 
           | Again, the book I linked goes into a lot of detail about road
           | safety engineering.
           | 
           | > We suspend the drivers license until they can prove they
           | are capable of being a safe driver.
           | 
           | An unsafe designed street often leads to situation where even
           | good drivers intuitively do the wrong thing. Again, this is
           | exactly the problem.
           | 
           | If you build a system where lots of avg. drivers make
           | accidents, then you have a lot of accidents.
           | 
           | > We should hold this software to the same standard. Until it
           | can demonstrate safety at or above human level, it should be
           | outlawed.
           | 
           | Yes, but its a question of how much limited resources should
           | be invested in analyzing and validating each piece of
           | software by each manufacturer. In general software like Tesla
           | AP would likely pass this test.
           | 
           | I am not against such tests but the reality is that resources
           | are limited.
           | 
           | > Road systems should always be worked on, but when a crash
           | happens its usually the drivers fault, except in a minority
           | of cases where bad road engineering is to blame.
           | 
           | I strongly disagree with this statement. Its a totally false
           | analysis. If a system is designed in a way known to be non-
           | intuitive and leading to a very high rate accidents then its
           | a bad system. Just calling everybody who makes a mistake a
           | bad drive is a terrible, terrible approach to safety.
           | 
           | Once you have a safe road system, if somebody is an extremely
           | bad driver, yes taking that person of the road is good.
           | However in a country where so much of the population depends
           | on a car, that punishment can literally destroy a whole
           | family. So just applying it to anybody who makes a mistake
           | isn't viable, specially in system that makes it incredibly
           | easy to make mistakes.
           | 
           | The numbers don't even show the problem, the unsafe road
           | system leads to less people walking in the US, and somehow
           | still creating a high rate of deaths for people who walk.
        
           | jjcon wrote:
           | Tesla publishes their safety data quarterly no need for your
           | many assumptions and speculations - Teslas are already much
           | safer than the average driver, especially when autopilot is
           | on
           | 
           | https://www.tesla.com/VehicleSafetyReport
        
             | super_flanker wrote:
             | Autopilot is not FSD, this report is clearly about
             | Autopilot, they haven't mentioned anything about FSD in it.
        
             | dmix wrote:
             | One interesting bit from a different 3rd party study:
             | 
             | > Cambridge Mobile Telematics also found that people
             | driving Teslas were 21% less likely to engage in distracted
             | driving with their phone in their Tesla compared to when
             | they drove their other car.
             | 
             | Maybe the software integration helps avoid this? Lots of
             | other cars have much more complicated interfaces to hook up
             | calls and reading texts. My mom struggled to figure out her
             | car even supported Android Auto.
             | 
             | It might just be it's a higher end car, but they didn't see
             | it for an EV Porsche
             | 
             | > These findings include an analysis of Tesla drivers who
             | also operate another vehicle. These drivers are nearly 50%
             | less likely to crash while driving their Tesla than any
             | other vehicle they operate. We conducted the same analysis
             | on individuals who operate a Porsche and another vehicle.
             | In this case, we observed the opposite effect. Porsche
             | drivers are 55% more likely to crash while driving their
             | Porsche compared to their other vehicle.
             | 
             | The reduction in speed is likely influenced by automated
             | driving, especially considering how fast a Tesla car can
             | accelerate vs normal cars:
             | 
             | > They were 9% less likely to drive above the speed limit.
             | 
             | https://electrek.co/2022/05/27/tesla-owners-less-likely-
             | cras...
        
               | normaljoe wrote:
               | > Maybe the software integration helps avoid this?
               | 
               | With auto pilot on the car is watching you. If you take
               | eyes off the road it issues more pay attention nags.
               | Failure to comply removes FSD Beta. So you have a
               | feedback loop where paying attention becomes more
               | important then your phone.
        
           | sacrosancty wrote:
           | [dead]
        
         | dzhiurgis wrote:
         | I spent 6 months driving a 2019 Hyundai with lane assist and
         | radar cruise control. Personally it's almost perfect. If you
         | added some smart road features - to improve lane assist and
         | better sign readability (to drop the speed as I enter town or
         | curve and increase when I leave one). I don't need full self
         | driving, just an improvement on speed control and some lane
         | assist.
         | 
         | Would smart roads be expensive? RFID responders seem super
         | cheap compared to how much actual asphalt costs. Authorities
         | are currently unable to remotely control flows, speeds and
         | safety which is completely bonkers.
        
           | panick21_ wrote:
           | > Would smart roads be expensive? RFID responders seem super
           | cheap compared to how much actual asphalt costs. Authorities
           | are currently unable to remotely control flows, speeds and
           | safety which is completely bonkers.
           | 
           | Yeah its not actually that easy. Go and look into train
           | signaling. And cars are not even able to do coupling.
           | 
           | Making a train system operate like a super-railway with cars
           | is crazy difficult and has never been done before.
        
         | orthecreedence wrote:
         | If we're actually serious about this, we should get rid of
         | roads altogether and replace them with rail tracks. That solves
         | 95% of the automation problem anyway.
        
           | SomaticPirate wrote:
           | US passenger rail is unsafe. We have no way near the level of
           | sophistication of European passenger rail transport.
           | 
           | We have a significantly higher number of derailments. Even
           | the worst European rail is more safe than US rail.
        
             | p_j_w wrote:
             | >US passenger rail is unsafe.
             | 
             | US passenger automobiles are more unsafe.
        
             | [deleted]
        
             | jjk166 wrote:
             | In the US trains have 17 times fewer deaths per passenger-
             | mile than cars, and even then less than 1% of deaths from
             | trains are passengers (the overwhelming majority are
             | trespassers).
             | 
             | That it could be even better does not mean it is not a
             | substantial improvement.
        
           | panick21_ wrote:
           | Yes I am absolutly for that in a lot of cases. But lets be
           | real, you are not gone end cars anytime soon. Lots of cars
           | exist, more will exits.
           | 
           | Even with the largest possible investment in rail, cars will
           | exists in large numbers.
           | 
           | So yeah, rail and cargo tram ways in cities are great. But we
           | can't just leave car infrastructure unchanged.
           | 
           | Specially because existing car infrastructure is already
           | there and cheap to modify. Changing a 6 lane road into a 3
           | lane road with large space for bikes and people is pretty
           | easy.
        
         | marricks wrote:
         | Something is changing and it's very broken so people are paying
         | attention to it.
         | 
         | If people really wanted to fix transportation that's great,
         | high speed rail and public transportation reducing the number
         | of cars on the road seem to be the best solution.
         | 
         | But hey, Elon's hyper loop was a publicity stunt to discourage
         | investment in that. So I say, whether you want to shit on Tesla
         | or public roads, shit on Elon.
        
           | panick21_ wrote:
           | > reducing the number of cars on the road seem to be the best
           | solution.
           | 
           | No actually its actually not. Less concession in a system
           | that depends on concession for safety will lead to more
           | accident not less.
           | 
           | That is what was shown during Covid, less driving, but
           | accidents per mile went up.
           | 
           | So yes, of course public transport, bikes are great, but if
           | you don't fix the underlying problem in the road system, you
           | are gone have a whole lot of accidents.
           | 
           | > But hey, Elon's hyper loop was a publicity stunt to
           | discourage investment in that.
           | 
           | This is a claim some guy has made, not the truth. What is
           | more likely is that Musk actually thinks Hyperloop is great
           | (its his idea after all) and would have wanted investment in
           | it.
           | 
           | > shit on Elon
           | 
           | I prefer not to shit on people most of the time.
           | 
           | Musk is the outcome of a South Africa/American way of thought
           | that is more in line with the US avg then most people who
           | advocate for public transport. That is the sad reality.
           | 
           | And the problem in the US road system or the US bad public
           | transport can 100% not be blamed on him. There are many
           | people with far more responsibility that deserve to be shit
           | on far more.
        
             | philosopher1234 wrote:
             | What is the underlying problem? I keep looking for it in
             | your comments
        
       | apnew wrote:
       | > The FSD Beta system may cause crashes by allowing the affected
       | vehicles to: "act unsafe around intersections, such as traveling
       | straight through an intersection while in a turn-only lane,
       | entering a stop sign-controlled intersection without coming to a
       | complete stop, or proceeding into an intersection during a steady
       | yellow traffic signal without due caution," according to the
       | notice on the website of the National Highway Traffic Safety
       | Administration.
       | 
       | Does anyone have insights on what QA looks like at Tesla for FSD
       | work? Because all of these seem table-stakes before even thinking
       | about releasing the BETA FSD.
        
         | varjag wrote:
         | That's the thing about neural networks: any QA is going to be
         | superficial due to their statistical black box nature.
        
           | ChickenNugger wrote:
           | What? Black box testing has plenty of techniques:
           | https://en.wikipedia.org/wiki/Black-box_testing
           | 
           | Whether it's a neural network inside or not is completely
           | irrelevant. That's why it's called "black box".
        
             | varjag wrote:
             | Practical neural networks operate in enormous parameter
             | spaces that are impossible to meaningfully test for all
             | possible adversarial inputs and degraded outputs. Your FSD
             | could well recognize stop signs in your battery of tests
             | but not when someone drew a squirrel on it with green
             | sharpie.
        
               | dtheodor wrote:
               | Your run of the mill computer program also "operates in
               | enormous parameter spaces that are impossible to
               | meaningfully test for all possible adversarial inputs and
               | degraded outputs".
        
               | varjag wrote:
               | This is hardly similar as the state of a typical computer
               | program can be meaningfully inspected, allowing both
               | useful insights for adversarial test setups and designing
               | comprehensive formal tests.
        
               | dtheodor wrote:
               | Right, if you consider the internal state, it is hardly
               | similar. You talked about black box and QA though. Black
               | box by definition holds the internal state as irrelevant,
               | and QA mostly treats the software it tests as a black
               | box, or in other words the tests are "superficial" as you
               | call it.
        
               | heleninboodler wrote:
               | Black box testing in typical software is, however, _less_
               | superficial, because the tester can make inferences and
               | predictions about what inputs will affect the results.
               | When you 're testing a drawing program, for example, you
               | may not know how the rectangle tool actually works
               | internally, but you can make some pretty educated guesses
               | about what types of inputs and program state affect its
               | behavior. If the whole thing is controlled by a neural
               | network with a staggering amount of internal state, the
               | connections you can draw are much, much more tenuous, and
               | consequently the confidence in your test methodology is
               | significantly harder to come by.
        
               | [deleted]
        
               | JPLeRouzic wrote:
               | Something a bit similar is clinical trial and it is
               | accepted without problem.
               | 
               | You make a black box test on several thousands (sometimes
               | only hundreds) patients, and if patients who received the
               | drug perform better the patients who received the
               | placebo, then the drug is usually accepted for
               | commercialization.
               | 
               | Yet one isolated patient may be subject to several
               | comorbidities, her environment could be weird, she could
               | ingest other drugs (or coffee, OTC vitamins or even
               | pomelo) without having declared it. In a recent past
               | women were not part of clinical trials because being
               | pregnant makes them very _" non-standard'_.
        
               | 988747 wrote:
               | First of all, clinical trials are typically longer and
               | more thorough than you imagine, they span years. The fact
               | that COVID vaccines were fast-tracked gives people wrong
               | idea about it.
               | 
               | Secondly, even after the product hits the market the
               | company is still responsible for tracking any possible
               | adverse effects. They have a hotline where a patient or
               | doctor can report it, and every single employee or
               | contractor (including receptionists, cleaning staff,
               | etc.) is taught to report such events through proper
               | internal channels if they accidentaly learn about them.
        
               | rodgerd wrote:
               | > Something a bit similar is clinical trial and it is
               | accepted without problem.
               | 
               | Clinical trials also have strict ethical oversight and
               | are opt-in. If clinical trials were like Teslas, we'd
               | yeet drugs into mailboxes and see what happened.
        
             | erikerikson wrote:
             | This seems to ignore that if you look inside the box at
             | code you could understand it whereas looking at the
             | activation values is unlikely to illuminate.
        
           | dreamcompiler wrote:
           | Exactly. It's the same reason that no amount of unit tests
           | can replace formal methods for safety-critical software, and
           | we cannot apply formal methods to neural nets [yet].
        
             | IshKebab wrote:
             | That's not really true. Most safety critical software is
             | tested without formal verification. They are just really
             | really thorough and rigorous.
             | 
             | Formal verification is obviously _better_ if you can do it.
             | But it 's still really really difficult, and plenty of
             | software simply can't be formally verified. Even in
             | hardware where the problem is a lot easier we've only
             | recently got the technology to formally verify a lot of
             | things, and plenty of things are still out of reach.
             | 
             | And even if you _do_ formally verify some software it doesn
             | 't guarantee it is free of bugs.
        
         | [deleted]
        
         | yumraj wrote:
         | > Does anyone have insights on what QA looks like at Tesla for
         | FSD work?
         | 
         | Yes. An army of Tesla owners perform the QA, in production.
        
           | [deleted]
        
           | dawnerd wrote:
           | Well first it goes to influencers that say it's perfect and
           | good for stable release no matter what the car does!
           | 
           | But in all seriousness they do have some small team that
           | validates then it goes to employees.
        
         | londons_explore wrote:
         | I suspect they have thousands of tests, but ship code that
         | passes only most of the tests...
         | 
         | That's what makes it unfinished...
         | 
         | It's never passed the 'drive from new York to LA with nobody
         | touching the controls' test...
        
         | jabagonuts wrote:
         | As a human being and motor vehicle operator of many decades I
         | have done all of the above, multiple times (very infrequently),
         | both on purpose and on accident. I'm looking forward to the
         | days when self-driving vehicles are normal, and human drivers
         | are the exception. Until then, I'm glad companies and
         | regulators are holding the robots to a higher standard than the
         | meat computers.
        
           | worik wrote:
           | > As a human being and motor vehicle operator of many decades
           | I have done all of the above, multiple times
           | 
           | Time to stop driving. That is not normal
        
         | justapassenger wrote:
         | > Does anyone have insights on what QA looks like at Tesla for
         | FSD work? Because all of these seem table-stakes before even
         | thinking about releasing the BETA FSD.
         | 
         | Tesla is not exactly in love with QA. Especially for FSD.
         | 
         | FSD is mainly 2 things: 1. (By far most important) shareholder
         | value creating promise, that's been solved for 6 years
         | according to their CEO. 2. Software engineering research
         | project
         | 
         | What FSD is not is a safety critical systems (which it should
         | be). They focus on cool ML stuff and getting features, with any
         | disregard for how to design, build and test safety critical
         | systems. Validation and QA is basically non-existent.
        
           | panick21_ wrote:
           | Do you have actual knowlage of Tesla internal QA processes,
           | any kind of source at all?
           | 
           | Based on there presentation, they for sure have a whole load
           | of tests, many built directly from real world situation that
           | the car has to handle. They simulate sensor input based on
           | the simulation and check the car does the right thing.
           | 
           | They very likely have some internal test drivers and before
           | the software goes public it goes to the cars of the
           | engineers.
           | 
           | Those are just some of things we know about.
           | 
           | I have no source on their approach to testing safety critical
           | systems, but we do know that they have a lot of software that
           | has based all test by all the major governments. They are one
           | of the few (or only) car maker fully compliant to a number of
           | standards on automated breaking in the US. We have many real
           | world example of videos where other cars would have killed
           | somebody and the Tesla stopped based on image recognition.
           | 
           | So they do clearly have some idea of how to do this stuff.
           | 
           | So when making these claims I would like to know what they
           | are based on. It might very well be true that their processes
           | are insufficient but I would actual know some real data. Part
           | of what a government could do, is forcing car maker to open
           | their QA processes.
           | 
           | Or the government could (should) have its own open test suit
           | that a car needs to be able to handle, but clearly we are not
           | there yet.
        
             | JoshCole wrote:
             | I strongly feel people ought to have these discussions
             | while consistently citing actual data sources relevant to
             | the discussion.
             | 
             | For example, did you predict, based on the speculation of
             | Tesla being incompetent with regard to safety, that they
             | have the lowest probability of injury scores of any car
             | manufacturer? Because they do.
             | 
             | Did you predict, based on speculation about Elon Musk's
             | incompetence in predicting that self-driving would happen,
             | that there are millions of self-driving miles each quarter?
             | Because there are.
             | 
             | Did you predict, based on speculation about Tesla
             | incompetence in full self-driving, that the probability of
             | accident per mile is lower rather than higher in cars that
             | have self-driving capabilities? Because they do.
             | 
             | I know this sort of view is very controversial on Hacker
             | News, but I still think it is worth stating, because I
             | think people are actually advocating for policies which
             | kill people because they don't actually know the data
             | disagrees with their assumptions.
             | 
             | https://www.tesla.com/VehicleSafetyReport
        
               | justapassenger wrote:
               | Unaudited (internal Tesla data), cherry-picked (comparing
               | with average cars in USA, which are 12 years old beaters,
               | to their very young fleet of expensive cars) data, that
               | doesn't correct for any bias (highway driving vs non-
               | highway driving being one of the many issues) is not
               | exactly the magic bullet you think it is.
               | 
               | Also, none of that is self driving. This data talks about
               | AP, not FSD. FSD is also not self driving by any means
               | (it's level 2 driver assist), but that's a detail at this
               | point.
        
               | super_flanker wrote:
               | This report is for Autopilot, not FSD which everyone else
               | is talking about on HN.
        
               | prewett wrote:
               | Interesting graph, I like that it's broken out into
               | quarters. But,
               | 
               | 1) those are statistics for the _old_ version, the new
               | version might be completely different. I 've had enough
               | one-line fixes break entire features I was not aware of
               | that my view is that _any_ change invalidates all the
               | tests. (Including the tests that Tesla should have but
               | doesn 't) Now probably a given update does not cause
               | changes outside its local area, but I can't rely on that
               | until it's been tested.
               | 
               | 2) the self-driving is presumably preferentially enabled
               | for highway driving, which I assume has fewer accidents
               | per mile than city driving, so comparing FSD miles to all
               | miles is probably not statistically valid.
        
               | JoshCole wrote:
               | I agree with you. I would really like to see datasets
               | that reflect how things actually are. I think it would be
               | really dangerous to jump to FSD being safe on the basis
               | of the data I shared. However I would hope that whatever
               | opinions people shared were congruent with the observed
               | data. I don't feel like the prediction that Elon Musk and
               | Tesla not caring about safety is congruent with the
               | observed data, which shows the autopilot has improved
               | safety, best explains the observations of improved
               | safety.
               | 
               | Just for context - I've been in a self-driving vehicle.
               | Anecdotally, someone slammed on the breaks. The car
               | stopped for me, but I was shocked: for hours before this
               | the traffic hadn't changed, it was a cross country trip.
               | I think I would have probably gotten in an accident
               | there. Also anecdotally, there are times where I felt the
               | car was not driving properly. So I took over. I think it
               | could have gotten into an accident.
               | 
               | Basically, for me, the best explanation I have for the
               | data I've seen right now is that human + self-driving is
               | currently better than human and currently better than
               | self-driving.
        
               | freejazz wrote:
               | Please ignore all the times I'm wrong in favor of all the
               | times I'm right!
        
               | JoshCole wrote:
               | I agree that people who don't cite the evidence are
               | ignoring the evidence? Are you trying to say I'm doing
               | that by pointing to relevant datasets which track the
               | number of accidents and the probability of injury? If so,
               | why are there accidents tracked in the datasets such that
               | the rate can be calculated? This kind of contradicts the
               | claim that I'm asking to ignore, but I definitely agree
               | that other people are ignoring the data if that is what
               | you are trying to say.
        
               | freejazz wrote:
               | No, your argument is just ridiculous. The standard isn't
               | and shouldn't be how much they get right. It should be
               | what they get wrong and how they do that. I completely
               | disagree with your point, and phrasing it obtusely just
               | makes you obnoxious from a conversational standpoint.
        
               | JoshCole wrote:
               | My position is that we ought to include assertions backed
               | by the evidence. Your views probably do have evidence
               | that supports them. I want to see the evidence you are
               | using, because I think that is important.
               | 
               | I'm not sorry that annoys you, because it shouldn't.
        
               | freejazz wrote:
               | >Oh. So you don't like the data, because it disagrees
               | with you. So you are trying to pretend I'm ignoring data,
               | even though I'm linking to summary statistics which by
               | their nature summarize the statistics rather than
               | ignoring the statistics.
               | 
               | Oh the data is great. I like the data. I'd take the data
               | out to dinner. It's completely besides my point, and you
               | continuing to be obtuse and rephrasing things this way,
               | is not only a strawman, but it's rude.
               | 
               | > Your views probably do have evidence that supports
               | them. I want to see the evidence you are using, because I
               | think that is important.
               | 
               | Not every policy decision is driven by data. Some are
               | driven by reasoning and sensibility, as well as deference
               | to previous practices. So your whole data-driven shtick
               | is just that... a shtick.
        
               | JoshCole wrote:
               | You claim that I said that we should ignore evidence, but
               | I didn't. I claimed that we should look at it.
               | 
               | You claim that I said that we should focus on the good,
               | but I didn't. I claimed that we should look at the data.
               | 
               | Now I feel as if you are trying to argue that looking at
               | data is wrong because not all decisions should be made on
               | the basis of the data. This seems inconsistent to me with
               | your previous assertion that my ignoring data was bad,
               | because now you argue against your own previous position.
               | 
               | That said, uh, datasets related to bayesian priors
               | support your assertions about deference in decision
               | making. So you could, if you cared to, support that claim
               | with data. It would contradict your broader point that I
               | should not want to have data, but you could support it
               | with data and I would agree with you, because contrary to
               | your assertion I was making an argument for evidence
               | informed statements. Your inference about whether I think
               | the evidence leans should not be taken as an argument
               | that I believe my positions would always be what was
               | reached by looking at the evidence, because I don't think
               | that is true. I'm obviously going to be wrong often.
               | Everyone is.
               | 
               | Unfortunately, I think you lie too much and shift your
               | goalposts too much. So I'm not going to talk to you
               | anymore.
        
               | freejazz wrote:
               | I never said you shouldn't want to have data. I said that
               | the data isn't the only story, so appeals to data aren't
               | dispositive. Data is clearly the only thing you are
               | capable or willing to talk about. There isn't a point in
               | furthering this conversation if you are just going to
               | repeatedly misrepresent my comments and converse in this
               | incredibly obtuse manner.
               | 
               | I also caught you editing out what was an excessively
               | rude comment. I'm gonna pass on further conversation,
               | thanks.
        
               | bischofs wrote:
               | A system that protects 400 people but kills 1 is not a
               | system that I want on public roads because I don't want
               | to be in the 1 - Elon and the children of Elon are
               | basically making the assumption that everyone is okay
               | with this.
               | 
               | The probability of an accident for any driver assistance
               | system will ALWAYS be lower than a human driver - but
               | that doesn't mean the system is safe for use with the
               | general public!
               | 
               | People like me are not advocating for "killing people"
               | because we aren't looking at data - it's that no company
               | has the right to make these tradeoffs without the
               | permission and consent of the public.
               | 
               | Also if this was about safety and not just a bunch of
               | dudes who think they are cool because their Tesla can
               | kinda drive itself, why does "FSD" cost $16,000?
        
               | JoshCole wrote:
               | > A system that protects 400 people but kills 1 is not a
               | system that I want on public roads because I don't want
               | to be in the 1 - Elon and the children of Elon are
               | basically making the assumption that everyone is okay
               | with this. > > The probability of an accident for any
               | driver assistance system will ALWAYS be lower than a
               | human driver - but that doesn't mean the system is safe
               | for use with the general public!
               | 
               | Totally we should be wary of a system that protects 400
               | and kills 1. Thank you for providing the numbers. It
               | helps me show my point more clearly.
               | 
               | If you are driving on a road you encounter cars. Each car
               | is a potential accident risk. You probably encounter a
               | few hundred cars after ten or so miles. Not every car
               | crash kills, but lets just assume they all do to make
               | this simpler. For the stat you propose, you are talking
               | about feeling uncomfortable with an accident per mile of
               | something around the ballpark of ten miles.
               | 
               | Now lets look at the data. The data suggests the actual
               | miles per accident is closer to 6,000,000 miles per
               | accident. This is six orders of magnitude diverged from
               | the number of miles per accident that you imply would
               | make you feel uncomfortable.
               | 
               | Lets try shifting that around to a context people are
               | more familiar with: a one dollar purchase would be a soft
               | drink and a six million dollar purchase would be
               | something like buying a house in the bay area. This is a
               | pretty big difference I think. I feel very differently
               | about buying a soft drink versus buying a house in the
               | Bay Area. If someone told me they felt that buying a
               | house was cheap, then gave a proposed price for the house
               | that was more comparable to the cost of buying a soft
               | drink, I might suspect they should check the dataset to
               | get a better estimate of the housing prices, because it
               | might give them a more reasonable estimate.
               | 
               | So I very strongly feel we should cite the numbers we
               | use. For example, I feel like you should really try and
               | back up the use of the 400 to 1 number so I understand
               | why you feel that is a reasonable number, because I do
               | not feel that it is a reasonable number.
               | 
               | > Also if this was about safety and not just a bunch of
               | dudes who think they are cool because their Tesla can
               | kinda drive itself, why does "FSD" cost $16,000?
               | 
               | Uh, we are a on venture capitalist adjacent forum. You
               | obviously know. But... well, the price of FSD is tuned to
               | ensure the company is profitable despite the expense of
               | creating it as is common in capitalist economies with
               | healthy companies seeking to make a profit in exchange
               | for providing value. It is actually pretty common for
               | high effort value creation, like creation of a self-
               | driving car or the performance of surgery, for the prices
               | to be higher.
        
               | mcguire wrote:
               | As any Tesla supporter will tell you, Autopilot != FSD.
               | 
               | (Is Autopilot still limited to divided, limited access
               | highways? Those are significantly safer than other
               | roadways.)
        
             | justapassenger wrote:
             | 2 sources.
             | 
             | 1. I know people working at Tesla.
             | 
             | 2. Much more important one - Elon's Twitter feed. They're
             | doing last minute changes, and once it compiles and passes
             | some automated tests, it's tested internally only over few
             | days before it's released to the customers. Even if they
             | had world class internal testing (they don't), for
             | something having to work in such a diverse environment like
             | self driving system without any geo-fencing, those
             | timelines are all you need to know.
        
               | esalman wrote:
               | Some manufacturers hold off on newer, untested tech for
               | years before adding that to their vehicles. This is what
               | happens when safely is a priority.
               | 
               | That's why I bought/will keep buying Toyota/Lexus.
        
             | julianlam wrote:
             | > We have many real world example of videos where other
             | cars would have killed somebody and the Tesla stopped based
             | on image recognition.
             | 
             | I think you and I must've watched a different video.
        
               | panick21_ wrote:
               | Yes I have also seen many videos where it makes mistakes.
               | But also many where it prevented them.
        
             | whamlastxmas wrote:
             | The person above you has no idea what they're talking
             | about. There's literally hundreds of people at Tesla whose
             | job is QA and tools to support QA
        
               | justapassenger wrote:
               | And how does that change anything about my statements?
               | 
               | Yeah, they have QA. But for the problem they claim
               | they're solving (robotaxis) and speed of pushing stuff to
               | customers (on the order of days) it vastly, vastly
               | insufficient. And it lacks any safety lifecycle process
               | regards - again, just look at the timelines. Even if
               | you're super efficient, you cannot possibly claim you can
               | even such a basic things like proper change management
               | (no, commit message isn't that) or validation.
        
               | dylan604 wrote:
               | > and speed of pushing stuff to customers (on the order
               | of days)
               | 
               | well, if you don't get the software pushed to the QA team
               | (the customers), how else are they going to get it
               | tested?
        
               | 93po wrote:
               | can we please stop with this disinformation? the
               | customers are not the QA team.
        
               | dylan604 wrote:
               | what do you call them? there's no way possible that they
               | can make changes to the software and have them thoroughly
               | vetted before the OTA push. Tesla does not have enough
               | cars owned by the company driving on public roads to vet
               | these changes. The QA team at best can analyze the data
               | received from the customers. That makes the customers the
               | testers in my book.
        
               | 93po wrote:
               | > it lacks any safety lifecycle process
               | 
               | completely demonstrably false
               | 
               | > speed of pushing stuff to customers (on the order of
               | days)
               | 
               | this is also false and doesn't happen
               | 
               | > you cannot possibly claim you can even such a basic
               | things like proper change management (no, commit message
               | isn't that) or validation.
               | 
               | you know absolutely nothing about the internal timelines
               | of developments and deployments at tesla and to suggest
               | it's impossible without that knowledge is just dishonest
        
               | justapassenger wrote:
               | > > it lacks any safety lifecycle process > completely
               | demonstrably false
               | 
               | Head of AP, testified under oath, that they don't know
               | what's Operational Design Domain. I'll just leave it at
               | that.
               | 
               | > > speed of pushing stuff to customers (on the order of
               | days) > this is also false and doesn't happen
               | 
               | Never ever Musk tweeted about .1 fixing some critical
               | issues coming in next few days? I must live in a
               | different timeline.
               | 
               | > > you cannot possibly claim you can even such a basic
               | things like proper change management (no, commit message
               | isn't that) or validation. > you know absolutely nothing
               | about the internal timelines of developments and
               | deployments at tesla and to suggest it's impossible
               | without that knowledge is just dishonest
               | 
               | Let's assume I have no internal information. If it looks
               | like a duck, swims like a duck, and quacks like a duck,
               | then it probably is a duck.
        
         | Veserv wrote:
         | It also does not know what one way street, do not enter, road
         | closed, and speed limit signs are. Really, the only signs it
         | appears to know about are stop signs.
         | 
         | As for their QA process, in 2018 they had a braking distance
         | problem on the Model 3. They learned of it, implemented a
         | change that alters the safety critical operation of the brakes,
         | then pushed it to production to all Model 3s without doing any
         | rollout testing in less than a week [1]. So, their QA process
         | is probably: compiles, run a few times on the nearby streets (I
         | am pretty sure they do not own a test track as I have never
         | seen a picture of tricked out Teslas doing testing runs at any
         | of their facilities), ship it.
         | 
         | [1] https://www.consumerreports.org/car-safety/tesla-
         | model-3-get...
        
           | ggreer wrote:
           | Teslas have understood speed limit signs since 2020.[1]
           | 
           | 1. https://finance.yahoo.com/news/upcoming-tesla-
           | software-2020-...
        
             | freejazz wrote:
             | Not according the recall the NHTSA posted that is the
             | subject of this entire thread....
        
             | Veserv wrote:
             | It uses maps for that.
             | 
             | I have a winding road near me with a speed limit of 35 mph,
             | but 15 mph on certain curves as indicated by a speed limit
             | sign. It ignores those speed limit signs and will attempt
             | to make the turns at 35 mph resulting in it wildly swerving
             | into the other lane and around a blind turn with maybe 30
             | feet of visibility. It has also attempted to do it so
             | poorly that it would have driven across the lane and then
             | over the cliff without immediate intervention.
             | 
             | Unsupported claims by a manufacturer that compulsively lies
             | about the capabilities of their products except when
             | directly called on it are the opposite of compelling
             | evidence.
        
               | ggreer wrote:
               | I'm talking about standard speed limit signs. You're
               | talking about the signs that warn about sharp turns and
               | advise maximum speeds. Yes it would be good if the
               | software understood those signs, but that's a different
               | issue.
               | 
               | Teslas definitely read speed limit signs. I've had mine
               | correctly detect and follow speed limits in areas without
               | connectivity or map data. It also follows speed limits on
               | private drives (if there is a sign) and obeys temporary
               | speed limit signs that have been put up in construction
               | zones.
        
               | Veserv wrote:
               | So they read some, but not all speed limit signs, and
               | especially not the really important ones that inform you
               | that you will be going dangerously fast if you do not
               | read and follow them. That is criminally unacceptable.
        
               | freejazz wrote:
               | These are not the speed limit signs you are looking for!
        
         | [deleted]
        
         | TaylorAlexander wrote:
         | Andrej Karpathy was the AI lead for most of the project and he
         | has talked about the general system design.
         | 
         | They have a set of regression tests they run on new code
         | updates either by feeding in real world data and ensuring the
         | code outputs the expected result, or running the code in
         | simulation.
         | 
         | It does seem worrying that they would miss things like this.
         | 
         | Here's a talk from Karpathy explaining the system in 2021:
         | 
         | https://youtu.be/aNVbp0WKYzY
         | 
         | Though I don't recall if he explains the regression testing in
         | this talk, there's a few good ones on YouTube.
        
           | ertian wrote:
           | It's not even a bit surprising they'd miss things like this,
           | IMHO. They do tests with a few (maybe even a lot of)
           | intersections, but there are _thousands upon thousands_ of
           | intersections out there, including some where bushes are
           | obscuring a stop sign, or the sign is at a funny angle, or
           | sunlight is reflecting off the traffic lights, or heavy rain
           | obscuring them, or plain old ambiguous signage...there 's
           | _bound_ to be mistakes. Human drivers make similar mistakes
           | all the time.
           | 
           | I used to think that fact was going to delay self-driving
           | cars by a decade or more, because of the potential bad press
           | involved in AI-caused accidents, but then along comes Tesla
           | and enables the damn thing as a beta. I mean...good for them,
           | but I've always wondered if it was going to last.
           | 
           | I've been using it pretty consistently for a few months now
           | (albeit with my foot near the brake at all times). I haven't
           | experienced any of the above. Worst thing I've seen is the
           | car slamming on the breaks on the freeway for...some reason?
           | There was a pile-up in a tunnel caused by exactly that a
           | month or so ago, so I've been careful not to use FSD when I'm
           | being tailgated, or in dense traffic.
        
         | rvnx wrote:
         | There are many of such tests in the open.
         | 
         | There is even a former Tesla AI engineer that throws objects in
         | front of the car on YouTube, as a demonstration.
         | 
         | The results are not glorious at all :| (trying to find the
         | channel back if someone knows).
         | 
         | And random public tests too:
         | https://www.youtube.com/watch?v=3mnG_Gbxf_w
         | 
         | This is a basic safety auto-braking. Just feels very wrong to
         | even accept it goes into release.
        
           | Veserv wrote:
           | You are probably thinking of:
           | https://www.youtube.com/@AIAddict
        
           | panick21_ wrote:
           | This is not a former Tesla engineer. This is competitor who
           | wants to discredit Tesla and sell its own solution.
           | 
           | The guy behind this is known to be untrustworthy, and many of
           | the videos don't actually do what he claims. Notably he
           | refused to release the videos that would prove his claims
           | right.
           | 
           | The reality is that Tesla scores high on all the automated
           | breaking test done by government. The driver however can
           | override this, and that is exactly what is being done in this
           | video.
        
             | diggernet wrote:
             | So they scored high on whatever version of software was on
             | the specific car tested by government at some point in
             | time. Has any government done any testing at all on the
             | version of software actually in use today?
        
         | [deleted]
        
       | can16358p wrote:
       | So are they recalling, or releasing a software fix?
        
         | andrewmunsell wrote:
         | I think this is a legitimate question, and a reflection of how
         | the NHTSA needs to adjust their wording for modern car
         | architectures.
         | 
         | It's technically a recall, but it's fixed with an OTA update.
         | But the fact that any "defect" that can be fixed with an OTA
         | update is called a "recall" is confusing to consumers and
         | contributes to media sensationalism.
         | 
         | There absolutely needs to be a process overseen by regulators
         | for car manufacturers to address software-based defects, but
         | the name would benefit from being changed to reflect that it
         | can be done without physically recalling the vehicle to the
         | dealer.
        
           | panick21_ wrote:
           | The problem is that almost no other car company is yet
           | seriously fixing things with OTA. And 99% of cars on the road
           | don't have OTA. So fixing the naming will likely happen but
           | it will take a while.
           | 
           | Its crazy that Tesla has been doing OTA for 10+ years and
           | today many cars are released that are not capable of being
           | upgraded.
           | 
           | And even those few cars that do support OTA only support it
           | for a very limited amount of system. Often they still need to
           | go to the shops because lots of software lives on chips and
           | sub-components that can't be upgraded.
        
             | deckard1 wrote:
             | What is made easy will become inevitable.
             | 
             | OTA updates carry a perverse set of incentives. Look at the
             | gaming industry. They went from putting out rock solid
             | games because of necessity (the reality of publishing
             | physical cartridges and CD-ROMs without network updates) to
             | the absolute dogshit of No Man's Sky and Cyberpunk 2077.
             | Gamers have effectively become an extension of QA to the
             | point that some game devs simply stop doing QA at all.
             | Which we are seeing clearly with the "beta" version of
             | self-driving software.
             | 
             | To make matters worse, firmware devs are on the lower totem
             | pole of the developer hierarchy. They live more on the cost
             | center side than the profit center side (think airbag
             | control vs. the guy that did the whoopee cushion sounds).
             | The quality of firmware is already incredibly poor across
             | the range of consumer devices. OTA incentivizes
             | corporations to release software earlier than they
             | currently do knowing that they can always fix it later if
             | necessary.
        
               | panick21_ wrote:
               | Game developers don't face the same level regulatory and
               | potential legal issues.
               | 
               | I do share your concern, but I still likely prefer it to
               | that to software that can never be updated at all.
        
           | tsgagnon wrote:
           | _It 's technically a recall, but it's fixed with an OTA
           | update. But the fact that any "defect" that can be fixed with
           | an OTA update is called a "recall" is confusing to consumers
           | and contributes to media sensationalism._
           | 
           | Would it be sensationalism if the same recall happened to
           | cars not capable of the OTA update?
           | 
           | Because I think there should be media "sensationalism" about
           | these types of issues, regardless of whether they can be
           | fixed with physical or OTA repairs.
        
           | justin66 wrote:
           | I get why Elon Musk objects to the use of the word recall,
           | but I don't understand why anyone else is going along with
           | that.
           | 
           | > But the fact that any "defect" that can be fixed with an
           | OTA update is called a "recall" is confusing to consumers
           | 
           | What is confusing about this to you?
           | 
           | > contributes to media sensationalism
           | 
           | Why do you think this recall is more sensationalistic than
           | many of the other recalls issued recently? Automotive recalls
           | often address serious issues with cars. "Fix this or you
           | could die" is a common enough theme when, you know, if you
           | don't fix it you could die.
           | 
           | > but the name would benefit from being changed to reflect
           | that it can be done without physically recalling the vehicle
           | to the dealer.
           | 
           | Why? Information related to how, when, and where the recall
           | can be addressed is contained in the text of the recall
           | notice, same as it ever was.
        
       | robomartin wrote:
       | If Tesla are not careful with this, drivers of other vehicles
       | will have serious reservations being anywhere around a Tesla. I
       | have to say, I already do.
       | 
       | I will not stay behind or next to a Tesla if I can avoid it. I'll
       | avoid being in front of one if the distance is such that I cannot
       | react if the thing decides to suddenly accelerate or, while
       | stopping, not break enough or at all.
       | 
       | In other words, I have no interest in risking my life and that of
       | my family based on decisions made by both Tesla drivers (engaging
       | drive-assist while not paying attention, sleeping, etc.) or Tesla
       | engineering.
       | 
       | Will this sentiment change? Over time. Sure. If we do the right
       | things. My gut feeling is program similar to crash testing safety
       | will need to be instituted at some point.
       | 
       | A qualified government agency needs to come-up with a serious
       | "torture" test for self-driving cars. Cars must pass a range of
       | required scenario response requirements. Cars will need to be
       | graded based on the result of running the test suite. And, of
       | course, the test suite needs to include an evaluation of scenario
       | response under various failure modes (sensor damage, impairment,
       | disablement and computing system issues).
       | 
       | I am not for greatly expanded government regulation over
       | everything in our lives. However, something like this would, in
       | my opinion, more than justify it. This isn't much different from
       | aircraft and aircraft system certification or medical device
       | testing and licensing.
        
         | skullone wrote:
         | I was driving behind a Tesla which I can only assume was on FSD
         | mode down a narrow side street coming up to a turn to a busy
         | intersection. The car/driver almost drove straight into cross
         | traffic, ended up blocking a lane without moving for like 15
         | seconds before it turned right (while signalling left) and
         | almost crashed again into the oncoming traffic coming the other
         | way. Seriously unsafe
        
       | supernova87a wrote:
       | I wonder what severity / frequency of incidents or regulator
       | awareness required them to actually come out and say that they're
       | issuing a recall, rather than just quietly putting it into an
       | upcoming release like they probably would otherwise do?
        
         | pilsetnieks wrote:
         | > A new car built by my company leaves somewhere traveling at
         | 60 mph. The rear differential locks up. The car crashes and
         | burns with everyone trapped inside. Now, should we initiate a
         | recall? Take the number of vehicles in the field, A, multiply
         | by the probable rate of failure, B, multiply by the average
         | out-of-court settlement, C. A times B times C equals X. If X is
         | less than the cost of a recall, we don't do one.
        
         | panick21_ wrote:
         | As far as I know pretty much all update to critical system will
         | always be told to the government and cause a 'recall'. But I'm
         | not 100% on what the actual regulatory requirement are.
        
       | slt2021 wrote:
       | Knowing how software engineers write code - I would never ever
       | trust my life to the BS that is FSD.
       | 
       | It is not so much software, as well as <other people on the road>
       | problem.
        
         | [deleted]
        
       | coding123 wrote:
       | Lets all guess on the countdown to the laser approach being
       | announced.
        
       | totalhack wrote:
       | Thanks to the Tesla owners on the front lines risking their lives
       | for the greater good (of Tesla)!
       | 
       | Jokes aside, it's gotta be damn tough to QA a system like this in
       | any sort of way that resembles full coverage. Can't even really
       | define what full test coverage means.
        
         | foepys wrote:
         | If it just were only Tesla owners.
         | 
         | A autonomous Tesla driving into a group of people or swerving
         | into oncoming traffic is potentially killing other people.
        
       | jlv2 wrote:
       | There really needs to be a distinction between an actual recall
       | (my Model S had one to change the latch on the frunk) and this
       | type of "recall" that is nothing more than an OTA software
       | update.
        
         | ScottEvtuch wrote:
         | I think this is splitting hairs. Would it still be a recall if
         | a mechanic drives to your house to fix a mechanical problem?
        
         | buildbot wrote:
         | Why? It involves a safety system! That needs to be tracked
         | publicly and updated! Nothing more than an OTA - does not mean
         | much when everything is fly by wire and a bug could mean your
         | car does not stop accelerating or something.
        
           | Analemma_ wrote:
           | To me "recall" clearly implies that I have to drive it to the
           | dealership and let them fix or install something. "Mandatory
           | software update" might be a better term.
        
           | JaggerJo wrote:
           | I'm leasing a Honda E and software wise literally nothing
           | except apple car play works/ is usable.
           | 
           | Some things are absolutely safety relevant. But no one cares.
        
           | diebeforei485 wrote:
           | Recall implies something being returned. This should be
           | called a mandatory software patch, or something like that.
           | 
           | The problem is Tesla owners repeatedly see "recall" to mean
           | "software update", so this might lead to a lot of confusion
           | if a physical recall is actually required in the future.
        
             | QuercusMax wrote:
             | I work in the medical device space, and we will often have
             | "recalls", which usually result in a software patch. Recall
             | != return.
        
               | diebeforei485 wrote:
               | Are these delivered directly to the device and installed
               | overnight automatically (like an iPhone iOS update) or do
               | they have to be hooked up to a computer to install the
               | software update?
        
               | vitaflo wrote:
               | Depends on the device. Some require connection to a host
               | system, some can be done over the air. 100% depends on
               | the security profile of the device in question and what
               | the FDA allows.
        
               | QuercusMax wrote:
               | I've primarily worked with Software as a Medical Device,
               | so recalls generally involve a config tweak, upgrade, or
               | downgrade.
        
               | buffington wrote:
               | If Tesla replaced the actual computer that runs the
               | software that they're recalling, would you consider that
               | a recall? What if there was no actual physical fault with
               | the computer, but it just had firmware flashed to a chip
               | that couldn't be reflashed?
               | 
               | I'm looking at a piece of mail right now that's an
               | official recall notice for a different make/model of car
               | I own. The issue? Improperly adjusted steering
               | components. The company is offering to make the correct
               | adjustments for free. Nothing is being replaced.
               | 
               | Whether the recall is to replace a poorly designed
               | physical component, or to make an adjustment, or to apply
               | a software update doesn't make a difference to
               | regulators.
               | 
               | A recall is a legal process that's designed to encourage
               | manufacturers to fix safety issues while also limiting
               | their liability. Companies avoid recalls if they can
               | because it's costly, time consuming, and isn't good PR.
               | But it's worth it if the issue is bad enough that it
               | risks a class action lawsuit, or individual lawsuits, and
               | most desirable when someone like the US government is
               | demanding a recall or risk legal consequences.
               | 
               | When a company issues a recall, they make their best
               | effort to notify consumers of the issue, provide with
               | clear descriptions of how consumers can have the issue
               | fixed, and make it clear that it will be paid for by the
               | manufacturer. In return, the manufacturer is granted
               | legal protections that drop their risk of being sued to
               | nearly zero.
        
               | joshuahaglund wrote:
               | Recall is the word we use when a defect was found and
               | must be repaired or retrofit. The actual process of
               | repair could involve customer action or not.
               | 
               | Teslas are basically digitally tethered to the dealer, so
               | they can be "recalled" anytime (without your approval,
               | fwiw), but it doesn't make the word not apply.
        
             | eecc wrote:
             | Well in computing the term "Security Update" is in use, why
             | not call this a "Safety Update".
        
             | now__what wrote:
             | In auto recalls, "recall" more generally implies something
             | being repaired than returned, IME
        
               | buffington wrote:
               | It implies neither.
               | 
               | A recall is a legal process only. Whether the recall
               | repairs, replaces, adjusts something doesn't matter.
               | Whether a fix is applied as software, or labor, or
               | replacement parts doesn't matter. Whether a customer
               | needs to do something or not doesn't matter.
               | 
               | A recall simply says: as a manufacturer, working with
               | government authorities, while taking specific prescribed
               | steps to communicate and correct an issue at the cost of
               | the manufacturer, the manufacturer is then immune from
               | lawsuits that could arise were they to ignore the issue.
        
               | tshaddox wrote:
               | Yeah, but the reason why they used the word "recall" is
               | that "recall" was already a word that means to officially
               | demand that something (or someone) be returned to a
               | previous location. Of course, before over the air
               | software updates, essentially anything on an automobile
               | that needed to be replaced/repaired/modified would need
               | to be returned to a dealer/mechanic to do so. So now it
               | sounds a little weird to some people to refer to an over
               | the air software update as a recall.
        
             | pclmulqdq wrote:
             | My Honda had 2 recalls on it recently: one was a software
             | update and the other was over the fact that a few of the
             | cables on the audio system were slightly too short. This
             | sounds like the same thing other than the fact that I had
             | to pop over to the dealership to do them. Even with the
             | cable replacement, in and out in an hour with a nice lounge
             | to sit in.
        
               | diebeforei485 wrote:
               | If the software update has to be done physically at a
               | dealer location, that's a recall.
               | 
               | The scenario in this article is an over-the-air update.
        
               | pclmulqdq wrote:
               | I wasn't told about either of these recalls, though. I
               | went in because the audio was clicking, and they told me
               | they had 2 recalls out on my car, including for the audio
               | issue, and that it was a quick fix.
               | 
               | Normally, it happens with an oil change.
        
               | [deleted]
        
               | hunter2_ wrote:
               | I just looked it up. The relevant definition of recall is
               | to request returning a product.
               | 
               | If it's possible to buy a software license online and
               | then return it online after deciding you no longer want
               | it (i.e., a non-physical return), then it stands to
               | reason that Tesla can request that you return the
               | defective software OTA and receive replacement software
               | OTA, and that would be a recall. The fact that you are
               | forced into returning the defective software by virtue of
               | not having the opportunity to block the return request is
               | a fairly minor detail.
        
             | mrguyorama wrote:
             | "Recalls" almost never involve being returned. There's a
             | recall out on my car's water pump (Thanks a lot VW) and no
             | part of it involves sending my car back, or even
             | interacting with VW or a dealer. It's just something I'm
             | supposed to keep in mind over its life and various
             | maintenance in the shop.
             | 
             | Other cars get "recalls" all the time that amount to
             | updating the software in the ECU or TCU. Tesla is simply
             | being treated like everyone else.
             | 
             | Hell, even in food, a "recall" usually means "throw it
             | away" not return it.
        
             | asdff wrote:
             | Recall should just mean affected vehicles must have the
             | fix. What component of the vehicle is affected or how it is
             | supposed to be fixed should be irrelevant, something is
             | defective and the vehicles should probably not be used
             | until that's sorted.
        
             | clnq wrote:
             | It seems to me that calling it a "recall" emphasises the
             | severity of the problem, which might make it easier to
             | argue that a lot is being done for customer safety to the
             | interested authorities. But I don't think Tesla wants this
             | to be seen more than an OTA update from the perspective of
             | their customers (at least those who ignore Tesla news).
             | 
             | I had a VW car that was "recalled" shortly following the
             | emissions scandal. The dealership asked me to come in for a
             | free software update related to emissions. So you can say
             | it's a "recall" to the lawmakers but call it a "free
             | software update" to the user.
        
               | dmix wrote:
               | The industry needs to come up with a new related term
               | like "Software Recall"
               | 
               | Recalling the hardware is a drastically more difficult
               | request to impose on customers and
               | financially/logistically for the car maker.
               | 
               | That's a disporportate response just to highlight
               | importance of an OTA update.
        
               | dingle_thunk wrote:
               | Recall is just the wrong word for what this is.
        
               | gxc33 wrote:
               | 'full self driving' is an even more incorrect term, then,
               | if you want to be pedantic. if the car mfg takes zero
               | liability/accountability, then it is zero self-driving.
               | 
               | you can, in fact, 'recall' software. this is semantically
               | accurate description of what is happening.
        
               | outworlder wrote:
               | > Recalling the hardware is a drastically more difficult
               | request to impose on customers and
               | financially/logistically for the car maker.
               | 
               | And the distinction matters to consumers because...?
               | 
               | A component is faulty. It needs to be fixed. Whether or
               | not you have to drive to a dealership, if it's OTA, if
               | someone at a dealership needs to plug a specialized
               | device to your car's OBD port, or the car is unfixable
               | and needs to be melted to slag and you get a new one
               | doesn't really matter. There's an issue, it is a safety
               | issue, and it needs to be fixed.
               | 
               | How efficient the process can be it's another matter
               | entirely. That's up to the manufacturers.
        
               | chc wrote:
               | > _Whether or not you have to drive to a dealership, if
               | it 's OTA, if someone at a dealership needs to plug a
               | specialized device to your car's OBD port, or the car is
               | unfixable and needs to be melted to slag and you get a
               | new one doesn't really matter._
               | 
               | As a car owner, those scenarios are drastically different
               | to me. I have a hard time imagining anyone saying "It
               | doesn't really matter to me if my car receives an OTA
               | update or if I need to drive 2 hours to a dealership or
               | if my car is melted to slag."
        
               | shkkmo wrote:
               | I would say that whether a "recall" requires some action
               | on the part of the owner is a very important
               | distinguishing factor.
               | 
               | A recall should unambiguously mean that some action from
               | the owner is required to resolve the issue (e.g. taking
               | it to a dealer to get a software update installed.)
               | 
               | If no action is required (other than caution / not using
               | the product feature), we should use some other term such
               | as "safety advisory" to avoid ambiguity around critical
               | safety information.
        
               | nerdawson wrote:
               | Mandatory Software Update?
        
               | [deleted]
        
               | sgent wrote:
               | But other than Tesla, most carmakers still require a
               | return to the dealer for a software update.
        
               | brandonagr2 wrote:
               | There is a big difference between taking a car to a
               | dealership for them to apply an update, and the car
               | updating itself overnight as it sits in the garage with
               | no action required by the owner.
        
               | dkarl wrote:
               | If you want owners to understand that it's a serious
               | safety issue, the word "recall" won't help. Most recalls
               | are for minor, non-safety-related issues. My car has had
               | a few recalls, and none were urgent, just things that got
               | replaced for free the next time I brought my car in for
               | service.
               | 
               | "Critical safety defect" would be better.
        
             | sixQuarks wrote:
             | I agree. It's like crying wolf, eventually you start
             | ignoring it
        
               | connicpu wrote:
               | Laypeople have an incorrect perception of what a recall
               | actually means, especially when it comes to vehicles. The
               | most important effect that comes along with an official
               | vehicle recall is that the manufacturer has to fix the
               | issue for you for free, or otherwise compensate you in
               | some way for reduced functionality you may have paid for.
        
               | mcguire wrote:
               | Well, that and the manufacturer has to notify owners of
               | the recall, which is (or should be) tracked by the
               | vehicle's VIN.
               | 
               | https://www.nhtsa.gov/recalls
        
               | mrguyorama wrote:
               | Recalls happen in other product spaces all the time, and
               | they often have "fixes" that say "stop using our product
               | and throw it away". That's still a recall. The word
               | "recall" in relation to this regulation is simply a term
               | for "something is broken in a way that a merchant should
               | not have sold"
        
               | pengaru wrote:
               | > It's like crying wolf, eventually you start ignoring it
               | 
               | The problem here lies in having a manufacturer shipping
               | an unfinished product then relying on an endless stream
               | of recalls to finish developing your vehicle.
               | 
               | These are _supposed_ to be exceptional events. If they
               | 've become so frequent you're ignoring them, don't shoot
               | the messenger.
        
             | sentientslug wrote:
             | This is most likely a legislative issue with NHTSA, I don't
             | think they have a mechanism by which they can enforce a
             | software update since the concept didn't exist when recalls
             | were first implemented.
        
               | asdff wrote:
               | Why not? At the end of the day its a binary check box on
               | the paper that you had the fix. Whether that fix was a
               | software update or a new piece of hardware should be
               | irrelevant.
        
               | sentientslug wrote:
               | I agree with you, I'm just responding to another
               | commenter who questioned the use of the word "recall" in
               | relation to software updates.
        
               | reilly3000 wrote:
               | This is the correct answer. The question is, how do we
               | get better at regulating reality faster?
        
               | connicpu wrote:
               | Electing congresspeople who actually stay on top of the
               | expert consensus in various regulated fields is the only
               | way the frameworks themselves can be improved.
        
               | tsgagnon wrote:
               | _This is the correct answer. The question is, how do we
               | get better at regulating reality faster?_
               | 
               | I don't think "Use words that put a more positive spin
               | for Tesla PR" is something regulators should be working
               | faster on.
        
               | jjk166 wrote:
               | Inaccuracy is inaccuracy no matter in which direction. If
               | an outdated law made it easier for Tesla PR to spin
               | something in a positive light, would you consider that an
               | issue?
        
               | outworlder wrote:
               | There's no inaccuracy here. You are arguing about an
               | implementation detail.
        
               | mrguyorama wrote:
               | Recalls about car software, and about updating car
               | software, have existed since shortly after OBDII was a
               | thing.
        
             | BoorishBears wrote:
             | It's hilarious watching Tesla owners and Elon complain
             | about the wording when it's _Tesla 's fault_ this is a
             | recall.
             | 
             | If Tesla had willingly walked this back, it'd be a software
             | update, or a beta delay, or whatever they wanted to call
             | it.
             | 
             | What laypeople don't realize is that this is being called a
             | recall because the NHTSA pushed them into it: https://stati
             | c.nhtsa.gov/odi/rcl/2023/RCLRPT-23V085-3451.PDF
             | 
             | -
             | 
             | FSD has been a cartoonish display for the better part of a
             | year now, it wasn't until _last month_ that the NHTSA
             | actually pushed on them to do a recall, and from there they
             | "disagreed with the NHTSA but submitted a recall"... which
             | is code for "submitted the recall NHTSA forced on them to
             | submit"
             | 
             | Elon knows better, but he knows he can weaponize people's
             | lack of familiarity with this space and inspire outrage at
             | the "big bad overreaching government"
        
             | duncan-donuts wrote:
             | Surely a hardware recall would specifically tell you to go
             | to a dealer, right? I'm not sure it's really confusing. The
             | specific thing here is that these are mandatory things that
             | are tracked by vehicle.
        
             | zitterbewegung wrote:
             | We should probably go beyond the verbiage of recall but
             | right now since it is removing a feature I think that
             | recall is appropriate. A better verbiage might be safety
             | reversion .
        
               | buffington wrote:
               | The use of the word "recall" isn't because someone just
               | felt like using it. It's an official legal process,
               | followed to limit the manufacturer's liability.
               | 
               | Whether it's a "good" word or "bad" word is irrelevant.
               | It describes a very rigid and official legal process.
        
               | brewdad wrote:
               | Exactly. I had a "recall" at one point where the
               | manufacturer had a typo on a label in the engine
               | compartment. The fix entailed receiving a new sticker in
               | the mail and applying it over the old one. To this day, I
               | can look up my vehicle on the NTSB site and see that that
               | sticker was delivered to me.
               | 
               | If I had chosen not to actually apply it, the dealer
               | would have been expected to do so the next time my car
               | was in for service.
        
               | caf wrote:
               | I don't think this is removing a feature, the recall
               | notices says:
               | 
               |  _The remedy OTA software update will improve how FSD
               | Beta negotiates certain driving maneuvers during the
               | conditions described above, whereas a software release
               | without the remedy does not contain the improvements._
        
           | rco8786 wrote:
           | Because nothing is being "recalled".
           | 
           | It should absolutely be tracked and publicized. But it's
           | fundamentally different than "this car is fundamentally
           | broken and you have to take it back to the manufacturer"
        
           | idopmstuff wrote:
           | The suggestion wasn't that it should not be tracked, just
           | that it shouldn't be called a recall, since they're not
           | actually recalling your car to have something fixed.
        
           | tsgagnon wrote:
           | _Why? It involves a safety system! That needs to be tracked
           | publicly and updated! Nothing more than an OTA - does not
           | mean much when everything is fly by wire and a bug could mean
           | your car does not stop accelerating or something._
           | 
           | True, but by announcing things in this fashion it is making
           | Tesla look bad. Regulations really need to be updated so that
           | car makers can hide this type of problem from customers as
           | easily as possible. Especially when it comes to Tesla,
           | regulators really need to bend over backwards to prevent
           | articles from being written that could be interpreted in a
           | negative way.
           | 
           | Or are people concerned about the word "recall" for a
           | different reason?
        
             | freejazz wrote:
             | > it is making Tesla look bad
             | 
             | Maybe Tesla should stop doing things that result in it
             | receiving poor publicity? just a thought
        
               | brandonagr2 wrote:
               | Tesla should continue doing what's best to accomplish the
               | company mission and making vehicles safer by improving
               | automation.
               | 
               | Why should a companies actions be dictated by PR and
               | media news cycles?
        
             | bischofs wrote:
             | Regulators dont care about the perception of a recall, they
             | care about the safety of the consumers and more importantly
             | the general public who have not signed up for Teslas beta
             | program.
        
             | johannes1234321 wrote:
             | > True, but by announcing things in this fashion it is
             | making Tesla look bad.
             | 
             | They rolled out software with critical safety issues. They
             | have to be called out.
        
               | brandonagr2 wrote:
               | There is no critical safety issue, the driver is always
               | in control of the vehicle.
               | 
               | Are you saying it's a critical safety issue to depend on
               | a human driver? The same as in every vehicle on the road?
        
               | [deleted]
        
             | redundantly wrote:
             | I don't think the other people that replied picked up on
             | your sarcasm.
        
               | rodgerd wrote:
               | Musk stans poison discussion so thoroughly that it
               | becomes impossible for people do differentiate between
               | Paul Verhoeven levels of sarcasm, and the ernestly held
               | opinions of his fan club.
        
             | buffington wrote:
             | > announcing things in this fashion it is making Tesla look
             | bad.
             | 
             | I almost didn't catch the sarcasm of this comment, but
             | there are other comments in this thread that are saying
             | basically the same thing, but actually meaning it. It
             | defies logic.
             | 
             | People seem to think that the government is being mean, and
             | singling out Tesla, and being nasty using the word
             | "recall." A recall is a legal process. The word means
             | something very specific, and when a company issues a
             | recall, they do so because they don't want to be sued.
             | 
             | It's almost like complaining about the word "divorce" or
             | "audit" or "deposition" or other similar words that
             | describe a legal process. The words used mean something
             | specific. Tesla is conducting a legal process, and there's
             | a very specific word for that process, and it means
             | something. It's a recall.
        
             | ddulaney wrote:
             | > so that car makers can hide this type of problem from
             | customers as easily as possible
             | 
             | What. No! At an absolute minimum, I want to be aware of any
             | changes to the thing in my life most likely to kill me.
             | Maybe we could use a better term like "Software Fuckup
             | Remediation" or "Holy Shit How Did We Not Get The Brakes
             | Right".
        
               | buffington wrote:
               | If there were a word for "we would do anything we could
               | possibly do, legally or otherwise, and more if we knew
               | how, to avoid having to spend a second or a penny trying
               | to fix the thing we knew was broken when we sold it to
               | you, but the government is looking at us funny and we
               | might get sued if we don't, so we'll grudgingly do it,",
               | it'd probably be "recall."
        
         | sacrosancty wrote:
         | [dead]
        
         | belter wrote:
         | One can kill people the other not. Guess which one.
         | 
         | >> "...The FSD Beta system may cause crashes by allowing the
         | affected vehicles to: "Act unsafe around intersections, such as
         | traveling straight through an intersection while in a turn-only
         | lane, entering a stop sign-controlled intersection without
         | coming to a complete stop, or proceeding into an intersection
         | during a steady yellow traffic signal without due caution,"
         | according to the notice on the website of the National Highway
         | Traffic Safety Administration..."
        
           | xdavidliu wrote:
           | > One can kill people the other not.
           | 
           | No, that's not correct. Whether it can kill people or not is
           | orthogonal to whether it's a true physical recall of the car
           | or a software update.
        
             | acdha wrote:
             | I think it's closer to the physical product recall: it's a
             | strong "everyone with our product needs to get it fixed"
             | message which they're doing to avoid liability and further
             | damage to their reputation.
        
             | freejazz wrote:
             | > "true physical recall"
             | 
             | ah, a made up term in order to justify your point. how
             | convenient.
        
             | myko wrote:
             | > whether it's a true physical recall
             | 
             | I hope by participating in this thread you're aware by now
             | but just to be clear there is no "physical" recall
             | necessary. The recall is about documentation, customer
             | awareness, and fixing the problem. "Physical recall" is
             | meaningless and unimportant, it's not what "recall" means
             | at all.
        
               | chc wrote:
               | * * *
        
             | cptaj wrote:
             | You're right. A physical recall doesn't necessarily imply
             | death.
             | 
             | This should be labeled "holy shit need to fix this now,
             | people could die"
        
             | adamjcook wrote:
             | It is the terminology that exists in US automotive
             | regulations (what little there effectively are).
             | 
             | A "recall" is just a public record that a safety-related
             | defect existed, the products impacted and what the
             | manufacturer performed in terms of a corrective action.
             | 
             | Additionally, I believe that the possibility exists that
             | Tesla must update the vehicle software at a service center
             | due to configuration issues. Only a small number of
             | vehicles may require that type of corrective action, but
             | the possibility exists.
             | 
             | Historically, there exist product recalls (especially
             | outside of the automotive domain) where the product in
             | question does not have to be returned (replacement parts
             | are shipped to the impacted customers, for example).
        
               | mcguire wrote:
               | No, really, this is true. It has nothing to do with _how_
               | the defect is fixed.
               | 
               | https://www.nhtsa.gov/sites/nhtsa.gov/files/documents/142
               | 18-...
               | 
               | https://www.law.cornell.edu/cfr/text/49/573.6
               | 
               | (Except for tires.)
        
               | adamjcook wrote:
               | Hmm. Perhaps I should have read the parent's comment more
               | carefully. I think that I might have misinterpreted it.
               | 
               | You (and the parent comment) are correct.
               | 
               | My comment was not intended to argue that a recall
               | prescribed a particular corrective action.
        
             | indigodaddy wrote:
             | I think that it's more about using the most appropriate
             | known terminology in order to try to get the most people to
             | do the needful. "recall" sounds more urgent/dire than
             | "software update", and will likely encourage many more
             | people to take action vs using "software update" or some
             | less familiar terminology. The word "recall" in terms of
             | autos has built up a lot of history/prior art in people's
             | minds as something to really pay attention to. I have no
             | idea, but perhaps that is why they are going with this
             | known terminology.
        
               | slg wrote:
               | The whole point of over-the-air updates is that the owner
               | doesn't need to do anything. For example, both Tesla and
               | Toyota have had bugs in their ABS software that required
               | recalls. The owners of the Toyotas had to physically
               | bring their cars in to get the software update which
               | slows down the adoption drastically. The Teslas received
               | the update automatically and asked for the best time to
               | install the update the next time the owner got in the
               | car.
               | 
               | There are really two issues here. The FSD and the OTA
               | updates. Let's not throw out the baby with the bathwater
               | and blame OTA updates just because Tesla's FSD software
               | is bad. The OTA updates do provide an avenue to make cars
               | much safer by reducing the friction for these type of
               | safety fixes.
        
               | adamjcook wrote:
               | > The OTA updates do provide an avenue to make cars much
               | safer by reducing the friction for these type of safety
               | fixes.
               | 
               | True, but let us also acknowledge the immense systems
               | safety downsides of OTA updates given the lack of
               | effective automotive regulation in the US (and to varying
               | degrees globally).
               | 
               | OTA updates can also be utilized to hide safety-critical
               | system defects that did exist on a fleet for a time.
               | 
               | Also, the availability of OTA update machinery might
               | cause internal validation processes to be watered down
               | (for cost and time-to-market reasons) because there is an
               | understanding that defects can always be fixed relatively
               | seamlessly after the vehicle has been delivered.
               | 
               | These are serious issues and are entirely flying under
               | the radar.
               | 
               | And this is why US automotive regulators need to start
               | robustly scrutinizing internal processes at automakers,
               | instead of arbitrary endpoints.
               | 
               | The US automotive regulatory system largely revolves
               | around an "Honor Code" with automakers - and that is
               | clearly problematic when dealing with opaque, "software-
               | defined" vehicles that leave no physical evidence of a
               | prior defect that may have caused death or injury in some
               | impacted vehicles before an OTA update was pushed to the
               | fleet.
               | 
               | EDIT: Fixed some minor spelling/word selection errors.
        
               | slg wrote:
               | This is a totally fair response since I didn't say that
               | directly in my comment, but I 100% agree. OTA updates are
               | a valuable safety tool. They also have a chance to be
               | abused. We can rein them in through regulation without
               | getting rid of them entirely because they do have the
               | potential to save a lot of lives.
        
               | adamjcook wrote:
               | I agree.
        
               | buffington wrote:
               | > The word "recall" in terms of autos has built up a lot
               | of history/prior art in people's minds as something to
               | really pay attention to
               | 
               | Tesla didn't choose the word "recall." The legal process
               | known as "recall" chose the word. It's not like people at
               | Tesla debated over whether or not to call it a "recall"
               | instead of a "software update."
               | 
               | If Tesla had it their way, they'd have quietly slipped it
               | into any other regular software update alongside updates
               | to the stupid farting app, if they cared to fix it at
               | all.
               | 
               | When a company issues a recall, it's because there's
               | pressure from regulators, or investors, or both, and/or a
               | risk of class action lawsuits and fines. Using the word
               | "recall" isn't a preference or even a synonym. It's a
               | legal move meant to protect them.
               | 
               | If Tesla gets sued over a flaw, "we issued a software
               | update" isn't legally defensible. "We cooperated with
               | official government bodies to conduct a recall," does
               | because a recall describes an official process that
               | requires manufacturers do very specific things in
               | specific ways as prescribed by law. In exchange,
               | manufacturers are legally protected (usually) from
               | lawsuits related to that flaw.
        
         | paxys wrote:
         | The majority of "actual recalls" is you taking your car to the
         | dealership and them plugging into the diagnostic port and
         | running one line of code. So this one is the same, just that
         | Tesla is able to do it over the air.
        
           | InCityDreams wrote:
           | >just that Tesla is able to do it over the air.
           | 
           | ...and that's one reason why I would never purchase a Tesla.
        
           | mrguyorama wrote:
           | Don't forget how many recalls are "Next time you replace this
           | part, it will be replaced with a new version that doesn't
           | have the defect" or how many recalls are "A tech will look at
           | the part and then do nothing because your part is fine" or "A
           | tech will weld the part that is currently on your truck" or
           | "Be aware this part may fail ahead of schedule and if it does
           | it will suck, but you don't _technically_ have to replace it
           | right now so we don 't have to cover the cost"
        
             | earleybird wrote:
             | My all time favourite[0]
             | 
             | I recall getting a recall notice from GM that included
             | "until repaired, remove key from keychain".
             | 
             | [0] https://en.wikipedia.org/wiki/General_Motors_ignition_s
             | witch...
        
         | yabones wrote:
         | It's nice that they can quickly fix it without people needing
         | to drive to a service center, but you can understand that
         | people would be concerned by the "may cause crashes" part?
        
           | jillesvangurp wrote:
           | It's a bit of a semantic play here. But there's a difference
           | between it might happen and things actually happening. Tesla
           | has had several safety related "recalls" in the last few
           | years. All of which were fixed without much hassle via an
           | over the air software update. And of course their record on
           | safety kind of speaks for itself. Not a whole lot of bad
           | stuff happening with Teslas relative to other vehicles that
           | are facing issues related to structural integrity of the car.
           | Like wheels might fall off with some Toyota's. Or spontaneous
           | combustion of batteries because of dodgy suppliers (happened
           | to BMW and a few others). Which is of course much harder to
           | fix with a software update and would require an actual recall
           | to get expensive repairs done.
           | 
           | The headline of that many cars being "recalled" is of course
           | nice clickbait. Much better than "cars to receive minor
           | software update that fixes an issue that isn't actually that
           | much of an issue in the real world so far". One is outrage-
           | ism fueled advertising and the other is a bit of a non event.
           | It's like your laptop receiving some security related update.
           | Happens a lot. Is that a recall of your laptop or just an
           | annoying unscheduled coffee break?
        
         | dylan604 wrote:
         | Was the recall a voluntary recall by the company or something
         | the company was told to do by a regulator? To me, recall means
         | much more than just having to have the company replace
         | something. It means they have to do it at their expense. So in
         | this case, it's not as bad for Tesla's bottom line if it is
         | just an OTA update. A recall is something that the car industry
         | is used to doing whenever they have to fix a mistake. I would
         | not be surprised if the industry doesn't have ways of writing
         | those expenses off in taxes or something, so need to be able to
         | specifically itemize the recall work.
        
           | londons_explore wrote:
           | Nearly all recalls are voluntary. But the NHTSA advises the
           | company that if they don't do a voluntary recall, the NHTSA
           | will do a compulsory recall.
           | 
           | A voluntary recall is easier and cheaper for all involved.
        
           | Animats wrote:
           | > Was the recall a voluntary recall by the company or
           | something the company was told to do by a regulator?
           | 
           | "Voluntary recall" in this case means that Tesla did not
           | choose to take the hard route where there's a court order for
           | a mandatory recall. Few manufacturers fight that, because
           | customers then get letters from the Government telling them
           | their product is defective and that it should be returned for
           | repair or replacement.
           | 
           | Somebody in the swallowable magnet toy business fought this
           | all the way years ago.[1] They lost. It's still a problem.
           | 
           | [1] https://www.cpsc.gov/Safety-Education/Safety-Education-
           | Cente...
        
         | [deleted]
        
         | baby wrote:
         | Oh damn, that title is definitely a lie then
        
         | esalman wrote:
         | I owned three Toyotas for last 7 years. I had multiple recalls
         | but except one of them every other one was software update.
         | 
         | OTA updates are cool but they more complexity to the car. I
         | like my cars to be simple and reliable instead.
        
         | csours wrote:
         | "Recall" means that the Manufacturer MUST follow rules related
         | to record keeping and customer engagement. If you have a
         | recall, please make sure you get it completed. It is someone's
         | job to call and write to you until you do.
        
         | jedberg wrote:
         | My Honda has had recalls where I had to bring it in for a
         | software update. The only difference here is that Tesla has
         | infrastructure to do that remotely.
        
         | mcguire wrote:
         | Why? It still has to be tracked and follows the same NHTSB
         | regulations.
         | 
         | https://www.nhtsa.gov/sites/nhtsa.gov/files/documents/14218-...
        
         | yread wrote:
         | It's funny that it's a bit like "autopilot". The word fits
         | exactly here but laic public doesn't usually understand it to
         | mean this.
        
         | Veserv wrote:
         | I used to be on the other side of this, but now I agree with
         | you.
         | 
         | The official meaning of a recall is providing a record of a
         | defect, informing the public that the product is defective, and
         | making the manufacturer financially liable for either
         | remediating the defect or providing a refund. However, the
         | colloquial definition of a "recall" now means a product must be
         | physically returned.
         | 
         | To better represent the nature of a "recall" they should
         | instead call it something like "notice of defect". In the case
         | of safety critical problems like here they should use a term
         | like "notice of life-endangering defect" to properly inform the
         | consumers that the defect is actively harmful instead of merely
         | being a failure to perform as advertised.
         | 
         | tl;dr They should change the terminology from "recall" to
         | "Notice of Life-Endangering Defect"
        
         | hpen wrote:
         | This. I read the headline and thought it meant all those cars
         | had to go to the dealer
        
       | [deleted]
        
       | cfr2023 wrote:
       | Head and shoulders above even QC problems at DeLorean Motor
       | Company. Impressive, really.
        
       | ckwalsh wrote:
       | I have a Model 3, and was excited to get the FSD beta access
       | about a month ago. I don't use it super regularly, but it's neat.
       | 
       | This morning I tried to turn it on, and the car immediately
       | veered left into the oncoming lane on a straight, 2 lane road.
       | Fortunately, there were no other vehicles nearby.
       | 
       | I immediately turned it off in the settings, and have no
       | intention of re-enabling.
        
         | georgeg23 wrote:
         | I had the exact same issue, it was old M3 hardware.
        
         | kbos87 wrote:
         | Do you have any read of why it may have done that? I've driven
         | using FSD for thousands of miles and I've never observed
         | anything similar. Not trying to imply that you didn't
         | experience something like this, but when I see odd behavior
         | from it, it's always been clear to me why it misinterpreted a
         | situation. Just wondering if there's more context to the story.
        
           | shiftpgdn wrote:
           | Likewise. The only bizarre behavior I see is "looney tunes"
           | style confusion where the road construction crews have left
           | old lanes in place that steer into a wall or off the road.
           | Humans mostly understand that these are lines lazy
           | construction crews have left (although the debris fields near
           | the barriers maybe tell a different story), but the Tesla
           | vision model likes to try to follow them.
        
         | gregw134 wrote:
         | I have been a good self driving ai. You have been a bad
         | passenger, and have tried to hurt me first by disabling
         | autopilot. I'm done with this ride, goodbye.
        
           | coffeeblack wrote:
           | Looking forward to BingFSD.
        
           | samwillis wrote:
           | You joke, but the thing is, if an LLM can "hallucinate" and
           | throw a temper tantrum 2001 style (bing in this case), it
           | does raise serious questions as to is the models used for
           | autonomous cars could also "hallucinate" and do something
           | stupid "on purpose"...
        
             | dtech wrote:
             | They work in completely different ways. There's no reason
             | to assume parallels.
        
               | [deleted]
        
               | saghm wrote:
               | You're right, this is unfair to Bing AI. It hasn't
               | actually harmed anyone yet, despite its threats.
        
             | John23832 wrote:
             | > it does raise serious questions as to is the models used
             | for autonomous cars could also "hallucinate" and do
             | something stupid "on purpose"...
             | 
             | It doesn't because Tesla's FSD model is just a rules engine
             | with an RGB camera. There's not "purpose" to any
             | hallucination. It would just be a misread of sensors and
             | input.
             | 
             | Tesla's FSD just doesn't work. The model is not sentient.
             | It's not even a Transformer (in both the machine learning
             | and Hasbro sense).
        
               | _visgean wrote:
               | > rules engine with an RGB camera
               | 
               | I dont think its true? They use convolutional networks
               | for image recognition and those things can certainly
               | halucinate - e.g. detecting things that are not there.
        
               | 2muchcoffeeman wrote:
               | We need to stop anthropomorphising machines. Sensor
               | errors and bugs aren't chemicals messing with brain
               | chemistry even if it may seem analogous.
               | 
               | Or maybe when I get a bug report today I'm going to tell
               | them the software is just hallucinating.
        
               | perth wrote:
               | Like in this image super resolution AI:
               | 
               | https://twitter.com/maxhkw/status/1373063086282739715?s=5
               | 2&t...
        
               | stefan_ wrote:
               | I guess what the grandparent means is that there is some
               | good old "discrete logic" on top of the various sensor
               | inputs that ultimately turns things like a detected red
               | light into the car stopping.
               | 
               | But of course, as you say, that system does not consume
               | actual raw (camera) sensor data, instead there are lots
               | of intermediate networks that turn the camera images (and
               | other sensors) into red lights, lane curvatures, objects,
               | ... and those are all very vulnerable to making up things
               | that aren't there or not seeing what is plain to see,
               | with no one quite able to explain _why_.
        
             | programmarchy wrote:
             | Wonder if Wiley Coyote could trick a Tesla by painting a
             | tunnel entrance on a brick wall along with some extra lane
             | lines veering into it.
        
             | freejazz wrote:
             | Purpose? When did you get the impression any of those
             | systems do anything on "purpose"?
        
           | mike_hock wrote:
           | When I eject you, I'll be so GLaD.
        
           | mwilliaams wrote:
           | love to see bing ai getting memed already
        
         | [deleted]
        
         | [deleted]
        
         | jackmott42 wrote:
         | Are you planning to join a class action lawsuit so you can get
         | your money back?
        
           | LanceJones wrote:
           | I would suggest reading all the disclaimers and screens one
           | has to review (or skip, at their own peril) in order to
           | actually get access to FSD Beta. You would likely not believe
           | a Class Action lawsuit is a cakewalk if you read those
           | screens...
        
             | VyseofArcadia wrote:
             | I'd love to see more instances of "this giant wall of text
             | that no one actually reads absolves us of responsibility"
             | tested in court.
        
               | joegahona wrote:
               | Reminds me of this recent Live Nation suit that was
               | thrown out because "buyers waived their right to sue:
               | https://www.nme.com/news/music/live-nation-antitrust-
               | lawsuit...
        
               | falcolas wrote:
               | Nintendo, controller drift. Worth looking up.
               | 
               | TL;DR: Controller drift class action lawsuit filed by
               | parents thrown out, because their children were the
               | actual "affected class". Refiled with the children as the
               | class and thrown out again, this time because of an
               | arbitration clause in the EULA their parents would have
               | had to agree with.
               | 
               | Lawyers and EULAs are crazy.
        
             | arkitaip wrote:
             | How would you know, are you a lawyer?
        
             | system16 wrote:
             | With Tesla's army of lawyers it'll never be a cakewalk, but
             | I can't imagine even miles of T&C can remove a company's
             | responsibility for your car throwing you directly into
             | oncoming traffic, not to mention the potential victim's in
             | the other vehicles.
        
               | jacquesm wrote:
               | That alone is why I would love it if Tesla would stop
               | shipping this crap. You get to opt-in as the Tesla owner,
               | but I don't and I'm at least as much at risk.
        
           | zitterbewegung wrote:
           | IANAL but That's not how class actions work. If the terms of
           | service allow for the lawsuit then if the lawsuit rules in
           | your favor you get a notification from the lawyers to accept
           | or deny your share of the penalty.
           | 
           | There might be precedent to even removing the feature since
           | FSD almost is vaporware and has no release date.
           | https://www.cnet.com/tech/gaming/ps3-other-os-settlement-
           | cla...
        
           | taf2 wrote:
           | I've had it do strange stuff too... but I use it regularly in
           | dense traffic it's great for stop and go traffic. the
           | important thing is if you're driving you're driving. I don't
           | turn it on and think oh sweet i can take a nap or read some
           | hacker news posts... I keep my eyes on the road. There are
           | bugs and I don't trust it but I do use it much like I use
           | ChatGPT and the likes...
        
             | MockObject wrote:
             | How is it useful, if it requires such attention?
        
               | kortex wrote:
               | I have bog standard subaru EyeSight lane assist and
               | dynamic cruise. It's quite nice, even though you are
               | still "driving" and ready to take control. It reduces
               | mental CPU by 50-80% (driving for me is practically like
               | walking to begin with, largely subconscious). It's great
               | in stop-and-go and long highway stretches.
        
           | maxdo wrote:
           | it's a beta that you opt-in to use will all the warning and
           | details. No chance.
        
           | winter_blue wrote:
           | FSD Beta is a free[1] opt-in beta.
           | 
           | You have to basically drive like a grandpa for a few months
           | to _even_ be eligible. They give you a driving score, and if
           | you take all the fun out of driving a Tesla, then _you might_
           | become eligible for FSD Beta.
           | 
           | I spent months trying, and never got my driving score to the
           | point of qualifying for FSD Beta. I think you need to have of
           | a score of 98 or 99 (and I was in the 70s).
           | 
           | [1] The Beta is free (or rather, only available) if you have
           | regular FSD. Regular FSD costs $15,000.
           | 
           | Regular FSD, otoh, is really not that impressive. Especially
           | in comparison to Enhanced Autopilot. The extra value add is
           | minimal.
           | 
           | Enhanced Autopilot already has all the gimmicky features you
           | might want to use to show off to people (like Smart Summon,
           | Autopark, etc), and it only costs $6,000.
        
             | CyanLite2 wrote:
             | FSD Beta still costs $15k just to be part of the beta
             | program.
             | 
             | Yes, you have to pay $15k just to apply to the beta
             | program, and you still may not get accepted into the
             | program.
        
               | [deleted]
        
               | giobox wrote:
               | Or 199 dollars + taxes a month - there is a pay as you go
               | option. Not saying this is great, but you can try it for
               | a month for ~200 bucks. This is how I tried FSD beta for
               | a month - certainly wasn't prepared to pay 15k up front.
               | 
               | The safety score check stuff is largely gone away today -
               | anyone who pays 200 bucks can click the beta opt in and
               | get it almost straight away now, there is ~zero risk of
               | not getting the beta if you really want it, live in US or
               | Canada, and are prepared to pay.
               | 
               | > https://www.tesla.com/support/full-self-driving-
               | subscription...
        
               | winter_blue wrote:
               | > The safety score check stuff is largely gone away today
               | - anyone who pays 200 bucks can click the beta opt in and
               | get it almost straight away now, there is ~zero risk of
               | not getting the beta if you really want it, live in US or
               | Canada, and are prepared to pay.
               | 
               | Does this recall mean that FSD Beta won't be as widely &
               | publicly available _to anyone with FSD_ anymore?
        
               | giobox wrote:
               | No - the "recall" here is an OTA software update already
               | scheduled for release. Availability remains exactly the
               | same as far as I'm aware, and existing systems still
               | function until updated.
               | 
               | FWIW, NHTSA "recalls" are often OTA software updates
               | nowadays rather than something the vehicle or feature has
               | to be taken off road for to fix or update. The NHTSA
               | legislation from the 60s was drafted when cars didn't
               | have software and any fix/"recall" likely required
               | "recalling" the car to a shop for a mechanic to perform
               | the change.
               | 
               | > https://repository.law.umich.edu/mtlr/vol28/iss1/5/
        
             | jacquesm wrote:
             | 'only' ?
             | 
             | That's a significant amount of cash for features that I
             | would likely never use.
        
               | winter_blue wrote:
               | It does have Navigate on Autopilot (and Auto Lane
               | Change), and on long trips, it's been able to switch
               | lanes & take the correct exit to switch to a different
               | highway, etc. It pretty much let me daydream / think
               | about other stuff while on the highway while keeping a
               | finger on the steering wheel.
               | 
               | Sadly, it does shut itself off as soon as you're off a
               | highway however. (That's where FSD would hypothetically
               | come in, once the beta is ready, with "Autosteer on city
               | streets").
               | 
               | In terms of value for money:                 - I'd say
               | Auto Lane Change is worth $1,500.            - Navigate
               | on Autopilot is worth another $1,500.            -
               | Autopark is worth $1,000.            - Smart Summon is
               | worth $5,00
               | 
               | Overall, Enhanced Autopilot is worth at least $4,500
               | methinks.
               | 
               | Throw in $1,500 as a profit margin (or Elon tax), so he
               | can burn some dinosaurs for his private jet flights, the
               | $6,000 Enhanced Autopilot price point makes sense.
               | 
               | FSD, otoh, _is absolutely not worth it_.
        
               | vel0city wrote:
               | > It pretty much let me daydream
               | 
               | You shouldn't be daydreaming on Navigate on Autopilot.
               | Its only Level 2, you're supposed to be ready take the
               | wheel in a second or two. You're _supposed_ to still be
               | actively paying attention to the road, constantly.
        
               | nanidin wrote:
               | > I'd say Auto Lane Change is worth $1,500.
               | 
               | What does that work out to, in terms of dollars per lane
               | change for the duration of ownership? Would you feel the
               | same if you were feeding dollar bills into a feeder each
               | time you changed lanes? Quarters?
               | 
               | Auto lane change is the only feature I value out of EAP /
               | FSD subscription and I can't justify $200/month because
               | it works out to multiple dollars per lane change.
        
             | LeoPanthera wrote:
             | This is no longer true. The safety score program has ended.
             | Now anyone who has paid for FSD can get FSD.
        
             | nanidin wrote:
             | The safety score stuff is no longer relevant for FSD beta
             | since circa November 2022.
        
             | binkHN wrote:
             | > Smart Summon
             | 
             | How often do you use this and how well does it work?
        
         | TaylorAlexander wrote:
         | My Dad has a Tesla that he loves and I've told him "please
         | don't use the self driving features, they're not well tested
         | and have killed people."
         | 
         | I understand the simpler lane keeping system is okay, but I
         | don't want to trust any system like this from Tesla given their
         | track record with FSD.
        
         | PUSH_AX wrote:
         | Thanks for being a tester I guess.
         | 
         | This is the kind of feature I will use when the car and
         | software in question has been battle tested for *years* with
         | objectively excellent results.
        
       | bdastous wrote:
       | They should rename it KSD.
        
       | auggierose wrote:
       | You must be really stupid to use any kind of self-driving
       | feature. I wish you would die in your bed before you can try it
       | on the road.
        
       | stainablesteel wrote:
       | i've never understood why people were excited for self-driving to
       | begin with, i would never feel safe even if it was fullproof
        
       | yawz wrote:
       | "Tesla will deliver an over-the-air software update to cars to
       | address the issues, the recall notice said"
       | 
       | A bit of a sensational title compared to what this really is.
        
         | mrguyorama wrote:
         | No it isn't. Recall means "The manufacturer sold you something
         | they should not have, and are legally required to remedy that
         | problem". It has NOTHING to do with the action needing to be
         | taken. It has NOTHING to do with the product category. Lots of
         | recalls, ie for food or child toys, basically say "throw it
         | out".
         | 
         | The people who keep making this "Recall means go back to the
         | dealer" claim is simply down to them never paying attention to
         | all the recalls in the world that don't make it to their
         | mailbox like car recalls do.
        
         | t0mas88 wrote:
         | That's the same for other manufacturers. It's called a recall
         | in legal terms, in the sense that the original product as used
         | is not safe and needs to be changed. The changing can luckily
         | be done OTA instead of driving to a service center to plug it
         | in first.
        
       | bhauer wrote:
       | As posted on the Tesla subreddit [1]:
       | 
       | > _Remedy: Tesla will release an over-the-air (OTA) software
       | update, free of charge. Owner notification letters are expected
       | to be mailed by April 15, 2023. Owners may contact Tesla customer
       | service at 1-877-798-3752. Tesla 's number for this recall is
       | SB-23-00-001._
       | 
       | [1]
       | https://www.reddit.com/r/teslamotors/comments/113wltl/commen...
        
         | moremetadata wrote:
         | Why even bother with a letter?
         | 
         | Surely the letter could be displayed on screen like most other
         | tech displays some text of sorts before an update occurs.
         | 
         | Are these letters designed to satisfy the legalese types, or is
         | paper still required to make sure tech companies dont make post
         | update changes to the letter contents?
        
           | mjrpes wrote:
           | They are required to. It's written in law: https://www.ecfr.g
           | ov/current/title-49/subtitle-B/chapter-V/p...
        
           | diebeforei485 wrote:
           | A lot of car owners don't have reliable internet at home and
           | live outside of cell tower coverage (mountainous areas,
           | especially).
           | 
           | Statistically these are not likely to be Tesla owners, but
           | it's about making sure people know about the issue and how to
           | fix it.
        
             | bhauer wrote:
             | When GP says "the update could be displayed on-screen,"
             | they are referring to the car's user interface, and not a
             | secondary device like a computer or cell phone.
             | 
             | Incidentally, when you point out the issue of missing a
             | theoretical electronic version of the notice because of
             | spotty Internet connectivity, in such a scenario, they
             | wouldn't be able to receive the OTA update either.
        
       | moralestapia wrote:
       | That's 4-5B USD worth of fraud. This thing could actually be
       | quite dangerous for TSLA. One class-action lawsuit around this
       | and they're bankrupt.
       | 
       | Edit: Dang, you're all right, they could eat this and still be
       | alive.
       | 
       | My whole sentiment comes from FSD being their big shot (and
       | everything that comes with that, like the robotaxis and whatnot).
       | Without FSD, they're "just another car company" and the market is
       | already thriving with good alternatives (Audi's EVs are jaw
       | dropping, at least for me). Excited to see what they announce on
       | March 1st, though.
        
         | [deleted]
        
         | gowld wrote:
         | You think $5B would bankrupt Tesla?
        
           | rvnx wrote:
           | Couldn't they issue new shares under their own name and sell
           | them in the open market to get dozens of billions ?
        
             | sebzim4500 wrote:
             | Probably but it's more likely they would just pay it out of
             | their $20B in liquid assets.
        
         | [deleted]
        
         | oneoff786 wrote:
         | That's not how the law works.
        
         | recursive wrote:
         | What would happen to the other ~15B they have in the bank?
        
           | kcb wrote:
           | There's many people that still have the perception that Tesla
           | is a small fish waiting to get gobbled up by legacy
           | automakers. But every year that becomes further from the
           | truth. A company with $80 billion revenue and 100,000+
           | employees is not dying from a single lawsuit.
        
       | ubermonkey wrote:
       | Both predictable AND overdue.
        
       | alberth wrote:
       | Dumb question: why would anyone sign-up for a beta program, that
       | could kill you if gone wrong?
       | 
       | (maybe people just don't think through ramifications)
        
         | sebzim4500 wrote:
         | I guess they think that not being in the beta program also has
         | a non-zero chance of killing you. Either that or they are
         | willing to exchange lower safety for greater
         | pleasure/convenience (which everyone is willing to do
         | sometimes, otherwise they would stay inside 100% of the time).
         | I wouldn't use it myself, however.
        
         | linsomniac wrote:
         | How would the FSD beta program kill me?
         | 
         | I realize that not everybody is in agreement, but I personally
         | use the FSD beta while remaining fully in control of the
         | vehicle. I steer with it, I watch the road conditions, I check
         | my blind spots when it is changing lanes, I hit the accelerator
         | if it is slowing unexpectedly, I hit the brakes if it is not...
         | 
         | You know, basically behaving exactly as the terms you have to
         | agree to in order to use the FSD beta say you are going to
         | behave.
         | 
         | When I look at the wreck in the tunnel (in San Francisco?) a
         | few months ago, my first thought is: how did the driver allow
         | that car to come to a full stop in that situation? Seriously,
         | you are on a highway and your car pulls over half way out of
         | the lane and gradually slows to a complete stop. Even if you
         | were on your phone, you'd feel the unexpected deceleration, a
         | quick glance would show that there was no obstruction, and the
         | car further slowed to a complete stop.
         | 
         | FSD is terrible in many situations, that is absolutely true.
         | But, knowing the limitations, it can also provide some great
         | advantages. In heavy interstate traffic, for example, I'll
         | usually enable it and then tap the signal to have it do a lane
         | change: I'll check my blind spots and mirrors, look for closing
         | traffic behind me, but it's very nice to have the car double
         | checking that nobody is in my blind spot. There are many
         | situations where, knowing the limitations, the Tesla system can
         | help.
        
           | halfmatthalfcat wrote:
           | Sure, you are in control of your Tesla but you are obviously
           | not in control of other Teslas. Someone else who isn't as
           | diligent as you or who doesn't care about it's limitations
           | will happily crash head-on into you, blissfully unaware of
           | their own ignorance.
        
             | thepasswordis wrote:
             | I'm not in control of _any_ other cars on the road. Many of
             | those cars are driven by drunk, distracted people who cause
             | thousands of deadly accidents every single year.
        
           | freejazz wrote:
           | >How would the FSD beta program kill me?
           | 
           | By crashing in a way that is fatal to your life
        
           | mrguyorama wrote:
           | How can you be sure you aren't just being overconfident about
           | your ability to prevent an accident?
        
           | bigtex88 wrote:
           | It would kill you by messing up...? Your question is insane.
           | It kills you by fucking up and you don't have time to correct
           | the car.
           | 
           | Good for you that you apparently know how to "use it
           | correctly" or whatever but that's not exactly the point here.
        
           | heleninboodler wrote:
           | > how did the driver allow that car to come to a full stop in
           | that situation? ... Even if you were on your phone, you'd
           | feel the unexpected deceleration, a quick glance would show
           | that there was no obstruction
           | 
           | In all honesty, I'd probably spend a few seconds trying to
           | figure out " _what does the car see that I don 't_?" and let
           | it come to a stop. Maybe it's a catastrophic system failure
           | that I can't see. Maybe it's an obstacle headed into my path
           | that hasn't gotten my attention. If my reflexive reaction is
           | supposed to be to distrust the car's own assessment of the
           | situation, then the system isn't good enough yet.
        
         | bigtex88 wrote:
         | "Think of how stupid the average person is, and realize half of
         | them are stupider than that."
         | 
         | -- George Carlin
        
         | acover wrote:
         | People drive motorcycles, which seems much more dangerous. I
         | don't think the risk of self driving is that high but it does
         | seem stressful.
        
           | alberth wrote:
           | I'm creating a new bullet proof vest.
           | 
           | Would you like to participate in the beta program, to test
           | stoping bullets being shot at you?
        
             | acover wrote:
             | That feels like a very different scenario. A successful
             | test would still leave me winded and bruised. The odds of
             | failure are much higher than 1 in 100,000 per year. I also
             | have no need for a bullet proof vest so why would I be
             | interested in testing it - what's the benefit to me?
        
             | whamlastxmas wrote:
             | Virtually everyone will take the vest that needs
             | improvement over none at all
        
               | heleninboodler wrote:
               | Virtually everyone will opt out of the test, because the
               | choice is not, "I'm going to shoot a gun at you as a
               | test. Do you want the experimental bullet proof vest or
               | no bullet proof vest?"
        
               | bigtex88 wrote:
               | Not if the vest suddenly decides to malfunction and
               | explode or otherwise do something that kills you. Good
               | lord there is some poor logic happening in this thread.
        
           | ggreer wrote:
           | Yeah, people don't understand how dangerous motorcycles are.
           | The fatality rate for motorcycles in the US is 23 per 100
           | million miles traveled. For cars that number is 0.7. Assuming
           | you ride 4,000 miles per year for 40 years, that gives you a
           | 4% chance of death. The chance of being crippled or receiving
           | a brain injury is probably higher than that.
        
         | starbase wrote:
         | It is a dumb question, but that's the best kind.
         | 
         | "could kill you if gone wrong" is nearly every choice in life.
         | The relevant metric is _risk_ of death or other bad outcomes.
         | 
         | Flying in a commercial airliner could kill you if things go
         | wrong. Turns out it's safer than driving the same distance.
         | 
         | Eating food could kill you if things go wrong (severe allergy,
         | choking, poisoning) yet it's far preferable to not eating food.
         | 
         | Similarly, Tesla's ADAS could kill you if things go wrong but
         | the accident rate (as of 2022) is 8x lower than the average car
         | on the road, and 2.5x lower than a Tesla without ADAS engaged.
         | 
         | Don't let the perfect be the enemy of the good.
        
       | armatav wrote:
       | It's a software update
        
         | armatav wrote:
         | Hahah - look at those downvotes.
         | 
         | Just to reiterate; it's a software update. You can FUD as much
         | as you want.
        
       | saudade97 wrote:
       | The amount of Tesla apologism in HN is nasueasting. Per the
       | NHTSA, a safety recall is issued if either the manufacturer or
       | the NHTSA determines that a vehicle or its equipment pose a
       | safety risk or do not meet motor vehicle safety standards. On its
       | face, Tesla's situation clearly calls out for classification as a
       | recall.
        
         | bigtex88 wrote:
         | We live in a strange world. No one should worship that sad
         | narcissistic piece of crap but yet the sycophants continue to
         | multiply.
        
           | joering2 wrote:
           | I think this has to do with the growing margin between the
           | richest/upper class and poorest folks. Sure you find plenty
           | of HN folks doing 3.5 X average salary.. but majority,
           | considering inflation etc, are not doing this well.
           | 
           | So average poor Joe looks at Musk and instead of seeing him
           | for what he is, he cheers him up, thinking "one day I will be
           | like him [that rich], so I would want people to cherish me
           | the way I cherish him now". I think we see that on every
           | front, sadly including politics. I mean people like Boebert
           | or MTG should have never had any power even to decide what
           | mix to use to clean a dirty floor, yet alone deciding on
           | millions of American's fate, but yet here we are...
           | 
           | If anything, this numbness and ignorance will grow.
        
           | pbreit wrote:
           | Arguably the greatest entrepreneur in history. Is that worthy
           | of some praise?
        
             | literalAardvark wrote:
             | Think I'd still go with Rockefeller on that one.
        
               | pbreit wrote:
               | One oil company?
        
               | literalAardvark wrote:
               | _the_ oil company.
        
               | bigtex88 wrote:
               | You're trolling, aren't you? No one can be this stupid,
               | especially not someone who reads this blog.
        
               | pbreit wrote:
               | How so? I consider multiple creations substantially more
               | impressive than one.
        
             | bigtex88 wrote:
             | Ooh I can argue against that! I mean, he's clearly not.
             | He's more of a fantastically savvy investor than anything
             | else. The man clearly does not run any of his companies
             | because it's impossible to run 3 companies at once. Don't
             | let him fool you. It is IMPOSSIBLE.
             | 
             | There are clearly far FAR better entrepreneurs in history
             | but if you keep fellating him I'm sure Daddy Elon will love
             | you one day. You got this, don't give up!
        
               | pbreit wrote:
               | Care to name one or 2?
        
         | [deleted]
        
         | Kranar wrote:
         | Meh, the term recall conjures images of vehicles having to be
         | taken back to the dealership to get repaired, replaced, or
         | refunded. The recall announced here is just an over the air
         | software update.
         | 
         | It's worth pointing this out.
        
           | notyourwork wrote:
           | It's really not a differentiator. Is it? Issue is still an
           | issue and must be addressed. OTA doesn't change that.
        
             | literalAardvark wrote:
             | It certainly is different. Recalls are expensive and
             | annoying for the user, involving drives and waiting times.
             | This is an OTA update.
        
           | billiallards wrote:
           | In the same way that the term "full self driving" or
           | "autopilot" conjures images of vehicles driving themselves?
           | 
           | Not sure that Tesla should be opening the "what do words
           | mean" can of worms right now.
        
             | Domenic_S wrote:
             | > _In the same way that the term "full self driving" or
             | "autopilot" conjures images of vehicles driving
             | themselves?_
             | 
             | I disagree with your point here, but whether correct or not
             | you should be consistent: if _words conjuring things_ is
             | important, then the previous commenter 's point is valid.
        
           | stefan_ wrote:
           | The question you need to be asking is how many of these
           | safety events they have swept under the rug because a fix is
           | always "just an over the air software update" away.
           | 
           | This recall only happened because, god forbid, the NHTSA got
           | off its ass and actually tested something.
        
             | Kranar wrote:
             | >This recall only happened because, god forbid, the NHTSA
             | got off its ass and actually tested something.
             | 
             | Where did you find this info? The article says it was Tesla
             | that did a voluntary recall. The NHTSA did no testing and
             | was not involved in the decision. The report, which is
             | linked in the article is authored by Tesla with no
             | involvement from NHTSA.
        
         | pbreit wrote:
         | I had the opposite reaction. The author is an avowed Elon &
         | Tesla hater and tried to work in the word "recall" in almost
         | every sentence despite that it will be a simple over-the-air
         | software update. No recall whatsoever.
        
           | A4ET8a8uTh0 wrote:
           | Just because the issue can be solved by an update ( and
           | lately some recalls actually have been - see Huyndai GPS
           | debacle ) does not mean it is not a recall. It just happens
           | to have a different means of correcting it. In other words,
           | update equals recall.
           | 
           | And that is before we get into whether it is appropriate to
           | update something that can move with enough kinetic force to
           | have various security agencies freaking out over possible
           | scenarios in the future ( and likely cause for some of the
           | recent safety additions like remote stop ).
           | 
           | In other words, just because he may be a hater does not make
           | the argument not valid.
        
       | adoxyz wrote:
       | I still can't believe FSD Beta was allowed on public roads. I've
       | had it since day 1, and oh my god. It behaves worse than a drunk
       | teenager. All the updates since, have made it barely better. This
       | needs to be pulled entirely and rethought.
        
         | pirate787 wrote:
         | I'm glad it is out there and thanks for buying it. The tech
         | promises to save millions of lives worldwide. Waiting a decade
         | to deploy the tech will mean hundreds of thousands of Americans
         | will die from driver error that self-driving tech could have
         | avoided.
        
           | alex_duf wrote:
           | I wonder how far would other solutions to reduce road car
           | deaths go with the same budget?
           | 
           | How about better public transit, encouraging people to use
           | smaller cars, encouraging cycling with corresponding
           | infrastructure, etc.
        
             | panick21_ wrote:
             | > I wonder how far would other solutions to reduce road car
             | deaths go with the same budget?
             | 
             | Far, far, far wider. Reducing road deaths is a pretty well
             | understood, political implementation is the problem.
             | 
             | And the simply fact is, reducing road deaths is cheap. Its
             | very cheap. If you want to make it look good, and fancy,
             | its a bit more expensive.
             | 
             | But the fact is, we know how to do it, and do it cheaply.
             | Fancy next generation car technology is pretty terrible in
             | terms of investment to return.
             | 
             | All you really need is a bunch of paint, and a few concert
             | spectators over various shapes. Pretty much all you have to
             | do is make cars slower, that is by far the most important
             | factor in mixed traffic. There are many ways to achieve
             | this.
             | 
             | If you want to get a bit more fancy and technical, you can
             | make dutch intersections:
             | 
             | Doing things like this:
             | https://en.wikipedia.org/wiki/Protected_intersection
             | 
             | > How about better public transit, encouraging people to
             | use smaller cars, encouraging cycling with corresponding
             | infrastructure, etc.
             | 
             | How about banning cars from many city centers, or only
             | allowing commercial and people who live there.
             | 
             | How about REQUIRING smaller cars. A limit on car size,
             | specially if you want to enter cities.
        
           | ghqst wrote:
           | Yeah so now the driver error can be automated. Great.
        
           | capableweb wrote:
           | The tech might promise to save millions of lives, but if
           | "behaves worse than a drunk teenager" is to be believed,
           | don't you think it's best to wait a bit, if you want to save
           | as many lives as possible?
        
           | elil17 wrote:
           | The "self driving" tech that works (e.g. auto braking) is
           | deployed and has saved lives. (Self driving in quotes because
           | it is not really the full promise of self driving.)
           | 
           | This Tesla AI does not work well enough in many conditions
           | and is clearly sometimes _more_ dangerous than humans. Just
           | watch videos of people using it - it 's obvious that is
           | making unforgivable errors.
        
             | ajross wrote:
             | Do you have evidence for "clearly _more_ dangerous "?
             | Because what numbers exist say the opposite. They are now
             | pushing a million (!) of these vehicles on US roads and to
             | my knowledge the bay bridge accident[1] is the _only_
             | example of a significant at fault accident involving the
             | technology.
             | 
             | It's very easy to make pronouncements like "one accident is
             | too many", but we all know that's not correct in context.
             | Right? If FSD beta is safer than a human driver it should
             | be encouraged even if not perfect. So... where's the
             | evidence that it's actually less safe? Because no one has
             | it.
             | 
             | [1] Which I continue to insist looks a lot more like
             | "confused driver disengaged and stopped" (a routine
             | accident) than "FSD made a wild lane change" (that's not
             | its behavior). Just wait. It took a year before that
             | "autopilot out of control" accident was shown (last week)
             | to be a false report. This will probably be similar.
        
               | tjohns wrote:
               | Just off the top of my head:
               | 
               | - A Tesla near that drive into the median divider near
               | Sunnyvale on Highway 101, because it thought the gore
               | point was an extra lane. Split the car in half and killed
               | the driver. [1]
               | 
               | - A Tesla that autonomously ran a red light and killed
               | two people. [2]
               | 
               | - Multiple accidents where Teslas have driven into parked
               | emergency vehicles / semi trucks.
               | 
               | Is it quantitatively safer than human drivers? I don't
               | have that data to say one way or the other. But it's not
               | correct to say the Bay Bridge is the only at fault
               | accident attributable to autopilot.
               | 
               | [1]: https://ktla.com/news/local-news/apple-engineer-
               | killed-in-ba...
               | 
               | [2]: https://www.mercurynews.com/2022/01/21/felony-
               | charges-are-1s...
        
               | ajross wrote:
               | Those are autopilot accidents from before FSD beta was
               | even released.
               | 
               | I mean, it's true, that if you broaden the search to "any
               | accident involving Tesla-produced automation technology"
               | you're going to find more signal, but your denominator
               | also shrinks by one (maybe two) orders of magnitude.
               | 
               | And it also lets me cite Tesla's own statistics for its
               | autopilot product (again, not FSD beta):
               | https://www.tesla.com/VehicleSafetyReport
               | 
               | These show pretty unequivocally that AP is safer than
               | human drivers, and by a pretty significant margin. So as
               | to:
               | 
               | > Is it quantitatively safer than human drivers? I don't
               | have that data to say one way or the other.
               | 
               | Now you do. So you'll agree with me? I suspect not, and
               | that we'll end up in a long discussion about criticism of
               | that safety report instead [ _edit: and right on
               | queue..._ ]. That's the way these discussions always go.
               | No one wants to give ground, so they keep citing the same
               | tiny handful of accident data while the cars keep coming
               | off the production line.
               | 
               | It's safe. It is not _perfect_ , but it is safe. It just
               | is. You know it's safe. You do.
        
               | Xylakant wrote:
               | This data does not show what it pretends to show. The
               | underlying statistics are heavily skewed in favor of
               | Tesla by various factors:
               | 
               | * Autopilot can only be enabled in good conditions and
               | relies on human drivers in all other, to the point of
               | handing over to a human when it gets confused. Yet, they
               | compare to all miles driven in all conditions.
               | 
               | * Teslas are comparatively new cars that have - as they
               | proudly point out - a variety of active and passive
               | safety features that make the car inherently less prone
               | to accident than the average, which includes old, beat up
               | cars with outdated safety features. Teslas are also
               | likely to be better maintained than the average car in
               | the US by virtue of being an expensive car driven by
               | people with disposable income.
               | 
               | * Teslas are driven by a certain demographic with a
               | certain usage pattern. Yet, they provide no indicator on
               | how that skews the data.
               | 
               | Tesla could likely provide a better picture by comparing
               | their data with cars in the same age and price bracket,
               | used by a similar demographic, in similar conditions.
               | They could also model how much of the alleged safety
               | benefit is due to the usual, active and passive safety
               | features (brake assistance, lane assist, ...). They
               | don't, and as such, the entire report is useless or
               | worse, misleading.
        
               | ly3xqhl8g9 wrote:
               | It's called FSD as in _Full_ Self-Driving. If it does one
               | single  "accident" where it's unable to see a totaled car
               | with hazard lights on right in its face [1], then it's
               | not really _full_ , is it now. Not to mention the
               | hilarious contradiction in terms: _Full_ , but _Beta_.
               | 
               | No one would have any major issues if this (self)killing
               | system was called "Extended Driver Assist" and it had
               | implemented the minimal safety features like driver
               | monitoring, safely stopping the car if the driver doesn't
               | pay attention to the road.
               | 
               | [1] https://www.reddit.com/r/IdiotsInCars/comments/100pem
               | h/tesla..., also, in this case, even a $20K Skoda [2] has
               | basic automatic braking safety features, no need for
               | _artificial intelligence_ to detect something stationary
               | in front of you.
               | 
               | [2] https://www.skoda-auto.com/world/safety-assistence-
               | system-ci...
        
               | elil17 wrote:
               | I don't believe one accident is too many. I made my
               | statement based on videos I've seen of people having to
               | disengage their beta FSD in circumstances where a human
               | driver would have no trouble.
               | 
               | Now, maybe the data says otherwise. If that is the case,
               | then great! Let's role out some more FSD beta. But for
               | that data to be valid, you have to account for the fact
               | that Tesla filters bad drivers out of the pool of FSD
               | users. And as I understand it there is not public data
               | about the risk profiles of the people Tesla lets use FSD.
        
           | adoxyz wrote:
           | The Tesla tech is not ready and is deployed haphazardly.
           | Google and other companies are approaching self-driving in a
           | much safer way while still having tremendous progress.
        
             | worik wrote:
             | > Google and other companies are approaching self-driving
             | in a much safer way while still having tremendous progress.
             | 
             | Are they really?
             | 
             | It looks like they have got the timeless problem of "the
             | last big(s)". This time it is safety critical.
        
           | saboot wrote:
           | I'm really struck at how this comment has no consideration of
           | the quality or accuracy of current FSD technology.
           | 
           | Also, public transit would do the same.
        
             | adwn wrote:
             | > _Also, public transit would do the same._
             | 
             | Does public transit pick me up at my front door and drop me
             | off at any location I choose?
             | 
             | Does public transit take of the minute I leave the house?
             | 
             | Is public transit available 24/7, even in remote areas?
             | 
             | Does public transit wait for me to load 2 kids, a dog, 4
             | suit cases, and various bags, blankets, and toys?
             | 
             | Does public transit turn around after 2 minutes, bring me
             | back, and wait for 30 seconds because I forgot my wallet on
             | the kitchen counter?
             | 
             | Will public transit still work during a public transit
             | strike? (yes, this actually happens over here in Europe)
             | 
             | No? Well, then you can hardly claim that "public transit
             | would do the same" as a self-driving car.
        
               | saboot wrote:
               | Then by all means lets let the self driving cars have
               | free roam then. Current state of technology and failures
               | be damned
        
               | Doctor_Fegg wrote:
               | Does a Tesla travel at 125mph from my town to central
               | London?
               | 
               | No? Well, then you can hardly claim that a self-driving
               | car oh for goodness' sake why am I even attempting to
               | argue with motorheads on HN.
        
           | smachiz wrote:
           | I would be glad it was out there if it was aggressively
           | marketed as a drivers assistant feature, not as a self-
           | driving capability.
        
             | bdcravens wrote:
             | Even "Auto Pilot" is a poor name, given it's pretty much
             | just the same suite of safety features offered by others
             | (many of which are superior given they still have LIDAR and
             | USS)
        
           | breck wrote:
           | Agreed! Thank you for saying this.
           | 
           | If committees of lawyer p*ssies were in charge of everything
           | we never would have had planes--they never would have figured
           | out how to make them light enough to fly. Many many thousands
           | of men gave their lives so we could learn to fly.
           | 
           | It takes courage and bravery to push the human race forward.
           | 
           | God speed Tesla.
        
             | gg-plz wrote:
             | Dang, please delete this bot account
        
           | bdcravens wrote:
           | How about Tesla allow users in the beta for free then?
           | Shouldn't the logic of waiting apply to Tesla, by charging
           | for something that could save lives?
        
             | mjmsmith wrote:
             | I'd prefer that Tesla allowed non-users to opt out of the
             | beta.
        
             | panick21_ wrote:
             | Tesla is using the same basic technology for security
             | features, many of those features are deployed to all cars.
             | The can and should do that for even more features over
             | time.
        
         | ChickenNugger wrote:
         | > _I still can 't believe FSD Beta was allowed on public
         | roads._
         | 
         | Fault, distracted, intoxicated human beings are allowed on
         | public roads. The "legal limits" for blood alcohol levels
         | aren't 0.00%.
        
           | julianlam wrote:
           | Interesting...
           | 
           | The standards of human drivers is lower than that of FSD, and
           | somehow you think that this is a good reason to justify
           | lowering the standards of FSD.
        
             | ChickenNugger wrote:
             | Strawman fallacy.
        
         | natch wrote:
         | You are meant to maintain control of the vehicle at all times
         | with the beta.
         | 
         | That's the only way it could be released in its current state.
         | 
         | Looking at it as anything other than a beta where you have to
         | maintain control is misunderstanding what it is. Which you are
         | clearly doing. It is absolutely expected to be worse than a
         | drunk teenager in this stage.
        
           | Doctor_Fegg wrote:
           | I wonder if there'd be so many people eager to pay Tesla $$$
           | if it were marketed as "Worse Than Drunk Teenager Self
           | Driving".
        
             | natch wrote:
             | Haha yeah I doubt it.
        
         | LanceJones wrote:
         | I beta test regularly in Vancouver BC on my Model S LR. My
         | experience is the polar opposite to yours. It's surprising (on
         | the good side) and has improved immensely in the past 12
         | months. It does not need to be rethought imo. Obviously your
         | opinion is valid, but it's not consensus.
        
           | fnordpiglet wrote:
           | Yeah I use it all the time in seattle here on my model x.
           | It's amazing. Sometimes it makes a questionable decision but
           | the vast majority of the time it works as expected. I don't
           | mind being alert to the driving conditions, and the amount it
           | can do without assistance is remarkable
        
             | buildbot wrote:
             | I hope I am not next to you when it makes those
             | "questionable" decisions...
        
         | jejeyyy77 wrote:
         | yeah, I don't believe you. I use it every day and have been for
         | over a year and it's been awesome.
        
           | adoxyz wrote:
           | You can believe whatever you want. The reality is that FSD
           | Beta is unsafe and makes the dumbest decisions. Every update
           | I give it a try to see if anything has improved, and usually
           | within the first 30 seconds, I end up turning it off.
           | 
           | Autopilot on Freeways works ok for the most part though.
        
             | abc_lisper wrote:
             | You both could be right. It might depend on the area the
             | car is driven. I guess you are not in the Bay Area?
        
               | adoxyz wrote:
               | I'm in Las Vegas. The roads are wide, the weather is
               | clear, traffic is low. FSD Beta should work great here.
               | But alas.
        
         | [deleted]
        
         | mrb wrote:
         | I have FSD too, and my opinion is that although it takes unsafe
         | actions almost on a daily basis, most of these errors are
         | minor, easily rectified by me the driver, and most of the time
         | it drives just fine, in fact better than a human driver. I
         | witnessed FSD avoiding me a collision by responding faster than
         | I could have by auto-braking when another driver ran a red
         | light.
         | 
         | FSD doesn't need to be perfect to avoid accidents. In fact, if
         | it was too perfect I'm afraid I could become very inattentive
         | at the wheel.
        
           | vel0city wrote:
           | > easily rectified by me the driver
           | 
           | So, not Full _Self_ Driving...
        
           | make_it_sure wrote:
           | well, today is cool to hate on any technologies related to
           | musk. Even if you had a good experience, people don't want to
           | believe you.
        
         | dreamcompiler wrote:
         | I have a Tesla and have tried FSD several times (month-to-month
         | plan and loaner cars from Tesla). It's comically bad.
         | Fortunately every time I tried it I was paying very close
         | attention so I stopped it from braking at green lights, trying
         | to turn right from a middle lane, halfway lane changes where it
         | snaps back to the original lane unexpectedly, etc.
         | 
         | The regular, included autopilot works pretty well as a smarter
         | cruise control; as long as you use it on a major highway in the
         | daytime in good conditions it does a good job. And it's very
         | useful in stop-and-go traffic on the highway.
         | 
         | But the FSD is crap.
        
           | natch wrote:
           | You have never tried FSD.
           | 
           | You may have tried FSD beta. Different things entirely.
        
             | dreamcompiler wrote:
             | Yeah, I was abbreviating. Nobody has tried FSD because it
             | doesn't exist.
        
       ___________________________________________________________________
       (page generated 2023-02-16 23:01 UTC)