[HN Gopher] Tesla FSD data is getting worse, according to beta t...
       ___________________________________________________________________
        
       Tesla FSD data is getting worse, according to beta tester self-
       reports
        
       Author : mfiguiere
       Score  : 472 points
       Date   : 2022-12-14 15:36 UTC (7 hours ago)
        
 (HTM) web link (electrek.co)
 (TXT) w3m dump (electrek.co)
        
       | nova22033 wrote:
       | https://www.thedrive.com/news/38129/elon-musk-promised-1-mil...
       | 
       | "Next year for sure, we will have over a million robotaxis on the
       | road," said Musk on October 21, 2019. "The fleet wakes up with an
       | over-the-air update. That's all it takes."
       | 
       | https://twitter.com/PPathole/status/1249209968877862919
       | 
       | Elon Musk @elonmusk * Apr 12, 2020 Replying to @PPathole and
       | @Tesla Functionality still looking good for this year. Regulatory
       | approval is the big unknown.
        
       | [deleted]
        
       | rafaelero wrote:
       | Tesla FSD may be bad, but no way it is that much worse than
       | Cruise. Looks like they are inflating their data.
        
         | bagels wrote:
         | Why do you say this?
        
           | rafaelero wrote:
           | Are you kidding? 40k km with no disengagement? This is nuts.
           | I don't think we have reached that level yet. And if we did,
           | I can't believe Tesla engineers would be so oblivious to that
           | that they wouldn't update their technology. The likeliest
           | explanation is that the article's chart is using some weird
           | metric that is not comparable between automakers.
        
             | ahelwer wrote:
             | You've given no actual reasons you find this hard to
             | believe, just running on 100% vibes. The data reported to
             | regulators doesn't lie. Tesla really is that far behind the
             | competition.
        
         | threeseed wrote:
         | I would not be surprised at all.
         | 
         | The biggest problem with FSD is bounding box detection as from
         | previous videos it routinely fails to identify objects in the
         | road correctly.
         | 
         | A problem that LiDAR (which Cruise uses) is especially good at.
        
           | rafaelero wrote:
           | That magnitude of a difference would be pretty noticeable and
           | I haven't seen Cruiser users posting amazing reviews on
           | YouTube in comparison to Tesla's. Maybe I am using an
           | incorrect proxy, but it's hard to believe that Tesla would be
           | so far off.
        
             | threeseed wrote:
             | The difference _is_ noticeable.
             | 
             | You only have to look at Cruise versus FSD videos to see
             | how the latter struggles with reliable object detection.
             | 
             | Also this is about Cruise the robo-taxi company and not the
             | GM SuperCruise technology which isn't what I would consider
             | a peer to FSD.
        
             | [deleted]
        
         | bin_bash wrote:
         | Cruise has cars driving out without a driver in San Francisco
         | right now. Tesla is a world away from being able to do anything
         | close to that.
        
       | bin_bash wrote:
       | I just tried FSD last night in suburban Dallas for the first time
       | with light traffic and it was harrowing. Drove in the wrong
       | lanes, almost hit a trash can, accelerated way too fast on side
       | streets, and it made a right turn at a red light without stopping
       | or even coming close to slowing down. This was a 3 mile drive.
       | 
       | I've been using autopilot on the freeway for years and that's
       | been mostly fine--but I'm never going to use FSD on streets
       | again.
        
         | influxmoment wrote:
         | Is your experience the norm though? Uncut YouTube videos show a
         | reliable FSD in many conditions
        
         | SketchySeaBeast wrote:
         | Did you actually keep it on those whole 3 miles? I would have
         | stopped after the first near miss.
        
           | bin_bash wrote:
           | I disengaged twice because I was terrified of what it might
           | do but turned it back on. I'm generally a very early adopter
           | with these kinds of things (I'm the type of person to always
           | turn on "Beta" mode regardless of what I'm working with). So
           | I have a high tolerance for things not working the way they
           | should.
           | 
           | This is unbelievably terrible though. I really regret
           | purchasing it.
        
             | tqi wrote:
             | > I'm generally a very early adopter with these kinds of
             | things
             | 
             | Which in turn makes everyone else around your car an early
             | adopter?
        
               | Cshelton wrote:
               | By that logic, all drivers are early adopters every time
               | a new driver gets their license and enters the "driving
               | pool".
        
               | d23 wrote:
               | We should require a test of one's ability to drive based
               | on some basic standards before issuing a license. We
               | should come up with a series of rules for what happens if
               | someone does not adhere to these standards, as well as a
               | mechanism of enforcement if they violate those rules.
        
               | [deleted]
        
               | ska wrote:
               | Isn't that why lots of of jurisdictions have constraints
               | on the learning driver (e.g. graduated licensing of some
               | sort) and/or visibility requirements (e.g. car has to
               | have a "learner" sticker of some sort) so that other
               | drivers know?
        
               | maximus-decimus wrote:
               | I literally never heard of such stickers.
        
               | philjohn wrote:
               | Hence why in the UK they're strongly advised to have a P
               | plate so other drivers are aware they are a new driver.
               | 
               | I suppose everyone should just assume a Tesla is about to
               | do something silly and drive defensively ...
        
               | sifar wrote:
               | They probably are to some extent, and you know where the
               | liability would lie if they are responsible should
               | something bad happen. What about this case ? There is no
               | sense of responsibility or realization of the danger they
               | are introducing _at scale_ .
               | 
               | Even when they actually admit that they have failed at it
               | [0]. I am not sure if they are aware of the doublespeak
               | in this admission.
               | 
               | [0] Failure to realize a long-term aspirational goal is
               | not fraud.
        
               | witheld wrote:
               | I think this logic is still square-
               | 
               | Normally, each person puts one new driver on the road
               | per-lifetime.
               | 
               | When you beta test a baby driving robot, you're now at
               | two new drivers per life! And the Tesla doesn't seem to
               | be learning faster than a human!
        
             | Zigurd wrote:
             | This makes me feel like running an Android beta on my daily
             | use phone just isn't that daring.
        
               | mcculley wrote:
               | This thread and the other comments on this post are
               | amazing. One cannot sell a car without a seatbelt, but
               | Tesla can use their customers for beta testing a
               | dangerous system that drives a whole car around.
        
             | jacquesm wrote:
             | It would seem like a return and a refund for selling a
             | defective product is in order.
        
               | masklinn wrote:
               | I expect that's exactly why you're paying to be a beta
               | tester: they can keep your money and not deliver
               | anything.
               | 
               | The one time I really beta tested a for-profit product,
               | not only did I get it for free, I actually got a rebate
               | on the final product (it was pycharm, and jetbrains gave
               | me a free license for a year, which they got back many
               | times over as I renewed yearly ever since).
               | 
               | Though I guess the early accesses I got for kickstarters
               | were kinda like paying for a beta in a way.
        
               | jacquesm wrote:
               | I don't think 'beta tester' is a recognized class in
               | terms of consumer law. You're a customer, a merchant, a
               | manufacturer or a bystander. Besides that for something
               | that costs that kind of money you can simply expect it to
               | work.
        
         | fiddlerwoaroof wrote:
         | I've been using FSD for a month and my experience with it has
         | been mostly great. I was skeptical initially because of the
         | prevalence of this sort of comment online but it didn't match
         | my experience. There are predictable situations where it can't
         | navigate but you get used to them and anticipate them.
         | Otherwise, it is a nice upgrade in functionality over EAP that
         | generally makes the car nicer to operate.
        
       | ljlolel wrote:
       | Worse is that Tesla is no longer releasing the safety data
       | quarterly as promised. Not exactly hard to report this as they
       | already have been doing it. Suspicious to stop reporting since
       | 2021.
       | 
       | We want this data!
       | 
       | https://www.tesla.com/VehicleSafetyReport
        
         | geocar wrote:
         | Am I misreading that? They're comparing accidents of tesla
         | owners against people belonging to different demographics and
         | circumstances!?
         | 
         | An equivalent improvement in driving safety might also be had
         | by eliminating drivers under 30 and since those aren't Tesla
         | drivers (the average Tesla owner is a 54 year old white man
         | making over $140,000 with no children) this self-driving
         | malarky would have zero safety impact.
        
           | ljlolel wrote:
           | Their more numbers of more affordable cars is probably why
           | their safety data is getting worse and why they're not
           | releasing it any more (I support tesla by the way just wish
           | it were transparent about safety).
        
       | ajross wrote:
        
         | Scea91 wrote:
         | Out of curiosity, are you perhaps driving on streets that are
         | likely to be heavily represented in the training data?
         | 
         | I would be interested in seeing how performance differs between
         | cities and countries.
        
         | jjulius wrote:
         | >But this software is working folks. It's going to happen. As I
         | see it our choice is either to celebrate the advent of new
         | technology or hide behind comment vote buttons trying to
         | pretend that it isn't.
         | 
         | I don't believe people don't want this technology to happen, or
         | don't think it's going to happen; I think the pressure and the
         | criticism is there because people want it to be safe. When it
         | comes to safety, I'd rather have FSD be subject to more
         | criticism than it needs to be, not less.
        
           | ajross wrote:
           | The linked article isn't about safety data, FWIW. FSD beta
           | has been deployed on hundreds of thousands of cars for over a
           | year now. If there was any safety data to measure, it would
           | be shining like a beacon at this point. Which is why the
           | people (Taylor Ogan and Fred Lambert absolutely qualify,
           | FWIW) with an anti-Tesla schtick need to scrap for
           | editorialized content like this[1].
           | 
           | The truth is that this is deployed as a supervised system,
           | and as such it's at least as safe as the driver operating it.
           | I can absolutely attest to situations where the car noticed
           | and reacted to a safety situation before I did. I think on
           | the whole I'm a safer driver monitoring my car vs. operating
           | it.
           | 
           | [1] In this case, measuring self-reported "disengagement"
           | data of amateur drivers in general traffic vs. the controlled
           | conditions reported by professionals operating vehicles in
           | limited environments and with limited functionality. You
           | can't take a Waymo or Cruise on the freeway to try it out,
           | they won't operate in parking lots or unpainted roads or
           | construction areas, hell my understanding is that they fence
           | off and disallow most unprotected left turns! You can get a
           | Tesla to try any and all of that (and I do! regularly! and
           | sometimes it fails!). I mean, come on, this isn't a fair
           | comparison and you know it.
        
             | jjulius wrote:
             | >The linked article isn't about safety data, FWIW.
             | 
             | Que? Disengagement can occur for a variety of reasons, up
             | to and including preventing the car from performing a
             | hazardous/deadly move. This is absolutely about safety.
        
         | leesec wrote:
         | Not sure why this was downvoted lol. If everyone can happily
         | share their stories of autopilot failing without being
         | downvoted why not a positive case?
        
           | frosted-flakes wrote:
           | Because AJ explicitly asked to downvoted.
        
           | TulliusCicero wrote:
           | It's pretty damn obvious why it's being downvoted, or did you
           | not make it through the whole comment?
           | 
           | > But this software is working folks. It's going to happen.
           | As I see it our choice is either to celebrate the advent of
           | new technology or hide behind comment vote buttons trying to
           | pretend that it isn't.
           | 
           | > For me, I'm happier watching it than I would be being a
           | luddite. I'll take my downvotes now, thanks.
           | 
           | Snarkily dismissing everyone else's experiences as invalid
           | isn't a great way to endear yourself.
        
           | pmarreck wrote:
           | It's one thing to report your positive experience. It's
           | another to presume that everyone else who is reporting a
           | negative one is just a complainer, because you're assuming
           | that you and them have equivalent experiences. "This software
           | is working, folks!" is an example of that; so is the
           | presumption that the complainers are simply caught up in
           | "Tesla/Musk hate"
        
           | [deleted]
        
         | pmarreck wrote:
         | The downvoting is for this: You presume that your own
         | experience is the same as everyone else's (regardless of
         | differing driving conditions, etc.), and thus, that everyone
         | else is being overly critical of their (assumed to be similar)
         | experiences.
         | 
         | There's a term for this: Gaslighting.
        
           | bilvar wrote:
           | > You presume that your own experience is the same as
           | everyone else's
           | 
           | Aren't the negative experience comments doing the EXACT same
           | thing?
        
             | ryanwaggoner wrote:
             | No? Surely you've had a negative experience with a product
             | or business and recognized that not every other person out
             | there has had the same experience.
             | 
             | The people here sharing negative experiences aren't
             | suggesting that the positive experiences others report are
             | just the result of being a Musk fan.
        
               | kjksf wrote:
               | The person who was downvoted didn't say that negative
               | experiences are from Musk haters.
               | 
               | He correctly predicted that his positive experience will
               | be downvoted by Musk haters even though it's as valid as
               | negative experiences shared by other people that were not
               | downvoted.
               | 
               | That's clear bias and so obvious that it can be called
               | ahead of time.
               | 
               | He didn't write anything less valid that the upvoted
               | complaints and shouldn't be downvoted by anyone.
        
               | ncallaway wrote:
               | > The person who was downvoted didn't say that negative
               | experiences are from Musk haters.
               | 
               | I got the...pretty strong implication from it. I didn't
               | downvote them, but it did rub me the wrong way for
               | exactly the implication that the people with negative
               | experiences are from Musk haters.
               | 
               | There are two sentences that, taken together, seem very
               | dismissive of people who have negative experiences:
               | 
               | > Everyone hates it (and given the dumpster fire at
               | Twitter right now, everyone really loves to hate Tesla).
               | 
               | > But this software is working folks.
               | 
               | "Everyone hates it" and the parenthetical reference to
               | Twitter really leans into the idea that people who
               | dislike it feel that way for ideological reasons and not
               | their own personal experience with the vehicle.
               | 
               | That's strongly reinforced by the next sentence that
               | states "This software is working folks", which is a flat
               | categorical statement that contradicts the experiences of
               | the people who dislike it.
               | 
               | So, I didn't downvote, but if I did it would be exactly
               | for the implication that the people who dislike it didn't
               | have valid reasons for disliking it.
        
               | ryanwaggoner wrote:
               | I downvoted because I downvote anyone who preemptively
               | complains / invites downvotes and assumes that they'll be
               | doled out in bad faith. It pollutes the conversation and
               | is the cheapest form of sophistry. It's no different than
               | setting up an argument that ends with "and if you
               | disagree, that just shows how right I am!"
        
           | ajross wrote:
           | Please don't. I don't think anyone in the linked article or
           | in this thread is lying. I think they all have real
           | experiences to relate and we should listen to them. Given
           | that, can you please retract your needlessly inflammatory
           | accusation of gaslighting? Thanks.
           | 
           | All I'm saying is that my experience makes it very clear that
           | what problems remain with this technology are resolvable
           | ones. Consumer cars _are_ going to drive themselves. Period.
           | And given existing evidence, it seems reasonably clear to me
           | that Tesla is going to get there first (though I wouldn 't be
           | shocked if it's Waymo, Cruise seems less like a good bet).
           | 
           | And given that interpretation, I find the kind of grousing
           | going on in this thread unhelpful. If I show you a bug in
           | your iPhone, would you immediately throw it out and declare
           | that no one could possibly market a smartphone? That's how
           | this thread sounds to me.
        
             | TulliusCicero wrote:
             | > needlessly inflammatory
             | 
             | > As I see it our choice is either to celebrate the advent
             | of new technology or hide behind comment vote buttons
             | trying to pretend that it isn't.
             | 
             | > For me, I'm happier watching it than I would be being a
             | luddite. I'll take my downvotes now, thanks.
             | 
             | Pot, meet kettle.
        
               | ajross wrote:
               | I think the fact that my post now sits flagged and
               | invisible, yet you're still commenting on it, more or
               | less bears out the point I was making. I'm genuinely
               | sorry you took offense, really I am. I thought that was
               | fairly gentle, honestly. It was intended to be a fun way
               | to point out the tribalism at work here. But... something
               | are maybe just too fun to hate?
               | 
               | Let me repeat: _I love my car._ I 'm not here to hate.
               | I'm here to try to explain how great this thing is.
               | Really, you have no idea how fun it is to have a robot
               | drive you around town. And, comments like yours and the
               | rest here make it clear that you're really missing out on
               | that kind of joy in your rush to hate.
               | 
               | Call a friend and get a ride in an FSD beta car. You
               | really might change your mind.
        
             | greendave wrote:
             | > All I'm saying is that my experience makes it very clear
             | that what problems remain with this technology are
             | resolvable ones. Consumer cars are going to drive
             | themselves. Period. And given existing evidence, it seems
             | reasonably clear to me that Tesla is going to get there
             | first (though I wouldn't be shocked if it's Waymo, Cruise
             | seems less like a good bet).
             | 
             | Perhaps I'm unreasonably dense, but I do not see the link
             | (direct or otherwise) between 'it works well for me under
             | the following circumstances' and 'it will work well for
             | (almost) everyone, under all (reasonable) circumstances'.
             | The former can be true, without any guarantee of the
             | latter. Given all the hyperbole and promises that have been
             | made thusfar, it's hardly surprising that folks are
             | inclined to be skeptical, particularly if their own
             | experiences are as poor as many in this thread indicate.
             | 
             | What I do agree with is that cars are going to be allowed
             | to drive themselves, irrespective of how well they can do
             | it, because the politicians and the bureaucrats apparently
             | are unable to resist the siren song of 'progress'. That
             | much we've seen in SF with Cruise among many other plraces.
             | Whether this is a good thing is a different question
             | entirely...
        
               | kjksf wrote:
               | Jules Verne predicted space flight decades before
               | technology made it possible.
               | 
               | In 1903, New York Times predicted that airplanes would
               | take 10 million years to develop
               | (https://bigthink.com/pessimists-archive/air-space-
               | flight-imp...).
               | 
               | Many people who saw the first actual flights from Wright
               | brothers dismissed them as a parlor trick that will not
               | amount to anything practical.
               | 
               | Technology has a long history of exponential
               | improvements. At first it improves slowly and then
               | suddenly.
               | 
               | Sure, everyone (not just Tesla) was expecting self-
               | driving to happen by now. It didn't happen yet.
               | 
               | I share the OP's perspective that the maturity of self-
               | driving is already so high that it's a matter of time for
               | it to become viable to deploy at scale.
               | 
               | The initial DARPA challenge was Wright's brother. We're
               | closer to a plane that can travel halfway over Atlantic.
               | Not quite there yet but it's just a matter of time until
               | it can cross it.
        
         | ncallaway wrote:
         | Do you disagree with the article's call for Tesla to publicly
         | release the same kind of disengagement data that others release
         | publicly, so that we don't have to just compare anecdotes as if
         | they were data?
         | 
         | This thread is full of people who are describing their own
         | personal experiences, which mostly tilt negative, with some
         | positive notes like yours. But the article makes the point that
         | a bunch of anecdotes is *not a replacement for data*, and the
         | data we have (which is seriously flawed), shows that the
         | average experience is not as good as yours.
         | 
         | Maybe your experience is an outlier compared to the data that's
         | been collected? Maybe your experience is the norm and the Tesla
         | data would corroborate your experience. We can't really know
         | unless Tesla releases their data.
         | 
         | But they refuse to.
        
           | [deleted]
        
       | devmor wrote:
       | I briefly tested FSD out on a Model 3, for about a mile and a
       | half on what was mostly a clean, recently painted 2-lane road. It
       | seemed to be okay until I got to the point the lines broke for an
       | exit and then it tried to drive me into a concrete median.
       | 
       | I trust the lane-keep feature on the old 2017 Corolla I rented
       | more than FSD.
        
       | gzer0 wrote:
       | Let me know of any other system that is even remotely close to
       | being able to do the following:
       | 
       | https://www.youtube.com/watch?v=qFAlwAawSvU
        
         | klabb3 wrote:
         | An average human brain with 1000 hours of training.
        
       | killjoywashere wrote:
       | I have been driving Tesla FSD 10.69 for the last few weeks and it
       | is far from perfect. It's definitely beta level software, and
       | unfortunately, there's no real distinction between beta web
       | frontend and beta life-critical ASAD.
       | 
       | I routinely have to take control. It definitely accelerates to
       | hard on some sleepy side streets. It's extremely conservative
       | about speed and lane changes. It's terrible at anticipating
       | slowing where humans obviously would, like brake lights stacking
       | back toward you on a curving highway: the driver can see the
       | braking problem coming from a thousand yards ahead, maybe miles.
       | Summon is slightly better than a parlor trick. FSD is
       | approximately useless in parking lots.
       | 
       | But....
       | 
       | It's not completely unusable. There are definitely segments of a
       | drive I can mentally relax (back to my normal driving level of
       | attentiveness).
       | 
       | Having gotten two kids through their driving tests recently, I'm
       | struck how much FSD 10.69 resembles a 16 year old with their
       | early driver's permit. They're terrible. But they are learning
       | really fast.
       | 
       | Similarly, Tesla is now getting millions, if not billions, of
       | disengagements coming back for retraining. The next release is
       | going to be better.
        
         | VBprogrammer wrote:
         | > can mentally relax (back to my normal driving level of
         | attentiveness).
         | 
         | I'm not sure if this was intentionally funny but it made me
         | laugh.
        
           | rootusrootus wrote:
           | It sounds accurate to me. If you are a defensive driver, you
           | will most likely find that even basic autopilot increases
           | your stress substantially. It happily drives you full speed
           | into situations your brain easily recognizes as risky.
           | Personally I'd probably have palpitations if I tried FSD.
        
             | bestcoder69 wrote:
             | Damn I can't wait to try neurolink.
        
       | sanedigital wrote:
       | When our Model 3 got access to FSD, my 6 year-old desperately
       | wanted to try it out. I figured a low-traffic Sunday morning was
       | the perfect time to test it out, so we headed out to grab Mom a
       | coffee downtown.
       | 
       | The car almost caused an accident 3 separate times.
       | 
       | The first, was when it almost drove into oncoming traffic at an
       | intersection where the road curves north on both sides.
       | 
       | The second, was when it failed to understand a fork on the right
       | side of the road, swerved back and forth twice, and then almost
       | drove straight into a road sign.
       | 
       | The third, in downtown, was when a brick crosswalk confused it on
       | a left turn, causing it to literally drive up onto the corner of
       | the intersection. Thank God there weren't pedestrians there.
       | 
       | When we upgraded to a Y, I turned down FSD. I don't imagine I'll
       | be paying for it ever again.
        
         | hn_throwaway_99 wrote:
         | Honestly, I feel this way about pretty much all "driver assist"
         | systems in the wild today.
         | 
         | That is, I fully understand that to get to Levels 3, 4, and 5,
         | you need to pass through levels 1 and 2 in autonomous driving.
         | But the issue is that I feel like these systems are basically
         | at the "slight drunk teenager" stage, with you as the driver
         | having to ensure they don't mess up too badly. Honestly, unless
         | I can, say, read a book or do something else, these systems
         | (I'm specifically referring to an advanced cruise control/lane
         | keeping system) right now just require me to pay MORE attention
         | and they stress me out more.
         | 
         | Fully understand we need folks to help train these systems,
         | but, at least for me, they currently make the driving
         | experience worse.
        
           | influxmoment wrote:
           | For the Tesla driver assistance specifically (non FSD) it's
           | more advanced and reasonable reliable. I find it helps a
           | great deal to reduce fatigue on long drives. It is nearly
           | flawless on highways and watching to see the car is safe is
           | much less fatiguing than a constant centering and monitoring
           | the accelerator. Seeing the car is the right speed is less
           | mental energy than constant control of power to get the right
           | speed
        
             | roofone wrote:
             | Given the potential consequences of a mistake, it feels
             | like there's still a pretty big difference between "nearly
             | flawless" and flawless.
             | 
             | Speed control I'm fine with and is obvs. a mature tech that
             | has been around for decades. Maybe it's the way I drive,
             | but I find lane assist a liability -- especially on curves.
             | More than once the car swerved unexpectedly one way or the
             | other going around a bend. After the 2nd time that
             | happened, I shut it off.
        
               | burnished wrote:
               | I suspect the difference in experience might be
               | attributable to differences in the environment. I went
               | cross country in a model Y and noticed that it did not
               | handle one lane turning into two lanes with any grace -
               | but I also drove across entire states where that didn't
               | come up. It wouldn't surprise me if some experiences were
               | regional to an extent.
        
               | vel0city wrote:
               | Lane assist isn't supposed to entirely keep you in the
               | lane on it's own, it's supposed to just help tug you in
               | the right direction as a hint in case you weren't paying
               | perfect attention. It's usually not supposed to steer the
               | car entirely on its own.
        
           | chris11 wrote:
           | FSD marketing has always seemed sketchy to me. Though I like
           | Open Pilot. It's a much smaller scope for an L2 system to
           | keep you in your lane and keep you a safe distance from the
           | next car. It works well for highway driving.
        
           | josefresco wrote:
           | > Honestly, I feel this way about pretty much all "driver
           | assist" systems in the wild today.
           | 
           | I've found adaptive cruise control to be a simple, noticeable
           | improvement.
        
           | vel0city wrote:
           | I personally really enjoy my ADAS systems on my cars even
           | though it's not to the read a book or take a nap level of
           | automation. It's really just cruise control on steroids. Do
           | you see value in regular cruise control systems, even though
           | it's not 100% automated?
           | 
           | When I'm in stop and go traffic, I really like not having to
           | constantly go back and fourth between the brake and gas
           | pedal, I can just pay attention to what's happening in and
           | around the lane and let the car keep a good follow distance.
           | 
           | I've gone over 100 miles at a time without having to touch
           | the gas or brake pedal, even through stop and go traffic.
        
           | tablespoon wrote:
           | > Honestly, I feel this way about pretty much all "driver
           | assist" systems in the wild today.
           | 
           | My five year old Honda has a very limited driver assist
           | system (radar cruise control + lane centering), which (in my
           | opinion) is very good at what it's trying to do. It has _no_
           | pretensions of being a  "self-driving" system, but it very
           | successfully manages to reduce some of the stress of driving
           | and make _my_ driving better. I think the key point is it
           | only automates the fine adjustments and provides alerts, but
           | is very careful to never allow the driver to rely on it too
           | much.
        
             | dont__panic wrote:
             | I feel the same way about my Subaru's EyeSight system. It
             | helps me stay in the lane, and annoys me if I get
             | distracted and cross a line. It slows down the car
             | automatically when it detects an obstacle ahead. It speeds
             | up and slows down automatically to maintain a mostly-steady
             | speed when I set cruise control.
             | 
             | Until autonomous vehicles reach "read a book or fall
             | asleep" levels, this is all I'm interested in. No thank you
             | to any dumb "autopilot" system that I can't actually trust,
             | but _tries_ to control my wheel.
        
         | ryandvm wrote:
        
           | [deleted]
        
           | [deleted]
        
           | monero-xmr wrote:
           | A single cancer drug from idea to FDA approval kills untold
           | numbers of rodents and monkeys. Price we are willing to pay
           | as humans.
        
           | bilvar wrote:
           | Could you please not propagate fake news? There is high
           | enough signal-to-noise ratio already on the internet.
        
           | yreg wrote:
           | ~1500 animals died at Neuralink, vast majority of them being
           | rodents. These are not large numbers.
           | 
           | https://www.reuters.com/article/factcheck-neuralink-
           | fabricat...
        
             | pmarreck wrote:
             | This factcheck looks legitimate. "Over 280 sheep, pigs and
             | monkeys" is unfortunately not very specific though.
        
               | shkkmo wrote:
               | It does show that the "3000 monkeys" claim is
               | misinformation since the real number is over an order of
               | magnitude less.
        
             | bearmode wrote:
             | It's a huge number for this sort of research. _Huge_.
        
             | lostlogin wrote:
             | That article also mentions an ongoing federal investigation
             | into the poor treatment of animals there.
        
             | falcolas wrote:
             | > These are not large numbers.
             | 
             | Only when compared to completely different industries, the
             | most commonly cited being education or food.
             | 
             | When compared to its own industry, the numbers are still
             | large.
        
               | yreg wrote:
               | Its own industry being medical research? Perhaps somewhat
               | more important than eating meat or beauty products?
        
         | mentalpiracy wrote:
         | Serious question for you: the FSD near-catastrophically failed
         | twice before you got to town. Why did you continue to use FSD
         | as you entered a more populated area?
        
         | sam36 wrote:
         | That sounds really terrible. I honestly thought that Tesla FSD
         | was better than that. It just reminds me of my opinions I had
         | (and still have) like 15+ years ago when I'd get into
         | discussions on random forums about self driving cars. I mean
         | sure, perhaps 100 years down the road when everything driving
         | is required to be in a fully autonomous mode with interlinked
         | wireless communication, maybe perhaps that would work.
         | 
         | But that is not what we have right now. Right now every driver
         | on the road is exposed to a potential instant tragedy that is
         | unavoidable. I mean, what is a self driving car going to do if
         | a large semi drops a huge log or freight right in front of you?
         | You have one second to impact. You can either attempt to hold
         | your line and drive over the object, potentially
         | killing/injuring the passengers. Or you can swerve left into
         | oncoming traffic. Or you can swerve right into a busy two way
         | frontage road.
         | 
         | No matter which choice is taken, I guarantee there will be a
         | lawsuit. Perhaps one way forward would be perhaps something
         | similar to what medical companies did in the 1980's with the
         | creation of the "vaccine courts". Maybe we need some kind of
         | new "autonomous driving" court which would be staffed with
         | experts who would then decide what kind of cases have merit.
         | That would at least better shield the parent companies and
         | allow them to continue innovating instead of worrying about
         | potential litigation.
        
           | adrr wrote:
           | It's fine in 90% of scenarios. Those 10% are scary. Mine
           | tried to pull do a right turn at red light but didn't
           | properly detect the cross traffic last week. That type of
           | scary. If would be nice if cars talked to each other so it
           | didn't just have to rely on vision.
        
         | fnordpiglet wrote:
         | This was my experience with the beta up until about 3 months
         | ago. Since then it's remarkably improved.
        
         | dehrmann wrote:
         | Pointing out the obvious, it's extremely negligent for Tesla to
         | have this feature available. It's not even close to ready for
         | general use.
        
           | musha68k wrote:
           | The profit motive is so clear in this case, and criminal.
           | Collecting telemetry for free so you get to improve your ever
           | elusive model, literally paid by your "customers" as well as
           | potentially with their (and potentially "collateral") lives.
           | It's horrendous.
        
           | akmarinov wrote:
           | And who's going to stop them? Certainly not the US
           | government.
           | 
           | Enjoy beta testing FSD as an unwilling pedestrian.
        
             | kibwen wrote:
             | If I see a Tesla in the wild, I shoot its tires out before
             | it has a chance to strike. That's the American way.
             | 
             | (I assume this is why the Cybertruck features bulletproof
             | glass, in case my self-firing gun (now in beta)
             | misidentifies the tires.)
        
           | brandonagr2 wrote:
           | It's extremely negligent for the driver to let the car drive
           | onto a sidewalk. As the driver you are solely responsible for
           | everything the car does.
        
         | foobarian wrote:
         | One problem with this experiment is that, I'm guessing, you
         | monitored the car very closely, ready to take over at any
         | moment, and, in fact, did so in a few cases.
         | 
         | But this raises the question, what would have actually happened
         | if you just let the car be? Would it have gotten into
         | accidents? Or maybe it just drives in this alien unfamiliar way
         | but is actually safer?
        
           | AlexandrB wrote:
           | > Or maybe it just drives in this alien unfamiliar way but is
           | actually safer?
           | 
           | As long as the roads are a mixed environment of autonomous
           | and non-autonomous vehicles, driving in an unfamiliar way is
           | by definition unsafe because other road users can't
           | anticipate what your car is going to do. That's not even
           | mentioning pedestrians or cyclists.
        
           | rurp wrote:
           | Sounds like great questions to answer in a safe controlled
           | test environment before letting it loose on public roadways.
        
         | thepasswordis wrote:
         | I had a similar experience when I first got FSD.
         | 
         | What I realized was just that I was being scared of the
         | computer. It _wasnt_ about to drive into traffic, and it
         | _wasnt_ about to crash into anything or anything like that.
         | 
         | What was happening was that I was rightly being overly cautious
         | of the _beta_ program, and taking control as soon as there was
         | really anything other than driving in a straight line on an
         | empty road.
         | 
         | Over time, it became a dependable, usable copilot for me. I
         | would massively miss FSD if it was gone.
        
         | bmitc wrote:
         | Why give such a company even more of your money though?
        
         | judge2020 wrote:
         | > Thank God there weren't pedestrians there.
         | 
         | It would've still stopped - the FSD system doesn't override the
         | background tasks that power the AEB systems that stop the car
         | from hitting humans or cars.
        
           | onlyrealcuzzo wrote:
           | Except that time FSD drove right into a semi truck and killed
           | the guy sleeping in his car...
        
             | blendergeek wrote:
             | Citation needed.
             | 
             | Unless you are referring to these [0] incidents from 2016,
             | before FSD was released. That was Tesla Autopilot (which
             | comes standard on all Tesla vehicles now).
             | 
             | Also, FSD uses active camera monitoring to make sure the
             | driver is paying attention, so no you can't sleep in your
             | while FSD is activated.
             | 
             | [0] https://www.wired.com/story/teslas-latest-autopilot-
             | death-lo...
        
             | sharpneli wrote:
             | You mean it would disable FSD when it would see that impact
             | was inevitable so Tesla could claim that FSD had nothing to
             | do with the accident.
        
               | judge2020 wrote:
               | > To ensure our statistics are conservative, we count any
               | crash in which Autopilot was deactivated within 5 seconds
               | before impact,
               | 
               | https://www.tesla.com/VehicleSafetyReport
        
               | lostmsu wrote:
               | Would make sense with 30+ seconds depending on the level
               | of the warning
        
           | andsoitis wrote:
           | > It would've still stopped
           | 
           | We don't know that. An Tesla cannot claim it.
        
             | 93po wrote:
             | We also don't know the opposite but that sure as heck won't
             | stop people claiming they know it as a fact and cause the
             | publication of dozens of news articles over nothing.
        
               | worik wrote:
               | It is the precautionary principle.
        
           | westpfelia wrote:
           | > It would've still stopped
           | 
           | Would it have though?
           | 
           | https://www.youtube.com/watch?v=3mnG_Gbxf_w
           | 
           | At this point I can't imagine buying anything Tesla related.
           | Or anything related to Elon Musk for that matter. He's a
           | grifter who has managed to stand on the shoulders of giants
           | and call himself tall.
        
           | mcculley wrote:
           | AEB cannot magically instantaneously stop a car that FSD has
           | put into the wrong lane. If you believe otherwise, please do
           | not drive. Cars are dangerous in the hands of those who do
           | not take risks seriously.
        
         | [deleted]
        
         | linsomniac wrote:
         | Tesla Autopilot significantly decreases fatigue on long trips.
         | I have on numerous occasions (I think 10+) driven 2,400 mile
         | trips. You absolutely have to stay aware during the use of
         | Autopilot, but it really helps decrease cognitive load in many
         | instances during a cross country trip.
        
         | sakopov wrote:
         | I don't have a Tesla but I do follow FSD development and a few
         | people who test updates on Youtube. It really seems like your
         | experience with FSD will vary depending on where you live. I
         | see a guy testing FSD updates in downtown Seattle and
         | neighboring residential streets where it seems very impressive
         | driving in traffic, one way and narrow streets with cars parked
         | on both sides. But then I also see it do some very bizarre
         | moves in other cities. I don't know how Tesla collects and uses
         | self-driving data but it seems like it's different from
         | location to location.
        
           | cactusplant7374 wrote:
           | > I see a guy testing FSD updates in downtown Seattle
           | 
           | In downtown Seattle doesn't it drive into the monorail
           | columns?
        
         | tsigo wrote:
         | So to be clear, your car almost drove itself into oncoming
         | traffic with your child in it and your first instinct was "Hey,
         | maybe give it two more tries"?
        
           | timeon wrote:
           | Even bought another car from the company.
        
             | avereveard wrote:
             | The full self driving package act like grammar error on
             | spam emails, self select for people with no understanding
             | of technology. I fully expect that the more wild
             | shenanigans in tesla future will be targeted at them
             | directly.
        
           | pschuegr wrote:
           | I assume that you have snarky comments to spare for the
           | company who legally put this option in his hands as well, is
           | that right?
        
           | blendergeek wrote:
           | Every car "nearly drove itself into oncoming traffic" if the
           | driver doesn't takeover. Its not like he climbed into the
           | backseat and said, "Tesla, takeover". No, he let the car help
           | with the driving, but maintained control of the vehicle to
           | ensure the safety of the child.
        
             | sircastor wrote:
             | So, third-hand story, about 20 years ago, from an
             | acquaintance who heard it from a Police officer who dealt
             | with the following situation:
             | 
             | This police officer was responding to an RV which had run
             | off the road, and was speaking with the driver. The driver,
             | a bit shook up from the experience explained that he was
             | driving down the highway, turned on the cruise control, and
             | got up to make himself a sandwich...
        
               | mcguire wrote:
               | This is literally the story of a Berke Breathed Bloom
               | County (or whatever the follow-on was) comic strip.
        
               | mikestew wrote:
               | "3rd-hand story", from a friend of a friend...uh, huh.
               | Unless I'm missing a heaping bucket of "ironically", you
               | could have just said "there's an old urban legend..."
               | instead of passing off an old joke as something true.
        
               | sircastor wrote:
               | Well, up until a moment ago, I legitimately believed it
               | to be true - even though I was a few steps removed from
               | it. Live and learn I guess.
        
               | sgc wrote:
               | I have been duped like this before too! Believe a story
               | that just doesn't ring right when you tell it to somebody
               | else years later. Teaching / communicating corrects a lot
               | of errors.
        
               | drivers99 wrote:
               | I remember my grandma telling that story maybe 30-40
               | years ago. Gotta be an urban legend. Yup:
               | https://www.snopes.com/fact-check/cruise-uncontrol/
        
               | VBprogrammer wrote:
               | I've no doubt this has probably happened in real life at
               | some point but it's practically a fable by now.
               | 
               | I think the Simpsons done it at least 30 years ago.
        
               | ryantgtg wrote:
               | I saw a "trying out my parking assist" (I think it was
               | with a Hyundai) video the other day where the guy didn't
               | realize that the function only assists with steering and
               | not the pedals. So he backed right into a car.
        
             | masklinn wrote:
             | > Every car "nearly drove itself into oncoming traffic" if
             | the driver doesn't takeover.
             | 
             | Those other cars don't claim to drive themselves.
        
               | max51 wrote:
               | that makes absolutely no difference in the context of the
               | comments above.
               | 
               | If a product being overhyped prevents you from using it
               | after you paid for it, you're gonna have to live with no
               | computer, no phone, no internet, no electricity, no cars,
               | no bikes.
        
               | jchw wrote:
               | I empathize that people are frustrated with the marketing
               | claims of this particular feature, which are clearly
               | bunk, but the point of the post you're replying to is not
               | to defend it, it's to defend that the other commenter is
               | not being negligent and putting their child in danger...
        
               | alistairSH wrote:
               | Maybe not his kid, assuming he has more faith in Tesla's
               | crash-worthiness than its FSD.
               | 
               | But, he'd definitely risking other road users and
               | pedestrians if that car keeps trying to run up sidewalks
               | and cause other havoc on the roadway.
        
               | jchw wrote:
               | If you are fully attentive, you can correct course once
               | you realize a mistake is being made. My Honda Civic has
               | adaptive cruise control and lane keep, and I run into
               | issues with it reasonably often. I'm not complaining:
               | after all, it's not marketed as much more than glorified
               | cruise control. And either way, turning it on is not a
               | risk to me. With any of these features, in my opinion,
               | the main risk is complacency. If they work well enough
               | most of the time, you can definitely get a false sense of
               | security. Of course, based on some of the experiences
               | people have had with FSD, I'm surprised anyone is able to
               | get a false sense of security at all with it, but I
               | assume mileages vary.
               | 
               | Now if the car failed in such a way that you couldn't
               | disengage FSD, THAT would be a serious, catastrophic
               | problem... but as far as I know, that's not really an
               | issue here. (If that were an issue, obviously, it would
               | be cause to have the whole damn car recalled.)
               | 
               | All of this to say, I think we can leave the guy alone
               | for sharing his anecdote. When properly attentive, it
               | shouldn't be particularly dangerous.
        
             | cactusplant7374 wrote:
             | If FSD decides to do max acceleration and turn the wheels,
             | can you stop it in time? Zero to 60 is under 3 seconds,
             | right?
        
               | jonfw wrote:
               | If your brake lines burst, will you be able to coast
               | safely to a stop?
               | 
               | Every piece of technology has a variety of failure modes,
               | some more likely than others. FSD is not likely to take
               | aim at a pedestrian and floor it, just like your brakes
               | aren't likely to explode, and neither of you are
               | irresponsible for making those assumptions
        
               | Eisenstein wrote:
               | What if it decides to short the batteries and set the car
               | on fire? Can you stop it from doing that?
               | 
               | I think you are making scenarios that no reasonable
               | person would assume. There is a difference between
               | 'getting confused at an intersection and turning wrong'
               | and 'actively trying to kill the occupant by accelerating
               | at max speed while turning the steering'.
        
           | [deleted]
        
           | elmomle wrote:
           | Comments like these disincentivize people from sharing
           | honestly. I have full confidence that OP was telling the
           | truth when saying they it was a relatively safe / low traffic
           | environment, and I fully imagine they were paying attention
           | and ready to intervene when FSD made mistakes.
        
             | wpietri wrote:
             | This is true, but they also disincentivize random amateurs
             | from conducting uncontrolled safety experiments with
             | children in the car. I think blame-free retrospectives and
             | lack of judgement are important tools, but I also think
             | they are best used in a context of careful systemic
             | improvement.
        
             | cactusplant7374 wrote:
             | > I fully imagine they were paying attention and ready to
             | intervene when FSD made mistakes
             | 
             | Is that enough? The software could decide to accelerate and
             | switch lanes at such a fast rate that the driver wouldn't
             | have time to intervene. It hasn't happened _yet_ to my
             | knowledge. But it may happen.
        
             | sho_hn wrote:
             | > Comments like these disincentivize people from sharing
             | honestly.
             | 
             | As an automotive engineer: Agreed. Realistic experience
             | reports are useful, and that includes also e.g. what
             | drivers are willing to attempt and how they risk-rate.
        
             | kolbe wrote:
             | People sharing anecdotes isn't productive either. Someone
             | talking about "almost crashes" is a terribly subjective
             | thing. We have thousands of hours of youtube video of FSD.
             | We have some data. And the value add of one commenter's
             | experience is virtually zero.
        
               | rurp wrote:
               | I strongly disagree. It's interesting to hear a
               | thoughtful recounting of a HNers experience.
               | 
               | Tesla releasing the actual raw data would be much more
               | helpful, but of course they are refusing to do that, most
               | likely because it would betray how overhyped, unreliable
               | and dangerous the software is.
        
               | kolbe wrote:
               | What do you want them to release? What does "raw data"
               | mean to you? Does Waymo release this raw data?
        
               | albertopv wrote:
               | Teslas have been driven for millions of hours at least,
               | if not billions, thousands of hours of youtube videos are
               | anecdotes as well proportionally speaking. What about
               | Tesla releasing complete real data? What are they scared
               | about? Until then Tesla claims can't be taken seriously.
        
               | lordnacho wrote:
               | At the very least anecdotes are a place to start thinking
               | about what data to collect. And wherever you think of it,
               | it's established in modern debates that people bring
               | anecdotes as a way to motivate discussion. Maybe it's
               | wrong without a proper statistical study, but it's what
               | people do and have done since forever.
        
           | dsfyu404ed wrote:
           | Pressing a button and then paying careful attention to watch
           | the system perform is VERY different from just engaging the
           | system and trusting it to do its thing.
           | 
           | I think the software is a half-baked gimmick but come on
           | "look guys, I care about children too" variety of in-group
           | signaling with a side of back seat parenting adds less than
           | nothing to the discussion of the subject at hand.
           | 
           | And INB4 someone intentionally misinterprets me as defending
           | the quality of Tesla's product, I'm not.
        
           | voganmother42 wrote:
           | Well and you don't need to go far to find others defending
           | the risk to others as well, "it wouldn't have gone up the
           | curb if there was a person there", its interesting to see how
           | cavalier people normally are with making that judgement for
           | others, especially for their offspring
        
             | three_seagrass wrote:
             | Also consumer bias, or "post-purchase rationalization" -
             | i.e. humans overly attribute positivity to goods/services
             | from brands they buy from.
             | 
             | Even when it's as bad as throwing you in the wrong lane of
             | traffic.
        
           | Lendal wrote:
           | Yes, because it's not something you just try out on a whim. I
           | personally paid $10,000 for the option, and it's
           | nonrefundable. You also have a human desire to help out, be
           | part of something bigger, do your part in the advancement of
           | science & engineering. So yes, you overcome adversity, you
           | keep on trying, and you teach those values to your kids.
           | 
           | Unfortunately, it increasing looks like the experiment has
           | failed. But not because of us. We're pissed because Musk
           | isn't doing his part in the deal. He's not pulling his
           | weight. At this point, he's becoming more and more an anchor
           | around the neck of technological progress. That didn't need
           | to happen. It didn't need to be this way. So yeah, we're
           | pissed off, not just because of the money we paid, but also
           | because we feel like we were defrauded by his failure to pull
           | his own weight in the deal.
           | 
           | I wouldn't be surprised to see him make his escape to the
           | Bahamas before this decade is up.
        
             | cactusplant7374 wrote:
             | > You also have a human desire to help out, be part of
             | something bigger, do your part in the advancement of
             | science & engineering. So yes, you overcome adversity, you
             | keep on trying, and you teach those values to your kids.
             | 
             | Why not restrict your beta testing to a closed course? A
             | race track? Some land you bought out in the middle of the
             | desert? Have the kids make some some stop signs, plow your
             | new roads, etc.
             | 
             | No one else on the road is consenting to a technology that
             | could rapidly accelerate and course correct into their
             | vehicle at some undetermined time.
        
             | rurp wrote:
             | Unfortunately paying that money also provides an incentive
             | to lie about how advanced the functionality and hide
             | unflattering data.
             | 
             | Funding responsible self driving research seems like a
             | great use of money to me, but testing an flawed system in
             | the wild does not.
        
             | selectodude wrote:
             | I'm just confused that she'd buy _another_ Tesla after that
             | experience.
        
               | HWR_14 wrote:
               | Well, she already had paid to install the at home charger
               | for Teslas.
        
               | [deleted]
        
             | mancerayder wrote:
             | >Yes, because it's not something you just try out on a
             | whim. I personally paid $10,000 for the option, and it's
             | nonrefundable.
             | 
             | For those that didn't buy it, you can 'rent' it for a
             | monthly subscription fee.
        
           | solardev wrote:
           | Anyone can make more kids. Not everyone can do more science!
        
             | iab wrote:
             | New life motto, thank you
        
             | pnut wrote:
             | I know this is a joke, but not everyone can make more kids.
        
             | f1refly wrote:
             | We've experiments to run / there is research to be done /
             | on the people who are still ali~ive!
        
             | twic wrote:
             | -- Cave Johnson
        
           | jollyllama wrote:
           | No harm, no foul. GP's child learned a valuable lesson which
           | may serve the child well in the decades to come: don't trust
           | self-driving.
        
             | someweirdperson wrote:
             | I guess the lesson is more for the parent, unless the child
             | will get $10k worth of ice cream less.
        
           | jjulius wrote:
           | Come on, this feels overly aggressive. Circumstances are
           | nuanced, we don't know to what degree of danger any of these
           | situations posed to the child, only the parent does. Judge
           | not lest ye, and such.
        
             | idiotsecant wrote:
             | Beta testing the guidance system for a 4500 lb steel slug
             | in a pedestrian environment is one thing.
             | 
             | Deciding that you want to put your family _into_ that steel
             | slug for the very first test seems to me to be an entirely
             | different level of poor decision making.
        
             | wpietri wrote:
             | Ah yes, surely the meaning of that bible verse is, "Don't
             | ask mildly difficult questions based in any sort of moral
             | stance." Because we all know that book is famously opposed
             | to performing any sort of moral analysis.
        
               | jjulius wrote:
               | There's asking questions about the circumstances to
               | better understand before casting judgement, and then
               | there's sarcastically implying that OP is a bad parent
               | for endangering their child without asking any actual
               | questions about what happened.
        
               | wpietri wrote:
               | That was not sarcasm, which generally requires words used
               | in contradiction to the normal meaning. E.g., if somebody
               | makes a dumb mistake, the response, "nice work, Einstein"
               | would be sarcastic. This was at worst mocking, but it
               | wasn't ever hyperbolic, given that the what was written
               | was a literal description of what the guy did.
               | 
               | Regardless, you haven't answered the point about the
               | quote. "Judge not lest ye be judged" does not mean we
               | have to empty-headedly refrain from any sort of moral
               | criticism. In context, it's about hypocrisy, reminding us
               | to apply our standards to ourselves as stringently as we
               | do others. I think it's only appropriate here if tsigo
               | somehow indicated he would happily endanger his own
               | children, which I don't see any sign of.
        
               | jjulius wrote:
               | Semantics that ultimately don't change the crux of my
               | point, even if I disagree with some of them, but thank
               | you for clarifying.
        
               | idontpost wrote:
        
           | sanedigital wrote:
           | Yes, this is exactly what happened. Afterwards, I went home
           | and lit several candles on my Elon Musk altar, and prayed to
           | the ghost of Steve Jobs that he forgive my reluctance for not
           | completely sacrificing ourselves to Big Tech.
           | 
           | In reality, I had both hands on the wheel and nobody was ever
           | actually in danger. But thanks for your faux-concern.
        
             | mentalpiracy wrote:
             | > nobody was ever actually in danger
             | 
             | not sure you're qualified to make that assertion, simply
             | based on the series of choices you've described yourself
             | making here.
        
             | ryanwaggoner wrote:
             | To be fair, you did say that it literally drove up onto the
             | corner of the intersection and "thank god there were no
             | pedestrians there", which does not make it sound like you
             | were in full control at all times, but rather that it was
             | lucky no one was there or they would have been hit.
        
         | stickfigure wrote:
         | How long ago was that?
         | 
         | A friend of mine bought a Model Y and got access to FSD about
         | six months ago. I've spent a fair bit of time in this car in
         | the bay area and... I'm impressed? It doesn't drive like a
         | professional but it feels safe.
         | 
         | My friend says it's been improving even in just the time he's
         | had it. So maybe it used to be a lot worse?
         | 
         | I'm not in the market for a new car but the next time I am, FSD
         | is going to be a big factor. Even if it's just as good as it is
         | right now.
        
           | jsolson wrote:
           | If it's anything like the original Autopilot was: yes.
           | 
           | I had one of the first Model S vehicles that was autopilot-
           | capable. Early enough that autopilot itself was added later.
           | The early versions were... intense. Not just in the "if you
           | take your hands of the wheel it might try to kill you"
           | intense, but also in the "even using this as adaptive cruise
           | with lane-keeping, sometimes it will suddenly try to veer off
           | the road and murder you" sense. Even when working "as
           | intended" it would routinely dip into exit ramps if you were
           | in the right lane. As a result, I didn't use it all that
           | often, but over not a lot of time it improved pretty
           | dramatically.
           | 
           | At this point my wife and I are on newer Model 3s, and our
           | experience with autopilot (not FSD) as a frequently-used
           | driver assist has been... uneventful? Given the _original_
           | autopilot experience, though, neither of us is particularly
           | eager to try out FSD yet. Autopilot strikes a good balance
           | for us in terms of being a useful second pair of eyes and
           | hands while unambiguously requiring us to drive the damn car.
        
         | GaryNumanVevo wrote:
         | Maybe I'm old fashioned but I've got no intention of buying /
         | enabling FSD on my 2020 Model X. I just want a car that I can
         | drive from point A to point B. I'm not even that risk averse,
         | but enabling a beta feature on the road with a bunch of other
         | drivers who barely know what they're doing is a stretch.
        
       | jsight wrote:
       | This seems like a pretty typical video in an area that FSD
       | doesn't handle well: https://www.youtube.com/watch?v=xIvfNHwZsCc
       | 
       | I bet this is extremely common.
       | 
       | To be fair, we've never seen a test of Waymo or Cruise on the
       | same road, as they are only available on certain specific roads
       | that have been well tested. I'm not making a judgment here as to
       | whether that is good or bad.
        
         | jfoster wrote:
         | There's so many decent quality FSD videos on YouTube. Why watch
         | the one where you can hardly see what's in front of the car?
        
           | jsight wrote:
           | Credibility. That's Jason Hughes who owns a third party shop
           | that repairs Teslas. He has a long history of reporting
           | fairly, both positive and negative. He also lives in a fairly
           | rural area that hasn't gotten a lot of attention from FSD
           | YTers.
           | 
           | The fact that some of his first FSD trips showed major issues
           | is very telling.
        
       | ryanwaggoner wrote:
       | I don't understand how regulators and insurance companies are so
       | asleep at the wheel (hehe) around this. It seems insane to me
       | that you can just start selling a car that you claim can drive
       | itself without any kind of certification or safety data being
       | provided.
        
       | 93po wrote:
        
         | stnmtn wrote:
         | What an incredible insipid comment. I'm sure if you extrapolate
         | anything to the logical extreme it sounds silly; it doesn't
         | make a point beyond displaying that you think so highly of this
         | car company that you want to disregard all criticism of it.
        
       | p0pcult wrote:
       | Has anyone posted on "free speech absolutism" Twitter 2.0 yet?
       | Wonder what happens to that account?
        
       | agumonkey wrote:
       | Honestly surprised, some youtube fans did test recent updates
       | (recent as in 6 months ago) and the SDV software was capable of
       | dealing with complex high speed, irregular traffic
       | crossroads/ramps.
        
       | dcow wrote:
       | Anecdotally this matches my experience too. I'm not even in the
       | FSD beta and with every software update plain old auto-pilot gets
       | less intelligent. I don't know if regulators are restricting what
       | can be done, if Tesla is self-censoring to avoid PR issues, or
       | what, but instead of being helpful like it used to be it's mostly
       | just frustrating. I find myself using it less an less.
       | 
       | All that's said from someone who otherwise loves their Tesla. Has
       | saved many thousands on gas. Hasn't had any reliability or "panel
       | gap" issues. And would likely buy a Tesla again.
        
       | jefft255 wrote:
       | I'm the first to complain about Tesla overselling their self-
       | driving abilities, but I think the numbers used in this article
       | are misleading. Miles per disengagements comparisons between
       | Waymo (to pick one) and Tesla is like apples to oranges. Waymo
       | operates in a closed fashion, is a laboratory-like setting (1),
       | and I don't believe for one second that their miles per
       | disengagements would be significantly better than Tesla in the
       | real-world open settings that Tesla's FSD is being tested in.
       | These numbers mean nothing when the environments used to compute
       | them are so vastly different.
       | 
       | (1) Here's what I think is the case for Waymo's operations: -
       | Phenix AZ only (maybe one more city?) -- Amazing sunny weather --
       | Clear line markings -- Can overfit to said city with HD maps,
       | handcoding road quirks, etc - Waymo employees only -- Not to
       | sound to tinfoil-hat, can we really trust this data - Even within
       | Phenix, some filtering will happen as to which route are possible
        
       | hello_friendos wrote:
        
       | [deleted]
        
       | jquery wrote:
       | After seeing what musk did to twitter it's pretty clear the
       | emperor has no clothes, and his main asset is being a tyrant and
       | taking the surplus value of the 80-100 hour weeks his peons do
       | for their lord.
       | 
       | Musk is no Steve Jobs, despite clearly seeing himself that way.
       | Full Self Driving will get nowhere under Musk.
        
         | 93po wrote:
         | Comparing Musk to Jobs is ridiculous. The scope of what they've
         | done and what they're aiming for is vastly different. One
         | person made cool phones, the other one is putting humans on
         | mars. Why would you think Elon wants to be a computer seller?
         | 
         | > his main asset is being a tyrant and taking the surplus value
         | of the 80-100 hour weeks his peons do for their lord.
         | 
         | Cool, I agree, can we start holding everyone accountable that
         | does this? Can we maybe start with the ones that use actual
         | slave labor? Where are the dozen articles a day about those
         | people?
         | 
         | > Full Self Driving will get nowhere under Musk.
         | 
         | It's gotten farther than literally any other organization on
         | the planet.
        
           | rootusrootus wrote:
           | > the other one is putting humans on mars
           | 
           | Hasn't yet. And IF that happens, and if it turns out to be
           | SpaceX that enables it, thank Gwynn. Musk is many things, but
           | this idea that he's some kind of genius engineer is extremely
           | insulting to all the actual rocket engineers.
        
       | wnevets wrote:
       | Does anyone else remember being told Tesla owners would be able
       | to rent their cars out automatically to Uber/Lyft when they
       | weren't using them because of the FSD?
        
       | manv1 wrote:
       | People (and technical people) like to believe that one data point
       | isn't important, that it's an anecdote and doesn't matter.
       | 
       | But as a lawyer once said, "I convict with anecdotes."
       | 
       | In any case the reports are interesting because with them you can
       | start to understand the shape of the autopilot's capabilities and
       | limitations. IMO the individual reports (like the ones here) are
       | nice and detailed; the kind of summary data the article wants is
       | pointless, because it doesn't include any real detail in it. Your
       | autopilot worked on a 400 mile stretch of empty Kansas highway,
       | big deal.
        
       | cramjabsyn wrote:
       | Rushing half finished cars out the door, and over promising on
       | capabilities/updates is such a flawed model. I am amazed by how
       | many people bought in, and will be really interested to see how
       | many people replace with another Tesla as legacy brands bring EVs
       | to market.
        
       | izzydata wrote:
       | Under no circumstance will I ever let a vehicle I am in drive
       | itself. If you can't nearly mathematically prove the software
       | will function as expected I am not putting my life in its hands.
       | From the sounds of it a lot of it is controlled by what is
       | effectively a machine learning black box.
       | 
       | No thanks.
        
       | diebeforei485 wrote:
       | Obviously not all miles are equal, and the others in the chart
       | (Waymo, etc) only run their vehicles in specific areas that they
       | have previously mapped onto their own system.
        
       | Groxx wrote:
       | If it disengages rather than eagerly plowing into something,
       | that's almost certainly for the better. This could _easily_ be
       | explained by a renewed focus on safety after massive amounts of
       | negative press, and that would be a good thing.
       | 
       | Is that the reason? Not a clue! Am I optimistic about Tesla's
       | FSD? Not even slightly! But the "data" here is extremely shaky
       | and there are many possible explanations beyond "getting worse".
        
       | gwbas1c wrote:
       | I have a Model 3 with enhanced autopilot. My wife wanted a Model
       | Y. I find the enhanced autopilot so glitchy that I decided to
       | just stick with regular autopilot.
       | 
       | Guess what: I like regular autopilot better!
        
       | danso wrote:
       | Earlier this year when Mercedes announced its Level 3 "Drive
       | Pilot" system [0], a lot of Tesla stans mocked its limitations,
       | which to be honest, _are_ quite numerous on the face of it:
       | 
       | - Only allowed on limited-access divided highways with no
       | stoplights, roundabouts, or other traffic control systems
       | 
       | - Limited to a top speed of less than 40 mph
       | 
       | - Operates only during the daytime and in clear weather
       | 
       | But the big promise from Mercedes is that it would take legal
       | liability for any accidents that occurs during Drive Pilot's
       | operation, something that Tesla doesn't appear to be even
       | thinking about wrt Autopilot and FSD.
       | 
       | I would love someone to goad/challenge Tesla to step up to
       | Mercedes. If FSD is so much better than Drive Pilot, then why
       | doesn't Tesla agree to provide a "safe mode" for FSD, that
       | operates with the exact same restrictions as Mercedes' D-P, and
       | offers the same legal protections to any users who happen to get
       | into accidents during "safe mode" FSD operation?
       | 
       | [0] https://www.roadandtrack.com/news/a39481699/what-happens-
       | if-...
        
         | drstewart wrote:
         | But can Mercedes definitively prove their technology will never
         | result in an accident, like people in this thread are demanding
         | of Tesla?
        
           | lolinder wrote:
           | They're willing to take liability for it, so they're
           | confident enough that their legal team and accountants are
           | satisfied. If Tesla were at _that_ point I think most people
           | here would be content, no need to definitively prove
           | anything.
        
             | drstewart wrote:
             | Ah yes, all's I have to do to get justice if I get killed
             | by a Mercedes run amok is to take on a multi-billion dollar
             | legal team. That makes me confident.
        
               | jonfw wrote:
               | A big company being responsible for a crash is best case
               | scenario. You would much rather sue Mercedes than Joe
               | Shmoe for an accident, no doubt. They've got deep
               | pockets, and your local courts are not particularly
               | friendly to them
        
               | hedora wrote:
               | Well, if you have life insurance, your insurance company
               | will be the ones suing Mercedes, and likely have an even
               | scarier legal team.
        
               | tiahura wrote:
               | There's no subrogation for life insurance.
        
               | Mordisquitos wrote:
               | The fact that they accept liability is precisely to avoid
               | [your next of kin] needing to _" take on a multi-billion
               | dollar legal team"_ if you get killed by a Mercedes run
               | amok. Now then, if on the other hand you were to get
               | killed by a _Tesla_ run amok...
        
           | dekhn wrote:
           | Nobody expects that any self-driving car technology will
           | never result in an accident. That's impossible and not a
           | reasonable goal.
        
             | twblalock wrote:
             | A lot of people seem to think that even a single accident
             | is unacceptable. Quite a few of the comments on this site
             | and others about self-driving cannot be explained without
             | understanding that the poster has that belief, at least
             | implicitly.
             | 
             | We are lucky that our ancestors were not so risk averse,
             | because if they were we would not have cars at all, or
             | airplanes.
        
               | dekhn wrote:
               | Yes, I see some people promote that idea but that was
               | never the expectation on the part of the self-driving car
               | creators, or the regulators. They also don't expect cars
               | to be able to solve complex philosophical questions
               | regarding trolleys. Nor does the general public have that
               | expectation.
        
           | justapassenger wrote:
           | Any person who knows what they're talking about is asking
           | Tesla to have an appropriate development process for safety
           | critical systems.
           | 
           | Tesla's system are unsafe, by default, if they don't follow
           | safety life cycle. And they don't - I saw dick-sharing apps
           | that had better life cycle processes.
        
           | JustSomeNobody wrote:
           | You can't prove a negative. Nobody sensible has ever demanded
           | this of Tesla.
        
         | [deleted]
        
         | Fricken wrote:
         | I'd been saying for years before anyone had L3 out that the
         | working definition of L3 is simply that the manufacturer will
         | assume liability while the vehicle is in charge.
        
         | lolinder wrote:
         | Yep.
         | 
         | Mercedes has these limitations not because their tech is less
         | capable than Tesla's but because Mercedes is a real car company
         | with real engineers and a gold-standard reputation to maintain.
         | 
         | Tesla, in contrast, is a software company that is trying to
         | take "move fast and break things" into the two-ton 75mph
         | vehicle space, with predictable results.
        
           | traceroute66 wrote:
           | > Mercedes is a real car company with real engineers and a
           | gold-standard reputation to maintain.
           | 
           | Not only that, but they have a quality reputation to uphold
           | with their domestic (German) and regional (European) market.
           | 
           | Your average discerning German car buyer (i.e. the sort who
           | has a Porsche 911, or a higher-end Audi/BMW/Merc) in their
           | garage will swiftly tell you about numerous problems with the
           | Tesla before they've even sat in it.
           | 
           | Panel gaps, for example. They mean a lot to your average
           | discerning German, and your average Tesla has them by the
           | bucket load.
           | 
           | In fact, the German in-joke is that the reason Tesla built a
           | factory in Germany is so that the Germans could (try to)
           | teach them how to fix the panel gaps. :-)
        
             | ChuckNorris89 wrote:
             | _> so that the Germans could (try to) teach them how to fix
             | the panel gaps_
             | 
             | Not a Tesla owner or fan but I have a question that's
             | really bugging me now: Which customers really care about
             | panel gaps?
             | 
             | Do average joes, SUV driving soccer moms or suburban white
             | collar workers go around with a ruler measuring their panel
             | gaps with others like "yeah your Tesla is cool and all, but
             | sorry, my Audi A5 has much tighter panel gaps which is what
             | matters most you know"?
             | 
             | When has the panel gap became "the benchmark" indicative of
             | car quality beyond the body shell?
             | 
             | Like, if the panel gaps are the only thing you can find
             | wrong in a car, then it must be a really really good car,
             | right?
             | 
             | Is there any proven evidence that the panel gaps corelate
             | to quality and reliability of the rest of the car, or is it
             | just a myth of the car enthusiasts community that got
             | spread around and went too far? I get it, some Tesla are
             | unreliable and built poorly, but it's not because they have
             | big panel gaps. The reverse is also true for many cars, so
             | this isn't a rule.
             | 
             | Sure, if you want to measure and compare panel gaps, then
             | by all means go ahead and measure panel gaps, but please
             | don't pretend they mean anything more than that, and that
             | it's somehow an indicative for the car's overall quality
             | and reliability, because so far there hasn't been any proof
             | of this correlation.
        
               | pandaman wrote:
               | Re: panel gaps. People do tend to notice when they cars
               | whistle and sport leaves, hair from the car washer's
               | brush, and other debris on their body.
        
               | qzx_pierri wrote:
               | No one cares about panel gaps. A lot of people on HN
               | despise Elon and become relentless pedants when
               | discussing any of his products/initiatives.
        
               | tqi wrote:
               | Aren't those cars (McLaren/Ferrari) also horribly
               | uncomfortable and lacking in a lot of other amenities
               | (like sound systems or tech)? It feels like those cars
               | are a completely different category of good, and trying
               | to measure them on the same scale is misguided.
               | 
               | To me, panel gaps are a proxy for how much faith you have
               | in your consistency and quality control.
        
               | ChuckNorris89 wrote:
               | _> To me, panel gaps are a proxy for how much faith you
               | have in your consistency and quality control._
               | 
               | I beg to differ. Modern German cars might have panel gaps
               | tighter than a nun's fanny, but their reliability,
               | especially after warranty is over, is so awful than in no
               | way can I say that they represent quality. Those quality
               | cars went away in the late '80s early '90s when the
               | engineers got replaced by the bean counters and cars
               | became white goods with many critical parts outsourced to
               | the lowest bidding contractor, that must look 'cool' in
               | the showroom, but fail the second the warranty runs out,
               | or many times even before that.
               | 
               | To me, the panel gaps are a superficial metric of quality
               | and prove nothing of substance that goes beyond body
               | shell.
               | 
               | Why don't we measure quality by how reliable a car is
               | over time and how long it lasts? Surely that would prove
               | good consistency and quality control on the
               | manufacturer's side, no?
               | 
               | Tight panel gaps only shows how much efort the
               | manufacturer has put in the body, but says nothing about
               | the quality and reliability of the electronics and
               | mechanics, which is what really matters in a car for most
               | people.
        
               | iancmceachern wrote:
               | Tesla was 19 out of 24 for reliability. BMW was 10th, in
               | a recent survey.
               | 
               | https://www.google.com/amp/s/www.newsweek.com/tesla-
               | ranked-n...
        
               | ChuckNorris89 wrote:
               | Does that prove a direct correlation between reliability
               | and panel gaps, or could it be merely a coincidence?
               | 
               | According to the article:
               | 
               |  _> "the Audi E-Tron and Volkswagen ID.4 were singled out
               | as being unreliable"_
               | 
               | So if Audi and VW are also unreliable then the panel gaps
               | prove as a poor signal for reliability, which was my
               | original point.
               | 
               | Edit after your reply below: Sure, Tesla has poor
               | reliability, but not because it has poor panel gaps.
               | Those two can be completely disconnected. Just because
               | they coincide sometimes, doesn't make this a rule of
               | thumb like some car snobs try to convince you of.
               | 
               | You can easily have cars with great panel gaps that are
               | incredibly unreliable, and vice versa. Panel gaps mean
               | nothing more than panel gaps.
        
               | iancmceachern wrote:
               | I'm just pointing out that we can have both reliability
               | and tight panel gaps. We have the technology. They are
               | separate things.
               | 
               | As are different models within a single manufacturer. I
               | love my Ford, would highly recommend it, but would never
               | buy or recommend a Pinto or a Bronco II.
               | 
               | The point people typically make is that Tesla has
               | uncommonly poor panel gaps, which point to poor quality
               | and tolerance control in their manufacturing. This is a
               | complex skill that automakers have been refining for 100
               | years. It is indicative of something, just as the quality
               | of paint job indicates the care and quality with which a
               | hot rod was built.
        
               | watwut wrote:
               | Aren't those sport cars, basically? Luxury sport cars
               | that don't sell comfort at all - they sell power, speed
               | and "I am cool cause I am powerful and fast" look.
        
               | prepend wrote:
               | I care about panel gaps and one of the reasons I didn't
               | consider teslas when buying a new car is the build
               | quality.
               | 
               | Not sure how much this matters as Tesla's sales are
               | really high, but I think this basic stuff is important.
        
               | [deleted]
        
               | pookeh wrote:
               | The super rich buy supercars for the increase in their
               | own perceived value (i.e wow look how rich this dude is)
               | 
               | The average joe buys a car because of the value it gets
               | them (because every dollar matters)
        
               | ChuckNorris89 wrote:
               | Umm, except the cars the average joe buys depreciate in
               | value, while the supercars the rich buy usually apreciate
               | in value, kind of like art, so wouldn't it make more
               | sense that panel gaps are more important for that market?
               | 
               | Does having tighter panel gaps help with the resale value
               | for the average joes?
        
               | shuckles wrote:
               | This isn't true. Your average super car does not
               | appreciate in value when you consider factors like
               | maintenance and the fact that you have to buy a bunch of
               | other garbage to even be put on the delivery list for a
               | desirable car's production. For example, actually buying
               | a top spec 911 isn't feasible if you don't have a good
               | relationship with your dealer.
               | 
               | Notwithstanding the fact that the market for super cars
               | is nothing like the market for Teslas or 5ers.
        
               | nluken wrote:
               | Supercars absolutely depreciate in value minus select
               | limited releases (which holds true for non-supercars as
               | well). Look at standard Lamborghini Gallardos, Ferrari
               | F430s/458s, and Aston Martins of any model and you will
               | see that some of these cars are worth less than half what
               | their original buyers paid for them.
        
               | ChuckNorris89 wrote:
               | Fair point, that was a bad example on my end. Edited.
               | 
               | But my point stands, that panel gap as a benchmark alone
               | is no measure of quality or any other metric.
        
               | epistasis wrote:
               | I think that most car buying is motivated by far far more
               | than delivered value. There's so much status and image
               | wrapped up in cars that thigh there are some who care
               | little about the car, nearly everyone chooses something
               | that fits their perception of themselves.
               | 
               | The reason that there are so many super-expensive pickup
               | trucks on the road is not because people are hauling
               | around things that require a pickup, for example. And
               | when combined with the face that pickup beds are becoming
               | increasingly useless...
        
               | Bhilai wrote:
               | > Do average joes, SUV driving soccer moms or suburban
               | white collar workers go around with a ruler measuring
               | their panel gaps with others like "yeah your Tesla is
               | cool and all, but sorry, my Audi A5 has much tighter
               | panel gaps which is what matters most you know"?
               | 
               | Average Joes and soccer moms in a typical suburban area
               | cannot even afford Teslas and are extremely happy with
               | their Odysseys, Seinnas and Pacificas. Even an Audi A5 is
               | cheaper than a Model 3 in most cases and Audi's ride
               | quality and cabin noise is extremely better than any
               | Tesla I have ridden in. Audi interiors (though not as
               | good as its other German competitors) beat Tesla by a
               | mile.
        
             | sho_hn wrote:
             | > In fact, the German in-joke is that the reason Tesla
             | built a factory in Germany is so that the Germans could
             | (try to) teach them how to fix the panel gaps. :-)
             | 
             | Speaking more seriously on that front, there's been the
             | acquihire of Grohmann Engineering.
        
             | purpleblue wrote:
             | I currently have a Tesla Model Y and a Porsche Macan. The
             | Macan feels more luxurious, but the Model Y is easier to
             | drive because of its unearthly acceleration.
             | 
             | The biggest thing that Tesla got right besides the
             | acceleration is the value-add features like Sentry mode,
             | the tight integration with the phone, the things like Walk-
             | away locking (although I would very much prefer it lock
             | closer to my car because it's about 40-60 ft away when it
             | locks and it makes me nervous).
             | 
             | The build quality is cheaper, the sound system sucks, and I
             | generally despise how many things are tied to the screen. I
             | want buttons and knobs for the air system, to have to hunt
             | for that on the screen is very dangerous, I hate it.
             | 
             | The Porsche feels more luxurious but it's mindblowing how
             | they get so many things wrong. The dashboard is much too
             | complicated with a lot of redundant buttons. Something as
             | simple as there's no place to put my phone, there's only
             | extremely awkward locations that cause my phone to fly
             | around the car when I go around any turn, which is so dumb
             | in 2022. The backup camera has way too narrow field of
             | view, I've almost backed into 2 cars in parking lots,
             | something the Tesla got right. At least they have Apple
             | Carplay, but activating it is extremely annoying.
             | 
             | I also have BMWs before (X3 and X6) and overall my favorite
             | car of all time is the X6.
        
               | Melatonic wrote:
               | Honestly I do not get how they go away with so much
               | touchscreen - that should be illegal honestly. You very
               | quickly learn the buttons and knobs in your car for basic
               | activities and do not even need to look. Not to mention a
               | single point of failure with that touchscreen!
        
             | heipei wrote:
             | > In fact, the German in-joke is that the reason Tesla
             | built a factory in Germany is so that the Germans could
             | (try to) teach them how to fix the panel gaps. :-)
             | 
             | As a German I can certainly say that folks here really pay
             | attention to where cars are manufactured. Tesla Model Y's
             | available on various platforms are boldly advertised as
             | "Manufactured in Grunheide". At the same time people are
             | aware that some domestic models (such as the Mercedes EQS
             | SUV) are solely manufactured in the US and then shipped
             | over, which are sometimes conceived as lower quality as a
             | result.
        
             | elorant wrote:
             | Fuck panel gaps. Just look at the quality of the interior.
             | You're paying 100k for a car that has worse quality than a
             | Ford. Everything is plastic, there are no button knobs
             | anywhere, no panel in front of the driver, the leather on
             | the seats doesn't feel like leather etc. I mean I get it
             | that half the price of the car is in batteries and R&D, but
             | still you can't even compare it to a 50k Volvo. It's just
             | crap. And now that the big manufacturers are moving into
             | electric cars Tesla's got a lot of serious completion to
             | face from companies who know how to treat a customer who's
             | paying big bucks.
        
               | ryantgtg wrote:
               | Serious competition is good!
               | 
               | I just replaced my Audi Q5 with a Tesla Model Y (which...
               | wasn't $100k). No panel gaps, no other problems, and the
               | overall quality feels nicer than the Audi. Shrug!
               | 
               | Anyway, yeah, the next few years look really exciting for
               | consumer EVs. So many announcements in 2022.
        
               | Something1234 wrote:
               | The Audi Q5 had issues with the infotainment IMO. It
               | wasn't stock android auto like in my Hyundai. The voice
               | assistant button always routed to the audi voice thing
               | (completely worthless I know what I want google to do,
               | now connect me to google). The buttons felt over
               | complicated.
               | 
               | Although one feature I really liked was that it would
               | tell me what the biggest consumers of power was. I would
               | definitely like to have a car with a heated steering
               | wheel in the winter here, but that isn't in the cards.
        
               | xxpor wrote:
               | If you're paying 100k for a model 3, you're getting
               | screwed.
        
               | cutenewt wrote:
               | I was seriously considering a Model 3 for my new car, but
               | I'm so glad I passed.
               | 
               | On the Model 3 subrreddit there are endless post about
               | reliability including ones I never thought would be an
               | issue.
               | 
               | Some recent ones include driver side mirror housing
               | falling off and snowing entering the trunk compartment.
        
               | sixQuarks wrote:
               | Have you looked at forums for other cars? There's tons of
               | complaints for every model. I've had my model 3 for 5
               | years with zero issues.
        
               | brewdad wrote:
               | I can't recall mirrors falling off or trunks so poorly
               | sealed as to allow snow inside with other car makers.
               | 
               | Maybe on a Yugo but nothing that purports to be anything
               | better than a cheap as possible econobox.
        
           | areoform wrote:
           | I feel that casting TSLA as a company without "real
           | engineers" isn't helpful nor is it truthful.
        
             | lolinder wrote:
             | For me, the measure of whether an engineer counts as "real"
             | is if they're empowered to say no.
             | 
             | You can have hundreds of qualified people who are called
             | "engineers" and would be excellent in another environment,
             | but if the culture is "If I say jump, you ask how high" you
             | don't have an engineering culture, you have an autocracy.
        
           | misiti3780 wrote:
           | Tesla isnt a real car company? Please climb back under your
           | rock.
        
           | jsjohnst wrote:
           | > Mercedes is a real car company with real engineers and a
           | gold-standard reputation to maintain
           | 
           | Totally agree with all your points against Tesla, but "gold-
           | standard reputation" for Mercedes? Based on what? They are
           | consistently rated as one of the worse brands reliability
           | wise (Tesla usually being worse, but still).
        
             | lcnPylGDnU4H9OF wrote:
             | "Nothing's more expensive than a cheap Mercedes."
             | 
             | Because they have a reputation of breaking down a lot, and
             | a "cheap" Mercedes is still a Mercedes which is fixed using
             | Mercedes-priced parts.
        
               | SoftTalker wrote:
               | It's interesting. I only buy older used cars. Mercedes is
               | a brand I have owned a couple of times. Outside of a few
               | specific engines and models, they are mechanically very
               | reliable and very solidly built. Moreover, especially the
               | older cars, are quite easy to work on for a home
               | mechanic. Parts are readily available and are not really
               | more expensive than for any other car I've owned, which
               | includes several other German as well as Japanese and
               | American brands.
               | 
               | If you are in that market, and stick with the older
               | models that are proven to be reliable, they are pretty
               | safe buys. Like any used car, a lot depends on the care
               | given by the prior owner, but people who buy Mercedes
               | cars new tend to have at least above-average income, and
               | can afford to maintain them properly.
        
           | haberman wrote:
           | Agree. Which makes it all the more frustrating that popular
           | press focuses on irrelevant distractions, like the fact that
           | if you try really hard, you can defeat the protections
           | designed to ensure that the driver is ready to take over:
           | https://www.consumerreports.org/autonomous-driving/cr-
           | engine...
           | 
           | A car is never going to prevent a determined individual from
           | doing stupid things. But it is a big problem that people who
           | are trying to be responsible are misled about what Tesla's
           | "Full Self Driving" can actually deliver.
        
           | revnode wrote:
           | So Mercedes' solution is to offer a product that isn't
           | usable? Why bother releasing it?
        
             | tsimionescu wrote:
             | It's the other way around. Tesla is selling an extremely
             | expensive beta test that you can't use at all anywhere in
             | any conditions with any kind of safety expectations.
             | 
             | Mercedes is selling a product that has a small set of well-
             | defined cases where it can actually be used.
        
             | lolinder wrote:
             | It's perfectly usable in its intended scope: it allows you
             | to focus on other things while driving in heavy traffic.
             | When they're confident that they can do so safely, they'll
             | extend it to other situations.
             | 
             | Mercedes explains the purpose in their press release:
             | 
             | > Conditionally automated driving on suitable motorway
             | sections where traffic density is high
             | 
             | > During the conditionally automated journey, DRIVE PILOT
             | allows the driver to take their mind off the traffic and
             | focus on certain secondary activities, be it communicating
             | with colleagues via In-Car Office, surfing the internet or
             | relaxing while watching a film. In DRIVE PILOT mode,
             | applications can be enabled on the vehicle's integrated
             | central display that are otherwise blocked while driving.
             | 
             | https://group-media.mercedes-
             | benz.com/marsMediaSite/en/insta...
        
             | binarymax wrote:
             | It's useable in certain situations that have a high
             | probability of safety, and allows them to capture data and
             | grow the program safely over time.
        
             | ahakki wrote:
             | With it's current limitations, the only application for
             | Mercedes' solution I can think of is during heavy traffic
             | on highways. But calling it "not usable" does seem a bit
             | harsh.
             | 
             | Of course if you prefer, move fast and brake... maybe
        
               | ethanbond wrote:
               | Traffic on highways is also by far the most frustrating
               | part of driving basically for the same reasons it's an
               | easy-ish target for automation, so seems like a pretty
               | good place to start.
               | 
               | IMO that's just good product strategy.
        
             | vanilla_nut wrote:
             | Tesla's solution offers a product that occasionally tries
             | to kill you and people around you. The only reason it
             | doesn't is because drivers are forced to pay attention and
             | take over at a moment's notice at all times.
             | 
             | Mercedes' solution is a car company taking actual
             | responsibility for their software. If they feel the
             | lawsuits/insurance claims/legal snafus are worth the risk,
             | that means their software is probably pretty damn good in
             | that limited scope. Otherwise they could literally bankrupt
             | the company with lawsuits! That's a lot more confidence
             | inspiring to me than Elon's repeated pie-in-the-sky claims.
        
             | newaccount74 wrote:
             | When I commuted in the city, there was a traffic jam almost
             | every day, and I'd be stuck 15-20 minutes driving at
             | walking speed. On especially bad days it could be up to
             | 45min.
             | 
             | If I could have read my emails in that time it would have
             | been really nice.
        
               | pantalaimon wrote:
               | The proper solution to this is trains
        
               | philippejara wrote:
               | everyone knows that public transport is the solution to
               | it but I can't buy a railroad track and a train to take
               | to work so people are going to do what they can do
        
           | _the_inflator wrote:
           | Tesla's mockery during the bull market is finally coming to
           | an end.
           | 
           | Being cocky and funny without delivering great results is
           | simply embarrassing.
           | 
           | I think Mercedes and other automakers have a good chance to
           | bypass Tesla now since Elno is captivated by his Twitter
           | acquisition.
        
             | rnk wrote:
             | I think you mean to say Elmo
        
               | klipt wrote:
               | Come now, Elmo and Elon may both be childish Muppets, but
               | Elmo at least has the excuse of actually _being_ a child
               | in an educational kids show.
        
             | jethro_tell wrote:
             | IDK, it seems like the people doing the work might have a
             | better grasp on delivering for tesla than musk. I don't
             | imagine tweeting promises that the engineers know they
             | can't keep is that useful.
        
             | bnjms wrote:
             | > I think Mercedes and other automakers have a good chance
             | to bypass Tesla now since Elno is captivated by his Twitter
             | acquisition.
             | 
             | I've been disappointed in Elon for some time and feel
             | society may have already taken him for all of his good
             | ideas.
             | 
             | So I'd bet the opposite. This is a chance for Tesla to
             | recover and match the established auto makers. And the only
             | chance they'll get.
        
               | snotrockets wrote:
               | So far, Tesla has been incapable of stepping up to be a
               | major car manufacturer. There are many small car
               | manufacturers, some build exquisite cars that are
               | technological and design marvels, but what Ford realized
               | early, and Nagoya perfected (just consider how many
               | modern op practices originated there!) is that it's not
               | about the machine, but about the infrastructure.
               | 
               | This is both pre-sale, where you have to build a lean,
               | mean, fast pipeline from vendors to assembly, and post-
               | sales, where you have to have service infrastructure that
               | spans continents, if not the world.
               | 
               | Tesla so far shows very little signs of being able to do
               | either. And just like Whitley failed and ended up being
               | bought by Mumbai, my personal bet (caveat lector!) is
               | that Tesla-the-brand might survive, but Tesla-the-car-
               | manufacturer would end up a subsidiary of a Chinese car
               | manufacturer, who has the car manufacturing chops, but
               | can't build a brand.
        
               | tiahura wrote:
               | VW made about 4.5 Billion in profit in Q3. Tesla made
               | about 3.3 Billion.
               | 
               | Seems pretty major to me.
        
               | erikstarck wrote:
               | Yeah, and highest margin by far in the auto industry.
               | While growing 50% per year.
        
               | HWR_14 wrote:
               | They are the highest margin because EVs are still supply
               | limited. How smart people think that margin will survive
               | once there is competition amazes me.
        
               | tiahura wrote:
               | Nobody thinks it will.
               | 
               | However Tesla makes about $9000 in profit per car. If
               | that is cut in half it still beats Toyota's $1200 by a
               | wide margin.
        
               | davidcbc wrote:
               | Now that Elon has ruined his carefully crafted PR image
               | of a real life Tony Stark we'll see how long that growth
               | continues.
        
               | Domenic_S wrote:
               | Only the terminally online think people buy Teslas
               | because of Elon
        
               | brewdad wrote:
               | I will never buy a Tesla because of Elon.
        
               | XorNot wrote:
               | No one thinks people buy Tesla's because of Elon.
               | 
               | But people absolutely will _not buy_ Tesla 's because of
               | Elon.
               | 
               | Brands work hard to avoid negative sentiment for a
               | reason.
        
               | epistasis wrote:
               | I have a Tesla, and love it (at least as much as I could
               | ever love a car, I'd prefer a car-free life honestly).
               | 
               | The worst thing about it isn't the panel gaps or
               | reliability (haven't had any problems). The worst part is
               | Elon Musk and his fans. Shortly after getting the car
               | three and a half years ago, I was leaving an outdoor
               | party and a man who was a Musk superfan was doing that
               | waving of arms of worship that you sometimes see fans do
               | at metal conferences, and it was just embarrassing.
               | Previously the same man had been gushing about full self
               | driving, and I said there was no chance it would be
               | delivered on time, if ever, and he professed his undying
               | trust in Musk.
               | 
               | Combined with Musk's recent anti-Ukraine efforts, his
               | hyper-partisan paranoia thay he's trying to push on
               | Twitter, his hate for trans people, his hate for
               | biological science exhibited throughout the pandemic and
               | even today in his "jokes" about prosecuting Fauci, Musk
               | is waging cultural war against every single aspect of my
               | identity.
               | 
               | I hate Musk so so so much, and I know he had almost
               | nothing to do with the creation of the car I like, but it
               | still pains me everytime I get in it to know that I
               | helped such a despicable person make a ton of money.
               | Never again will I buy a Tesla, especially since there
               | are now competitors. I'm sure I would hate all the rest
               | of the auto execs almost as much if I knew as much about
               | them as I know about Musk, but the nice thing is that I
               | _dont_ know a damn thing about Stellaris 's CEO, from
               | their name to their former partners. And there's a lot of
               | value to that, as a customer.
               | 
               | Maybe Musk is just trying to win over conservatives and
               | jackasses to but Teslas, but I doubt it. I think he's
               | just a dangerous fool.
        
               | api wrote:
               | "You die a hero or live long enough to become a villain."
        
           | MomoXenosaga wrote:
           | The difference between breaking Firefox nightly and writing
           | software for a nuclear power plant.
        
         | [deleted]
        
         | davidkuennen wrote:
         | It's actually 80 mph in Germany now [0], which makes this such
         | a great and useful feature. It really feels like the future.
         | 
         | [0] https://www.therobotreport.com/un-allows-autonomous-
         | vehicles...
        
         | andreyk wrote:
         | I've been testing FSD, and I WISH their system did more to
         | limit its use in bad conditions. The perception (based on
         | dashboard visualization) is much worse during rain, yes Tesla
         | lets you keep using FSD even in heavy rain.
        
         | clashmoore wrote:
         | Tesla was found to be deactivating the autopilot mode at the
         | second before a crash [0]. I think it's for a dubious reason so
         | that Tesla could declare none of their cars were in
         | autopilot/FSD mode when involved in a crash.
         | 
         | [0] PDF https://static.nhtsa.gov/odi/inv/2022/INOA-
         | EA22002-3184.PDF
         | 
         | "The agency's analysis of these sixteen subject first responder
         | and road maintenance vehicle crashes indicated that Forward
         | Collision Warnings (FCW) activated in the majority of incidents
         | immediately prior to impact and that subsequent Automatic
         | Emergency Braking (AEB) intervened in approximately half of the
         | collisions. On average in these crashes, Autopilot aborted
         | vehicle control less than one second prior to the first
         | impact."
        
           | agumonkey wrote:
           | I fail to understand how would anyone at the top of any
           | serious company would think bailing out at the last second
           | would absolve them of anything.
        
         | Geee wrote:
         | This 'level 3' is just a very cheap marketing trick. The system
         | is a very simple highway traffic jam assistant. This trick
         | plays with the misconception that ADAS levels actually
         | determine how advanced the system is. They get to claim 'level
         | 3' with a very simple system by assuming liability in those
         | conditions. It's just marketing, and has nothing to do with
         | actual capabilities of the system.
        
           | mosseater wrote:
           | I mean personally the company assuming liability means A LOT
           | more to me than how much the system can do. It's one thing to
           | say your system can drive down a slick and curvy mountain
           | road, and another thing to say you'll cover all liability if
           | the car drives itself off the mountain. It's easy to write
           | software the runs most of the time. This is our lives that
           | we're talking about.
        
             | Sakos wrote:
             | Feels like the people here saying that Mercedes assuming
             | liability doesn't matter are the same people who say it's
             | your own fault if you lose your job and your healthcare and
             | become poor.
        
         | gzer0 wrote:
         | Here is the full list of restrictions for the Drive Pilot legal
         | liabities to take effect:                 Roads need to be
         | premapped ahead of time with LiDAR       Roads need to be pre-
         | approved       Car cannot go above 37 MPH       limited-access
         | divided highways with no stoplights       no roundabouts
         | no traffic control systems whatsoever       no construction
         | zones       only operate during daytime       Reasonably clear
         | weather       Without overhead obstructions
         | 
         | It is actually _illegal_ to be going that slow on a highway, in
         | Texas at least. This would simply be too dangerous to even
         | allow.
         | 
         | Let me know of any other system that is even remotely close to
         | being able to do the following:
         | 
         | https://www.youtube.com/watch?v=qFAlwAawSvU
        
           | gnicholas wrote:
           | So it works on freeways when there's congestion, and the
           | speed of traffic is < 37 MPH? Sounds like adaptive cruise
           | control with lane keep (and insurance coverage, which isn't
           | nothing).
        
             | macspoofing wrote:
             | >Sounds like adaptive cruise control with lane keep (and
             | insurance coverage, which isn't nothing).
             | 
             | Without the restriction that hands are on-wheel and driver
             | is paying attention to the road. That's a BIG difference.
        
           | vel0city wrote:
           | > It is actually illegal to be going that slow on a highway,
           | in Texas at least.
           | 
           | I've been at a dead stop on many highways in Texas, along
           | with hundreds of other cars around me. I see such things
           | happening pretty often outside my office window.
           | 
           | Honestly, times when I'm going <37MPH on a controlled access
           | highway is some of the most annoying driving that I'd like to
           | have completely automated. That usually means I'm in stop and
           | go traffic, some of the most grating time to drive. Both of
           | my cars are mostly there, keeping safe distances and coming
           | to a stop with cruise control, but definitely not completely
           | automated yet.
        
           | snotrockets wrote:
           | Texas has done a lot to increase the dangers caused by their
           | roads, and don't seem like they plan to reroute. Just look at
           | the current plans for the I-35, ffs.
        
           | mcguire wrote:
           | " _It is actually illegal to be going that slow on a highway,
           | in Texas at least. This would simply be too dangerous to even
           | allow._ "
           | 
           | You don't live in Austin or Houston, do you? :-)
        
           | lolinder wrote:
           | People have mentioned this in other sub-threads, but it's
           | explicitly intended for stop-and-go traffic jams:
           | 
           | > Mercedez-Benz has announced approval of their "Drive Pilot"
           | system, in Germany, which does fully autonomous operation in
           | highway traffic jam situations.
           | 
           | > ...
           | 
           | > The Mercedes car provides the traffic jam assist function
           | -- only on German motorways to start -- below 60 km/h. While
           | people debate whether they want to drive their car or not,
           | nobody likes driving in a traffic jam, and everybody hates
           | the time wasted in them. As a luxury feature, this will let
           | drivers make more productive use of that time.
           | 
           | https://www.forbes.com/sites/bradtempleton/2021/12/13/merced.
           | ..
        
             | slg wrote:
             | But stop and go traffic jams in perfect conditions can
             | already be handled properly by numerous companies' adaptive
             | cruise control and lane keeping systems. I'm not sure why I
             | should be impressed with Mercedes' tech here. The
             | impressive aspect is that they are standing behind the tech
             | by taking on liability, but that could easily just be
             | considered a marketing expense rather than actual
             | confidence in the technology. We have all heard the auto
             | manufacturer anecdote from Fight Club. The math these
             | companies do is based off money and not lives saved.
        
               | HWR_14 wrote:
               | It's not a jump in technology. It's the result of a slow
               | growth of that technology to "mature enough to take
               | liability for". Which is a better way to move tons of
               | machine around under computer control.
        
               | lolinder wrote:
               | Based on all the information I've seen, adaptive cruise
               | control with lane keeping is all that Tesla is reliable
               | at as well. The main difference between them and Mercedes
               | is that Tesla is willing to put out tech that is known to
               | be unreliable and let their customers take the fall for
               | it.
        
               | retSava wrote:
               | Perhaps this is just a rebrand of that already common
               | tech? Kind of how some manufacturers claim "we have AI!"
               | just based on something simple like adapting to a moving
               | average.
        
               | yardie wrote:
               | I haven't driven every car but none of the cars I've
               | driven could actually handle stop and go traffic. They
               | will certainly stop but leave it to you to press the
               | accelerator to go again. Now on a highway with medium to
               | light traffic they are plenty capable of managing it.
        
               | ummonk wrote:
               | The difference is that they're explicitly allowing the
               | driver to stop paying attention to driving, which reduces
               | fatigue, wasted time etc. It's actual level 3 self-
               | driving tech rather than mere driver assistance tech.
               | 
               | Of course, other driver assistance systems might be close
               | to on par with it, but a system that successfully
               | navigates stop and go traffic 99% of the time is very
               | different from a system that successfully navigates stop
               | and go traffic 100% of the time, in terms of driver
               | attention required.
        
               | slg wrote:
               | I'm not sure level 3 is any safer than level 2. Level 3
               | still requires a driver to intervene if the car requests
               | it. But going from not paying attention to driving isn't
               | something that can happen instantly. Imagine you are
               | playing some game on your phone and alarms start going
               | off in the car. You need to be able to process what those
               | alarms are saying, assess the situation, and take control
               | of the car. How quickly can people do that? Likely not
               | fast enough to avoid any urgent issues. A driver in a
               | level 2 system should already be paying attention so they
               | should be able to respond quicker.
               | 
               | And yes, I understand that drivers can get lazy with a
               | level 2 system. But if the selling point of Mercedes is
               | taking over liability from the driver, I am mostly
               | concerned how the system would benefit me as a driver and
               | I regularly use my car's level 2 features while paying
               | attention.
        
               | froh wrote:
               | the difference is that level 2 requires you to take over
               | at any time, immediately, while level 3 allows you to do
               | something else and gives you some time (for drive pilot:
               | 10 seconds) to take over. 10 seconds is quite some time
               | in contrast to
        
               | burnished wrote:
               | I think for the proscribed use case the situation where
               | you require human intervention is where the traffic jam
               | clears up and it's time to drive at highway speed again.
               | Not an emergency. I'm having a hard time imagining a
               | situation where you would need to speedily regain
               | complete control to avert a crisis that a human wouldn't
               | already fail.
        
               | diebeforei485 wrote:
               | I find the distinction between level 2 and level 3 to be
               | unhelpful. How long do humans have to take over? Anything
               | less than 20 seconds is not very feasible IMO.
               | 
               | Taking liability is an interesting PR move, but I don't
               | think it matters in stop and go traffic where speeds are
               | relatively low and damage is typically minimal if any.
        
               | tsimionescu wrote:
               | Collisions at 40mph / 60km/h are no laughing matter. And
               | as far as I understand, the Mercedes system will let you
               | know well ahead of time if you have to take over, as that
               | would only be required as you leave the designated area.
               | Taking over to drive faster than the limit for drive
               | pilot would never be a requirement.
        
               | ummonk wrote:
               | If 10 seconds isn't enough to orient yourself and take
               | over after the car alerts you to do so, you shouldn't be
               | driving a car.
        
               | macspoofing wrote:
               | >But stop and go traffic jams in perfect conditions can
               | already be handled properly by numerous companies'
               | adaptive cruise control and lane keeping systems.
               | 
               | And do those adaptive cruise control/lane keeping systems
               | allow the driver to take their hands off the wheel and
               | stop paying attention to the road?
        
               | vel0city wrote:
               | Stop and go traffic jams aren't _completely_ automated by
               | most ADAS systems. For one, you 're still completely
               | liable for it failing. Secondly, most of those lane keep
               | assists will still let your car wander out of the lane if
               | you really don't pay attention, they mostly just tug at
               | the wheel to help you notice drift or will beep at you.
               | Finally, a lot of those will require manual intervention
               | for it to start moving again after a full stop.
               | 
               | Mercedes implementation takes the legal liability. It
               | will definitely stay in its own lane without any driver
               | input. It will continue going again after a full stop all
               | on its own.
        
           | tablespoon wrote:
           | > It is actually illegal to be going that slow on a highway,
           | in Texas at least. This would simply be too dangerous to even
           | allow.
           | 
           | So does the Texas Highway Patrol ticket everyone in a traffic
           | jam? They almost certainly don't, which should be a big clue
           | that it's not so simple.
           | 
           | This appears to be the actual law in Texas:
           | 
           | https://texas.public.law/statutes/tex._transp._code_section_.
           | ..
           | 
           | > (a) An operator may not drive so slowly as to impede the
           | normal and reasonable movement of traffic, except when
           | reduced speed is necessary for safe operation or in
           | compliance with law.
           | 
           | > ...
           | 
           | > (c) If appropriate signs are erected giving notice of a
           | minimum speed limit adopted under this section, an operator
           | may not drive a vehicle more slowly than that limit except as
           | necessary for safe operation or in compliance with law.
           | 
           | It's not safely operating a vehicle to go 40mph in 10mph
           | traffic.
        
         | pkaye wrote:
         | > But there's one key difference: Once you engage Drive Pilot,
         | you are no longer legally liable for the car's operation until
         | it disengages.
         | 
         | What if they disengage right before an accident in order to
         | transfer the liability to you?
        
           | froh wrote:
           | they don't. the Mercedes drives on for at least ten seconds
           | until you take over in that traffic jam on highway scenario,
           | under all circumstances.
        
           | bastawhiz wrote:
           | If the car knows that it's about to be in an unavoidable
           | accident and it is at fault, it has acknowledged that it has
           | fucked up. To think that Mercedes wouldn't find itself in an
           | expensive legal battle the first time that this happens would
           | be ridiculous.
           | 
           | But I would expect that the disengagement is much less abrupt
           | than what Autopilot/FSD do. From [0]:
           | 
           | > After consulting with the engineer in the passenger seat, I
           | closed my eyes completely, and just eight or nine seconds
           | later a prompt popped up asking me to confirm I was still
           | alert. I ignored it, which soon started the 10-second
           | countdown toward disengagement.
           | 
           | Which makes sense: if the point of the system is for you to
           | be able to turn around and help your kids for a few seconds
           | or watch a TV show on the center console, they simply can't
           | expect that they can ding and have you regain control
           | instantly.
           | 
           | [0] https://www.motorauthority.com/news/1136914_mercedes-
           | drive-p...
        
         | r053bud wrote:
         | 40 MPH is absolutely too slow for a highway, dangerous even. Is
         | this for surface roads?
        
           | gnicholas wrote:
           | > _limited-access divided highways with no stoplights,
           | roundabouts, or other traffic control systems_
           | 
           | Yeah I don't understand where exactly this would be usable,
           | at least around where I live. If it's a divided highway, it
           | would have to have stoplights. Are there places where divided
           | highways have stop signs?
        
             | snowwrestler wrote:
             | A limited-access divided highway does not have stoplights
             | or stop signs, or any intersections at all. Cars enter and
             | exit the roadway exclusively via on- or off-ramps.
        
             | gen220 wrote:
             | My guess is that it's intended to be used in the entire
             | State of Connecticut, between the hours of 8AM and 10AM and
             | 4PM and 6PM. i.e. situations where the highway is doing its
             | best impression of a Dunkin' Drive-Thru.
             | 
             | Signed, slightly-jaded person who drives the Boston<>NYC
             | track enough to be slightly-jaded.
             | 
             | ---
             | 
             | Or, Wareham -> Barnstable on Cape Cod, on any weekend
             | morning for 6 months out of the year. Or 101 in the CA Bay
             | Area during rush hour.
             | 
             | Basically, any time+place where the the thought of driving
             | elicits an audible moan from the people then and there.
        
             | toast0 wrote:
             | A stop sign is a traffic control system, FWIW. They're
             | saying a freeway, more or less, although the low top speed
             | means really a freeway during congestion.
        
             | kingnothing wrote:
             | It's intended for use on interstates and highways during
             | stop and go traffic jams.
        
               | gnicholas wrote:
               | So it's basically adaptive cruise control with lane
               | keeping? I guess they don't have to worry about turns
               | that are too sharp (which an be troubling for lane-keep
               | systems) because they're limiting it to freeways that are
               | meant to be driven at 70 MPH, but only when the speed of
               | traffic is half that.
        
               | Linosaurus wrote:
               | They are also trying to convince regulators that they,
               | not you, are legally responsible for any incidents. As
               | long as you are ready to take over with a ten second
               | warning.
        
             | numpad0 wrote:
             | Limited-access divided highway in this context means
             | freeways and toll roads with walls to roadsides. It is
             | generally considered acceptable to operate dangerous robot
             | machines in a fenced off areas with enough precautions, and
             | that isn't much different in philosophy to a self driving
             | car on such a highway.
        
             | vel0city wrote:
             | I've been on many rural roads with divided county highways
             | with stop signs.
             | 
             | And as others have mentioned it's still a traffic control
             | device.
        
             | someweirdperson wrote:
             | > Yeah I don't understand where exactly this would be
             | usable, at least around where I live.
             | 
             | German autobahn.
        
               | gnicholas wrote:
               | < 37 MPH?
        
               | dual_dingo wrote:
               | Stop and go is an all too common thing on the Autobahn,
               | often during rush hour in areas near large cities.
        
               | kwhitefoot wrote:
               | If you are thinking of places like Hamburg then the terms
               | and conditions forbid it because the motorways in the
               | Hamburg area are all construction zones and have been for
               | at least the last five years that I have driven through
               | them.
        
           | [deleted]
        
           | Karppu wrote:
           | From my understanding it's initially meant for use e.g. in
           | slow moving traffic jams on highways. They're working towards
           | getting it approved for up to 130kmh.
           | 
           | There's some very general info here:
           | https://www.roadandtrack.com/news/a39481699/what-happens-
           | if-...
        
             | judge2020 wrote:
             | I don't see how 'approval' is required for _anything_. Not
             | for lane-keeping at high speeds, and not for Mercedes to
             | take liability for accidents while their technology is
             | active at high speeds.
        
               | dual_dingo wrote:
               | Luckily, the law disagrees in most countries and you need
               | to get a type approval before you can sell a new car type
               | or new assistive technologies. I would not want to live
               | in world where this is not the case.
        
             | bscphil wrote:
             | > From my understanding it's initially meant for use e.g.
             | in slow moving traffic jams on highways.
             | 
             | I'm guessing the feasibility of this is very city-
             | dependent. in LA, the usual traffic pattern is "drive 60-70
             | mph for 20 seconds, slow down to 5 mph for 60 seconds,
             | repeat". (Granted that that's because of poor drivers, it
             | would be better to have a consistent speed of 30 mph, but
             | there's nothing Mercedes can do about that either way.)
        
           | trgn wrote:
           | Just to piggy back on the 40mph callout, but I would like to
           | see self driving cars never really drive more than 5mph under
           | the speed limit. It would have a great calming effect on
           | traffic. If they combine that with very conservative
           | acceleration, it would be even better, much less of that
           | rushing and accordion effect that's causing so many crashes.
           | 
           | Instead, Tesla fsd, at least from the youtube videos, looks
           | like it's driving like a BMW-driver. Way way way too
           | aggressive.
           | 
           | The biggest contributors to car crashes is speed and not
           | enough distance from car in front. If self-driving cars would
           | exaggerate the basic premises of safe driving, low speed, low
           | acceleration, long distance, ... it would be really good for
           | traffic overall imho.
        
             | vel0city wrote:
             | Driving significantly slower than the pace of traffic is
             | dangerous. If the average pace of traffic is 5mph over but
             | your car won't go faster than 5 under, you're now going
             | 10mph less than everyone around you.
             | 
             | Speed differentials kill.
        
               | trgn wrote:
               | > Speed differentials kill.
               | 
               | That's the cope people use for habitual speeding. 10mh is
               | not significant differential. 30 vs 60 on a highway,
               | sure, 55 vs 65, not at all.
               | 
               | You should actually try it once. Go five under the speed
               | limit and keep generous distance with the car in front of
               | you. You'll barely notice it. Traffic will be ahead of
               | you, you won't pass anybody. The biggest thing to get
               | over is the ego-thing.
               | 
               | As you are doing this, then also pay attention to your
               | capacity to act on any emergency stop you may have to
               | make (dog sprinting across, car slamming their breaks,
               | ...) and how much much more time and capacity you will
               | have to respond.
               | 
               | The other thing that peoples mind immediately go to
               | multi-lane highways. Never the other 70-80% of driving,
               | in town, single lanes, where going slower is always
               | manifestly better.
        
               | vel0city wrote:
               | I've seen it many times before. I got people bunching up
               | behind me, riding my bumper, cutting me off, swerving
               | around me, causing near misses in other lanes as they cut
               | other people off trying to pass. It causes backups near
               | ramps to get on and off highways, backups which often
               | result in rear end collisions, partially because...ding
               | ding ding _speed differentials_.
               | 
               | Also, acting in capacity to react in an emergency is more
               | about following distance than speed. And yeah, as speed
               | increases a driver needs to increase their follow
               | distance. Something that I agree loads of people fail at
               | doing and then complain about their ADAS systems always
               | slamming on the brakes suddenly.
               | 
               | > where going slower is always manifestly better.
               | 
               | Just tell that to all the cyclists going < 20mph in
               | 40-50MPH roads. They're way safer going that speed than
               | those fools driving their cars near the speed limit. It's
               | often not safe for them, partially because... _speed
               | differentials_. To solve this, we shouldn 't just
               | restrict cars to only go cycling speeds, we should build
               | infrastructure so similar speed traffic is grouped
               | together and separate, reducing... _speed differentials_.
               | 
               | If I started driving my car 5mph in a 40mph road, I'd
               | probably cause more accidents than if I just went along
               | with traffic at 43mph.
               | 
               | Speed differentials kill.
        
               | watwut wrote:
               | Overwhelming majority of time I was driving on highway,
               | the right lane went below speed limit. That makes up
               | quite a lot of cars that go below it.
               | 
               | And I used to drive exactly speed limit (as measured by
               | GPS) and that maded me among the _faster_ cars on
               | highway. Only few cars went faster then me.
               | 
               | I made effort to slow down lately and can confirm that
               | the biggest and only issue to overcome is the ego and the
               | knee jerk "being there faster makes you better driver"
               | kind of thinking.
        
               | trgn wrote:
               | > Just tell that to all the cyclists going < 20mph in
               | 40-50MPH roads
               | 
               | That's again a 20-30 speed deferential, not to mention a
               | huge difference in weight. We're talking about a 5-10 one
               | between cars.
               | 
               | Also, if it's a heavy freight truck going 20 in a 40mph
               | single lane. yeah, no issue at all with that speed
               | deferential isn't there? Maybe the problem is here the
               | inattentive, impatient drivers plowing through the
               | cyclist?
               | 
               | > acting in capacity to react in an emergency is more
               | about following distance than speed
               | 
               | The cars in front are not the only hazards.
               | 
               | Overall, I think you're making it too extreme. I'm not
               | saying you should be going 20 on a highway. I'm saying
               | going 5 under a posted speed limit is actually very
               | reasonable, and it's what self driving cars (and human
               | drivers) should do. It will reduce crashes. I think we
               | disagree there.
        
           | fnordpiglet wrote:
           | So it's adaptive cruise control? That's an absurdly low bar.
        
             | ender341341 wrote:
             | If they're claiming legal responsibility for it causing any
             | crashes I'd say they're setting a pretty high bar as far as
             | confidence goes.
        
         | dheera wrote:
         | I really wish they would cut this "Level" nonsense, that system
         | was invented by business people, not engineers.
         | 
         | Interventions are more nuisanced than just "Did the driver
         | intervene". Many times I intervene and take control while using
         | Tesla FSD not out of safety reasons, but to be nice to other
         | drivers, or for a smoother ride. It tends to love passing other
         | cars on the right and not letting cars into merges, for
         | example. It also brake a little hard at traffic jams, for
         | instance, not nearly to a point where it would be a safety
         | issue, but when I see a traffic jam far ahead I would begin
         | decelerating much, much earlier just for the comfort of myself
         | and passengers.
         | 
         | That said, FSD is nowhere near ready, I do have a huge number
         | of safety related interventions as well, but reducing this to a
         | number like L3 or L4 is trying to oversimplify a problem that
         | isn't simple.
        
           | vel0city wrote:
           | ADAS Levels were standardized by the Society of Automotive
           | _Engineers_. Are you saying the Society of Automotive
           | _Engineers_ are all just business people and not engineers?
        
         | esalman wrote:
         | > But the big promise from Mercedes is that it would take legal
         | liability for any accidents that occurs during Drive Pilot's
         | operation
         | 
         | I am a FSD skeptic but I might be sold on this.
        
       | hacoo wrote:
       | I'm no Tesla fanboy, but it should be said that publicly reported
       | miles per disengagement metics are complete bullshit. Different
       | companies have wildly different criteria for "disengagement".
       | Those reporting 1000mi+ between disengagements (at least in urban
       | settings) are only doing so because they aren't counting the vast
       | majority of events.
        
         | calcifer wrote:
         | > Those reporting 1000mi+ between disengagements (at least in
         | urban settings) are only doing so because they aren't counting
         | the vast majority of events.
         | 
         | Source? Do you work at one of those companies?
        
       | cjdoc29 wrote:
       | I will say, FSD has genuinely gotten better over the last year. I
       | would not trust it to drive without a driver monitoring it. But
       | I've had ~20mi trips end-to-end (surface streets and freeways)
       | without requiring disengagement. It's been a great driver assist
       | - but it's not true FSD.
       | 
       | That said, it still does dumb things like:
       | 
       | * Getting in the left-most turn lane when I will make a right
       | turn immediately after a left. This usually results in a
       | disengagement or me having to change lanes immediately after
       | turning.
       | 
       | * Changing lanes, but the process sometimes makes it so that I
       | change lanes in an intersection. This is illegal.
       | 
       | * Not stopping at the crosswalk on the right-most lane on a red.
       | I expect the car to stop firmly at the intersection and then
       | slowly creep out.
        
       | pmarreck wrote:
       | Musk has tweeted about the importance of transparency/candor
       | https://twitter.com/elonmusk/status/1598858533608431617?s=46...
       | and yet does not release recent FSD safety data. This is simply
       | hypocritical.
        
       | Animats wrote:
       | The California DMV has been licensing autonomous vehicles since
       | 2014. There are three categories of license. Testing with a
       | safety driver is the learners permit: must have licensed driver
       | ready to take over, no driving for hire, no large vehicles. About
       | 45 companies have a learner's permit. Driverless testing is the
       | next step up: no driver in the vehicle, but a remote link and
       | observer. 7 companies have that permit.
       | 
       | Finally comes deployment: no driver, paying passengers, remote
       | monitoring.[1] Three companies are far enough along for
       | driverless car deployment in California: Waymo, Cruise, and Nuro.
       | Waymo is going about 13,000 miles between disconnects now. The
       | remote monitoring center can reset.
       | 
       | Waymo is still being very cautious. Only in Phoenix, AZ is the
       | service really deployed. There's a service in San Francisco, but
       | you have to sign up as a "trusted tester" and it's mostly Google
       | employees. When it goes live in SF, then this is real. Waymo has
       | a new vehicle they're showing. It's an electric van with no
       | steering wheel. Not clear how far away deployment is.
       | 
       | Tesla isn't even trying to qualify for autonomy in California any
       | more. They've given up. They used to whine about being "over-
       | regulated". What they hated was having to report all disconnects
       | and accidents to DMV, which publishes them.
       | 
       | [1] https://www.dmv.ca.gov/portal/vehicle-industry-
       | services/auto...
       | 
       | [2] https://www.theverge.com/2022/11/21/23471183/waymo-zeekr-
       | gee...
        
         | killjoywashere wrote:
         | > partnered with Chinese automaker Geely
         | 
         | (+deg#deg)+( +-+
         | 
         | Are you telling me there are no American companies they could
         | have partnered with? Why would you willingly give US dollars to
         | the CCP for a free market R&D effort?
        
           | Animats wrote:
           | Waymo has previously partnered with Chrysler/FCA/Stellantis,
           | and Jaguar. They don't seem to be strongly committed to a
           | vehicle maker.
        
         | d23 wrote:
         | > Tesla isn't even trying to qualify for autonomy in California
         | any more. They've given up. They used to whine about being
         | "over-regulated". What they hated was having to report all
         | disconnects and accidents to DMV, which publishes them.
         | 
         | Sunlight as a disinfectant indeed.
        
       | outside1234 wrote:
       | It is not hard to imagine that Tesla is going to die at this
       | rate.
       | 
       | Where the moat? Traditional car makers are catching up on
       | electric and already have superior build quality. And if
       | anything, Tesla is behind on FSD.
        
         | jsight wrote:
         | My other car with TACC and autosteer claims to be a pro pilot,
         | but can't handle a shallow curve and also accelerates towards
         | stopped traffic fast enough to trigger its own collision
         | warning.
         | 
         | Tesla will be fine.
        
         | rootusrootus wrote:
         | Supercharger network is the only moat. In all other respects
         | other EVs beat Tesla feature for feature.
         | 
         | Maybe the gov't will squash FSD and Tesla will have an epiphany
         | and decide to refocus their efforts on the basics where they're
         | quickly falling behind. Hell, maybe they'll start including the
         | better technology the competition relies on, and try to stick
         | to just the things they're good at. Which is ... I don't know.
         | Image seekers?
        
         | bin_bash wrote:
         | Still superchargers for now
        
           | foepys wrote:
           | In Europe there is regulatory pressure to open all publicly
           | accessible charging stations to be open to all vehicles.
           | 
           | If superchargers are Tesla's main advantage, they are not
           | only in competition with auto makers but with all electricity
           | providers in an increasingly competitive market.
        
         | mrcwinn wrote:
         | Tesla is certainly behind on FSD. What that team has produced
         | so far is nothing short of amazing, but gosh, their boss sure
         | hasn't helped their credibility.
         | 
         | I can say that after experiencing the software of the BMW i4
         | and the Audi e-tron -- holy cow, Supercharger is certainly not
         | the only advantage of a Tesla. Audi's is laughably bad. BMW's
         | is better than Audi's. Both are much worse than Tesla.
         | 
         | The obvious retort there is: but Tesla's competitors will get
         | better. And sure, they will. You're right. But Tesla likewise
         | has their own R&D investments and unreleased products.
         | 
         | I think sometimes when we talk about competitive landscape, we
         | tend to say "Company B will catch up to Company A" while in our
         | minds falsely imagining Company A as being static. Both Company
         | A and B are moving, but in Tesla's case, they had a ten year
         | jump on tightly integrated hardware + software. I think it's
         | likely that continues to yield dividend.
        
           | Der_Einzige wrote:
           | Tesla's software is also shit compared to most of the
           | American brands, Porsche, and with 22+ MY vehicles, even
           | Toyota/Lexus have better infotainment/software/sensors.
           | 
           | It's literally just a good battery and good charging. That's
           | all Elon has, and that advantage will die in 5 years.
        
       | hax0ron3 wrote:
       | It has been obvious to me ever since, a few years ago, Musk
       | promised that fully self-driving robotaxis were going to be on
       | the roads a year in the future that he is a con artist and/or
       | ridiculously over-optimistic about self-driving technology.
       | 
       | But is this why I am seeing this article on Hacker News now? Or
       | am I seeing it now because Musk pissed off a bunch of people who
       | disagree with his political stances?
        
       | maxdo wrote:
       | You are comparing apples to burgers. Mercedes system attacks
       | narrow scope ( gaps in traffic approved roads, certain speed
       | conditions) etc. They bring for that lots of expensive hardware.
       | Tesla is trying to build general AI. Wether they will succeed or
       | not, it's a debate. But basically Mercedes strategy is do
       | something NOW with expensive hardware where Tesla strategy do way
       | more in the future, with cheap, 5 y.o. hardware
        
       | Sohcahtoa82 wrote:
       | One of the comments points out:
       | 
       | > Comparing Telsa's disengagements for its FSD beta, which can
       | drive anywhere, against autopilots that work only on (certain)
       | highways is not very comparable. At least compare highway to
       | highway disengagements.
       | 
       | I own a Tesla, but do not have FSD (Not interested in it at all
       | beyond being a party trick). After 3 years and 20,000 miles, I
       | can say that I've never had to disengage AP because of imminent
       | danger, though I HAVE had many times where the lane is splitting
       | and AP can't decide which way to go, so I had to take control.
       | Likewise, when a lane is merging, it would sometimes jerk left
       | and right a bit and I'd take over. Both cases were more for
       | comfort than safety.
       | 
       | Total, I'd probably average 5 miles per disengagement, but that's
       | just so I can change lanes since the base AP does not include
       | lane changing.
        
         | KyleJune wrote:
         | I have FSD and owned my M3 for the same amount of time. With
         | FSD you get Navigate on Autopilot that handles lane changes on
         | the highway. It has the same issues you see. The lane changing
         | isn't great, it wants to change lanes more frequently than I do
         | and will often try going into the fast lane even though there
         | is significantly faster traffic coming that I would be slowing
         | down. I often have to disable navigate on autopilot to keep it
         | in lane because it will suggest changing lanes, I'll cancel,
         | then a minute later it will suggest it again.
         | 
         | I'm hoping once the two stacks are merged, that will improve.
         | 
         | The other problem I have on high ways is I can't really use it
         | in stop and go traffic. If I do, it will frequently accelerate
         | to fast from being stopped then have to break hard when traffic
         | reaches a stop again. Too jerky for my comfort.
         | 
         | Overall, when I can use it, it makes driving less stressful. I
         | drive 600 mile round trips a few times per year and am able to
         | have it engaged vast majority of the time on highways.
         | 
         | FSD Beta on city streets requires too frequent overrides for it
         | to be anything more than a party trick at the moment. Even in
         | very rural areas it will have problems of either not changing
         | lanes when it should or changing lanes when it shouldn't.
         | 
         | Auto parking worked great for me initially but something
         | changed and it is never able to detect parking spots for me.
         | The few times it does it just flickers on the screen then
         | disappears, resulting in me not being able to use the feature.
         | I'm not sure if this is because of a problem with sensors or if
         | it's related to me being in the FSD Beta. I've been in the FSD
         | Beta for over a year now.
        
         | Waterluvian wrote:
         | Something interesting to me, as someone with a 2020 base-level
         | Forester, is that you describe my experience completely. It
         | works great on most highways, but needs me when lanes split or
         | merge.
         | 
         | Of course it's not 1:1. AP has other features, such as a much
         | nicer UI that better reports what it thinks it sees, and I
         | think has sexy lane changing features and such.
         | 
         | But it makes me feel that FSD was more business necessary to
         | keep a competitive advantage against cheap cars. Soon many
         | cheap cars will have AP-like features at comparable fidelities.
        
         | dummydata wrote:
         | I have the exact same experience with AP. I trust it on
         | highways until a merge or lane split is ahead. I think it's so
         | silly that the decision making isn't more robust..
         | 
         | If I can't trust AP with simple road rules, then why bother
         | upgrading to FSD?
        
           | fiddlerwoaroof wrote:
           | FSD handles road rules significantly better: the road up to
           | my house has a complicated five-way intersection with a
           | railroad crossing that EAP could never handle. FSD navigates
           | it perfectly. For my 40 minute compute, there are about three
           | spots where there's predictable road issues that force me to
           | disengage (no safety issue, just the car gets really
           | hesitant) but otherwise it's obviously an upgrade from non-
           | FSD.
        
         | tzs wrote:
         | > I can say that I've never had to disengage AP because of
         | imminent danger, though I HAVE had many times where the lane is
         | splitting and AP can't decide which way to go, so I had to take
         | control.
         | 
         | This suggests a trivia question, which I do not know the answer
         | to: what lane in the US goes the farthest without a split or
         | dead-end? Similar question for other countries.
         | 
         | I'd count most freeway exit ramps as splits from the adjacent
         | lane, but would not count intersections with cross streets as
         | splits unless the two roads intersect at a small angle. It is
         | the actual lane that matters, not its name, so if a lane has a
         | name change when it crosses some political boundary it still
         | counts as the same line.
         | 
         | My first guess would be it is on some long freeway like I-10.
         | My second guess would be it is on some rural road through the
         | middle of nowhere.
        
           | vl wrote:
           | US is surprisingly bad at this though: in Germany, you get on
           | Autobahn and you know the lane you are in is going to go
           | forever. In US you get on a highway, get to the leftmost
           | lane, it gets to the city, you driver straight and suddenly
           | find yourself in the exit lane on the right? Who designs
           | highways this way?!?!
        
       | defterGoose wrote:
       | I was _brake checked_ by a Tesla while driving home on the 10W
       | the other day. My first thought was,  "what the hell man, you're
       | in the #1 lane without a car in sight in front of you?!". Then I
       | realized that the car had probably disengaged AP and this
       | involved the cruise control disengaging and slowing (i.e.
       | braking) the car.
       | 
       | I had already known that Tesla drivers using AP was a major cause
       | of slow, too-far-following-distance driving (which is why I take
       | every opportunity to cross in front of a Tesla doing 60ish), but
       | this was a whole 'nother level of literally "fuck you, pay
       | attention until you get home, asshole".
       | 
       | /Rant.
       | 
       | P.S. Tesla drivers are absolutely the new Prius drivers.
       | 
       | Edit: and the fact that the behavior of "if there's a car
       | following don't slow abruptly" isn't a 0-th level if loop in the
       | FSD algorithm is completely inexcusable. Shit needs to be
       | regulated _now_.
        
       | kossTKR wrote:
       | Knowing little about how the FSD chip works, is the fact that it
       | runs locally and not on some super cluster a bottleneck in
       | performance right now?
       | 
       | The fact that it's only in the last year or so that
       | Stability/OpenAI and efficient hardware like before mentioned
       | processors that can produce relatively fast and accurate results
       | in the NN sphere makes me wonder if it's just "too early" to have
       | self driving cars?
       | 
       | ChatGPT still requires a supercluster to run and as a layman one
       | would think video analysis with near zero latency would be even
       | more demanding.
       | 
       | Also how does it compare to to apples newer Arm based
       | architecture and Neural Engine which as far as i know is a good
       | example of in-house designing (not production off course).
        
         | ody4242 wrote:
         | Imho, if Tesla could train a huge neural network to demonstrate
         | that self driving is possible with their sw stack, they would
         | have done that already to let Elon Musk have it's road show
         | with it's capabilities.
         | 
         | (Running ChatGPT does not require a supercluster, training the
         | model does)
        
         | quenix wrote:
         | I don't think it would be feasible to do cloud analysis on a
         | live video stream in self-driving context. The latency would
         | simply be too high, and a single internet hiccup would result
         | in a dangerous situation. So I think all the models run
         | locally.
        
       | narrator wrote:
       | As a stock market investor that is purely self-interested in
       | predicting the direction of stocks and who could not give a crap
       | about your opinion on Elon's latest tweet, it's EXTREMELY
       | interesting to see TSLA flip from getting a pass on everything to
       | getting bad PR on everything. It's like a case study in a fall
       | from grace of a company because of political factors.
       | 
       | I feel like a good way to play the market with all this is to
       | look at mainstream media news service articles about TSLA before
       | the Twitter purchase and after the Twitter purchase and train
       | them to spot the sentiment difference in the before and after
       | articles. It would serve as an early warning sign that the
       | company has been switched from being boosted or trashed.
       | 
       | One funny thing is before the Twitter purchase, alt-right outlets
       | like ZeroHedge were constantly mocking Tesla and predicting its
       | imminent demise during its gigantic run up while mainstream
       | outlets were throughly praising it. Now, it's the other way
       | around.
        
         | bhauer wrote:
         | It truly is fascinating to what degree news coverage is
         | politically biased. Partisanship rules everything.
         | 
         | As you point out, the stark flip-flop of roles between left and
         | right-leaning coverage of Tesla is almost hilarious when
         | observed from any "neutral" point of view. Even in this HN
         | comment thread, the emotional injection of political points
         | throughout is both funny and disheartening.
        
       | sethd wrote:
       | I know that a lot of Tesla fans view FSD as a sort of tip jar to
       | throw extra money towards the cause but I have a feeling
       | regulators and gov attorneys will come to see it quite
       | differently.
        
         | bin_bash wrote:
         | Yeah I would've been one of these fans until I tried FSD
         | yesterday (see my other comment). I suspect now that it's been
         | GA for a week or so a lot of people are going to change their
         | mind like I did.
        
         | ProjectArcturis wrote:
         | I'm kind of amazed there hasn't been more government action on
         | this.
        
       | torartc wrote:
       | Calling it FSD ever should be considered straight up fraud.
        
       | xnx wrote:
       | Good example of the 2nd-order effects of how being "aggressive"
       | on putting self-driving into the hands of drivers can cost lives.
       | Not only can Tesla "F"SD cost lives directly through accidents,
       | it could sour peoples' opinions of more mature driving systems
       | like Waymo's and delay our shift to safer, automated, drivers.
        
       | rootusrootus wrote:
       | I think 'awful' is underselling it a bit here. The fact this is
       | allowed to be in widespread use in the hands of amateur 'beta
       | testers' is horrifying. I'm surprised USDOT hasn't put the kibosh
       | on it yet.
        
       | sshine wrote:
       | I've commuted a bit on the German highways. I only run Autosteer,
       | which is cruise-control + lane-control, and it disengages almost
       | every time there's roadwork.
       | 
       | It is very bad at handling ambiguity when the road has both white
       | and yellow navigation lines. There's a number of scenarios, some
       | of which involve criss-crossing lines, some where they run mostly
       | in parallel.
       | 
       | The car sometimes wants to follow the white lines when the yellow
       | are actually overrides. When the yellow lines are wrong (and
       | should rightfully have been scraped off), this always leads to
       | disengagement. For stretches where the yellow lines are correct,
       | and the older white lines are off by half a meter so that there's
       | effectively not enough space between the white lines and the
       | concrete side, autosteer will sometimes try to switch to the
       | white lines, which would lead to the car smashing into the
       | concrete side.
       | 
       | And sure, autosteer != FSD. But it's a much simpler problem that
       | the Tesla still basically fails at.
       | 
       | Sometimes, in fog, cruise-control will abrubtly break. Very
       | nerve-wrecking.
       | 
       | Relying fully on vision has its downside. Listening to Kaparthy
       | on Lex Fridman's podcast, it sounds like this was cost-cutting. I
       | can't believe that "nope, that big white thing isn't a wall" is
       | not objectively better navigation.
        
         | cvak wrote:
         | I tried VW(+Skoda), BMW, and Stelantis(Citroen/DS/Peugeot
         | models) "Autosteer" in past year(changing comanies, waiting for
         | new company car to be build while having Hertz rentals), plus I
         | had autosteer with adaptive CC in Skoda from 2018.
         | 
         | It got a lot better over the 2018 thing, that tried to crash
         | itself on several roadworks, I actually almost scratch the car
         | in first 100km I had it because of it.
         | 
         | My latest BMW works perfectly fine in German roadworks with
         | bright yellow lanes, Also all the stelantis cars I've driven
         | (all 2020+ models) worked fine in that conditions.
         | 
         | Camera system has problems when there's fog, or direct sun in
         | precise angles. Radar has problems where there's lot of rain,
         | and rainsplash from trucks, or snowy conditions, and the front
         | radar gets frozen.
         | 
         | But it's awesome as assistive system.
        
           | cjrp wrote:
           | Skoda/VW lane assist feels like it would bounce you back and
           | forth between the lane markings if you just let it do it's
           | thing; more useful a backup system to stop you crossing the
           | line when you have a lapse in concentration (which is how
           | they describe it I believe).
        
             | sz4kerto wrote:
             | Yep, its function is basically: if your child distracts you
             | and you don't pay attention for a few seconds then it'll
             | try preventing you sliding into the other lane. Which is
             | fine.
        
         | judge2020 wrote:
         | > And sure, autosteer != FSD. But it's a much simpler problem
         | that the Tesla still basically fails at.
         | 
         | They've publicly admitted that the entire autosteer codebase
         | hasn't been touched in over 4 years, since they fully pivoted
         | to getting FSD in a usable level 2 state.
        
         | ajmurmann wrote:
         | This comment made me wonder how they train the car for
         | different sets of traffic rules. Some of this goes beyond
         | formal rules. On most US highways you are technically supposed
         | to overtake on the right, but it's totally common. In contrast
         | in much of Europe this is actually enforced and the convention
         | is that you only leave the rightmost lane while overtaking.
         | 
         | Do you do 100% of training on data from the same region? Are
         | there low-level skills that all translate and don't bring the
         | risk of training contradictory expectations? Can you just
         | annotate training data with the rule set/location that applies?
        
           | D13Fd wrote:
           | > On most US highways you are technically supposed to
           | overtake on the right
           | 
           | You mean the left?
        
             | ajmurmann wrote:
             | Yeah, sorry, I omitted the "not". In practice both seem
             | believable looking at traffic here.
        
             | 9dev wrote:
             | I think they meant ,,aren't technically supposed to", that
             | makes more sense
        
           | Barrin92 wrote:
           | when it comes to decision-making there is a lot of hardcoded
           | rules in these systems. They don't learn the speed limits or
           | to stop at red lights. The learning is all in the perception
           | and object recognition but a lot of the behavior is just
           | explicit rules.
        
         | Zigurd wrote:
         | This is an example of the risks of vertical integration. Some
         | of what Tesla needed/chose to do itself is paying off. More
         | integration into a single electronics system puts Tesla ahead
         | of an industry trend with many unique features.
         | 
         | But rolling your own ADAS _and_ FSD might result in having
         | spent a lot for a result that is inferior to what ADAS
         | suppliers can sell to other car OEMs, with no substantial
         | benefit to the FSD effort, which is much riskier, with the
         | added, and perhaps gratuitous risk of trying to do FSD with
         | only camera sensors. The risks in that are not just it is a
         | risky approach to real time sensor data, but that the data all
         | your cars are collecting is worth a lot less for not having a
         | LIDAR point cloud providing 3D data.
        
         | dmix wrote:
         | I'm most curious how Teslas autosteer compares to CommaAI (or
         | other offerings). I've been seriously considering letting
         | CommaAIs set of supported cars to try it out.
         | 
         | It'd be cool if there was a yearly competition to rank the
         | basic lane keeping/speed matching/turning. Eventually when it's
         | actually full-self-driving they can rank FSD.
        
           | chroma wrote:
           | This[1] is one of the videos on comma.ai's website.[2] I
           | count at least 10 disengagements. It doesn't even follow the
           | road in some cases.
           | 
           | 1. https://youtube.com/watch?v=NmBfgOanCyk
           | 
           | 2. https://comma.ai/openpilot
        
           | techwizrd wrote:
           | My brother uses CommaAI's openpilot on his Honda HR-V, and
           | it's really quite interesting. It does pretty well where
           | we've tested in Virginia and Maryland, but we haven't
           | compared it head-to-head against Tesla's offering.
        
         | nosianu wrote:
         | Talking about "lane assist" on a German Autobahn - I had a bad,
         | if not worse anecdote with a French (the model) rental car one
         | or two years ago. I did not know that assistant was on, but had
         | I known I would have thought nothing of it.
         | 
         | Because what happened really surprised me: Entering a
         | construction zone _the assistant took the wheel away from
         | me(!!!)_. I was steering manually, normally, but the assistant
         | insisted I follow the previous line in the road, that lead
         | right into the construction zone. Which was behind a concrete
         | divider.
         | 
         | I actually had to _fight_ the  "assistant", with not
         | insignificant use of force on the wheel, to not crash sideways
         | into that divider wall.
        
           | ridgered4 wrote:
           | I swear my car was (gently) fighting my steering here due to
           | the lane keep feature before I turned it off. I frequently
           | drive around potholes and dead animals and don't need the car
           | trying to fix that for me.
        
         | rootusrootus wrote:
         | Bit by bit, it gets harder to figure out how buying a Tesla is
         | ever the right answer. Autosteer (beta) is worse than the
         | competition, traffic aware cruise control is timid to the point
         | of braking for things which don't even exist, you don't get
         | basic things like ultrasonic sensors, radar, rain-sensing
         | wipers, or even carplay. And even when they do include
         | something that sounds halfway cool, it turns out to be
         | underwhelming -- the matrix headlights are all of 100 pixels,
         | which would have been cool about 10 years ago. And they aren't
         | even enabled. Given how poor automatic high beams are, do we
         | really want to see how Tesla matrix headlights would work in
         | real life?
        
           | dsfyu404ed wrote:
           | People will pay more for cars that project images they want
           | to project. This explains pretty much all the car preference
           | patterns for people who have enough money sloshing around
           | that they don't need to purely focus on maximizing utility
           | for cost but who don't have "I don't care what you think"
           | money.
        
           | tzs wrote:
           | > Bit by bit, it gets harder to figure out how buying a Tesla
           | is ever the right answer
           | 
           | Great car or great charging network. Pick one.
           | 
           | If you pick charging network get a Tesla. Otherwise get
           | something else.
        
             | mbesto wrote:
             | This.
             | 
             | I really want to sell my Model S (for like an Audi for
             | example), but no one can compete with its battery:
             | 
             | https://docs.google.com/spreadsheets/d/1k1DOw-
             | NwvW8E8tQeXlac...
        
               | judge2020 wrote:
               | Or rather, the Hummer EV has the battery of two Model S
               | cars! It just is twice as inefficient..
        
             | mmmmmbop wrote:
             | Their charging network is open to all cars, not just Tesla.
        
               | rootusrootus wrote:
               | Not in the US.
        
               | tzs wrote:
               | ...in some countries in Europe [1]. It's not quite there
               | yet in the US [2].
               | 
               | [1] https://www.tesla.com/support/non-tesla-supercharging
               | 
               | [2] https://cars.usnews.com/cars-
               | trucks/features/superchargers-o...
        
               | vel0city wrote:
               | "All cars" is absolutely not true. If I drive up to a
               | Tesla Supercharger with my Mach E in the US, I'd have no
               | way to charge my car. The Tesla charger has a proprietary
               | plug that needs a proprietary app with a proprietary
               | payment network to start the charge.
               | 
               | Tesla chargers _are not_ open, in the slightest. Them
               | publishing specs online on the physical plug design doesn
               | 't make it open to everyone. There's still lots of Tesla
               | IP that covers those specs and designs. Tesla won't
               | license it to other automakers unless other automakers
               | essentially never enforce any of their IP.
        
               | mmmmmbop wrote:
               | Thanks for pointing that out. I genuinely was under the
               | impression that it was open to all cars, so I learned
               | something new today. It made me realize I fell prey to
               | Tesla's PR.
        
           | starik36 wrote:
           | > Autosteer (beta) is worse than the competition
           | 
           | Autosteer on our Tesla Model 3 is actually pretty great. I
           | use it all the time because it makes driving a lot less
           | tiring, especially over long distances. I don't know where
           | you get your information.
           | 
           | As far as competition, our other car Kia Forte 2022 now comes
           | with Autosteer like feature as well. With that, unless, the
           | lines are pristinely painted, you are literally taking your
           | life into your own hands.
        
             | cyri wrote:
             | same here! i love the autosteer in my Sept 2019 Model 3.
        
       | buildbuildbuild wrote:
       | Ranking by miles per disengagement without context as to which
       | roads the vehicles drove on does not feel like the most useful
       | metric. (On a track? In a well-mapped city? Cross-country on
       | highways?)
        
         | cma wrote:
         | Limiting to a well mapped city should be a point in favor of
         | Cruise and Waymo, they are operating it with more restraint.
         | Normalize for the raw difficulty of the area (city vs highway,
         | etc.), but don't penalize for taking steps to make their
         | operation area safer (hd mapping it).
        
       | jmacd wrote:
       | The price for this software is $15,000 USD at the present time.
        
       | ff317 wrote:
       | The main interesting charts they're showing are about
       | disengagement rates, but this is a pretty sketchy comparison,
       | both between manufacturers and over time with Tesla's FSD as
       | well.
       | 
       | "Disengagement" is going to happen for different reasons for all
       | of these different cases. There's the axis that runs from merely
       | inconveniencing others (e.g. "I disengaged because I was holding
       | up traffic by being embarrassingly slow through a turn at an
       | intersection") to the ones where a serious accident was averted
       | (e.g. "I hit the brakes moments before the car caused a head-on
       | collision). There's the scenario differences that feed into
       | thresholds for deciding to disengage: professional safety drivers
       | on planned tasks that have been given a sort of "rules of
       | disengagement", vs everyday normal humans getting groceries and
       | intervening whenever they feel like it for whatever reason.
       | There's the fact that many competitors are operating in tight
       | geofenced, HD-mapped areas, while FSD is operating real-world all
       | over the country.
       | 
       | I agree that Tesla's lack of transparency is troubling here, but
       | this article seems to be trying to pressure them to increase it
       | by taking a dishonest tack and stack all the unknowns against
       | them in the worst possible way.
       | 
       | They have published basic accident statistics (in the past), and
       | those have generally shown their automation to be a net win. In
       | their own published stats from Q4 21 (latest available on
       | https://www.tesla.com/VehicleSafetyReport ), the comparison they
       | make is this:
       | 
       | * NHSTA data says the whole US averaged one accident per 484K
       | miles.
       | 
       | * All Tesla cars on the road averaged one accident per 1.59M
       | miles (so, ~3.5x better? This could also be caused by the profile
       | of Tesla drivers rather than any real car safety difference -
       | income, where they live, etc)
       | 
       | * Tesla cars with some form of Autopilot engaged averaged one
       | accident per 4.31M miles driven (~2.7x better than non-Autopilot
       | Tesla data, and overall ~10x better than the national average).
        
         | ncallaway wrote:
         | Is there a reason there aren't stats for 2022 on the
         | VehicleSafetyReport link?
         | 
         | Is that a typical delay (e.g. were Q1, Q2, Q3 21 reports not
         | available in December of 2021), or have they stopped or
         | suspending publishing the reports?
         | 
         | I'm just not familiar enough with the site to know what is
         | normal.
         | 
         | > Tesla cars with some form of Autopilot engaged averaged one
         | accident per 4.31M miles driven (~2.7x better than non-
         | Autopilot Tesla data, and overall ~10x better than the national
         | average).
         | 
         | Do you know if "accident" encompasses all accidents, or is
         | there a threshold level for severity? I'd be curious if the
         | same patterns hold for "serious accidents" defined as someone
         | (in any vehicle in the accident) being seriously injured or
         | killed.
        
       | ivraatiems wrote:
       | I want to understand the minds of engineers at Tesla - or
       | anywhere - who willingly put this kind of stuff out into the
       | world. How do you convince yourself Tesla FSD is safe enough to
       | ship? Are you just so worried about your boss being mad at you
       | that you don't care? Are you so bought-in that you ignore obvious
       | signs what you're doing is dangerous?
       | 
       | It's engineering malpractice to test this product on public
       | roads, in my view. It's beyond malpractice that the government -
       | at city, state, and federal levels - is allowing it.
       | 
       | (To be clear, I am not against self-driving cars in all
       | instances. I am talking specifically about the Tesla FSD, which
       | has been dangerous since launch and isn't getting better.)
        
         | BLanen wrote:
         | This is what happens when no-one takes their ethics classes
         | seriously because naive "innovation good".
        
         | brandonagr2 wrote:
         | You misunderstand what it does today. It doesn't make the car
         | autonomous. It is a driver assistance system that must be
         | constantly monitored by the driver, it is perfectly functional
         | and helpful if you understand what it does.
        
           | maximinus_thrax wrote:
           | You're right.
           | 
           | However, this is copy-pasted from Tesla's website:
           | 
           | "Tesla cars come standard with advanced hardware capable of
           | providing Autopilot features, and full self-driving
           | capabilities--through software updates designed to improve
           | functionality over time."
           | 
           | Who's to blame for this misunderstanding?
        
         | robotburrito wrote:
         | I wonder if it's the case of, "someone will ship this even if I
         | quit and lose my income, so I may as well do it."
        
           | lamontcg wrote:
           | The individual engineers working there probably make whatever
           | part of the code that they're working on much better--before
           | they get burned out and quit after a year or three and then
           | that part of the codebase rots.
        
         | chadlavi wrote:
         | I would guess the actual engineers know it's not nearly ready,
         | and it's product/marketing people (or senior leadership) who
         | are forcing it to be released too early
        
           | bavila wrote:
           | If the engineers know it is not ready and they still deploy
           | it despite knowing it is reasonably foreseeable that this
           | system will cause injury or death, that is still on them.
           | Unless the product/marketing people are holding the engineers
           | at gun point, the engineers are free to refuse and quit if
           | need be. I have quit jobs (with no prospects lined up) and
           | turned down offers in the past due to ethical objections, so
           | I am not particularly sympathetic to the "just doing my job"
           | defense.
        
         | jfoster wrote:
         | There's been a lot of cars on the road with FSD for quite a
         | while now. You say it's dangerous, but do you have any data to
         | substantiate that? Even a single accident?
        
         | lamontcg wrote:
         | Maybe abusing workers into 80 hour a week death marches really
         | doesn't produce good results?
        
         | jandrese wrote:
         | The even more insane part is that the newest FSD turns off a
         | the distance sensors and tries to rely entirely on computer
         | vision, even though it has blind spots and computer vision is
         | one of those fields where success is still measured in
         | statistical deviance. The newest Teslas don't even ship with
         | the sensors anymore. It's like Elon is setting them up to fail.
        
         | wickedsickeune wrote:
         | It's the same way that the news can get biased without
         | corruption. If you disagree with the boss, they can find
         | someone who agrees.
        
           | mzs wrote:
           | Also H-1Bs are kind-of trapped.
        
         | justapassenger wrote:
         | My take - Tesla doesn't really employe people who understand
         | safety (and I don't mean that as a ding for people working
         | there - most of software engineers just have no idea what is
         | required to build safety critical system). As a result, people
         | likely think they're doing things correctly.
        
         | ff317 wrote:
         | The counterpoint goes something like this (not that I
         | necessarily buy it, but this is what I infer to be Tesla's
         | reasoning):
         | 
         | 1) We're only going to fully "solve" self-driving with ML
         | techniques to train deployed NNs; it can't be done purely in
         | human-written deterministic code because the task is too
         | complex.
         | 
         | 2) Those NNs are only going to come up to the necessary levels
         | of quality with a _ton_ of very-real-world test miles. Various
         | forms of  "artificial" in-house testing and simulation can help
         | in some ways, but without the real-world data you won't get
         | anywhere.
         | 
         | 3) Deploying cars into the real world (to gather the above)
         | without some kind of safety driver doesn't seem like a great
         | path either. There's no backup driver to take over and
         | intervene / unstick the car, and so far driverless taxi fleet
         | efforts have been fairly narrowly geofenced for safety, which
         | decreases the scope of scenarios they even get data on vs the
         | whole real-world driving experience.
         | 
         | 4) Therefore, the best plan to acquire the data to train the
         | networks is to use a ton of customers as safety drivers and let
         | them test it widely on the real routes they drive. This is
         | tricky and dangerous, but if it's not too dangerous and the
         | outcome saves many lives over the coming years, it was worth
         | it.
        
           | ivraatiems wrote:
           | I could see something like this being their logic - maybe not
           | with neural networks/machine learning specifically, but
           | certainly "the only way to get to where we want to go is to
           | do this".
           | 
           | My counter-counter-point would be that there's plenty of
           | other companies that are doing this more safely, and also
           | that ends don't justify the means when those means involve
           | killing pedestrians.
        
             | kjksf wrote:
             | Those other companies are rapidly going bankrupt because
             | the economics of doing it the non-Tesla way seem
             | impossible.
             | 
             | zoox was bought by Amazon for $1 billion, which seems a lot
             | but it was the amount of money invested into company, so it
             | was sold at cost to Amazon.
             | 
             | argo.ai just shutdown. VW and Ford spent several billions
             | of dollars on that.
             | 
             | drive.ai shutdown and was acqui-hired by Apple for the car
             | project that was reportedly just pushed to 2026
             | 
             | aurora is publicly traded and is on the ropes, reportedly
             | trying to find a buyer before they run out of cash.
             | 
             | We'll see how long GM and Google will be willing to put ~$2
             | billion a year into Cruise / Waymo. I don't see them
             | generating significant revenue any time soon.
             | 
             | Tesla and comma.ai have a model where they make money while
             | making progress. Everyone else just burns unholy amounts of
             | capital and that can last only so long.
        
               | SketchySeaBeast wrote:
               | So we're arguing it's better to offer a FSD that crashes
               | rather than go bankrupt because maybe one day it won't?
        
               | sangnoir wrote:
               | It's worse than that in my reading; the argument is
               | entirely neutral on crashes, the only metric of success
               | presented is not going out of business!
        
               | kjksf wrote:
               | No, I'm arguing that Waymo, Cruise and others following
               | similar strategy will go bankrupt before delivering a
               | working product and Tesla / Comma.ai won't.
               | 
               | As to crashes: the disengages part of your rebuttal is
               | implied claim that Waymo / Cruise are perfectly safe.
               | 
               | Which they are not.
               | 
               | FSD have been deployed on 160 thousand cars. No
               | fatalities so far. No major crashes.
               | 
               | Cruise has 30 cars in San Francisco and you get this:
               | 
               | > Driverless Cruise robotaxis stop working
               | simultaneously, blocking San Francisco street
               | 
               | > Cruise robotaxis blocked traffic for hours on this San
               | Francisco street
               | 
               | Another Cruise robotaxi stopped in the muni lane.
               | 
               | Waymo car also stopped in the middle of the road.
               | 
               | Neither FSD or Cruise or Waymo had fatalities.
               | 
               | They all had cases of bad driving.
               | 
               | This is not Safe-but-will-go-bankrupt vs. not-safe-but-
               | won't-go-bankrupt.
               | 
               | It's: both approaches are unsafe today but one has a path
               | to becoming safe eventually and the other doesn't, if
               | only because of economic realities of spending $2 billion
               | a year without line of sight for going break even.
               | 
               | https://www.theverge.com/2022/7/1/23191045/cruise-
               | robotaxis-...
               | 
               | https://techcrunch.com/2022/06/30/cruise-robotaxis-
               | blocked-t...
        
               | SketchySeaBeast wrote:
               | > As to crashes: the disengages part of your rebuttal is
               | implied claim that Waymo / Cruise are perfectly safe.
               | 
               | I didn't mean to suggest that. I was responding to your
               | words here:
               | 
               | > Tesla and comma.ai have a model where they make money
               | while making progress.
               | 
               | I'm saying that it's not OK for a car company to keep
               | going with dangerous self driving just because it can
               | afford to.
               | 
               | > FSD have been deployed on 160 thousand cars. No
               | fatalities so far. No major crashes.
               | 
               | That doesn't seem to be the case[1]. Though now we're
               | going to squabble about definitions of "major" and also
               | how is this reporting happening.
               | 
               | [1]
               | https://www.latimes.com/business/story/2022-07-14/elon-
               | musk-...
        
               | robotresearcher wrote:
               | That's how we got cars, planes, medicine, bridges, and
               | ... almost everything.
               | 
               | We can't wait for perfection. The question is how much
               | risk are we willing to absorb.
        
               | SketchySeaBeast wrote:
               | If someone told you that they were going to revolutionize
               | bridge building but it was going to take a bunch of
               | catastrophes to get there how would feel about it?
        
           | uvdn7 wrote:
           | By this reasoning, shouldn't Tesla pay users instead to
           | enable FSD and collect data for them?
        
             | shagie wrote:
             | A tangent to that thought... "do you want people to be
             | financially incentivized to get into novel situations to
             | test situations where FSD was lacking data?"
             | 
             | I recall Waze had some point system to help gather
             | positioning / speed data for side roads that it would try
             | to have you go get with icons... and those were just fake
             | internet points.
        
             | judge2020 wrote:
             | It seems they're doing quite well on their financials by
             | offering access to FSD as a subscription. The misconception
             | here is that FSD is needed for them to collect data - they
             | collect autopilot sensor data on all cars regardless of FSD
             | or not.
        
           | _Algernon_ wrote:
           | >4) Therefore, the best plan to acquire the data to train the
           | networks is to use a ton of customers as safety drivers and
           | let them test it widely on the real routes they drive. This
           | is tricky and dangerous, but if it's not too dangerous and
           | the outcome saves many lives over the coming years, it was
           | worth it.
           | 
           | Maybe you should use specifically trained test drivers, who
           | are acutely aware of the limitations and know how to deal
           | with them, not random people who have been told through
           | intentional snake oil marketing by a billionaire with a god
           | complex who needs to feed his ego that the car can drive
           | itself.
           | 
           | It's insane that governments allow these vehicles on the
           | road.
           | 
           | Also, that kind of the-end-justifies-the-means reasoning has
           | lead to a lot of catastrophic developments in history. Let's
           | not go there again.
        
             | dcow wrote:
             | I appreciate being principled about ends not justifying the
             | means. But in my experience this principle is not applied
             | universally by people. It's cherry-picked as what amounts
             | to a non-sequitur when deployed in a discussion. Don't get
             | me wrong, I wish it were a universally held and enforced
             | moral principle, but it's not.
             | 
             | Anyway, the reality is that Teslas are _safer_ than any
             | other car on the market right now, despite the _scary
             | voodoo tech_. So it seems in this case the means are also
             | justified. If auto-pilot and FSD were causing more
             | accidents than humans, we 'd be having a different
             | conversation about ends justifying means, I surely agree.
        
           | lostmsu wrote:
           | IMHO, if they are allowed to use the public as a guinea pig,
           | the data they collect should be available for everyone.
        
           | ot wrote:
           | The fact that the other companies which went with a more
           | thoughtful roll-out, delaying the time to market, got a much
           | better track record, is a strong counter-counter-point IMO.
        
             | jlundberg wrote:
             | The thing is that Tesla FSD is trying to solve another
             | problem than cars driving in geograhically limited areas.
             | 
             | Thus comparing the disengagement rates does not make sense.
        
           | doliveira wrote:
           | Yeah, move and fast and break things, you can just do beta
           | testing with live real human beings.
           | 
           | That Tesla is even allowed to do it speaks volumes to the
           | unchecked power Elon's influence yields.
        
           | bentcorner wrote:
           | I feel like you could enable FSD for every Tesla car in a
           | "backseat driver" mode and have it mirror actions the driver
           | does (so it doesn't have control but you're running it to see
           | what it _would_ do, without acting on it), and you watch for
           | any significant diversions. Any time FSD wanted to do
           | something but the driver did something else could have been a
           | real disengagement.
        
             | bhauer wrote:
             | They had been doing that, and called it "shadow mode" [1].
             | I suspect it's no longer being done, perhaps they reached
             | the limit of what they can learn from that sort of
             | training.
             | 
             | [1] https://www.theverge.com/2016/10/19/13341194/tesla-
             | autopilot...
        
               | judge2020 wrote:
               | When it's in 'real mode', any disengagement or
               | intervention (ie. using the accelerator pedal without
               | disengaging) is logged to the car and sent to Tesla for
               | some data analysis, and this has been a thing for a
               | while. Of course we don't know just how thorough their
               | data science plays into FSD decision making and what
               | interventions they actually investigate.
        
             | gretch wrote:
             | I don't think that would work due to "bad" drivers. We all
             | drive differently than we know we should drive in certain
             | circumstances (e.g. road is completely empty in the middle
             | of rural new mexico)
             | 
             | For example, you can imagine FSD would determine to go
             | straight down a straight lane with no obstacles - that
             | would be the correct behavior. Now imagine in real life the
             | driver takes their hand off the wheel to adjust the radio
             | or AC, and as a result the car drifts over and lightly
             | cross the lane marker - this doesn't really matter because
             | it's broad daylight and the driver can see there's nothing
             | but sand and rocks for 2 miles all around them. What's the
             | machine conclude?
        
               | kevincennis wrote:
               | I forget who it was (maybe George Hotz) that said
               | something to the effect of "All bad drivers are bad in
               | different ways, but all good drivers are good in the same
               | way".
               | 
               | The point being made was basically that in the aggregate
               | you can more or less generalize to something like "the
               | tall part of the bell curve is good driving and
               | everything on the tails should be ignored".
               | 
               | Since learning happens in aggregate (individual cars
               | don't learn - they simply feed data back to the
               | mothership), your example of a single car errantly
               | turning the wheel to adjust the radio would fall into the
               | "bad in different ways" bucket and it would be ignored.
        
               | gretch wrote:
               | "All bad drivers are bad in different ways, but all good
               | drivers are good in the same way".
               | 
               | I accept that as a plausible hypothesis to work off of
               | and see how far it goes, but I would not bank on it as
               | truth.
               | 
               | I'll give another example, I think a significant portion
               | of the time, people roll through stop signs (we'll say,
               | 25% of the time? intuitive guess). I do it myself quite
               | often. This is because not all intersections are built
               | the same - some intersections have no obstacles anywhere
               | near them and you can tell that duh, there's no cars
               | coming up at the same time as me. Other intersections are
               | quite occluded by trees and what not.
               | 
               | I'm fine with humans using judgement on these, but I
               | would not trust whatever the statistical machine ends up
               | generalizing to. I do not think rolling a stop sign makes
               | you a 'bad' driver (depending on the intersection).
               | Still, if I knew I was teaching a machine how to drive, I
               | would not want it to be rolling stop signs.
        
               | sushisource wrote:
               | The theory would be that washes out in the noise. It's a
               | simplification, but on average, most of the people most
               | of the time are not doing that - why would it zone in on
               | the rare case and apply that behavior?
        
               | gretch wrote:
               | Well zoning in on the rare cases is the difference
               | between what we (as in society's collective technology,
               | not tesla) have today and full reliable self-driving.
               | 
               | Even in the anecdotes throughout the rest of the comment
               | section, there's a lot of people that said "yeah I tried
               | FSD for a limited period of time and it worked for me".
               | Because we're not saying that taking FSD outside right
               | now will kill you within 5 minutes. We're saying that
               | even if it kills you 0.01% - that's pretty sketch.
               | 
               | The general principle is that all of the drivers that
               | have been recruited to be 'teachers' to the machine are
               | not aware that they are training a machine. As a result,
               | they are probably doing things that are not advisable to
               | train on. This doesn't even just apply to machines - how
               | you drive when you are teaching your teenage child is
               | probably different than the things that you do on a
               | regular basis as an experienced driver. If you are not
               | aware that you actually teaching someone something,
               | that's a dangerous set of circumstances.
        
             | MontyCarloHall wrote:
             | I believe this is exactly how Comma trains OpenPilot.
        
       | leetharris wrote:
       | I've had FSD for about 3-4 months and it can do the vast majority
       | of my drives without disengagement.
       | 
       | There are some very obvious things it still struggles with like
       | construction and roads without clearly defined lines. But I just
       | don't use it on those routes.
       | 
       | I think Tesla made a mistake trying to take a "boil the ocean"
       | approach that will work literally anywhere instead of focusing on
       | key areas first. It ensures you need to 100% nail it before it
       | can be considered "released" as opposed to releasing with a
       | specific set of criteria to meet.
        
         | misiti3780 wrote:
         | I agree, I have been testing it for a year or so.
         | 
         | 1. I have many many multi mile 0 intervention drives
         | 
         | 2. it is getting a noticeably better every release
         | 
         | Their video model will be very good in a the future if the
         | current release pattern keeps up. Not sure what it will take to
         | be certified by the USG as level 5, but it seems like it will
         | be at the top of this chart in the future.
        
           | croes wrote:
           | Or they will hit a wall with no further progress.
        
             | misiti3780 wrote:
             | not likely for a long time, given they are getting more
             | data every day.
        
               | croes wrote:
               | What if it's not a data problem?
               | 
               | That's the same reasoning as thinking a faster CPU or
               | more RAM would solve problem.
               | 
               | More data could even lead to worse models.
        
               | jcranmer wrote:
               | Getting more data is useless if it's not the right data.
        
               | rafaelero wrote:
               | They are getting the right data because their users are
               | reporting exactly the instances where it fails.
        
               | croes wrote:
               | Knowing where it fails isn't the same as knowing what is
               | right.
        
           | oliveshell wrote:
           | >Their video model will be very good in a the future if the
           | current release pattern keeps up
           | 
           | It appears they're abandoning the video-only approach:
           | 
           | https://techcrunch.com/2022/12/07/tesla-appears-to-be-
           | turnin...
        
             | misiti3780 wrote:
             | sure, but it still going to use a video model, with
             | additional sensors.
        
       | helf wrote:
       | I just, you know, steer my Volt. With my hands. And eyeballs.
       | 
       | If I need to do something on a drive I get my wife to drive.
       | 
       | I honestly do not understand the obsession with self driving or
       | driver assist.
       | 
       | I drove a Ford with active lane assist and having the car
       | physically move the steering wheel is the worst.
       | 
       | But apparently Im one of those weirdos who actually enjoys
       | driving.
        
         | [deleted]
        
         | Zigurd wrote:
         | Stopping humans from driving cars could save many lives. A
         | million people worldwide, die in road accidents every year.
         | Half are cyclists or pedestrians.
         | 
         | But I tend to believe Gill Pratt, at Toyota, who says he
         | doesn't know how to make level 5 AVs, more than I believe Elon.
        
           | rootusrootus wrote:
           | A third of all fatalities are caused by drunk drivers. That's
           | easier to solve than self-driving cars. Another big chunk are
           | weather related, driving at night, driving while sleepy, etc.
           | The median driver is pretty safe, there are better ways to
           | spend money than building automated car toys for them to play
           | with.
        
             | Zigurd wrote:
             | That may be a US number. Other places that have stricter
             | drunk driving laws, or that culturally have less drinking,
             | can also have much worse road safety.
             | 
             | AVs can also solve other problems like efficiency and
             | surge-ability in public transportation. Not just a toy for
             | the rich.
        
           | Drunk_Engineer wrote:
           | While nobody knows how to make a level 5 AV, we do know how
           | to eliminate road deaths:
           | 
           | https://www.theguardian.com/world/2020/mar/16/how-
           | helsinki-a...
        
         | leetharris wrote:
         | I drive 150+ mile trips all the time. Autopilot is a game
         | changer.
         | 
         | FSD is far less impactful on city streets as I enjoy city
         | driving, but it gets mind numbing on long highway drives.
        
           | rootusrootus wrote:
           | > I drive 150+ mile trips all the time. Autopilot is a game
           | changer.
           | 
           | Ha, AP puts me in situations (especially at merge points) I'd
           | never put myself in. It certainly doesn't make the drive less
           | stressful. Quite the opposite. And my wife won't even let me
           | turn on AP now when she and the kids are in the car (this
           | would be most road trips) because it's startling when the car
           | brakes suddenly for an overpass.
        
         | mplanchard wrote:
         | People are super dangerous in cars, so it's nice to have
         | technologies to make driving safer.
         | 
         | I also enjoy driving, but I love the adaptive cruise control in
         | my Toyota, because it makes the most frustrating kind of
         | driving (lots of traffic on the highway) super easy to deal
         | with. It makes me much less hesitant to make the trip from
         | Austin to San Antonio, because rather than having to be hyper-
         | vigilant for two hours due to constantly shifting stop-and-go
         | traffic, I can let the car control the speed and instead focus
         | more on watching out for crazy drivers.
         | 
         | I am not sure whether I feel the same way about full self-
         | driving. At that point, in my mind, you'd be better off taking
         | a train, since you're very likely to become super
         | distracted/tired with nothing to focus on. That said, if it
         | makes the roads safer in aggregate, I'm not against it.
         | 
         | Given the kind of information we see in this article and other
         | factors, though, I'll never buy a Tesla. I'll wait until it
         | hits the major car manufacturers.
        
           | TheCoelacanth wrote:
           | It is nice to have technology to make driving safer. Maybe
           | someday Tesla can get some.
        
             | brandonagr2 wrote:
             | https://www.tesla.com/blog/model-y-earns-5-star-safety-
             | ratin...
        
           | Der_Einzige wrote:
           | Btw, radar powered adaptive cruise control on my 2017 Lexus
           | is excellent. Isn't jerky, can start from a complete stop
           | again, and is reliable.
           | 
           | I feel bad for Tesla owners who are stuck with shitty
           | cameras. Lex Friedman should have given a way bigger tongue
           | lashing to kaparthy for being so stupid.
        
           | rootusrootus wrote:
           | > People are super dangerous in cars
           | 
           | That is a very subjective description. One fatality per
           | 100,000,000 miles actually seems pretty good, given how
           | complex driving can be. And even then, that number includes
           | the shitty drivers that account for most wrecks. The median
           | driver is quite good, objectively.
        
             | mplanchard wrote:
             | It is subjective, but almost everything you can compare
             | cars to is less dangerous.
             | 
             | Relative to trains, buses, and airplanes, cars are super
             | dangerous.
             | 
             | Relative to motorcycles, they're safe.
        
               | rootusrootus wrote:
               | Relative to trains & buses (airplanes is a separate niche
               | entirely), cars are a lot more useful too. That is
               | inextricably tied to why they are more dangerous.
        
             | c-cube wrote:
             | Does that also include pedestrians and cyclists killed by
             | drivers?
        
               | rootusrootus wrote:
               | Yes, it does.
        
             | mplanchard wrote:
             | I think it's also important to note that fatalities are not
             | the only way to define dangerous. Plenty of accidents leave
             | people disfigured, badly injured, permanently disabled,
             | with lifelong chronic pain, and so on. While I'd certainly
             | prefer any of that to dying, I'd rather have neither if I
             | could choose.
        
         | Blackthorn wrote:
         | > I honestly do not understand the obsession with self driving
         | or driver assist.
         | 
         | Self driving would be a complete game changer for the disabled.
        
           | beAbU wrote:
           | What benefit does this bring over public transport or
           | something like taxis/ride sharing services?
           | 
           | If I can be allowed to grossly generalze here: would the
           | elderly and the disabled be in a position to afford a vehicle
           | as expensive as this? Would they /need/ a permanent vehicle
           | to take them to wherever they need to be, if they already
           | have everything they need right there (hospice, retirement
           | complex etc)? Remember: I'm grossly generalizing here, but I
           | think my point stands.
           | 
           | The way I see it: I've been making use of FSD vehicles to
           | take me to work and back for almost a decade now. Sometimes
           | even the vehicle arrives right to my front door and takes me
           | directly to my destination. The fact that it's wetware
           | instead of software has no material impact on my experience
           | or expectations. Thus I'll always go for the cheapest option.
           | 
           | In my view FSD is trying to solve a problem that only exists
           | in the mind of the isolated techie who forgets about the real
           | world sometimes.
        
             | Blackthorn wrote:
             | > What benefit does this bring over public transport or
             | something like taxis/ride sharing services?
             | 
             | No offense but do you own a car? Or do you only ever take
             | public transport and taxis / ride-shares all the time?
             | 
             | Being able to do things on your own, when you want to, on
             | your own schedule, especially in places that are not cities
             | and have _much_ worse infrastructure on both public transit
             | and ride sharing, is immensely freeing.
        
           | andrewmunsell wrote:
           | Or, the elderly. How many times have we seen accidents due to
           | an confused individual driving down the wrong side of the
           | street or mistaking the accelerator for brake?
           | 
           | Tesla may not ever get FSD working with its current approach,
           | but if a competitor can do it (even geofenced like how Waymo
           | and Cruise operate today!) and the economies of scale could
           | kick in to make it more affordable than Ubering everywhere,
           | then it would be a huge benefit to those that are even unable
           | to use public transportation.
        
           | gnu8 wrote:
           | For the disabled people with money anyway. How are disabled
           | people who live entirely from disability subsidies supposed
           | to buy a Tesla?
        
             | Blackthorn wrote:
             | I'm not talking about Tesla, just self driving in general,
             | which is what the comment was asking about. I don't trust
             | Tesla to ever deliver an actual self driving product.
        
         | upofadown wrote:
         | You can't use your smart phone while driving. People tend to
         | use their smart phones anyway which has caused a big increase
         | in distracted driving and the resultant carnage.
         | 
         | What's the world coming to when some schmuck on public transit
         | can do things like play games or even get work done where a
         | driver can not do that? What's worse is that smart phones make
         | public transit a lot more usable and attractive in general.
         | Self driving cars keep the automotive transportation mode more
         | competitive.
        
         | rafaelero wrote:
         | > I honestly do not understand the obsession with self driving
         | or driver assist.
         | 
         | What a lack of imagination.
        
         | lkbm wrote:
         | I'm one of those weirdos who enjoys cycling, and the biggest
         | threat to my life, by far, is humans driving cars. Drunk
         | humans, tired humans, humans eating food, humans otherwise
         | distracted, or humans just being human.
         | 
         | When I think about the people I know who died under age forty,
         | I think it's majority car accidents. At least a plurality.
         | 
         | This holds roughly true in the statistics: car accidents have
         | been at least in the running for leading cause of death for
         | Americans under 30 for years. We need to address this issue,
         | and self-driving cars are rapidly approaching viability as a
         | solution.
         | 
         | Even if we don't eliminate human drivers, having viable
         | alternatives means we can be strict about who drives. Right
         | now, for so, so many people, driving is a necessity, and thus
         | we're unwilling to limit access to people who are actually good
         | and safe at driving. If you get in an accident, your insurance
         | company will your rates, but you keep your license. If you get
         | in another, we'll...they raise them some more more. It's really
         | quite difficult to actually _lose_ your license in the US. My
         | friend had to get a breathalyzer after _multiple_ DUIs, but he
         | got to keep his license through all of it.
         | 
         | If driving were seen as an optional privilege, we could limit
         | it to people who do it safely, and perhaps those people would
         | only do it while sober and alert given the option to use self
         | driving when they aren't.
        
       | pcurve wrote:
       | It's interesting how quickly the narrative has turned against
       | Tesla in recent month, not just Musk but in respect to the
       | performance of their cars.
        
       | maxcan wrote:
       | This is relatively consistent with my experience using the FSD
       | Beta in and around Miami with one major caveat.
       | 
       | With two exceptions, all of my disengagements have been "quality
       | of life" disengagements where I disengaged for reasons other than
       | safety:
       | 
       | * to take a different route or get in the correct lane * to be
       | polite / courteous to a pedestrian who wasn't yet in the
       | crosswalk * to move more quickly through a construction zone or
       | other traffic irregularity.
       | 
       | In none of those cases did I believe that the car was going to
       | cause any risk of an accident, at best there was risk of pissing
       | someone off (which, to be fair, in a city like Miami with far
       | more firearms than judgement could be fatal, but not due to car
       | at all). So yes, these are disengagements, but they weren't
       | dangerous.
       | 
       | There is one case where I do get dangerous disengagements. At the
       | intersection pictured below, when I'm in the position of the
       | white van, sometimes tesla will mistake a green on the far
       | traffic light (which is for the other road, intersecting at a
       | sharp angle) for my road and proceed if it is green even though
       | it is red. Happened twice, haven't noticed on the last version
       | yet. Intersection here:
       | https://www.google.com/maps/@25.750348,-80.2061284,3a,75y,26...
        
         | ccorda wrote:
         | The original data source of all this breaks this out into
         | "Critical Disengagements" (CDE) and Non-Critical.
         | * Critical: Safety Issue (Avoid accident, taking red light,
         | unsafe action)       * Non-Critical: Non-Safety Issue (Wrong
         | lane, driver courtesy, merge issue)
         | 
         | That both Fred from Electrek and Taylor from Snow Bull ignore
         | this distinction shows me their intent is less than neutral.
         | 
         | Looking at the original data sources [1], FSD seems to be
         | improving over time at the metric that matters most:
         | * City miles per CDE have gone from ~50 to ~120.        * % of
         | drives over 5 miles with no CDE have gone from 72% to 93%.
         | 
         | I've been pleasantly surprised that human oversight while the
         | software improves seems to be a viable approach. FSD doesn't
         | appear particularly close to Waymo/Cruise at the moment, but
         | it's not as if they are crashing left and right (you'd
         | certainly hear about it if they were).
         | 
         | Personally I have it, I don't find it enjoyable to use -- but I
         | also don't feel unsafe when using it. Highway autopilot on the
         | other hand I find immensely reliable and valuable.
         | 
         | [1] https://www.teslafsdtracker.com/
         | https://twitter.com/eliasmrtnz1
        
           | lastofthemojito wrote:
           | It's sort of meta to this whole discussion, but it has been
           | interesting to see "journalists" like Fred Lambert and the
           | evolution of their coverage.
           | 
           | Fred was an early Tesla fanboy and investor; he used to gush
           | over Tesla and Elon Musk in his writing at Electrek. It
           | seemed like most of the time when Tesla was mentioned in an
           | article, he also mentioned TSLA, the stock ticker...but he
           | didn't seem to do that as regularly for say, Ford or GM.
           | 
           | It was rumored that Fred had a direct line to Elon and there
           | were also occasionally public Twitter interactions between
           | the two, but Fred gradually become more critical of Elon,
           | especially over FSD stuff. After one critical article, Elon
           | blocked Fred on Twitter: https://twitter.com/fredericlambert/
           | status/14176561698297487...
           | 
           | Fred certainly hasn't forgotten about it: https://twitter.com
           | /FredericLambert/status/15186308301381795...
           | 
           | So, Tesla fanboy/blogger comes up with some valid criticism,
           | Elon cuts him off, which in turn probably makes him more
           | critical of Tesla/Elon.
        
       | naillo wrote:
       | Feels great seeing all this reputational damage attacks towards
       | the one guy making the biggest impact in the incoming climate
       | disaster.
        
         | izzydata wrote:
         | Tesla isn't even the largest manufacturer of electric vehicles
         | and their impact on climate change is not as great as it seems
         | like it should be. Not to mention that launching rockets into
         | space is like driving a million gasoline vehicles.
        
         | yks wrote:
         | Same guy fueling proof-of-work cryptocurrency mania? It is not
         | entirely clear to me which direction his contribution to the
         | carbon levels is going to end up at.
        
       | adoxyz wrote:
       | I was one of the first people to get FSD Beta access and have
       | given it a try every single time they pushed an update, and
       | honestly, it's unusable and dangerous.
       | 
       | The car just does not behave like a regular driver in any
       | capacity. It's a neat trick to show when there is nobody on the
       | road, but besides that, I have lost all faith in FSD ever coming
       | out in any meaningful way. I only paid $2k for FSD ($7k total w/
       | Enhanced Autopilot), but even that is too much for what FSD
       | actually is.
        
       | aidenn0 wrote:
       | "Don't look at the data, give me lots of money to try it
       | yourself" is something you expect to hear from a cult leader, not
       | an engineer.
        
       | seshagiric wrote:
       | First time Tesla owner and recently had opportunity to experience
       | their full self drive feature. My findings:
       | 
       | 1. When it works it's just great. From maintaining lanes, speed
       | limit, distance to other cars, auto park and summon are
       | delightful.
       | 
       | 2. Auto park in home garage is a bit too slow to use. But
       | parallel parking is cool.
       | 
       | 3. Even without a radar it worked on most roads and different
       | lighting conditions.
       | 
       | 4. Random speeding/ slowing is definitely present. Not like
       | jamming into the vehicle in front of you but sudden acceleration
       | can surprise. More dangerous was random slowing down at traffic
       | signal, even when green and at traffic circles.
       | 
       | 5. Overall it's definitely good enough to show promise of self
       | driving. They are not there yet but I would place them as one of
       | the most advanced considering the range of functionality.
       | 
       | 6. Still the $15k is simply way too much. You are better off with
       | occasional $200 per month like when going on a long trip.
        
       | omgomgomgomg wrote:
       | Seems like they are in an irreversible development hell of bugs.
       | Probably the whole soft and hardware infrastructure was badly
       | planned and/or implemented.
       | 
       | If they are developing it and it is getting worse, this simply
       | means there is a lot of technical debt and more to surface. The
       | guy leading it has left as soon his stock vested, why not stick
       | around and write history?
       | 
       | Probably when waymo departed from them and they thought they will
       | fork the code and then improve it is where it all went wrong.
       | 
       | This also means that new devs and engineers cant make sense of
       | the code and product, we have all inherited some of these code
       | bases, havent we.
       | 
       | Seen many such people who deploy much less than a mvp and then
       | face the consequences eventually.
       | 
       | All the mvp out there who survive the beginning at least have a
       | solid software architecture.
        
       ___________________________________________________________________
       (page generated 2022-12-14 23:01 UTC)