[HN Gopher] Honda's now selling the first production car with le...
       ___________________________________________________________________
        
       Honda's now selling the first production car with level 3 self-
       driving
        
       Author : nradov
       Score  : 304 points
       Date   : 2021-03-05 03:03 UTC (19 hours ago)
        
 (HTM) web link (www.thedrive.com)
 (TXT) w3m dump (www.thedrive.com)
        
       | ianai wrote:
       | Is anyone concerned with the possibility of a hacker taking over
       | such vehicles?
        
       | speedgoose wrote:
       | The interior design is old fashioned and very bland on this Honda
       | A8 "flagship". It doesn't look like the future but more like the
       | past.
       | 
       | The Honda E interior is a lot more interesting in my opinion. The
       | Honda E may not have level 3 self-driving in traffic jam yet, but
       | it does have a virtual aquarium.
        
       | callesgg wrote:
       | [X] Doubt.
        
       | fblp wrote:
       | > "Honda claims Sensing Elite was tested in 10 million unique
       | simulated scenarios and 800,000 miles of real-world testing
       | before it decided the tech was ready ready for primetime"
       | 
       | This is not much compared to Tesla's 3 billion autopilot miles
       | and Waymo's 20 million driven miles.
        
         | minikites wrote:
         | Is quantity the only metric?
        
           | numpad0 wrote:
           | Would you tolerate a probabilistic answer to that?
        
           | testrun wrote:
           | With visual deep learning quantity is absolutely crucial.
           | 
           | Quality of course matters too, but the more information you
           | have in different environments, the better.
        
             | rootusrootus wrote:
             | Tesla isn't capturing everyone's autopilot miles. They're
             | not even capturing incidents. They don't have anywhere near
             | enough bandwidth for that.
        
           | arthurcolle wrote:
           | With supervised learning data set size.. ? Uhh
        
           | andrewnc wrote:
           | For deep learning it seems to be....
        
         | selcuka wrote:
         | True, but the results of an experiment is important too. Maybe
         | they reached a satisfactory level of confidence relatively
         | quicker than other manufacturers.
        
       | deagle50 wrote:
       | I would be pleasantly surprised if it's better than OpenPilot.
       | I'm 99% hands off on the highway, 50-75% in the burbs.
        
         | arthurcolle wrote:
         | Tesla Model X ?
        
           | Daho0n wrote:
           | Tesla can't do that (I have no idea if openpilot can either).
        
           | Issaclabs wrote:
           | Openpilot by comma.ai is what the comment refers to
        
           | deagle50 wrote:
           | 2015 Genesis with C2 running OpenPilot.
        
       | marshmallow_12 wrote:
       | the supersonic eurofighter typhoon has autopilot. Are self
       | driving cars more complex then the most advanced airplanes in
       | existence?
        
         | Toutouxc wrote:
         | Typhoon's only flight inputs are basically: desired pitch rate,
         | desired roll rate, desired yaw rate, desired power output. The
         | whole airplane is completely controlled by a computer (all of
         | the control surfaces, engine controls), which gives you an
         | artificially stable, crazy overpowered, laws-of-aerodynamics-
         | defying video-game-like toy to navigate in a huge and
         | completely empty 3D environment. So, yes, self driving cars are
         | infinitely more complex.
        
         | rodgerd wrote:
         | How often does the Typhone routinely travel within a few metres
         | of humans, other vehicles which vary in performance from "the
         | same" to "a fraction of your speed"?
         | 
         | How many traffic intersections are there in the sky?
         | Roundabouts? Wildlife? Domestic livestock?
        
         | jjmorrison wrote:
         | Yes - by several orders of magnitude.
        
       | jackson1442 wrote:
       | Slightly o/t but I think it's interesting that in the imagery for
       | this supposedly futuristic car we see a user interface filled
       | with 3d gradients etc which are design elements you might have
       | expected several years ago in the era of the serif Google logo
       | and iOS 6.
       | 
       | You'd think that in a car, with its limited display
       | size/resolution and glanceability required the UI/UX designers
       | would spring for a more flat, colorful option so you can simply
       | touch the orange square for music or something along those lines.
       | 
       | Not that touch controls in a car are a great UX in the first
       | place, but you'd think they would have moved past the "everything
       | is the same colored 3d rect with a glyph in the middle" phase of
       | UI development.
        
       | cachvico wrote:
       | Let's hope nobody will lose their head over this.
        
       | bearjaws wrote:
       | 100 cars is not a production car.
        
       | acd wrote:
       | "While the vehicle is under the control of the system, the driver
       | can watch television/DVD on the navigation screen or operate the
       | navigation system to search for a destination address, which
       | helps mitigate driver fatigue and stress while driving in a
       | traffic jam."
       | 
       | "Please do not overestimate the capabilities of each Honda
       | Sensing Elite function and drive safely while paying constant
       | attention to your surroundings. Please remain in condition where
       | you can respond to the handover request issued by the system, and
       | immediately resume driving upon the handover request.""
       | 
       | How can you immediately respond and pay constant attention if you
       | are watching a DVD?
        
         | Karawebnetwork wrote:
         | I am guessing that it's more about being alert enough to take
         | back the wheel once the car starts beeping and displaying that
         | the censors are no longer able to distinguish the road. Hoping
         | that this would also pause the media player.
         | 
         | I have the 2019 Accord with the regular Honda Sensing and this
         | happens in multiple situations. For example, the car will turn
         | off automatic cruise control if the weather becomes too severe.
         | Snow can also obscure the sensors in winter.
         | 
         | Some roads are also too damaged (no lines, no barriers) for the
         | lane keep assist to work and you need to control the steering
         | manually when this happen. But that is probably not an issue
         | for the honda sensing elite.
        
       | nkoren wrote:
       | Self-driving is a Hard Problem. I'm sure that there's an
       | incredible amount of sophistication going into a system like
       | this, and I have sincere respect for all the engineers working on
       | it.
       | 
       | Nonetheless, o, the hype! Some caveats:
       | 
       | 1. 100 vehicles is not "production."
       | 
       | 2. "Please ... drive safely while paying constant attention to
       | your surroundings." is not Level 3 Automation. The point of Level
       | 3 automation is explicitly that it should allow the driver to
       | _not_ pay attention to their surroundings under limited
       | circumstances, while the vehicle fully does the driving. The
       | human remains the fallback mode: they must stay awake and alert
       | so that they can take over control if the vehicle requests it.
       | However they are NOT required to maintain a constant supervisory
       | function. If they are, then that 's Level 2 automation.
       | 
       | The problem with Level 3 has always been that it requires the
       | vendor to take liability for unsupervised autonomous driving
       | (like levels 4 and 5), but only some of the time, and the driver
       | to take liability the rest of the time. The handover from one to
       | the other is fraught, with the humans almost certainly being the
       | weak link, since they're not all THAT good at staying alert in
       | the first place, and get much much worse when they don't have
       | supervisory duties to keep them awake. For this reason, many
       | vendors have (wisely, IMHO) chosen to skip Level 3 automation
       | entirely.
       | 
       | So my synopsis of this would be:
       | 
       | 1. Honda's engineers have come up with some genuinely interesting
       | new capabilities for autonomous driving, albeit falling well
       | short of what would be needed for full Level 5 autonomy.
       | 
       | 2. Honda's management would like to convert those innovations
       | into some good PR for the company.
       | 
       | 3. Honda's lawyers, actuaries, and human-factors people have
       | realised that there's no way in hell mass deployment of Level 3
       | automation is a good idea, and so to limit the potential damage,
       | are only allowing 100 vehicles to be shipped, to pre-qualified
       | and well-trained buyers, and then furthermore attaching
       | disclaimers which knock this down to L2.
       | 
       | 4. L5 Autonomy remains a long ways away.
        
       | hasa wrote:
       | Why don't we have automated roads instead. Containers running in
       | scheduled slots.
        
         | mperham wrote:
         | We call them buses.
        
         | amelius wrote:
         | You mean like railways?
        
         | jfoster wrote:
         | I think this would be a great way to bring some automation to
         | some roads.
         | 
         | Tesla/Boring almost did it when they were considering having
         | electric sleds moving cars through Boring tunnels.
         | 
         | If there's a schedule, pre-planned route, and a fenced off
         | road, there's barely any need for fancy sensor suites & AI. The
         | most difficult part of it is probably that the road then
         | couldn't be used by all types of vehicles.
        
       | etaioinshrdlu wrote:
       | Do we know what compute stack they are using for this? What types
       | of processors and software?
        
       | FuckMeNow69 wrote:
       | Fuck Me Now https://sites.google.com/view/meet-for-sex69/home
        
       | ajhurliman wrote:
       | I'm not sure I trust the tech of a company that's asking me to
       | watch a DVD in 2021.
        
         | bootlooped wrote:
         | Well, to be fair you're probably not going to be on wifi while
         | driving, so maybe a DVD isn't such a bad idea.
        
           | thekyle wrote:
           | I think lots of modern cars (especially the high-tech and
           | expensive ones) have in-car Wi-Fi.
        
             | M2Ys4U wrote:
             | That's to supply wifi to passengers. The car itself needs
             | an uplink, and that'll be a cellular one
        
       | gok wrote:
       | > conditional automation, which means a car can read its
       | environment and make decisions based on what it sees
       | 
       | It's frustrating how technically illiterate the press is.
        
       | SenHeng wrote:
       | What's not mentioned in all the othe english based articles is
       | that Honda's system only works on highways at speeds _below_ 50km
       | /h. It's a system specifically designed for dealing with traffic
       | jams on highways, and is not comparable to, say, Tesla's
       | autopilot.
       | 
       | https://asia.nikkei.com/Business/Automobiles/Honda-launches-...
       | 
       | > _It can free drivers from driving in congested traffic on an
       | expressway when travelling slower than 50 kilometers per hour._
       | 
       | > _The system automatically accelerates, brakes and steers while
       | monitoring the vehicle 's surroundings, using data from high-
       | definition mapping and external sensors._
        
         | ckastner wrote:
         | That significantly reduces the risk of fatalities.
         | 
         | Even a head-on collision (as we've seen with Teslas) with a
         | concrete pylon should be survivable at those speeds.
        
         | glaucon wrote:
         | What's weird is that the accompanying video (embedded in the
         | same page and another one here
         | https://youtu.be/PGLBiORNgOE?t=53) clearly shows aspects of the
         | system being used at speeds well over 50km/h.
        
       | yholio wrote:
       | > It does all this with zero input from the driver, who Honda
       | says can "watch television/DVD on the navigation screen or
       | operate the navigation system to search for a destination
       | address."
       | 
       | So another batch of dangerous "half-self-driving" cars is hitting
       | the streets, encouraging drivers to disengage from the wheel but
       | ignoring the fact that distracted people tend to remain
       | distracted much longer than their automated car desires.
       | 
       | There is no such thing as "Level 3 automation", because driving
       | is a wholesome intelectual endeavor that cannot be
       | compartmentalized. You cannot drop a distracted driver in a life
       | or death trafic situation, an expect them to correct course
       | efficiently.
       | 
       | Driving includes route planning, optical recognition of static
       | and dynamic road conditions, trajectory prediction of other
       | trafic participants, as well as being aware of the dynamic
       | parameters of your own vehicle and predicting how your inputs
       | will reflect into future trajectory. The brain is essentially a
       | giant prediction machine, as explained yesterday by an article
       | here on HN: https://news.ycombinator.com/item?id=26341218
       | 
       | This is a continuous loop where an attentive driver maintains
       | state information from a large time period, from seconds to
       | hours: this is slippery and curvy road section, a certain road
       | sign was spotted a mile ago, the gray Honda following drives
       | erratically, there is a kid with a ball and a cat on the
       | sidewalk, the wheel is slightly tilted to right to match
       | trajectory.
       | 
       | There is no way to partition this activity into "simple" self-
       | driving tasks that can be automated and "complex" things a man
       | can drop in and do. It's all or nothing. Once the man looks away
       | from the road for half a minute, he has lost critical state
       | information that affords him human level driving accuracy, and it
       | would take minutes to warm up again, including gradual increase
       | of control authority. His role in a "Level 3 self-driving car" is
       | essentially that of the monkey to take the blame.
        
       | jjmorrison wrote:
       | If I read this correctly, the point being made is that Honda got
       | approval to allow drivers not to look at the road when in a
       | traffic jam, which makes it technically level 3 when the car is
       | in a traffic jam?
       | 
       | It's a bit of an overstatement to say Honda is leading self-
       | driving. This seems like a great regulatory win in Japan, but not
       | a very interesting technical achievement.
        
         | goshx wrote:
         | This is a paid ad. "Leading" by doing less than the competition
         | and having 100 cars to become available. This article is a
         | joke. Kudos to Honda, but this ain't leading.
        
           | kaba0 wrote:
           | As if the tesla would not be surrounded by the most
           | ridiculous marketing attempts.
        
         | dmingod666 wrote:
         | My industry leading AI can make sure the car remains motionless
         | when it's parked. It doesn't even need a driver to be present.
         | ;)
        
         | refulgentis wrote:
         | That is a _very_ good way of putting it, thank you, clarified
         | my similar sense that something was 'off'
        
         | jliptzin wrote:
         | It reads like something that someone at a hedge fund put out
         | right after shorting Tesla
        
       | akg_67 wrote:
       | Are these Honda cars using Cruise's self driving system? I
       | thought they had partnership for development and Cruise cars were
       | suppose to enter Japan in 2021.
       | 
       | Only 100 Honda Cars with Level 3 system, seems like a marketing
       | gimmick.
       | 
       | Here is Toyota demo from May 2020 of a Tokyo metropolitan
       | expressway, entering, lane change, and car overtake, with a
       | reporter:
       | 
       | https://toyotatimes.jp/en/chief_editor/027.html
       | 
       | The Toyota page also has better illustration of different levels
       | of drive automation.
        
       | sychoptah wrote:
       | Full self-driving right ? Like you need 3 person remotely
       | operating that vehlices 24/7, that means 8 working hours, thats 3
       | people per day if every of one of them is driving 24/7. Blink
       | blink. (internal joke for honda guys).
       | 
       | Another one of my ideas, do you know why Musk is building
       | starlink ? To operate fleet of cars remotely all over the world
       | by using remote operators and avoid paying NET connect to telco
       | operators. With LEO orbit latency is piece of cake. Only thing is
       | trsutworthy of that operators. The price for autdriving, will be
       | payments for operators, calculated from local country HDP, to pay
       | operators by local country "minimal" HDP wages. Problems with
       | unemployment "solved".
        
       | Shopolica wrote:
       | https://www.shopolica.com/13-best-air-purifier-for-home/
        
       | Bluestein wrote:
       | > This allows a car equipped with a Level 3 system like Honda
       | Sensing Elite to act on its own accord (no pun intended)
       | 
       | Heh ...
        
       | potatochup wrote:
       | At least from this video [1], it looks like it has the following
       | features:
       | 
       | - hands-free overtaking of slower vehicles when at highway speeds
       | 
       | - a "traffic jam mode" that is hands-off up to a certain speed,
       | and plays a video on the center display
       | 
       | This seems... not that impressive? I haven't seen many real
       | reviews yet though, just a bunch of press releases filmed on
       | closed courses.
       | 
       | [1] https://www.youtube.com/watch?v=_hBwmFbpNCA
        
         | Geee wrote:
         | Yeah, it's like a high school science project compared to
         | https://www.youtube.com/watch?v=7psq48HE-QQ
        
         | BoorishBears wrote:
         | If it's not impressive it's because a certain player in the
         | field has distorted the standards of rigor for self driving for
         | so many people with "Autopilot".
         | 
         | The fact you watch a video while it's active is a tiny
         | difference to the driver (especially since some people are
         | already watching videos using LKA), but a HUGE difference in
         | terms of what the system is doing. The fact the system is
         | driving with full responsibility on Honda
         | 
         | To the point that instead of seeing existing LKA as a step
         | behind this, you might as well see them as existing on
         | different planes, where progress towards A does not affect
         | progress towards B meaningfully
        
           | potatochup wrote:
           | > the system is driving with full responsibility on Honda
           | 
           | Yeah. I guess we'll have and wait for real reviews to see
           | under what conditions this operate (or more importantly, fail
           | to operate and hand control back to the user)
        
         | cecja wrote:
         | not that impressive?
        
       | timwaagh wrote:
       | honda owners will seek out traffic jams so they can watch netflix
       | in the bosses time.
        
       | TheRealSteel wrote:
       | "Just 100 cars will be made available with the technology in
       | Japan, and they will cost the equivalent of $101,900. None of
       | these Legends will make their way to the U.S."
       | 
       | I'm assuming if it won't come to the US it won't come to the UK,
       | Canada or Australia either.
       | 
       | Is 100 units really a "production car"? I don't agree.
       | 
       | Also:
       | 
       | "It does all this with zero input from the driver, who Honda says
       | can "watch television/DVD on the navigation screen or operate the
       | navigation system to search for a destination address."
       | 
       | and yet:
       | 
       | "Please do not overestimate the capabilities of each Honda
       | Sensing Elite function and drive safely while paying constant
       | attention to your surroundings. Please remain in condition where
       | you can respond to the handover request issued by the system, and
       | immediately resume driving upon the handover request."
        
         | atty wrote:
         | This mixed messaging of "you don't need to be looking at the
         | road!", "but you should be looking at the road." Is going to
         | lead to more deaths. As someone who works for one of the major
         | global car companies, I'm very concerned for how breathlessly
         | we (as an industry) talk about things like "autopilot" instead
         | of "driver assist", and a million other flashy marketing terms
         | that make it difficult to just understand what the system is
         | and is not capable of handling.
        
           | cycomanic wrote:
           | I blame this largely on Telsa essentially driving the
           | industry like a heard of cattle. This is/was a great when it
           | came to accelerating the development of electric cars, and I
           | realise that the software development community really likes
           | the "move fast and break stuff" mantra, but there are reasons
           | why engineering fields with chartered engineers have
           | processes like FMEA: people die if things fail.
           | 
           | A friend who is an engineer at a premium car manufacturer
           | working on sensors for self-driving is telling me that the
           | internal policy inside the manufacturer is, that the only
           | competitor that matters is Tesla, so they only compare
           | themselves to Tesla. The same friend also believes that real
           | self-driving is still many years out, even sensors that can
           | deal with general weather conditions do not exist yet.
        
             | [deleted]
        
             | kolinko wrote:
             | How did your friend arrive at the conclusion that we don't
             | have good enough sensors?
             | 
             | IMHO AI to analyse the data is what's missing - we have
             | cameras that are just as good as human eyes, so we know for
             | a fact that the current sensors are enough to drive in
             | general weather conditions.
        
               | KaiserPro wrote:
               | Cameras don't give you depth.
               | 
               | You have two choices:
               | 
               | o have a sensor that gives you depth directly (ie laser,
               | radar, or lenticular array) o try and infer depth from
               | stereo cameras
               | 
               | Lidars are expensive and power hungry, Radar is cheap and
               | mature, but doesn't give you anywhere near as much
               | resoultion
               | 
               | lenitucular only has a tiny range. There are other time
               | of flight sensors, but they are either not production
               | ready, expensive or both.
               | 
               | What Tesla have chosen to do is kinda try and merge radar
               | and monocular object detection to give a higher frame
               | rate depth estimation of _objects_. However its expensive
               | to develop, unreliable, and terrible in corner cases (ie,
               | if sees an unknown object, it can 't place its depth.)
               | Now humans can do this, because we've had years of
               | training, Tesla can't.
               | 
               | Tesla's fancy cruise control is dangerous. Its auto drive
               | stuff its trying to develop is even more dangerous.
               | Instead of finding the sensors and processing spec needed
               | to drive safely, they are trying to use the sensors and
               | GPU they already have. Its not going to work and is
               | unsafe.
        
               | saalweachter wrote:
               | > Now humans can do this, because we've had years of
               | training...
               | 
               | Humans can _mostly_ do this. Like 90% of UFO sightings
               | are people seeing the moon and getting confused and
               | thinking it is a nearby object following them.
        
               | ricardobayes wrote:
               | There are surprisingly sophisticated (and precise)
               | machine learning algorithms for mono-camera depth
               | estimation. It's not a problem anymore. Furthermore, most
               | companies use various focal length lenses. The real
               | problem lies in sensor fusion. Which sensor's input do
               | you believe most, especially if one or more is
               | occluded/dirty.
        
               | amelius wrote:
               | I have yet to see any ML algorithm that can reach 99.999%
               | accuracy or better (we need better).
        
               | evdoks wrote:
               | Why do we even need such accuracy? Humans are extremely
               | bad at estimating distance and nevertheless are quite ok
               | drivers. ML needs to make sure that the car is not
               | bumping at things or people, which is a very different
               | task.
        
               | amelius wrote:
               | Yes, so "can/cannot bump into" is a classification tasks
               | which needs quite some accuracy.
               | 
               | A human on the other hand can recognize another human
               | with near perfect accuracy.
        
               | myself248 wrote:
               | A camera is a light sensor, a camera is not an object
               | sensor. Cameras as light sensors definitely exist.
               | 
               | A camera feeding into phenomenally sophisticated AI
               | trained on all sorts of weather conditions with
               | windshield wiper smear and glare and everything else,
               | which can turn those horrible images into a picture of
               | what actually lies ahead, is an object sensor. Those
               | aren't good enough yet.
        
           | spullara wrote:
           | It doesn't lead to more deaths. See Tesla for proof. This is
           | a super bad take.
        
             | totalZero wrote:
             | https://www.tesladeaths.com/
        
               | spullara wrote:
               | Ok, reduce that to where autopilot is engaged.
        
               | ggreer wrote:
               | It should be noted that that site is run by a group of
               | pseudonymous people who do not hide the fact that they
               | have been shorting Tesla for years. They count every
               | death involving a Tesla. If someone jumps in front of a
               | manually piloted Tesla and is run over before the driver
               | can react, it is added to the spreadsheet. They're
               | holding the company to an impossible standard. Moreover,
               | they don't actually care about reducing deaths involving
               | Tesla vehicles. They want Tesla to fail so that they can
               | profit. It's a purely selfish act.
               | 
               | The most prominent member of the TSLAQ people and (IIRC)
               | the first to publicize a spreadsheet of Tesla deaths is
               | Elon Bachman (a pseudonym). He has also been a charlatan
               | on COVID. A year ago he said, "After 4 months of white
               | hot Coronavirus panic, the disaster is visible everywhere
               | except in the data. Total deaths, rounded to the nearest
               | percent, remain 0% of annual flu deaths, and serious
               | cases are falling"[1]
               | 
               | I tried to bet him up to $1,000 that US COVID deaths
               | would exceed 25,000 by the end of 2020.[2] He ignored me
               | and continued to deny the harm caused by the disease.
               | 
               | One simply cannot trust anything he's involved in.
               | 
               | 1. https://twitter.com/ElonBachman/status/123703029234031
               | 4112
               | 
               | 2. https://twitter.com/ggreer/status/1237164317142736896
        
               | Pyramus wrote:
               | > They want Tesla to fail so that they can profit.
               | 
               | A more realistic take is: They think Tesla is massively
               | overvalued and should be held accountable for fraudulent
               | corporate behaviour.
               | 
               | > He has also been a charlatan on COVID.
               | 
               | Kind of ironic given that Elon himself was (is?) one of
               | the first vocal Covid deniers.
               | 
               | Check out TC's Chartcast [1] in case you are interested
               | in TSLAQ, beware it's a deep rabbit hole.
               | 
               | I've previously tried to sum up Tesla's red flags here
               | [2].
               | 
               | [1] https://www.buzzsprout.com/758369
               | 
               | [2] https://news.ycombinator.com/item?id=26065075
        
               | ggreer wrote:
               | Musk has never been a covid denier. He understood that
               | covid wasn't as dangerous as many claimed, especially for
               | younger and healthier people. The only thing close to
               | covid denialism that I can find is Musk's decision to
               | resume production at Tesla's Fremont plant in defiance of
               | Alameda county lockdown orders. This was in May of 2020
               | after months of failed negotiations with government
               | officials. It's important to note that unlike Bachman,
               | Musk had skin in the game. He asked that if the
               | government sent cops to shut down the factory, only he be
               | arrested.[1] The factory resumed production and nobody
               | was arrested. A week or so later, Alameda county changed
               | its rules to allow the factory to operate legally.
               | 
               | 1.
               | https://twitter.com/elonmusk/status/1259945593805221891
               | "Tesla is restarting production today against Alameda
               | County rules. I will be on the line with everyone else.
               | If anyone is arrested, I ask that it only be me."
        
               | Pyramus wrote:
               | Here's a selection of tweets for you, by a celebrity many
               | still trust and consider a "scientific genius":
               | 
               | @elonmusk Mar 6, 2020
               | 
               | The coronavirus panic is dumb
               | 
               | https://twitter.com/elonmusk/status/1236029449042198528
               | 
               | @elonmusk Mar 19, 2020
               | 
               | Based on current trends, probably close to zero new cases
               | in US too by end of April
               | 
               | https://twitter.com/elonmusk/status/1240754657263144960
               | 
               | @elonmusk Mar 19, 2020
               | 
               | Kids are essentially immune, but elderly with existing
               | conditions are vulnerable. Family gatherings with close
               | contact between kids & grandparents probably most risky.
               | 
               | https://twitter.com/elonmusk/status/1240758710646878208
               | 
               | @elonmusk Jun 29, 2020
               | 
               | There are a ridiculous number of false positive C19
               | tests, in some cases ~50%. False positives scale linearly
               | with # of tests. This is a big part of why C19 positive
               | tests are going up while hospitalizations & mortality are
               | declining. Anyone who tests positive should retest.
               | 
               | https://twitter.com/elonmusk/status/1277507826529660928
        
               | ggreer wrote:
               | The only screw-up I see is his prediction that the US
               | would conquer the spread by the end of April. All the
               | other info is pretty accurate, especially considering how
               | little we knew back in March. Compared to most
               | authorities and experts, he did a pretty good job.
               | Remember two weeks to flatten the curve? Remember when
               | the official line was that masks don't work?[1][2][3]
               | Remember when the WHO said that travel bans are a bad
               | idea?[4]
               | 
               | Musk was calling the panic dumb at the same time as the
               | WHO was saying, "Our greatest enemy right now is not the
               | coronavirus itself. It's fear, rumours and stigma."[5] If
               | we judge him by the same standards as every major
               | institution (media, governments, NGOs), Musk is far less
               | deserving of criticism. And unlike those institutions,
               | Musk never claimed to be an expert on the topic.
               | 
               | 1. https://twitter.com/WHOWPRO/status/1243171683067777024
               | 
               | 2.
               | https://twitter.com/UNGeneva/status/1244661916535930886
               | 
               | 3. https://web.archive.org/web/20200312104152if_/https://
               | twitte...
               | 
               | 4. https://twitter.com/WHO/status/1224734993966096387
               | 
               | 5. https://twitter.com/WHO/status/1233418231261646849
        
               | dheera wrote:
               | > They think Tesla is massively overvalued
               | 
               | > fraudulent corporate behaviour.
               | 
               | These two things have nothing to do with each other.
               | 
               | > and should be held accountable for
               | 
               | Holding a company accountable should go through the
               | courts, not personal profit.
               | 
               | I'd be happy if short selling weren't an option. If you
               | believe something is overvalued, stay out. You're not
               | wanted.
               | 
               | If you believe a company did something wrong, file a
               | lawsuit.
        
               | alisonkisk wrote:
               | You haven't made an argument against short selling.
        
               | dheera wrote:
               | Pessimism is counterproductive to the long-term
               | development of humanity and rapid deployment of electric
               | vehicles and clean energy. Armchair pessimists don't
               | deserve a place in the stock market. If you think Tesla
               | is doing something wrong, either (a) go work there and
               | fix it or (b) start your own Tesla competitor. Short
               | selling isn't constructive.
        
               | mercurysmessage wrote:
               | What's wrong with short selling? The only reason why
               | people like Elon Musk don't like short selling is because
               | they know that their companies are massively overvalued.
        
               | Geee wrote:
               | The problem is profiting from spreading misinformation.
               | Tesla short-sellers run large operations that fabricate
               | and spread lies. It's usually easier to spread
               | misinformation than to debunk it.
        
               | mercurysmessage wrote:
               | And the companies don't spread misinformation? Not saying
               | it's right in either case but companies, Tesla included
               | aren't shining beacons of morality and truthfulness.
        
               | Geee wrote:
               | It's not in their interest to lie in the long term. Those
               | companies who lie are scams and they're spotted pretty
               | quickly (e.g. Theranos, Nikola). Elon has made some
               | mishaps, such as 'funding secured @420' tweet, which he
               | was punished for. Public companies have to be pretty
               | careful in their communications. Making promises with too
               | optimistic schedules is not lying, it's just an error in
               | forecasting, and is common in technology.
        
               | Pyramus wrote:
               | That couldn't be further from the truth. Theranos
               | survived for 15 years. Wirecard 22 years. Enron 10+
               | years. Fraud gives you a massive competitive advantage.
               | 
               | Not too along ago I thought of Musk as a misunderstood
               | genius, now I'm pretty certain he knows exactly what he
               | is doing (for the most part). If you look at all the
               | oddities surrounding Tesla, there are clear patterns
               | emerging.
               | 
               | Plainsite has a good summary [1].
               | 
               | [1] https://www.plainsite.org/realitycheck/tsla.pdf
        
               | Geee wrote:
               | That report is a bunch of horseshit. Lot of words without
               | any substance. He even cherry-picked some data to "prove"
               | that Tesla's sales are declining. Everyone can see the
               | actual progress that Tesla and SpaceX has made. Their
               | cars are winning awards and they're innovating and
               | building new factories as fast as they can. Who cares if
               | they miss a couple of estimates. SpaceX can deliver
               | payload to orbit with much lower cost than competitors.
               | 
               | True though that those companies survived for too long.
        
               | FireBeyond wrote:
               | And Tesla will happily release misinformation too.
               | 
               | One of the last fatalities, Tesla was more than happy to
               | push out a press release based on telemetry, saying
               | "Autopilot wasn't at fault, the driver was inattentive -
               | the vehicle even told him to put his hands on the
               | steering wheel!".
               | 
               | They somehow neglected to mention that the steering wheel
               | alert was triggered, ONCE, and FOURTEEN MINUTES before
               | the crash.
               | 
               | Misinformation is not a good thing. But lets not pretend
               | that Tesla is some downtrodden underdog just trying to
               | make our lives better.
               | 
               | Also, if you have an accident in your Tesla, you'll have
               | a lot of fun trying to get any telemetry information from
               | them, even if Tesla isn't a named party and you're just
               | dealing with the other involved person. You'll need
               | multiple subpoenas and expect them to resist releasing
               | any data as "proprietary".
               | 
               | But should your telemetry from an accident be able to be
               | spun (correctly or otherwise) into a Get out of Jail Free
               | card for Tesla, expect it to be released to the media
               | without your consent or authorization (I'm sure it's
               | buried in Section 48, Subsection 24, Paragraph 14c iii
               | that you consent, but still).
        
               | kwhitefoot wrote:
               | That's an interesting table. I would say that it
               | reinforces the case that Autopilot (actually labelled
               | Autosteer in the UI of my 2015 Model S) is actually not
               | dangerous.
               | 
               | I presume that the aim of the site is the opposite
               | though.
        
               | vletal wrote:
               | Have you mentioned how sparse the autopilot claimed
               | column is? And how even sparser the verified autopilot
               | is? Seems like the authors are trying to inflate the
               | total numbers very hard.
        
               | madamelic wrote:
               | Not to mention the spreadsheet admits Autopilot was
               | released in 2015 but they are still including deaths from
               | 2013.
               | 
               | This can't be more blatant.
        
               | FireBeyond wrote:
               | Autopilot might not be around, but up until very recently
               | Tesla refused to participate in a lot of auto safety
               | testing saying that they believed the testing regime was
               | "flawed".
               | 
               | I have no issue with keeping track of auto deaths from a
               | company who is claiming that their vehicles are safe
               | while preventing anything but the legal minimum necessary
               | tests from occurring.
        
             | shafyy wrote:
             | I'm not sure why you are going downvoted, this is true.
        
             | 6gvONxR4sf7o wrote:
             | The stats I've seen quoted about Tesla are super misleading
             | and not at all apples to apples comparisons.
        
             | FireBeyond wrote:
             | Tesla's miles have the advantage of being inherently
             | autopilot capable, and turning off when things get dicey
             | for it.
             | 
             | Human drivers do not have those advantage.
             | 
             | So Tesla is comparing drivers in situations where AP
             | doesn't even try to drive the vehicle because it couldn't.
             | 
             | This is amazingly misleading. And Tesla knows it.
        
             | cgriswald wrote:
             | The issue isn't whether having the technology saves more
             | lives than not having the technology. The issue is whether,
             | _given the technology_ , the marketing creates a perception
             | that results in unsafe usage of the technology, costing
             | lives.
        
             | kaba0 wrote:
             | It's is easy, even almost trivial to drive long hours on
             | almost nothing happens freeways. We don't have any sort of
             | statistics on the dangerous close encounters, and on
             | whether these remain close encounters with teslas.
        
               | jiofih wrote:
               | We have all sorts of statistics, every car maker keeps
               | track of disengagements and events. What are you talking
               | about?
        
               | kaba0 wrote:
               | Accidents per miles is not too informative. For example
               | do teslas fair better than an average human driver on eg.
               | someone running a red in front of the car? On a suddenly
               | overturned car on the freeway? Do we have enough data to
               | answer these questions?
        
               | jiofih wrote:
               | I don't understand this kind of question. Obviously you
               | need some statistical measure to compare safety, and
               | incidents per mile (or vehicle years, or hours driven) is
               | an accurate way to estimate accident rates. All we care
               | about is that on average it crashes less than human
               | drivers. If there are specific conditions where it fares
               | worse those would be pretty obvious to address, and
               | naturally the odds of _when crashing_ the situation being
               | weirder than usual is a given if safety improves for the
               | average case.
               | 
               | That aside, the answer to the first one is _yes_. Because
               | of sonar and the cameras, a Tesla can see traffic two or
               | three cars ahead and will initiate braking way earlier
               | than a human would in the case of someone else running a
               | red light - there a few videos of this exact situation
               | available in YouTube. As for the second, probably not
               | enough data, but in absolute numbers humans are
               | responsible for nearly al of those cases so far, they
               | happen weekly.
               | 
               | By the way, the overturned truck was not a fatal accident
               | - the car triggered emergency braking and the driver came
               | out without a scratch. The fatal one was a couple years
               | ago when a Model S ran _under_ a white truck making an
               | unsafe u-turn.
        
               | 6gvONxR4sf7o wrote:
               | > All we care about is that on average it crashes less
               | than human drivers.
               | 
               | All we care about is that on average it crashes less than
               | human drivers _in comparable conditions_. If you compare
               | a 40 year old driving a tesla on autopilot on a sunny
               | freeway to an 18 year old driving a 1995 beater without
               | modern safety features, no ADAS, etc, in a snowy busy
               | intersection, autopilot doesn 't have to be that great to
               | look better. If you compare tesla on autopilot to
               | "average human miles" you rolling some very invalid
               | comparisons into the averages. You can only usefully
               | compare averages if they're averaged over comparable
               | contexts.
               | 
               | A less extreme comparison, if my memory serves, is tesla
               | rolling out ADAS with autopilot when they did a
               | before/after comparison. They compared autopilot-
               | available + ADAS-engaged to autopilot-unavailable + ADAS-
               | unavailable, or something along those lines, if I
               | remember correctly.
        
               | kaba0 wrote:
               | Incidents per miles is fine with a big enough data set
               | --- but I disagree that it is enough in case of a new
               | technology that can potentially kill. Like let's say
               | teslas are absolutely safe on the highways, much more
               | than human drivers, but would have a tendency to hit
               | pedestrians much more so than humans. It is possible that
               | it will have a better incident per mile record even
               | though no sane person would legalize them in this case.
               | 
               | > If there are specific conditions where it fares worse
               | those would be pretty obvious to address
               | 
               | Like, if (inSpecificSituation) { payMoreAttention(); } ?
               | This is the actually hard part of the problem, not
               | breaking when something is close and stay in the lane.
               | They can't even create adequate test environments for the
               | many many special cases that can trivially happen in a
               | city.
               | 
               | Also, how does it see traffic two or three cars ahead? It
               | sounds like a marketing gimmick but correct me if I'm
               | wrong.
        
               | 6gvONxR4sf7o wrote:
               | > Incidents per miles is fine with a big enough data set
               | 
               | Only if you're comparing comparable conditions. If you
               | compare autopilot in the sun to humans in the snow,
               | infinite miles won't make it any more valid of a
               | autopilot/human comparison.
        
               | FireBeyond wrote:
               | Tesla's miles driven is self-selecting, because AP won't
               | engage when it can't because of poor conditions.
               | 
               | Humans don't get that luxury.
               | 
               | You can't say for Tesla (deaths/miles driven on good-for-
               | AP roads in acceptable conditions) and for humans
               | (deaths/miles driven on all roads in all conditions) and
               | then act as if they are comparable or prove "improved
               | safety", because they don't.
        
             | swarnie_ wrote:
             | The issue is when there is a crash or a death could that
             | situation have been avoided if the driver was looking ahead
             | with two hands on the wheel, like you would in any other
             | car without the marketing hype.
             | 
             | I've seen more then enough crashes and deaths in Telsas
             | that were perfectly avoidable providing the driver wasn't
             | watching their ipad.
        
               | kwhitefoot wrote:
               | Really? I've been driving for over forty years (about 15k
               | km per year) and I have never seen a death involving any
               | car let alone the relatively rare Tesla.
               | 
               | I've seen emergency vehicles attending a crash perhaps
               | once or twice a year but never seen the crash occur. In
               | fact the only ones I have personal first hand experience
               | of is the time I rear ended a car, the two times that I
               | was rear ended, and the time I slid off the road on an
               | icy corner. All three rear ending events were relatively
               | low speed incidents (under 30 km/h) and would quite
               | likely have been mitigated or even completely avoided if
               | the cars in question had had automatic emergency braking.
               | The icy corner was my own fault for not thinking ahead.
               | 
               | So, to me, your statement, without some more context,
               | sounds like an exaggeration.
        
               | swarnie_ wrote:
               | I too have been driving some some time now but we have
               | this thing called the internet, where you can see and
               | share information.
               | 
               | Lemme help you out friend - https://www.youtube.com/resul
               | ts?search_query=tesla+autopilot...
        
               | kwhitefoot wrote:
               | So you didn't mean: "I've seen more then enough crashes
               | and deaths in Telsas"
               | 
               | you meant: "I have heard about .." or "I have seen
               | accounts of .."
               | 
               | Those are not quite the same thing.
        
           | TameAntelope wrote:
           | It doesn't say you should be looking at the road, simply that
           | you are aware of your surroundings and that you remain ready
           | to retake control should the vehicle notify you should.
           | 
           | That's different.
        
             | piva00 wrote:
             | How can you keep aware of your surroundings going 60+ km/h
             | without paying attention to the road? Even more if you are
             | required to take control in an instant, you have to be
             | completely enveloped by your spatial awareness.
             | 
             | I have raced go-karts, I've raced GT cars, there is
             | absolutely no way to keep aware of your surroundings if
             | your focus isn't constantly on the road and checking
             | mirrors, full stop.
        
               | TameAntelope wrote:
               | The language here is critical. The question is where your
               | focus lies, and at level 3 autonomous driving, there are
               | situations where your focus can be elsewhere besides the
               | specific road conditions.
               | 
               | You are assuming "aware" means "ready to take over in an
               | instant". That is not what it means in this context.
        
           | lukebuehler wrote:
           | Yes, it should be quite simple:
           | 
           | Self-driving: car comes with no steering wheel.
           | 
           | Lane/drive assist: car has steering wheel and you have to
           | steer the whole time--with occasional nudges from the AI.
        
           | bryanlarsen wrote:
           | The Honda is level 3 only during traffic jams -- it's limited
           | to 50km/h and only on the expressway. If it's completely
           | incompetent the worst that will happen is some minor body
           | damage. It's not going to cause any deaths.
        
           | judge2020 wrote:
           | > talk about things like "autopilot" instead of "driver
           | assist"
           | 
           | As stated countless times before, autopilot in planes isn't
           | even geared towards handing most flight scenarios or
           | challenging conditions. Tesla is technically correct to call
           | it this, even though this naming is confusing to consumers
           | who think "plane flies itself" and think it means their car
           | can drive itself in all conditions and avoid at-fault
           | incidents - which most likely will indeed lead to more
           | deaths.
        
             | yread wrote:
             | I can't set target altitude nor heading on this thing, it
             | can't use ILS, follow VORs. It doesn't make sense to call
             | it autopilot
        
               | nelox wrote:
               | The passengers have no idea there are no pilots around
               | when things don't go as expected
        
             | jfoster wrote:
             | Remember how dangerous the roads were when the car
             | companies producing cars without gear shifts started
             | calling their cars "automatic" as opposed to manual?
        
               | totalZero wrote:
               | This is ridiculous. A transmission failure and a self-
               | driving failure have very different implications.
               | Automatic transmissions don't have pedestrians walking
               | through them, etc.
        
               | brabel wrote:
               | You say this is ridiculous but your premise that these
               | new "self-driving" cars are "walking trough pedestrians"
               | is even more ridiculous and not backed by any data of
               | real world experiments currently being undertaken.
        
               | upbeat_general wrote:
               | I think the point was that automatic transmissions are
               | far simpler systems that are harder to screw up. And if
               | you do screw them up it's bad but not driving into
               | pedestrians bad.
               | 
               | No, there's not many self driving cars that have driven
               | into pedestrians but Uber ATG's car notably did, and
               | there is certainly a lot of room for failure.
        
               | jfoster wrote:
               | Meanwhile, humans are driving into pedestrians every
               | minute of every day, but yes, get upset about what Tesla
               | called their system...
        
             | FuckMeNow69 wrote:
             | Hi
        
             | nitinreddy88 wrote:
             | The consumers are not techsavy like Tesla or HN world.
             | Every word used for marketing should be wisely thought or
             | we will end up with consequences
        
             | [deleted]
        
             | maxerickson wrote:
             | What are the rules to this "technically correct"?
             | 
             | It's not even flying the car!
        
               | ineedasername wrote:
               | You need the latest Tesla firmware upgrade to activate
               | flight. Or an unfinished bridge combined with lane
               | tracking and an inattentive driver.
        
               | ck2 wrote:
               | I understood that reference.
               | 
               | https://jalopnik.com/a-tesla-stan-dmd-me-to-show-what-
               | they-t...
        
             | ncallaway wrote:
             | > As stated countless times before, autopilot in planes
             | isn't even geared towards handing most flight scenarios
             | 
             | As stated countless times in response, it doesn't matter
             | what autopilot actually does when the concern is the public
             | perception of the marketing.
             | 
             | If "autopilot" sounds to the general public like "it drives
             | itself" (which is the simple etymology of the word), then
             | it doesn't matter one flying fig what autopilot on planes
             | actually does.
        
               | spullara wrote:
               | Autopilot saves lives even in current state.
        
               | kaba0 wrote:
               | Based on what?
               | 
               | What could save lives is intelligent automatic breaking -
               | because that is something we are currently capable of.
               | Humans as a species is terrible at paying attention to
               | boring tasks and quickly react - so these gimmick
               | autopilot takes are dangerous if anything.
        
               | spullara wrote:
               | It does that as well. Have you ever driven one?
        
               | brabel wrote:
               | Most standard cars already come with emergency automatic
               | breaking.
        
               | please-reread wrote:
               | Is that so? Now if they also could brake...
        
               | NavinF wrote:
               | That's exactly what self driving cars do, silly.
        
               | kaba0 wrote:
               | They do it as well on top of some other non-safe
               | features. Less is sometimes more. Breaking when a
               | pedestrian steps before us in sub-human reaction time is
               | great - it doesn't have to be combined with full self
               | driving, which is simply really far away from now.
        
               | labcomputer wrote:
               | Why isn't there the same level of hand wringing over
               | Ford's "co-pilot"? Surely a copilot is even more capable
               | than an autopilot.
        
               | FireBeyond wrote:
               | Common perception of a co-pilot is that a co-pilot
               | _assists_ the pilot.
               | 
               | It's another perception/reality thing. I realize that the
               | co-pilot shares the load, and is capable of being the
               | pilot-in-charge, or having control of the aircraft. But
               | one implies self piloting (Tesla), the other implies an
               | intelligent assistant pilot.
        
               | ben_bai wrote:
               | That would imply that people know aircraft-slang. Which
               | they clearly don't.
        
               | xmprt wrote:
               | Because unlike auto-pilot the implication of co-pilot is
               | that you're still primarily in charge of
               | flying/driving/piloting the vehicle.
        
               | jiofih wrote:
               | It's not, the co-pilot might fly the plane for most of
               | the flight.
        
               | NavinF wrote:
               | Umm no. Here's how it works
               | 
               | me: "you have the controls"
               | 
               | co-pilot: "I have the controls"
               | 
               | me: "you have the controls"
               | 
               | I am no longer flying the aircraft or even paying much
               | attention to it. I could even get up and walk away from
               | the controls. That's why co-pilots exist
        
               | ncallaway wrote:
               | The response I had earlier in the thread is absolutely
               | applicable to this.
               | 
               | It doesn't matter. At. All. How the operations work
               | inside the cockpit of an airplane. What actually happens
               | in an airplane is 0% important to the conversation.
               | 
               | What actually matters is the public at large's perception
               | of the term "autopilot" and their perception of the term
               | "co-pilot". If the public perceives "co-pilot" as being
               | less functional than "autopilot" then it _does not matter
               | at all_ that in aviation a co-pilot is actually more
               | capable than an autopilot.
        
               | ncallaway wrote:
               | This assumes that no one is hand wringing over Ford's
               | marketing. I haven't seen as much of Ford's marketing as
               | I have as Tesla's, so I personally have written as many
               | words about it.
               | 
               | But I have the same concerns about _all_ marketing that
               | positions cars as being capable enough that people don 't
               | have to pay attention. Until we hit true Level 4 or Level
               | 5 self driving cards, I'm extremely concerned about the
               | public's perception that they don't have to operate a
               | motor vehicle. *Especially* if the vehicle is capable of
               | enough automated driving that the driver doesn't need to
               | pay attention during most of the operation of the
               | vehicle.
               | 
               | To your specific point though, I think in general the
               | public would view something marketed at "autopilot" as
               | more capable than something marketed as "co-pilot",
               | despite the significant capability advantage that a "co-
               | pilot" actually offers over an "autopilot" inside the
               | cockpit of an airplane.
        
               | madamelic wrote:
               | The difference between Ford and Tesla is that Ford is
               | "one of the good ones".
               | 
               | Not interested in leading the pack; happy enough to
               | follow, crush smaller competition by their bloat, declare
               | bankruptcy and take in hundreds of billions in handouts
               | because of their irresponsibility and ineptitude.
        
               | z3t4 wrote:
               | Just like tobaco is forced to print "this leads to
               | cancer" on their products, self driving cars should have
               | labels like "car crashing into wall" and "car killing
               | other human".
        
               | jiofih wrote:
               | Please link to reports of a Tesla running over someone in
               | AP.
        
               | Pyramus wrote:
               | Let's not forget that Tesla is well aware the car doesn't
               | "drive itself" and have classified AP/FSD for
               | legal/liability reasons as a level 2/3 system.
               | 
               | From a marketing perspective however, robotaxis are just
               | around the corner.
        
             | ineedasername wrote:
             | The average consumer is not familiar enough with the limits
             | of automated avionics to understand the limits of
             | commercial jets' autopilot.
             | 
             | If the average consumer was also an airline pilot then it
             | would make sense to use an industry term with nuanced
             | meaning. Instead, the average consumer assumes a
             | straightforward meaning of the term, especially with pop
             | culture throwing out phrases like "planes practically fly
             | themselves these days".
             | 
             | Claiming "autopilot is technically correct" would only
             | approach being true if Tesla & others went into great
             | detail to educate buyers about the limits of the avionic
             | autopilot features referenced by the term, and difficulties
             | of flying a plane in non-optimal conditions, so they
             | actually understood the "technical" meaning and could apply
             | it correctly to the car they're buying.
        
               | spullara wrote:
               | They tell you every time you engage it the limitations.
               | Stop this fud. It is better than humans being in control.
        
               | Kbelicius wrote:
               | No it is not[0],stop this shilling.
               | 
               | [0] https://www.forbes.com/sites/bradtempleton/2020/10/28
               | /new-te...
        
           | NavinF wrote:
           | Meh. It only needs to be better than the average driver.
           | Several companies have already achieved that
        
             | AshamedCaptain wrote:
             | Actually, it only needs to be better than the average
             | mammal.
             | 
             | And I am not sure any level of AI has reached that level
             | yet.
        
             | cycomanic wrote:
             | Yes in California/Arizona sunny weather on wide US roads
             | with. My experiences with even driving assist on small
             | twisty road in the snow tells me me things are nowhere
             | close. The car was accelerating in situations (e.g. coming
             | over the crest of a small incline with a corner at the
             | end), where no driver would ever accelerate and which would
             | have resulted in some bad situation without intervention.
             | We have seen Teslas getting confused by fork in the roads
             | on the highway.
        
               | daveswilson wrote:
               | I don't think anybody's sanctioning the use of driver
               | assist features on small twisty roads the snow yet, are
               | they? If so please share what you were driving - that
               | would be interesting even if it didn't work well at the
               | time. Most of these things, AFAIK, are in the LKAS, ACC,
               | LDW range and meant for highways and there are mutterings
               | about that in the manual. I sometimes use ACC and/or LKAS
               | on country roads but only with very low expectations.
               | 
               | Teslas at least get better over time. If it got confused
               | over a fork in the road, there's a decent chance that
               | after a near-future firmware update that same car will no
               | longer be confused. A few folks have acknowledged that
               | they're now smart enough to slow down for curves (even
               | sometimes a bit generously), and they are among the few
               | that work pretty well off-highway (most of the time).
        
               | PraetorianGourd wrote:
               | It may not being marketed directly, but calling something
               | "full self driving" greatly implies that self driving is
               | fully supported in all scenarios. I have not seen any
               | Tesla marketing that says "full self driving, on big
               | roads with clear markings in dry/slightly damp weather".
               | 
               | Words matter
        
             | dotancohen wrote:
             | The average driver got to that level after years of
             | practice. With this system in place, the new-average-driver
             | will have the skills of someone who's just barely gotten
             | their drivers license years ago and have never used it
             | since. And that person will be expected to pilot the
             | vehicle only in times of such dire conditions that the
             | computer cannot do it.
             | 
             | Perhaps the use of this system should be licensed
             | separately from a standard drivers license, akin to an IFR
             | rating. And to keep the rating, manual driving must be
             | performed periodically and logged as well.
        
               | NavinF wrote:
               | If we have evidence that this is an issue, I'm on board
               | with that.
               | 
               | If it turns out this is not an issue and we unnecessarily
               | made self driving cars less common, that would have the
               | same effect as shooting a few hundred random people every
               | day (3,700 people die every day from crashes)
               | 
               | We'll find out in a couple of years
        
               | mathgeek wrote:
               | > 3,700 people die every day from crashes
               | 
               | Can you provide your source please? Closest I could find
               | is https://www.cdc.gov/injury/features/global-road-
               | safety/index... which states that "[e]very day, almost
               | 3,700 people are killed globally in crashes involving
               | cars, buses, motorcycles, bicycles, trucks, or
               | pedestrians. More than half of those killed are
               | pedestrians, motorcyclists, or cyclists." I didn't find a
               | number for just motor vehicle crashes, and since the
               | rates are three times higher in developing countries, it
               | seems like the cost of ownership is going to be a huge
               | factor.
        
             | azureel wrote:
             | There is also the case of "liability". An average driver
             | can drive the car into a tree with no problems at all. No
             | one except the driver is responsible there.
             | 
             | But if a company sells a product with the name "Autonomous
             | Driver", than that company is liable if the product
             | mulfunctions (ie. drives a car into a tree).
        
               | NavinF wrote:
               | Let the courts decide who's liable. I'm sure that in
               | practice this would be decided on a case by case basis
               | with logs from the car's computer.
        
               | dagw wrote:
               | _would be decided on a case by case basis with logs from
               | the car 's computer._
               | 
               | If I was a car manufacturer and I knew that my log files
               | would undoubtedly be used as evidence against me in a
               | criminal negligence trial, I would think very hard about
               | what I did and didn't put into those logs.
        
               | Vespasian wrote:
               | But if the car is in full auto mode the driver is not
               | expected to pay attention or look at the road. Based on
               | the situation the car thinks it can drive autonomously.
               | 
               | At least a partial liability will have to fall on the OEM
               | that's why nobody is making any binding promises
               | (including Tesla)
        
             | dheera wrote:
             | I think the bar needs to be much higher than "average".
             | 
             | Although beating the average is arguably "good enough" to
             | deploy in a collective sense, we live in a society of
             | individual actors, and if I'm an above average driver, it
             | needs to be better than me, not better than average, for me
             | to want to use it.
             | 
             | Assuming a society in which each person individually and
             | rationally chooses whether or not to use it, if you want
             | 99% of people to use it, your software needs to be better
             | than the 99th percentile driver.
        
               | stormbrew wrote:
               | This assumes a lot about the correctness of people's
               | perceptions of their own driving skills. My experience is
               | 99% of drivers think they're above average, and that
               | obviously can't be true.
        
               | throwaway0a5e wrote:
               | In my experience there's two kinds of people who'll say
               | their good drivers and they're both good but with wildly
               | different definitions of "good"
               | 
               | You've got Jose the 35yo MRI service technician who logs
               | 100k/yr for work and has had so much time to get good he
               | can tell exactly what traffic is doing and are highly
               | capable of printing what is about to happen and are
               | preemptively making moves based on that. He knows his
               | insurance company would crucify him if the saw how he
               | flings an overloaded Transit Connect through an on-ramp
               | or parallel parks by braille so if you press the issue
               | he'll tell you he's good at getting where he's going but
               | that the bureaucrats who write the state driver's manual
               | wouldn't like him.
               | 
               | And then you've got Karen the elementary school teacher
               | nearing retirement who logs 10k/yr 5k of which are spent
               | looking at her phone. She has spent a cumulative one hour
               | of her life above the speed limit despite spending much
               | more than that on highways where the traffic flow is well
               | above the speed limit. She follows every rule in the book
               | to the letter, gets honked at daily and once a week she
               | has a story about "some asshole" she got in a conflicting
               | situation with. She doesn't know how far down the gas
               | pedal on her 4Runner goes but oh boy does that brake
               | pedal get a workout when "oh crap almost missed my turn".
               | She swears up and down that she's a good driver because
               | of the seventeen fender benders she's been in she was
               | only at fault in the three that were caught on camera or
               | where witnesses stopped.
               | 
               | Which definition of "good" do you want near you when
               | there's 3" of snow on the road?
        
               | kwhitefoot wrote:
               | > Which definition of "good" do you want near you when
               | there's 3" of snow on the road?
               | 
               | When there is snow on the road you should not be near
               | either of them.
               | 
               | You should be far enough behind that you can stop when
               | the car in front gets into difficulties. The rule of
               | thumb on a dry road is that you should be three seconds
               | behind. At 100 km/h (about 60 mph) that's 83 m (about 270
               | ft), say 17 car lengths (for my Tesla S that is). If
               | there is snow on the road perhaps it would be wise to
               | allow more distance.
               | 
               | See, for instance,
               | https://www.driveincontrol.org/drivingtips/the-three-
               | second-...
        
               | gunnarmorling wrote:
               | > My experience is 99% of drivers think they're above
               | average, and that obviously can't be true.
               | 
               | It actually can be true, depending on the distribution of
               | skills you assume. Say you have 100 drivers, 99 with the
               | same skill level, and one driver which is worse. 99% will
               | be better than the average. I'll see myself out ;)
        
               | NavinF wrote:
               | If you really think you're better than a self driving
               | car, then don't buy one. Or do you want the gov't to
               | choose for you?
               | 
               | You should be happy that other people can buy them
               | because they'll no longer crash into you ;)
        
           | daveswilson wrote:
           | Goodness gracious doesn't anyone in the auto industry ever
           | even look at how the word "autopilot" has been used in the
           | aviation and marine industries? It has _never_ meant that the
           | vehicle operator gets to abandon their responsibilities.
           | 
           | However I agree that the mixed messaging is sad and
           | irresponsible, and this article's author could have been more
           | circumspect.
        
             | loudmax wrote:
             | How the term "autopilot" is used by professionals in the
             | aviation and marine industries isn't the main point here.
             | What matters is what non-professionals think when they hear
             | the term "autopilot". If you market something as
             | "autopilot" don't be surprised that general consumers don't
             | have the same nuanced understanding of the term as
             | professional pilots.
        
           | Dumblydorr wrote:
           | More important than short run death is longer term road
           | deaths. Their messaging is flawed but hundreds of thousands
           | die in car accidents yearly... The rollout of these cars
           | needs to be smooth and quick to prevent longer term death.
           | It's incumbent upon early adopters to be responsible for the
           | sake of society.
        
             | mrkwse wrote:
             | Regarding this, I strongly believe the correct approach is
             | L3 capability but only expressed through L2 features.
             | 
             | We're not at a point where it's safe for drivers to move
             | their attention from driving, but the technology is mature
             | enough that it can and should intervene wherever possible
             | to avoid collisions and other incidents.
             | 
             | The expectation that drivers can divert their attention
             | from driving to perform other tasks on the expectation that
             | they can resume control at short notice is extremely
             | misguided. This should not prevent similar sensor and
             | software tech that could enable L3 autonomy from enabling
             | more effective accident avoidance technology that
             | intervenes in situations where the driver may drive in a
             | careless or dangerous way.
        
           | throwaway0a5e wrote:
           | >Is going to lead to more deaths.
           | 
           | It'll lead to the same number of deaths and slightly more
           | complicated lawsuits.
           | 
           | At a statistical level Nobody(TM) is heeding fine print
           | warnings.
        
           | jeffreygoesto wrote:
           | YipYipYip. This is the blue wart in the green sea [0]. The
           | system drives until you should have taken over and died. The
           | situations that are detectable as faulty are not the lethal
           | ones typically. As far as I remember, Volvo denied to
           | implement Level3 years ago [1], which is the only grown-up
           | and responsible answer to this problem.
           | 
           | [0] https://www.sae.org/binaries/content/gallery/cm/articles/
           | pre...
           | 
           | [1] https://auto2xtech.com/volvo-to-skip-level-3-autonomous-
           | mode...
        
             | hkhjghg wrote:
             | >YipYipYip...
             | 
             | Is this a Sesame Street reference?
             | 
             | https://youtu.be/KTc3PsW5ghQ?t=97
        
               | jeffreygoesto wrote:
               | YipYipYip =|-D
               | 
               | Com...pjutaaa.
        
             | Gravityloss wrote:
             | Thanks for linking to good source material.
             | 
             | > When the feature requests, _you must drive_
             | 
             | Is there some kind of expected delay from the system
             | requesting driver attention to the driver assuming control?
             | Ie something like 3 seconds?
        
               | buran77 wrote:
               | That's the problem, no such system can guarantee it gives
               | you a minimum amount of time. Some situations appear and
               | develop in less than 3 seconds (think small child jumping
               | in front of your car from between parked cars). You might
               | only have 1s to react and if you are not "plugged in" and
               | fully attentive at that instant you missed the window to
               | react.
               | 
               | The more you "disconnect" as a driver, the less likely
               | you are to take over in a split second and this has been
               | proven again and again with humans. A system that works
               | 90% of the time is probably the worst because it's good
               | enough to give you confidence but bad enough that it's a
               | false one. The driver is in the position to buy the
               | advertised "you can disconnect" feature but then only be
               | able to use it under _explicit_ risk of harm to
               | themselves and others. It actually increases the mental
               | load as drivers will try to both  "disconnect" (type on
               | the phone, watch a movie) while also "paying attention to
               | the road". Of course both can't be reasonably done at the
               | same time so both experiences are sacrificed.
               | 
               | It's like encouraging people to take a taxi after
               | drinking but then expecting them to take over whenever
               | the driver makes a mistake and holding them responsible
               | and accountable for the outcome. You won't enjoy the
               | drink, or the drive.
               | 
               | These half-way solutions are great for marketing and for
               | people who are all about the hype. But not only are they
               | the bad compromises, they're also fueled by bad and
               | incomplete data, and by marketing departments wanting to
               | sell more. Almost no self driving outfit provides data on
               | how many times did a human make the slightest correction
               | while the self driving was enabled. Or out of all
               | possible driving conditions, what percentage of them were
               | covered by those self driving miles.
               | 
               | My personal philosophy is that cars are either self
               | driving or they're not. Meaning they can either match an
               | average human driver in all conditions they're expected
               | to encounter over the lifetime, or they're just assisting
               | the driver. If it's self driving except when it's not
               | then it's just like a student driver. You wouldn't
               | confuse them with being perfect drivers just because of
               | their perfect safety record, given the supervisor
               | corrects their every mistake.
        
               | omilu wrote:
               | >situations appear and develop in less than 3 seconds
               | (think small child jumping in front of your car from
               | between parked cars)
               | 
               | I trust full self driving tech to handle this situation
               | better than a human driver more often than not. The
               | computer doesn't get distracted and has better reaction
               | time.
        
               | buran77 wrote:
               | > full self driving tech
               | 
               | Full self driving tech is presumably (and by definition)
               | fully able to handle things by itself. We're talking
               | about "partial self driving tech" (SAE L3) which relies
               | on the driver taking over when it doesn't know what to
               | do. The context switching for a regular human absolutely
               | kills the reaction time in these situations making this a
               | "I'll mostly drive myself but when I can't you're almost
               | guaranteed to cause a crash" type issue.
        
               | 6pac3rings wrote:
               | The Tiger Woods driving experience as informal industry
               | use case standard to focus on is as good as the informal
               | two golf bags carrying capacity.
        
               | drran wrote:
               | > Some situations appear and develop in less than 3
               | seconds (think small child jumping in front of your car
               | from between parked cars). You might only have 1s to
               | react and if you are not "plugged in" and fully attentive
               | at that instant you missed the window to react.
               | 
               | Yeah, humans are weak at this task, which leads to
               | numerous deaths on roads. Can we develop a tech for this?
               | Something, that will watch the road and will assist the
               | driver.
        
               | mavhc wrote:
               | Humans already kill 3000 people every day while driving
        
               | rayiner wrote:
               | Americans drive more than 3 trillion miles per year.
        
               | ben_w wrote:
               | Politics is emotional, and emotions do not always agree
               | with Utilitarian ethics.
               | 
               | Self driving doesn't just need to be safer, people also
               | need to _feel_ that it is.
        
               | pbhjpbhj wrote:
               | That's exactly the sort of task I'd expect an automated
               | vehicle to perform better on. If they can't beat humans
               | at that they're not ready for the public road.
               | 
               | Of course hand-off could be for liability purposes, like
               | 
               | Car: "I'm sorry Dave, I'm about to plow into a school,
               | this is your problem now"
               | 
               | Dave: [looks up briefly from his game of candy crush
               | before dying in a crash]
        
               | perl4ever wrote:
               | >That's exactly the sort of task I'd expect an automated
               | vehicle to perform better on
               | 
               | If the AI could talk, it wouldn't be so much "I'm sorry
               | Dave" as "I cannae change the laws of physics!"
               | 
               | An AI can _think_ about the situation for several million
               | instructions worth, if it is running at  >GHz speeds, but
               | it can't prevent a crash given a few milliseconds,
               | because inertia. And for the same reason, it's futile to
               | hand off to a human.
        
               | jellicle wrote:
               | Press release: "While the self-driving feature of our car
               | had been engaged earlier in the drive, the driver was in
               | control at the time of the crash."
        
               | jeffreygoesto wrote:
               | The problem is, they can't do this today with the
               | necessary classification quality. Elaine was killed,
               | because someone at Uber wanted these cars to drive so
               | badly. It also is about culture and prediction. If the
               | other participants behave differently to your prediction,
               | it can get hazardous quickly at highway speeds. I can't
               | find a better link [0], but maybe it's good enough to get
               | the point across. The term is "warning dilemma" and means
               | that you only have perfect information at the time of
               | impact. Before, the classification if there will be a
               | problem or not is worse, the earlier you ask. But asking
               | early is necessary for the human to have a chance to
               | react.
               | 
               | [0] https://www.researchgate.net/publication/326568066_To
               | wards_a...
        
               | Aeolun wrote:
               | From what I remember, Elaine was mostly killed because
               | she stepped in front of a moving car in the middle of the
               | night.
               | 
               | I'm not convinced that's the best example, since the AI
               | might have actually done better if it had full control of
               | the car instead of defaulting to 'please take over now'
               | mode.
        
               | mbreese wrote:
               | If I remember correctly, the cars built in system for
               | emergency braking in this scenario was disabled. So, any
               | other car of that type would have avoided the accident.
               | However, because the AI apparently knew better, it was
               | disabled and someone died.
               | 
               | AI has the problem that not only does it need to observe
               | the environment and determine what to do, but it has to
               | predict the actions of non-rational actor (humans). We
               | aren't always predictable. AI isn't always predictable
               | either. The woman who died may have assumed that a car
               | driven by a person would have stopped.
        
               | jeffreygoesto wrote:
               | Right. I could have stated it better, what I meant was
               | that the (safety) driver was not paying (enough)
               | attention to the road and was basically driving Level 3
               | (the car drives and it will beep whenever I need to take
               | over, so just relax and browse the web). It was
               | impossible to react to the short notice and the warning
               | was _not_ given 10 seconds ago. I personally think the
               | system had sensed her as a succession of standing targets
               | and never predicted any velocity vector. Braking for
               | static radar targets is not happening, because of the
               | clutter you get everywhere you would not drive at all. So
               | the misclassification and misprediction was not
               | detectable for the system (in it's world model,
               | everything looked consistent) and it warned too late.
        
               | scrose wrote:
               | Who would the car optimize for? Saving the kid or saving
               | the driver from getting rear-ended or swerving into an
               | object?
        
               | Asooka wrote:
               | See kid => hit brakes. This is already way better than a
               | driver who doesn't notice the kid and doesn't hit the
               | brakes. The inside of the car is very well designed to
               | keep the occupant alive in case of collision, the car
               | should prioritise not hitting other people. I hope self-
               | driving technology will at some point be regulated
               | regarding what the car should attempt to do in those
               | situations. With more vehicles having collision avoidance
               | technology, the car behind you won't rear-end you because
               | its AI will react fast enough while also refusing to
               | drive dangerously close to the car in front.
        
               | mannykannot wrote:
               | This sort of _assistance_ is an excellent use of current
               | technology, and a way to move forward to true autonomy.
               | What needs to stop here is the two-faced, fatally
               | ambiguous marketing, and if manufacturers cannot act
               | responsibly, regulation will be needed (more than the
               | current regulation for responsible driving, which only
               | kicks in after the damage is done.)
        
               | Gravityloss wrote:
               | The systems probably already have some kind of an
               | estimate of "surprise possibility in the next 3 s".
               | 
               | For example on a narrow road with a building right in the
               | corner, going around the corner, anything might happen.
               | You don't know what's around the corner. So the system
               | will either go so slow that there's ample warning time or
               | then require driver attention already before attempting
               | to turn, because the driver needs to be ready to react to
               | potential up-and-coming new information fast.
               | 
               | In contrast on a wide highway, it's a more easy-to-
               | estimate environment, visibility is better etc. So as
               | long as other cars are far away and the velocity
               | differences are not huge etc, the driver can stay
               | inattentive.
        
               | buran77 wrote:
               | The challenge self driving cars face is that being
               | "overall better" is probably not enough. Regressions in
               | certain segments would hardly be acceptable for anyone.
               | You can't tell people "FSD cars lowered overall deaths by
               | 10% but we now kill 30% more toddlers because [technical
               | reason]", or that "highway deaths are going up", or
               | basically any segment that anyone could reasonably care
               | about. Cars have to match and exceed human drivers in
               | every category for them to be acceptable, and even 99%
               | there may not be enough.
        
               | sorbits wrote:
               | Another point is that there is a psychological difference
               | between sitting behind the wheel in a vehicle knowing
               | that around 10 in 100,000 people get themselves killed
               | each year, and then sitting in a machine that kills 10
               | out of 100,000 people each year.
               | 
               | In the first situation, you feel in control, and at least
               | think that you can do what is necessary to avoid that you
               | end up in the statistics.
               | 
               | Personally I would be very skeptical about trusting my
               | life to a machine that has a non-zero chance of getting
               | me killed, even if the machine, on average, performs
               | better than a human, because most of us think that we are
               | better than the average driver.
        
               | Aeolun wrote:
               | Unless, you know, there's a reflective truck passing by.
               | Or faulty recognition of lines on the road steer you
               | straight into a barrier.
        
             | madamelic wrote:
             | This is the blue wart in the green sea.
             | 
             | The system gives you more power and you'll kill yourself.
             | Similar situations with horse-drawn are not as lethal. As
             | far as I remember, Volvo denied to use horseless carriages
             | years ago, which is the only grown-up and responsible
             | answer to this problem.
        
               | beanders wrote:
               | I do not see how this gives more power to the driver.
               | Decreasing responsibility is not equivalent.
               | 
               | Edit: sp
        
             | sitkack wrote:
             | Volvo is absolutely making the right call here. There is
             | too much irrational exuberance around SDC.
             | 
             | The obvious answer is to apply AI to the entire system, not
             | just the car navigating the road
             | 
             | * Is the driver paying attention, how much attention should
             | they be paying verses the conditions?
             | 
             | * What is the environment like, how should I react, refuse
             | to drive? Go slowly? Honk?
             | 
             | We already have self driving vehicles, they are called
             | Mules. Why not make artificial mules instead of artificial
             | pole position. The car should be an active participant,
             | more of a copilot and than a blind automaton.
             | 
             | Self driving cars should be _shown_ to have passed a
             | repeatable rigorous adversarial gauntlet before being
             | allowed on the road. And the individual cars should have to
             | be re-certified every 6 months by retaking the test.
        
               | dm319 wrote:
               | As a side note - I think humans should require retaking
               | their test on a regular basis. Maybe every 5 years, and
               | just a short 'refresher' test on the way home from work.
               | It would help find those who simply can't drive safely
               | anymore, and may dissuade others who don't think it's
               | worth going through that process. It would also take a
               | lot of stress and anxiety from the problem of families
               | who know mildly-demented grandad really shouldn't be on
               | the road because he can hardly see at night, but find it
               | hard/impossible to actually do something about it.
        
               | leetcrew wrote:
               | there's a deeper problem here in the US. a large portion
               | of the workforce needs to drive to work. a lot of these
               | people are not very safe drivers, through some
               | combination of incapability, unwillingness, or ignorance.
               | it's hard to strike a good balance between keeping unsafe
               | drivers off the road and not upending people's
               | livelihoods.
        
               | kmonsen wrote:
               | Also the driving test is a joke in the first place so
               | there are a ton of unsafe drivers.
        
               | leetcrew wrote:
               | I'm sure this varies state-to-state. I wouldn't say my
               | driving test was a joke (in the sense of easy), but the
               | hard parts had very little to do with driving safely. the
               | whole thing took place in a parking lot at 5 mph. I
               | failed the test my first two times because it was really
               | difficult to perfectly parallel park my dad's massive
               | truck.
               | 
               | I think it would be okay to make the initial test a
               | little harder, and ideally more focused on real world
               | driving safety. the real problem is if you let someone
               | have a license, they plan their life around it, and then
               | you yank the rug out from under them because they
               | couldn't back up 100 feet in a straight line ten years
               | later.
        
               | j1dopeman wrote:
               | In my state, the test never touches the highway or any
               | busy roads so you don't have to learn how to merge, stay
               | in your lane, drive at speed, etc. Actually most highways
               | around explicitly prohibit permit holders. I personally
               | know more than one person who had their first experience
               | on the highway after they had a license and it was an
               | absolute disaster. They were white knuckled, driving
               | 20-30mph under the limit and every merge was a close
               | call. I think the only thing they actually test is
               | whether you can parallel park or not.
        
               | fuzzer37 wrote:
               | Maybe getting to work should be an incentive for
               | practicing safe driving. You cant get to work if you
               | crash your car either.
        
               | leetcrew wrote:
               | sure, that's not unreasonable. but with the current
               | situation in the US, you would need to drastically expand
               | welfare, public transit, or both if you really want to
               | lean into the "driving is a privilege" idea.
        
           | csharptwdec19 wrote:
           | It's because they know that Self Driving Cars are the way to
           | Guarantee a revenue stream for the life of the vehicle.
           | 
           | Screw actual utility, this is about revenue, plain and
           | simple.
        
           | craftinator wrote:
           | > talk about things like "autopilot" instead of "driver
           | assist"
           | 
           | I actually find driver assist technologies almost as
           | damaging. For example, lane assist (car centers itself in
           | lane) can cause the car to veer when lanes are mis-painted.
           | After having it turned on for a few minutes without issues,
           | you begin to relax, then the car decides to take an off-ramp
           | or swerve towards the shoulder.
           | 
           | As a comparative example, let's look at automatic gear boxes
           | vs manual. As far as I know, there are no auto gear boxes
           | that will occasionally require you to hit the clutch in order
           | to successfully change gears. Their either completely auto,
           | completely manual, or auto with manual override. Having
           | something halfway between automatic and manual is just ASKING
           | for problems; that being said, an auto gearbox that is
           | expected to work only 99% of the time also really doesn't
           | exist, which is why self driving is hard.
           | 
           | I'd much rather better alerting and imminent danger
           | capabilities; more ubiquitous and accurate blind spot
           | sensors, lane change cameras, brake alarms (or even auto-
           | braking, if it's high reliability). These technologies will
           | allow the overall self-driving landscape to improve (because
           | there's so much overlap; self-driving NEEDS all of these
           | sensing techs, and needs them to be REALLY good in order to
           | work) and mature over time, while road infrastructure,
           | mapping, legislature, and public opinion catch up.
        
         | FireBeyond wrote:
         | Sounds a lot like Tesla's marketing.
         | 
         | "Summon your car while dealing with a fussy child _. "
         | 
         | (_ do not summon your vehicle while distracted, maintain
         | constant visual contact with the vehicle)
         | 
         | "Full self driving capability _"
         | 
         | "_ vehicle has all equipment capable to self drive, but may be
         | limited by laws or regulations in your area"
         | 
         | and so on. Tesla is full of nudge nudge wink wink, and
         | disclaimers that walk back their ledes, so what exactly is
         | problematic about Honda's statement about limitations?
        
         | samstave wrote:
         | I wonder if cars will be able to be put into "overly cautious
         | mode" implying any other following: inclement weather, children
         | inside, inebriated passengers who may not be in any condition
         | to respond to situations, "HOSPITAL NOW" voice function,
         | "POLICE NOW" voice function. As well as other one-word voice
         | commands "Car, Home" "Car, Work" etc..
        
         | loeg wrote:
         | It's super weird to only sell 100 of these in a single market.
         | If it does what they say it does, they should be able to make a
         | killing as the first mover. Just more reason to be skeptical.
        
           | NotSammyHagar wrote:
           | I'm guessing it's more of the idea that Japan will tolerate
           | 100 possibly dangerous vehicles to further their engineering
           | goals.
        
             | Erlich_Bachman wrote:
             | That's still incredibly sketchy.
             | 
             | What any sane engineer would do is to keep testing these
             | cars with a safety driver and gather anough
             | information/miles to claim higher statistical safety than a
             | human driver, while keeping track that those miles were
             | really in fact driven without intervention/(or with
             | predictable and managed intervention). Once they would have
             | those numbers, then they would know for sure that the car
             | is safe and they could release/manufacture hundreds of
             | thousands/unlimited number of units to the market. If they
             | didn't - it means either the car really isn't tested or it
             | doesn't do what they claim it does.
             | 
             | Releasing a 100 vechicles just doesn't make sense, unless
             | the claims are somehow shady or misrepresentative.
        
           | rodgerd wrote:
           | No different to the Toyota Century, or the Nissan President,
           | or (to a lesser degree) Alphard etc. The major Japanese car
           | makers have quite a few vehicles (or options) that don't get
           | exported, but whose features show up later in a Lexus or
           | similar.
        
             | fomine3 wrote:
             | Toyota Century/Crown/Alphard/etc is specially designed car
             | for Japan but Honda Legend isn't. It's not well sold, I
             | believe its primary market is US or China (sold as Acura
             | RLX).
        
               | apocalyptic0n3 wrote:
               | Worth noting the Acura RLX has been discontinued with no
               | successor planned. The Legend will continue to be sold in
               | select markets, but as of right now, plans have only been
               | announced for Japan.
        
           | jfk13 wrote:
           | I'm not sure "make a killing as the first mover" is the best
           | choice of wording when the topic is self-driving cars.
        
           | fomine3 wrote:
           | Is very careful initial launch for such product weird? I
           | don't think so.
        
           | Geee wrote:
           | It's just a marketing piece. Tesla's FSD is way more advanced
           | than just a 'traffic jam pilot', but they don't market it as
           | level 3. It's a low-hanging fruit for marketing.
        
             | therouwboat wrote:
             | I guess you are technically right, since Elon promised
             | level 5 last year.
        
               | FireBeyond wrote:
               | "Full self driving, coast to coast, this year, no hands
               | on the steering wheel!"
               | 
               | He also promised it by the end of 2018.
               | 
               | Oh, and before that, 2016.
        
               | Geee wrote:
               | Who cares. Few years here and there don't matter. Here's
               | a video of the current FSB beta driving from San
               | Fransisco to Los Angeles without driver intervention:
               | https://www.youtube.com/watch?v=dQG2IynmRf8
        
               | FireBeyond wrote:
               | I hear this repeatedly from people. "Who cares, it's
               | marketing, he's cheerleading", as if Elon's quotes don't
               | have a material impact on Tesla's stock price. They'll
               | then cry and complain about "the shorts, the shorts", and
               | how unfair it is that they're having an impact on stock
               | prices.
               | 
               | Coast to coast means places like Iowa, Pittsburgh in
               | winter, downpours, poor roads.
               | 
               | Not LA to SF on a perfect, cloudy (no sunglare) day on
               | well-maintained interstate. Or a 14 minute video from a
               | Tesla fan site that says "Here's a brief video of a 8
               | hour drive where, we swear, there was no intervention".
        
           | FPGAhacker wrote:
           | It's possible they are selling these at a loss to seed or
           | whet the market. If it takes off then maybe it will reappear
           | at a profitable price.
        
           | rsynnott wrote:
           | > If it does what they say it does, they should be able to
           | make a killing as the first mover
           | 
           | Of course, if it _doesn't_ do what they say it does, they
           | will, ah, also make a killing. Or many killings. You can
           | understand the caution.
        
         | bayindirh wrote:
         | WRC has (or had) a rule for car homologation which required at
         | least 100 road-going copies to be made for some higher end
         | tiers (This is why we have Lancer Evo & Impreza WRX on the
         | roads). There may be another rule/law in the automative
         | industry where 100 units are considered "serial production".
        
         | mkl95 wrote:
         | They really are making it seem like it's level "2.5" self-
         | driving and it was rounded out to 3.
        
         | divbzero wrote:
         | Perhaps it's no coincidence both examples involve the
         | navigation system? Makes it easier to flash an alert and return
         | the driver's attention to the road.
        
           | stubish wrote:
           | I would have thought some sort of display on or in the
           | windscreen, but maybe that is too distracting and causes
           | panic?
        
             | spullara wrote:
             | Sound is way better, especially if you aren't already
             | looking out the windscreen.
        
         | carlsborg wrote:
         | Solving self-driving in dense slow moving traffic on narrow
         | city rodes is a simpler problem than on high speed freeway
         | style roads, mainly because the risks are much lower - you wont
         | get injured at 20mph in moderns cars. Dings and dents can be
         | compensated relatively cheaply.
         | 
         | Much of urban Europe with pre-industrial-revolution cities and
         | part of urban Asia is slow moving traffic. Slow, compared to
         | California/Atlanta/Houston non-rush-hour freeways.
        
           | kwhitefoot wrote:
           | Surely not. Motorway driving is very simple compared to urban
           | driving. This consequences of a collision are probably less
           | serious in slow moving traffic but the risk of them happening
           | is much higher. And the density of traffic and of pedestrians
           | makes the job much more complicated.
        
         | helsinkiandrew wrote:
         | > It does all this with zero input from the driver
         | 
         | Isn't this different from things like Tesla's autopilot in that
         | in certain conditions the Honda will take full responsibility
         | for driving and the driver can concentrate on other things. But
         | needs to be ready when the car tells the driver to take
         | control/responsibility. I'm guessing these conditions could be
         | quite specific and limited - like traffic jams and freeway
         | driving when the car is effectively in a 'convoy'
        
         | Melting_Harps wrote:
         | > Is 100 units really a "production car"? I don't agree.
         | 
         | Considering you couldn't even get into Group B rally
         | homoligation with those numbers (200) in the 80s, or Dakar race
         | spec production (2500) in 90-2000s I don't think it is, either.
         | 
         | > It's possible they are selling these at a loss to seed or
         | whet the market. If it takes off then maybe it will reappear at
         | a profitable price.
         | 
         | Exactly. It seems more like a mass produced prototype, or a
         | proper limited edition model and if I'm honest this looks
         | exactly like an Accord/Insight with all the JDM goodies we
         | don't get in the West. But this is about marketing.
         | 
         | I wish them well, Honda (auto) has a ton of muck in their face
         | due to their failure to, once again, make any inroads in
         | Formula 1 after their epic success with Mclaren in the 80s--
         | this last time around was so hard to watch, and the mere
         | utterance of 'GP2 engine' will live in infamy forever. Which
         | it's raison d'etre to trickle down it's production car, but
         | with V6 turbos and KERS systems pretty much maxed they need
         | something like this to justify their R/D budgets.
         | 
         | I'm not sure how to feel about it, to be honest: had this been
         | on an limited edition EV insight it would definitely be a step
         | in the right direction. Whereas this will fall into obscurity
         | as more manufacturers move to EV.
        
           | ohgodplsno wrote:
           | > you couldn't even get into Group B rally homoligation with
           | those numbers (200)
           | 
           | Unless your name is Lancia and you move 100 cars from one
           | parking spot to the other while you're treating the officials
           | to a lunch with loooots of wine.
        
             | Melting_Harps wrote:
             | > Unless your name is Lancia and you move 100 cars from one
             | parking spot to the other while you're treating the
             | officials to a lunch with loooots of wine.
             | 
             | I really wish Clarkson would do more of these kind of
             | stories, the trio would be the ideal team to make a docu-
             | series for the Motorex and GTR fiasco!
             | 
             | It's such an insane story that I cannot believe they're on
             | the 100th version of Fast and Furious but we have not seen
             | this story be told.
             | 
             | Having lived through it myself, I was in the early drifting
             | community in SoCal back in 2002 and a regular on Fresh
             | Alloy/NICO since 2000. I even saw a few of those cars that
             | got sold at the old meetups at Life Plaza before the canyon
             | runs, the Bee*R rev limit kits on those R chassis [0] were
             | so absurd back then when they first came out, they sounded
             | like the meanest rally cars.
             | 
             | My memory is fuzzy after all these years, and I can't
             | remember if it was big bird, or black bird but we saw it
             | being tuned on the freeway in LA doing high speed wangan
             | runs and the helicopters may or may not have showed up,
             | good times.
             | 
             | I'm going to miss the ICE days because of this, but...
             | they're dinosaurs and we can always tell the story from the
             | glory days. I'm just glad I was born when I was, because I
             | think that is a culture that has since peaked and is now on
             | a very sad descent, other than Tesla and Rimac I don't see
             | anyone even trying to make EV cars be anything more than a
             | soulless utilitarian computer enclosure with wheels to get
             | you to point A to B.
             | 
             | 0: https://www.youtube.com/watch?v=7_pUReXK3zM
        
         | m463 wrote:
         | I can think of a few reasons to be japan only:
         | 
         | - probably easier to have a japan-only dataset
         | 
         | - probably easier to get engineers to work on this at
         | headquarters
         | 
         | - japan highway markings might be more predictable than say 50
         | states all with subtly different rules and state of repair
         | 
         | - us liability law
        
         | jackson1442 wrote:
         | The second quote seems to insinuate that you should remain in a
         | state where you can drive and can use the context of your
         | surroundings. For example, if you see flashing lights, you will
         | likely get a handoff request because there are emergency
         | vehicles which suggests an unpredictable situation.
         | 
         | To me, it appears to mean "don't sleep or 'drive' intoxicated."
         | The car can presumably predict when it needs to handoff with
         | enough notice to allow you to regain the full focus needed to
         | drive.
        
           | Griffinsauce wrote:
           | > The car can presumably predict when it needs to handoff
           | with enough notice to allow you to regain the full focus
           | needed to drive.
           | 
           | This is the critical assumption. And:
           | 
           | > suggests an unpredictable situation.
           | 
           | All driving situations are unpredictable. From the kid
           | running into the street after their ball to the drunk guy in
           | a truck careening into the same crossroad.
           | 
           | We can never reach this dream of "watching a movie while the
           | car drives" without railroading our roads in some form or
           | agreeing on a new distribution of risk.
        
         | dubeye wrote:
         | This where Tesla is underestimated I think. There is a huge
         | difference between a high volume, sellable product, and a
         | highly priced concept designed for PR. You could argue Tesla
         | has both, but at least they are attempting to place this
         | technology into the hands of consumers.
         | 
         | The unavoidable penalty for this is a longer period of driver
         | responsibility and perhaps never reaching level 3. It's a trade
         | off tesla buyers seem willing to accept.
        
         | rsj_hn wrote:
         | I really think this is an example where the litigious nature of
         | the US is going to drag on self-driving efforts. There is
         | simply no way to deploy this tech without learning from
         | crashes, and there will be crashes, people will get killed, and
         | the US market is the last place where you'd want to be a
         | manufacturer if that happens.
        
           | JumpCrisscross wrote:
           | This isn't an issue. Congress can create safe harbours if it
           | becomes a problem.
        
             | rsj_hn wrote:
             | That's like saying Congress can reform our liability laws
             | and turn it into more of a Japanese system.
             | 
             | Well, OK, sure, it's a _possibility_ , but the reason why
             | that wont happen is because we as a nation are much more
             | litigious and expect to have a right to sue. We love to
             | blame big business and make them pay.
             | 
             | People have been clamoring for reform of medical liability
             | for decades - still no progress on that front, either.
             | 
             | Safe harbours are not politically popular here, and you
             | tend to get them only in very niche areas that are below
             | the radar of most people. Auto accidents and traffic injury
             | law, not so much.
        
               | kaba0 wrote:
               | The US is like one of the most pro-company countries I
               | can imagine. The amount of lobbying to only benefit Big
               | Co is ridiculous from a European point of view - I don't
               | see it as a negative to even increase the liability of
               | companies. They should very well be responsible for
               | everything they do, and it should not cost/risk innocent
               | lives to further a private companies profits without
               | public benefit!
        
               | ALittleLight wrote:
               | Haven't there are already been self-driving car accidents
               | that killed people? And also - is the US behind in the
               | self-driving race?
        
               | spullara wrote:
               | Fewer than cars by people and no.
        
               | kaba0 wrote:
               | Rocket launchers kill much less people than guns, but I
               | doubt it is a reasonable assumption that the former is
               | less dangerous/deadly..
        
               | RcouF1uZ4gsC wrote:
               | Do you know how many lawyers are in Congress or donate to
               | politicians? Trial lawyers as a group are big donors to
               | the Democratic Party. There is no way, Congress is going
               | to do anything to make it harder for lawyers to sue
               | people.
        
               | upbeat_general wrote:
               | Probably true but I'd think Google, GM and every other
               | company working on this tech that is probably going to be
               | a 100B+ industry would have at least some influence here.
        
               | nradov wrote:
               | Several US states have actually imposed caps on medical
               | malpractice damages.
        
           | jcims wrote:
           | People have already been killed. I just don't see it as that
           | big of an impediment.
        
         | teruakohatu wrote:
         | > Is 100 units really a "production car"? I don't agree.
         | 
         | Sounds more like a beta test.
        
           | dexterdog wrote:
           | Tesla has been doing a much larger beta test and charges 9k
           | for the sign up.
        
             | [deleted]
        
             | anonymousiam wrote:
             | I felt like Tesla did a cash grab when they recently
             | offered an upgrade from "auto pilot" (which I pre-ordered
             | on the forthcoming Cyber Truck) to "full auto". This
             | upgrade raised my estimated delivery price by $10k. I went
             | ahead and did it, but WTF? My initial estimate already
             | included the $9k for "auto pilot" so now I'm paying $19k
             | for self driving?
        
               | jiofih wrote:
               | Autopilot is included in the base price, it has never
               | cost 9k. The FSD price went up from 7k to 10k last year.
        
               | totalZero wrote:
               | I loled at this, not at all because I find joy in your
               | situation, but because it's so difficult to picture (A)
               | someone who buys a cybertruck being price sensitive, or
               | (B) someone who is like "well, ok I guess" in the face of
               | a $9k price hike on a truck.
        
         | greenyoda wrote:
         | > I'm assuming if it won't come to the US it won't come to the
         | UK, Canada or Australia either.
         | 
         | Since Japan drives on the left side of the road[1], a car
         | manufactured for the Japanese market would have its steering
         | wheel on the opposite side of a US or Canadian vehicle. It
         | would be OK for the UK or Australia, which also drive on the
         | left, but the cars can only be leased for now[2], and the
         | provisions of the lease would probably disallow taking the car
         | to another country.
         | 
         | [1] https://en.wikipedia.org/wiki/Left-_and_right-
         | hand_traffic#W...
         | 
         | [2] https://asia.nikkei.com/Business/Automobiles/Honda-
         | launches-...
        
           | TheRealSteel wrote:
           | I was more getting at that if they aren't interested in the
           | US market they're unlikely interested in the other ones I
           | mentioned either, rather than commenting on sidedness.
           | 
           | Yes, technically it's easier to bring this car to other RHD
           | countries but I'm assuming Honda has plenty of manufacturing
           | capacity to do that if they want to. It just seems like they
           | don't want to.
        
           | 1e100 wrote:
           | It's rare, but I have occasionally seen left hand drive cars
           | in Australia
           | 
           | https://www.qld.gov.au/transport/registration/register/left
        
             | FuckMeNow69 wrote:
             | https://sites.google.com/view/meet-for-sex69/home
        
           | Waterluvian wrote:
           | I'm not sure but I bet it's perfectly legal to drive a right
           | hand drive in North America.
        
             | loeg wrote:
             | Mail carriers often have right-hand drive in NA.
        
             | rsj_hn wrote:
             | Yes, it is, but you can't sell a right hand car here.
             | Individuals have to import it, but sometimes this happens.
        
               | olyjohn wrote:
               | There are no laws against selling right hand drive cars
               | in the US.
        
               | adolph wrote:
               | Seems to be a lively market: http://www.texasjdm.com/
               | 
               | Jeep has a RHD Wrangler for 2021 and I'm not certain if
               | they ever paused making them.
        
               | sjwalter wrote:
               | Your comment elides that Jeep exclusively markets these
               | (afaik) to USPS for rural delivery drivers. Though they
               | are widely available in the secondary market, are you
               | sure they sell them directly to consumers?
        
               | Spooky23 wrote:
               | Rural mail carriers are contractors who buy their own
               | vehicles. Anyone can order that jeep or a RHD Subaru.
        
               | sjwalter wrote:
               | Great to know, thanks.
        
             | fomine3 wrote:
             | Possibly the self-driving system on a car sold in JP/UK is
             | optimized for left side, and vice versa.
        
             | [deleted]
        
         | lastofthemojito wrote:
         | Honda seems like a very conservative company. It's released a
         | couple of low volume "production" cars as it's dipped its toes
         | into electric cars as well. The Honda Fit EV was a real
         | production car, available to the general public (and I did see
         | a couple of them on the roads in the US). But it was only
         | available to lease, and Honda only leased about 1000 of them
         | IIRC. Seems like a hedge - it gets them some experience with
         | running an electric vehicle program, and data on how their
         | vehicles do in the wild, but in a limited enough number that
         | there won't be any serious harm to their reputation if the
         | vehicles are deficient in some way.
         | 
         | So where Tesla might want to put this sort of tech into the
         | hands of as many customers as possible, as soon as possible,
         | Honda is being Honda and entering the water very slowly.
        
         | sudosysgen wrote:
         | It seems to make sense. You can be distracted, but should be
         | able to take back control when the car asks you to.
        
           | throwawayboise wrote:
           | That doesn't really work. It will take seconds for you to
           | orient yourself to what is going on if you're not paying
           | attention. At 70 MPH that's about 200-300 ft of travel.
        
             | adrr wrote:
             | Level 3(eyes off)can perform emergency actions and doesn't
             | require drivers to pay attention. It will handle all
             | immediate response situations. Tesla is level 2(hands off)
             | because it will run into things like stationary fire
             | trucks.
        
               | ecpottinger wrote:
               | I think you have answered the question of why they only
               | are releasing a 100 cars. Tesla has million cars on the
               | road, yet everyone talks about the car that runs into a
               | truck or fire-engine.
               | 
               | Considering the odds, it is likely you would see no
               | accidents if you only look at just 100 Tesla cars.
               | 
               | This would be more impressive they put 100,000 cars on
               | the road and then we saw the stats.
        
               | kwhitefoot wrote:
               | > everyone talks about the car that runs into a truck or
               | fire-engine.
               | 
               | Not to mention that plenty of non-self driving cars run
               | into stationary vehicles on both motorways and ordinary
               | roads.
               | 
               | A fireman I spoke to says it happens all the time and
               | that the fire engine is parked upstream of the incident
               | that they are dealing with so that errant vehicles run
               | into it rather than the fire crew.
               | 
               | It is a matter of lively debate in the UK right now with
               | regard to 'Smart Motorways' which have no hard shoulder,
               | several people have been killed in collisions with
               | stationary vehicles.
               | 
               | In fact even on motorways with hard shoulders a number of
               | people have died because they stopped on the hard
               | shoulder because of a breakdown and another vehicle
               | strayed onto the hard shoulder and rear ended them at 70
               | mph.
        
               | adrr wrote:
               | As an owner of a Tesla with FSD. My car would have ran
               | into a stalled car in the carpool lane and has tried to
               | change lanes into a concrete barrier. Tesla is level 2
               | and needs your attention to be safe.
        
               | kaba0 wrote:
               | I would prefer roads that are not the test lab of some
               | dystopian experiment for some rich people.
        
               | jiofih wrote:
               | You might be disappointed to learn that human drivers run
               | into firetrucks way more often than Teslas.
               | 
               | The current rate as of October is:
               | 
               | Tesla Autopilot Accidents: 1 out of 4,530,000 Miles; US
               | Average: 1 out of 479,000 Miles
        
               | adrr wrote:
               | Most accidents happen where Tesla autopilot isn't being
               | used like city driving and exiting and entering the
               | freeway. Sitting in one lane on the freeway is the safest
               | thing you can be doing and any car with lane centering
               | and adaptive cruise control can do it. That's why those
               | numbers are a joke. I use autopilot everyday day on a 45
               | mile commute and my only accident is entering the freeway
               | where my car and another car merged into the same lane.
        
               | [deleted]
        
               | jiofih wrote:
               | Nothing better than A/B testing within the same
               | population, right?
               | 
               | > In the 4th quarter, we registered one accident for
               | every 3.45 million miles driven in which drivers had
               | Autopilot engaged. For those driving without Autopilot
               | but with our active safety features, we registered one
               | accident for every 2.05 million miles driven. For those
               | driving without Autopilot and without our active safety
               | features, we registered one accident for every 1.27
               | million miles driven.
               | 
               | Even if you assume most accidents happen in that last
               | group, city driving with AP off, that ratio (3.45:1.27)
               | is better than the overall estimated proportion of city
               | vs highway accidents for all vehicles, at 2.3:1. At a
               | minimum, AP is making highway driving 10-20% safer, and
               | obviously not causing any new city accidents when it's
               | off.
        
               | adrr wrote:
               | AP doesn't switch lanes. SO you turn it off during these
               | events. It's not an AB test since AP is used in a very
               | specific scenario where a person driving is used in
               | situations where AP is incapable of handling like dealing
               | with a stalled vehicle on the road.
        
               | kaba0 wrote:
               | How confident are we in the first number? Accidents
               | usually do not happen just because on a straight road,
               | they happen when something unexpected happens. How many
               | unexpected cases were recorded in the first number vs the
               | second? Because simply because of the number of teslas on
               | the roads, I doubt many.
        
               | jiofih wrote:
               | That's where the "per mile" part comes in.
        
               | FireBeyond wrote:
               | We're absolutely not, but Tesla fans love to repeat this
               | quote, no matter how many times it is explained to them.
               | 
               | You cannot at all equate "subset of miles where AP is a
               | possibility, be it good conditions/road/weather", where
               | AP will turn off or not be available because it can't
               | work, with "all the miles driven by humans in all
               | conditions", and say in any way with a straight face,
               | "Look, safer!". But that's what Elon, and a large number
               | of people do.
        
               | upbeat_general wrote:
               | Not sure how this factors in but autopilot primarily only
               | works on the highway.
               | 
               | It's certainly possible that city driving is safer and
               | also more complex to be done autonomously but it's
               | something that needs to be considered.
        
               | jiofih wrote:
               | The ratio of city vs highway accidents is 2.3 to 1. At
               | worse the Tesla number is cut in half.
        
         | stubish wrote:
         | The bit about watching DVDs is for the traffic jam mode, where
         | it is crawling forward locked to the car in front of you.
         | Pretty much the same as automatic parking features. Thankfully
         | they won't recommend having a snooze though, since you will
         | need to take control at the end of the jam or at an
         | intersection. The warning about overestimating capabilities is
         | on the rest of the features, where failing to respond to a
         | request to take control will assume you are asleep and do an
         | emergency stop. It seems to be a bunch of apps that you have to
         | select and turn on/off manually, rather than 'autonomous
         | driving'. ie. it assists you driving, rather than you assisting
         | the car to drive itself.
        
           | dmingod666 wrote:
           | First question is, where do you buy a DVD? Second question,
           | where do you put it in?
        
             | wil421 wrote:
             | You put it in your Honda Legend of course!
        
       | The_rationalist wrote:
       | Does it works with rain, fog, snow and the night ?
        
       | ricardobayes wrote:
       | I have worked for one of leading self-driving software companies.
       | I think self-driving is now going through the same tech curve as
       | all new tech out there: we have passed the peak of inflated
       | expectations (it will work everywhere, cars won't even have
       | steering wheels), have gone through the the trough of
       | disillusionment (crashes, moral dilemmas, companies going bust or
       | merged due to expensive research not yielding enough fruits) and
       | is now in the slope of enlightment: limited release of limited
       | functionality. Self-driving is essentially an insurance problem:
       | instead of the drivers, the car brands (or their OEMs) need to
       | assume responsibility. This is probably not going to fly with the
       | current car pricing model.
        
       | cryptoz wrote:
       | Confusing..Honda wants (ie specifically suggests) you to watch a
       | DVD (???) while also asking you to pay complete attention to your
       | surroundings.
       | 
       | Not sure this messaging is safe, much less the car itself.
        
         | ben_bai wrote:
         | It lets you watch a movie in "traffic jam mode" so when the
         | vehicle is stopped or moving slowly because of heavy traffic.
         | As soon as it's moving faster you are not allowed to watch a
         | movie b.c. you need to watch the road.
         | 
         | Edit: it's a joke. LVL3 no way.
         | https://www.youtube.com/watch?v=_hBwmFbpNCA
        
         | Danieru wrote:
         | > while also asking you to pay complete attention to your
         | surroundings
         | 
         | No. That is the true benefit of Level3: you do not need to pay
         | attention. The car is paying attention and will notify you when
         | you must retake control.
         | 
         | Tesla's Level2 "autopilot" will not tell you before it drives
         | into a wall. Tesla Drivers must constantly watch the road and
         | ask themselves "Do I think my car is trying to kill me?". When
         | they think "Yeah, my car appears to be trying to drive under
         | that parked semi-truck. I don't we fit, I better take over".
         | 
         | Meanwhile Honda's level3 will say: "Please take over. I see a
         | semi-truck and I don't think we'll fit."
        
           | throwawayboise wrote:
           | L2: If you're not paying attention, you'll run into the semi
           | and be dead before you even know it.
           | 
           | L3: Your car will warn you and you'll have enough time to
           | look up, realize what is happening, and say "oh shit"
        
           | cryptoz wrote:
           | Sorry, that doesn't make sense to me nor does it square with
           | the direct Honda quotes from the article.
           | 
           | > "There is a limit to the capabilities (e.g. recognition
           | capability and control capability) of individual functions of
           | Honda Sensing Elite," reads an official statement. "Please do
           | not overestimate the capabilities of each Honda Sensing Elite
           | function and drive safely while paying constant attention to
           | your surroundings. Please remain in condition where you can
           | respond to the handover request issued by the system, and
           | immediately resume driving upon the handover request."
           | 
           | You cannot be watching a DVD and then immediately resume
           | driving at request. It is not humanly possible in my opinion.
           | There is a critical context-switching time that even the best
           | of people will be slow to handle.
           | 
           | I'll admit I don't understand the L2/L3 difference - and
           | maybe that is part of the messaging and the problem. Your
           | demeaning quotes about Tesla drivers seem to fit 100% with
           | how Honda wants you to behave in their car. They are telling
           | you the system could fail and you need to be ready. But you
           | can't "be ready" without already being ready - if you have to
           | 'get ready' by quickly stopping your attention to the DVD and
           | changing to the road, it may be too late.
        
             | [deleted]
        
       | liquidify wrote:
       | I have a really hard time of understanding what the differences
       | are between level 2 and 3. The article is short of words relating
       | to the distinction. It claims Tesla isn't there, but Honda is,
       | but they don't really say exactly why or how they were able to
       | surpass Tesla. Is this really believable when they don't have a
       | fleet of cars to test on like Tesla does?
        
         | kube-system wrote:
         | Here's the official descriptions:
         | 
         | https://www.sae.org/news/press-room/2018/12/sae-internationa...
         | 
         | TL;DR: In levels 0-2 you are driving the car and the
         | electronics are helping to manipulate the controls. In levels
         | 3-5, you are handing over the responsibility to drive to the
         | car. You do not need to babysit the car to make sure it's not
         | trying to do something stupid.
        
         | qznc wrote:
         | An accident happens in a traffic jam. A person gets injured. If
         | it was a Tesla, the driver gets sued. If it was a Honda Legend,
         | Honda gets sued.
         | 
         | This changes the legal situation for car manufacturers
         | dramatically and has technical implications. If the power in
         | your car for the controlling device fails, normal cars have the
         | driver as fallback to break. Honda needs to have some technical
         | redundancy somehow.
        
           | Spooky23 wrote:
           | Tesla has demonstrated that is not the case.
           | 
           | Tesla acts as a third party. They leverage a tort system not
           | designed to accommodate the type of telemetry they have to
           | aggressively find fault with the drivers action.
        
         | maxerickson wrote:
         | Level 2: driver is expected to override mistakes made by
         | vehicle.
         | 
         | Level 3: driver is expected to take control, with brief notice,
         | when vehicle cannot determine correct action.
         | 
         | https://www.nhtsa.gov/technology-innovation/automated-vehicl...
        
         | jowday wrote:
         | The article is terrible at describing it - and really, the
         | entire level scale is terrible - but there is a real important
         | distinction between level 2 and level 3.
         | 
         | Level 2 means that the car automates several different parts of
         | the driving task, but that you must be ready to take over at a
         | moments notice - you still have to keep your hands on the wheel
         | and your attention focused on the driving task in case you need
         | to intervene. This is what Autopilot, and the FSD beta is. You
         | can turn them on but you have to keep your hands on the wheel
         | and your attention focused. This also means you're legally
         | liable for any accidents, since the onus is still on the driver
         | to intervene.
         | 
         | Level 3 basically means that, in certain situations, you can
         | take your attention off driving completely and play on your
         | phone, read a book, etc. The car might still need you to take
         | over in certain situations, but it will warn you in advance and
         | give you enough time to transition back to driving. Companies
         | have largely shied away from pursuing level 3 system because it
         | puts the legal burden on them - if the car gets in a crash when
         | it's operating at 'level 3', the onus is on the company, since
         | the driver was not told to be ready to intervene.
         | 
         | WRT your point about how they could 'surpass' Tesla, two
         | things:
         | 
         | Level 3 can be pretty restrictive - I'm not sure on specifics,
         | but they could offer this on, say, only the ten largest
         | interstates in Japan, and it would still qualify as level 3
         | self driving. The 'levels' aren't really a strict progression
         | in capabilities. A level 2, 3, and 4 solution is really three
         | separate problems to solve, not succesively better versions of
         | the same system. It's conceivable to me that a legacy automaker
         | could focus on getting this working on a few major interstates
         | and have a system they're confident in after a few years, since
         | they don't have to worry about nearly as many edge cases as a
         | general highway ADAS solution.
         | 
         | Tesla's data/fleet advantage is overstated in their messaging.
         | Most of the stuff presented in their autonomy day presentation
         | doesn't actually exist (follow some Tesla reverse engineers for
         | more info). Self-driving/ADAS is not, by and large, a data
         | problem.
        
           | goshx wrote:
           | I can already use autopilot on a traffic jam without having
           | to touch the wheel all the time. It seems to me the
           | difference here is purely legal/regulatory and not the
           | technology.
        
           | gfodor wrote:
           | It's kind of funny if what you wrote is the full description
           | that we are going to gauge the capability of a self driving
           | system by how willing the manufacturer is to put themselves
           | on the line, instead of data-driven analysis of the systems
           | involved.
        
             | jimbokun wrote:
             | Data can be faked and fudged.
             | 
             | Legal liability tells you how confident the manufacturer
             | really is in its technology.
        
               | rodgerd wrote:
               | Exactly. I don't care what your CEO tells me about being
               | able to self-deliver in a couple of years. Show me what
               | you'll be liable for.
        
             | jowday wrote:
             | That's how this industry works, unfortunately.
             | 
             | It's widely accepted that the disengagement stats companies
             | are required to report to government agencies are
             | essentially meaningless. There's no real standardization in
             | how any of this stuff is reported. There's no way to do
             | independent verifiable data driven analysis of the systems.
             | 
             | Waymo's the only company that's put out a really robust
             | deep dive into their safety stats - the white paper they
             | out out was really pretty comprehensive.
        
           | durkie wrote:
           | > Self-driving/ADAS is not, by and large, a data problem.
           | 
           | can you explain more?
        
         | cryptoz wrote:
         | Whole article is confusing and self contradictory on several
         | fronts. Level 3 is described as decision-making, which is not
         | specific enough to differentiate from any other self driving
         | initiatives in production, including Tesla, the supposed anti-
         | example quoted.
        
         | teruakohatu wrote:
         | Given that is is limited to Japan, I wonder if the Level 3
         | driving will be limited to highways and major cities that have
         | been 3D mapped by Honda in advance. Self-driving without the
         | kind of data TSLA has is doable if the solution is less
         | generalized.
        
           | liquidify wrote:
           | I've often wondered why we haven't been spending some effort
           | developing optimized test areas which integrate helpful
           | elements for self driving cars into the roads. Obviously, we
           | wouldn't want a car that couldn't go everywhere, but it seems
           | like at a national level, we could implement something into
           | the highway system that would make it easy for interstate
           | self driving vehicles.
        
         | cachvico wrote:
         | As I understand it:
         | 
         | level 1 = expect car to go into ditch if you're not careful
         | 
         | level 2 = hands on steering wheel but it should be ok (Tesla)
         | 
         | level 3 = hands off steering wheel, looking around, expect to
         | have to take over control (Honda's beta test)
         | 
         | level 4 = either taking over control rarely needed, or vehicle
         | has limited ability (e.g. can't drive in snow)
         | 
         | level 5 = you can sleep
         | 
         | The confusion lies around whether we're allowed to watch TV -
         | however I'm fairly sure no country or state has passed
         | legislation allowing us to take our eyes off the road while
         | driving.
        
           | liquidify wrote:
           | Yeah that was an especially confusing part.
        
           | bryanlarsen wrote:
           | On my understanding is that level 4 is full self driving, but
           | not everywhere. It will refuse to drive where it can't, and
           | will stop safely if conditions change. So you can safely
           | sleep.
        
             | anticensor wrote:
             | If it refuses going there altogether, then it is level 5.
             | If it goes anyway and waits for you to take over while
             | parking safely, then it is a level 4.
        
         | numpad0 wrote:
         | NHTSA/SAE/UNECE refused to listen to Google and defined that
         | the technology is to advance through their Level system that
         | allow progressively slower response time for the
         | driver/operator.
         | 
         | Level 2 requires instantaneous takeover should the system fail,
         | Level 3 system must tolerate ~30s delay, etc. in feature
         | checkbox style.
         | 
         | Automakers are fully committed to the Level system, but it does
         | not agree with the reality that algorithms don't fail gradually
         | like a turbine engine would, so the reality do not agree with
         | the Level system and Level 3 is just going nowhere.
         | 
         | Your confusion around Level 3 comes from the fact that Level 3
         | do not exist. So no worries.
        
           | jowday wrote:
           | +1
           | 
           | I'm not confident L3 actually exists. The only real levels
           | are 2 and 4.
        
             | Sebb767 wrote:
             | > I'm not confident L3 actually exists
             | 
             | It's pretty "easy", actually, by making the take-over
             | optional. I.e. you need the driver to take over within a
             | few seconds if you can't navigate, but you'll fall back to
             | a safe stop if the driver can't take over. So your auto-
             | pilot doesn't need to handle all situations (well), but you
             | can safely be inattentive; it might just cause a sub-
             | optimal response.
        
       | caseyf7 wrote:
       | Traffic jam driving sounds great. Are there any cars out now in
       | the US that can do this? Which one is the best?
        
         | jackson1442 wrote:
         | The Kia Niro EV can drive pretty well on highways at L2, it's
         | actually almost difficult to use at night because you really
         | are just holding the steering wheel.
         | 
         | It doesn't do lane changes or anything fancy, mind you, but I
         | definitely like how it manages traffic jams; while normally I
         | find them stressful, in adaptive cruise mode it's more like
         | just idle time.
         | 
         | You do, of course, have to keep your hands on the wheel as that
         | is the standard for anything built in in the US at the moment.
         | If you got something like OpenPilot you could probably get away
         | with hands-off.
        
         | jiofih wrote:
         | Tesla. As a bonus, it can do it even without the traffic jam!
        
       | yawaworht1978 wrote:
       | I have a general question about self driving tech/software. I
       | assume the data from the cameras/lidars is processed via
       | something like a "stream". How can a company or their devs make
       | sure that stream never enters a buffering state? Similar to a
       | video platform when the viewer gets a loading spinner in the
       | middle of a movie or a stuttering frame rate. Self driving tech
       | processes much more data and it sounds to me like this is bound
       | to happen.
        
         | jeffreygoesto wrote:
         | You either can assure you have enough processing power to stay
         | within cycle budgets, or prioritize objects or whatever and try
         | to process the n most meaningful ones. Getting these priorities
         | wrong can get you into trouble. But at least the framerate is
         | ok =;-). This is normal business in realtime systems.
        
       | daguava wrote:
       | I wanted to view a page about honda and got visually assaulted by
       | no less than 5 massive and un-dismissable Ford ads before I could
       | read the first words of the article. This site can eat shit and
       | die.
        
         | eps wrote:
         | Use an ad blocker if your religion allows it.
        
       | Danieru wrote:
       | What is notable is level 3. Not the features, but the legal
       | aspect of claiming drivers can now be distracted with your mode.
       | 
       | Tesla "auto"pilot does more, while pretending to enforce an
       | always alert driver. This has resulted in a couple wrongful
       | deaths no one at Tesla has been prosecuted for.
       | 
       | Honda meanwhile is taking advantage of the Japanese governments
       | legal changes which make level 3 legally tenable. Drivers need
       | not watch the road, but they must respond when the car asks them
       | to. This means Honda is promising your Honda will not silently
       | drive you into a wall as the Tesla autopilot is known to do.
        
         | [deleted]
        
         | ineedasername wrote:
         | It won't silently drive them into a wall, but at 65mph it might
         | only give them just enough time to look up and panic as they
         | hit the wall, internal alarms blaring for their attention. Lane
         | keeping combined with construction combined with curves in the
         | road won't be a good mix for for distracted drivers in these
         | cars.
        
           | stubish wrote:
           | How far ahead to the sensors work? Maybe there is lots of
           | time at 100km/h. They certainly need to look far enough ahead
           | to detect a stationary object in the road and come to a
           | complete stop, as does a human. A car coming towards you in
           | the wrong lane at 100km/h is a different kettle of fish
           | though (as it is for a human).
           | 
           | Over here, the speed limit with road works is 40km/h
        
             | ineedasername wrote:
             | In the US on highways the speed limit near construction
             | generally drops from the usual 65mph/105kmh down to 50 or
             | 55mph/85kmh. I think the federal reference is 45mph to
             | 55mph, but in my experience it's usually at the highest
             | level, 55mph, unless there's really active work with crew
             | on the ground. If you're accustomed to 40kmh, I can only
             | imagine how insane these speeds probably seem to you. (And
             | I'm not saying you'd be wrong in that)
             | 
             | Overall, the problem of course comes down to reaction
             | speed. An _attentive_ human with hands on the wheel might
             | respond in 250ms. A computer obviously can respond faster.
             | But some problems won 't give you any more than that 250ms
             | to respond. Situations where faster sensors or cpu are
             | meaningless in giving the driver more warning because there
             | is literally only 250ms from onset of stimulus to the time
             | needed for a reaction if catastrophe is to be avoided.
             | Meaning that if the computer _can 't_ respond without human
             | intervention, an inattentive driver is screwed because
             | their 250ms reaction speed was based on them _already_
             | paying attention. To break it down in rough estimates:
             | 
             | 1) The car detects & make its own assessment and raises an
             | alarm. 1ms? 10ms?, It almost doesn't matter.
             | 
             | 2) The human receives & registers the alert. This is the
             | initial 250ms reaction time.
             | 
             | 3) Only then can the real work begin: The human, from a
             | cold start, needs to take in the entire situation and
             | understand what's going on. This is more than just reaction
             | speed. This is task switching, which is cognitively more
             | costly, especially for un-practiced tasks, and the task
             | here is rapid analysis & synthesis of a panic inducing
             | situation that most people never experience. 750ms is
             | probably about right, based on research that shows the
             | cognitive burden of task switching can be even higher [0]
             | 
             | So, a full 1000ms, a full second.
             | 
             | This is way, way too slow. As I said at the beginning,
             | there are moments driving where that 250ms is required to
             | avoid catastrophe. When those situations arise then there
             | is simply not enough time for the car to get help of an
             | inattentive driver to respond. It's simply not possible.
             | Either the car makes the decision, or the human has just
             | enough time to go wide-eyed before it's over.
             | 
             | Basically the only situations where the car's notifications
             | can facilitate meaningful human response are those that can
             | be foreseen at least a full 1000ms in advance. Certainly
             | this encompasses a lot of difficult scenarios, but not many
             | of the most dangerous ones. Any time a split-second
             | decision saved the life of a driver? That same driver, if
             | inattentive when the car's alert comes in, is now dead.
             | 
             | There is simply no self-driving car that that can safely
             | allow human inattention until the car can make a whole lot
             | more of its own decisions.
             | 
             | In the meantime, if overzealous marketing & breathless
             | hyperbole don't continue to oversell these capabilities,
             | then computer-assisted driving by _attentive_ drivers
             | should greatly improve safety.
             | 
             | [0] https://www.semanticscholar.org/paper/Task-switch-
             | costs-subs...
        
         | Robotbeat wrote:
         | Of course, since they're only releasing 100 of them, that's a
         | low bar. Tesla has sold over a million vehicles.
        
         | judge2020 wrote:
         | > while pretending to enforce an always alert driver.
         | 
         | To be fair amongst automakers, not many other mass-market car
         | is different (right now) - they only auto-steer if the user
         | applies torque to the wheel periodically and beep at the driver
         | if torque isn't applied every 10/30 seconds (30 seconds is only
         | set for interstates from my experience). Only sometime this
         | year will the Ford Mustang Mach-E get an optional $600 upgrade
         | to enable hands-off-the-wheel driving which uses hardware
         | (already installed) in the steering wheel to monitor the
         | driver's road attentiveness[0].
         | 
         | 0: https://www.slashgear.com/ford-active-drive-assist-
         | mustang-m....
         | 
         | e: no other -> not many other - apparently Cadillac has eye
         | tracking
        
           | ineedasername wrote:
           | How does it determine torque application if you're on a
           | straight away? Or does it detect hands-on-wheel? (Preferably
           | both)
        
             | theshrike79 wrote:
             | My Hyundai applies a bit of resistance to the steering
             | wheel, so I need to keep the opposite torque.
             | 
             | It won't drive to a ditch if I let go, but will start
             | howling after 20-30 seconds of not touching the wheel.
        
             | bootlooped wrote:
             | With Honda you must move the wheel a bit, even if it's not
             | necessary for correcting the car's direction. It does not
             | detect if your hands are merely on the wheel, that would be
             | nicer though.
        
           | jjmorrison wrote:
           | I agree - I don't know how it use to work, but today, Tesla's
           | a pretty pushy about keep you alert. I can't imagine blaming
           | my car for crashing when I'm using autopilot.
        
           | GhostVII wrote:
           | Cadillac supercruise has eye tracking that makes sure you are
           | watching the road.
        
             | BadOakOx wrote:
             | Doug Demuro has a nice video about this:
             | https://youtu.be/AhthZ5rxQJs
        
           | cvak wrote:
           | Since last Passat facelift, all* VWs have capacitive steering
           | wheel, where it is enough to just have a hand on it. I also
           | think other carmakers moved to this in past ~2 years.
        
         | gfodor wrote:
         | Your last sentence should read:
         | 
         | ... as previous versions of Tesla autopilot have done.
         | 
         | Talking about any one of these systems via their branding as if
         | they are a singular fixed thing (and not a network of software
         | versions and models under continual update) leads to bad mental
         | models.
        
           | Danieru wrote:
           | Here is your "but this time it is fine" Autopilot attempting
           | to murder yet another driver just 9 months ago:
           | https://www.youtube.com/watch?v=LfmAG4dk-rU
        
             | jcims wrote:
             | What a bunch of zombies holy moly.
        
             | ineedasername wrote:
             | It looks like the car at least _tried_ to stop. I 'm sure
             | some software engineer at Tesla saw that and said "Yes! A
             | real-world UAT on the latest _don 't hit stationary
             | objects_ point release! Now I can close out that story in
             | Jira."
        
             | zhte415 wrote:
             | I suppose we need more captcha images of overturned trucks
             | rather than a brain behind the wheel.
        
             | jiofih wrote:
             | Here is a human driving into 36 cars:
             | https://m.youtube.com/watch?v=u5KgLVh-4Mg
             | 
             | Here is another doing a burnout and crashing into a store,
             | a month ago: https://m.youtube.com/watch?v=tJu5TZ6rwXg
             | 
             | Human-driven truck runs straight into a 100-car pile up, 1
             | month ago: https://m.youtube.com/watch?v=VqNy7v5YekM
             | 
             | Beat that, Tesla!
        
               | bildung wrote:
               | This isn't how statistics work. Teslas are only about .1%
               | of all cars in the US, and only a fraction of those have
               | autopilot on. If autopilot is as safe as a human driver,
               | we'd expect to see about 10000 cases of human stupidity
               | for every case of autopilot stupidity.
        
               | jiofih wrote:
               | That's exactly my point. There are millions of Teslas on
               | the road, a single incident _nine months ago_ means very
               | little. You will find thousands of these for humans in
               | the past month alone.
               | 
               | Teslas per-mile incident rate is currently about 1/10th
               | that of human drivers.
        
               | bildung wrote:
               | _> There are millions of Teslas on the road,_
               | 
               | It's only 1.4 million ever sold worldwide.
               | 
               |  _> Teslas per-mile incident rate is currently about 1
               | /10th that of human drivers. _
               | 
               | But the reason for that is obvious, isn't it? It is
               | strongly encouraged to activate autopilot in trivial
               | situations (e.g. lane assist on interstates), whereas
               | most dangerous conditions (rain, freezing, bad roads,
               | road works and so on) and their higher incident share is
               | almost completely in the human driver column.
        
               | jiofih wrote:
               | These cars have driven about 4 billion miles with AP on,
               | only six confirmed AP fatalities total since 2016. If it
               | was such a hazard you'd expect to see hundreds or
               | thousands of serious incidents.
        
             | jacquesm wrote:
             | What is really amazing about that video is the people who
             | must obviously be witnessing a really bad crash but just
             | keep in driving.
        
           | jowday wrote:
           | Here's a Tesla hitting a highway wall from last week.
           | 
           | https://youtu.be/XgzpAN4qsmg
        
             | Geee wrote:
             | How can any AI make sense from such a low-quality video?
             | Why isn't there some kind of lens hood that blocks the sun
             | glare?
        
             | goshx wrote:
             | I'd probably hit that wall too. I wonder how many non-
             | teslas have hit the same spot.
        
             | ineedasername wrote:
             | Yes, but that's not the old version. The latest versions
             | are failing in new and interesting ways. /s
        
             | ogre_codes wrote:
             | The Tesla should have caught that.
             | 
             | But that is the stupidest way of temporarily? (I hope)
             | relaning a highway ever. What were they thinking?
             | 
             | One of those situations where Musks insistence on not using
             | LIDAR is clearly wrong.
        
               | jiofih wrote:
               | The car has radar which can see that wall. It's most
               | likely a problem of _what to do_ , which humans face just
               | the same - do you swerve into the right lane potentially
               | causing a much worse accident?
        
               | DaiPlusPlus wrote:
               | Can radar see concrete? I thought radar was only reliable
               | for metal obstacles.
        
               | ogre_codes wrote:
               | I believe Teslas rely 100% on cameras for this type of
               | maneuvers. Regardless, it didn't need to swerve into the
               | next lane, just move to the right 6 inches in the same
               | lane.
        
               | aembleton wrote:
               | It could check its side cameras first.
        
               | Shivetya wrote:
               | LIDAR would not have made any difference if the car is
               | not coded to handle the situation, let alone how in the
               | hell is that even legal? We can only make an assessment
               | of what the car saw if he were lucky enough to have the
               | screen data as seen in car if not all telemetry from its
               | recordings.
               | 
               | I own a TM3 and I do not have the "FSD beta" that is in
               | limited testing. I will say that the car display portion
               | of the UI really does not impress me in what it shows and
               | does not show.
               | 
               | As in it is damn happy to show cones and garbage cans but
               | I have instances where it showed me cones front and back
               | of a vehicle but did not draw the vehicle in between.
               | Same with a floating garbage can where sometimes the
               | loader was visible but the truck itself would wink in and
               | out.
               | 
               | I really wish people would quit ascribing super natural
               | abilities to LIDAR let alone superiority. Plus I really
               | want to see studies done which show the impact having
               | hundreds of LIDAR equipped vehicles would have on people,
               | animals, and even insect, life. (then again how many
               | vehicles in traffic does it take to cause an issue for
               | them)
        
               | ogre_codes wrote:
               | The only reason I mentioned LIDAR is because I strongly
               | suspect the reason the Tesla didn't avoid that is because
               | the angle of the sunlight made that pillar difficult to
               | see.
               | 
               | I fully agree with you that that spot on the road is
               | pretty horrible, which is why I mentioned it. But dealing
               | with crappy roads is the inherent problem any self
               | driving car has. It's the car's job to not hit things.
               | 
               | I own a Model Y and I feel like Tesla's autopilot gets a
               | worse rap than it deserves. But there is an inherent
               | disconnect between what the car is capable of and what
               | some people presume it is capable of. Tesla doesn't help
               | this with their branding.
               | 
               | Why did the driver just let his car plow into the pillar
               | instead of taking control? Were they even looking up at
               | the time? Would you have let your car drive into a pillar
               | like that? I'd like to think I would have, but perhaps
               | not.
        
             | shock-value wrote:
             | Interesting, I think this is the spot: https://www.google.c
             | om/maps/@34.0357838,-118.169596,3a,75y,2...
             | 
             | Seems like when the Street View pic was taken this was not
             | a lane. I'm shocked it was converted into one. Seems very
             | dangerous, for human or AI.
        
         | FeepingCreature wrote:
         | > This means Honda is promising your Honda will not silently
         | drive you into a wall as the Tesla autopilot is known to do.
         | 
         | If Tesla only sold 100 vehicles, I assume it could also have
         | avoided them driving into walls...
        
           | ecpottinger wrote:
           | Stats matter, Tesla has a million cars on the road. Choose a
           | random 100 and odds are very favorable you see no accidents.
           | 
           | If Honda had a million cars on the road, I would not be
           | surprised if the accident rate was well over that of even
           | humans drivers.
        
             | 2rsf wrote:
             | Maybe, but in Tesla's case it's a known bug/design choice
             | that can probably be consistently reproduced and not just a
             | random edge case.
        
             | upbeat_general wrote:
             | Is that the case? Most people get into an accident at least
             | once in their lifetime and most people don't drive for 100
             | years.
             | 
             | It certainly wouldn't be statistically significant but over
             | the course of a year or two you'd except a couple accidents
             | for 100 cars.
        
       ___________________________________________________________________
       (page generated 2021-03-05 23:03 UTC)